Resize Text: |
 

Graduate Medical Education in Singapore



BACKGROUND

Prior to 2010, Singapore’s graduate medical education had been modelled after the UK system, which was based on apprenticeship and summative assessments. Doctors spent their first year after graduation as interns (House Officers) to obtain their license to practise, following which, they could choose to undertake graduate medical studies. Specialty education began with basic specialty training for 3 years (which was broad based for internal medicine and surgical specialties), followed by more focused advanced specialty training for a further 3 years (Figure 1). The traditional focus was training, structured around hospital rotations under supervisors appointed by Heads of Departments. There was a high bar intermediate examination between the basic and advanced years, and an exit examination before doctor exited as a specialist. Over the years for surgical specialties the duration for a broad based basic training for surgery was reduced to two years and the high bar for intermediate exams lowered.

Decoration
Figure 1: Graduate Medical Education System Prior To 2010

Back to Top


THE NEED FOR CHANGE

2. In 2006 and 2007, MOH conducted cross-sectional surveys and interviews with specialist and family medicine trainees on graduate medical training. The main concerns raised included the lack of training structure, and insufficient supervision and protected time for training. These could be attributed to three factors – the increase in service demands, the inclusion of fee-for-service model for doctors’ salaries, and the increase in the number of junior doctors undergoing training.

3. Previous efforts had been made to facilitate training. For example, there was an attempt to secure protected time for trainees through appropriate funding to protect 40% of the trainees’ time. However, it was found that protected time was rarely honoured. In addition, training was usually carried out on a voluntary and ad hoc basis, and commonly occurred after office-hours. Emphasis was placed on learning by the individual, rather than on training. Despite other efforts such as retrospective accreditation of past postings for specialty training and the introduction of electronic logbooks, the nature of specialty training remained suboptimal.

4. Secondly, it was found that the high bar British examinations had been a deterrent to universal graduate medical education for our doctors. The pass rate for most of the UK membership examinations, especially the Membership of the Royal College of Physicians, among local trainees was approximately 40 to 60% per sitting. This, together with the lack of a conducive training environment, deterred a significant proportion of doctors from continuing their training. For each cohort of medical graduates, about 35% qualified as specialists and about 10% qualified as family physicians, and this was largely achieved through their own efforts. The remaining graduates left the public sector after completion of their service obligation (bond) to become general practitioners in the community. This created a ‘wasted’ opportunity to develop better skilled doctors, especially in the presence of the service obligation imposed on our locally-trained doctors.

5. Thirdly, there had been an overall increase in demand for specialist manpower. In response to the need for more specialists to staff the upcoming hospitals, the quota on specialist training was lifted in November 2007. With the ramp up in the number of specialists being trained, there was a need to also ensure quality of training.

6. The urgency for change was catalysed by our larger and aging population, the increased number of trainees, the gradual erosion of the apprenticeship system due to service pressures and the need to improve graduate medical education, as shown by the feedback received. In addition, there was a need to optimise our human capital by providing quality training, to ensure good quality of patient care.

7. Between 2006 and 2007, MOH and the Specialists Accreditation Board (SAB) studied the graduate training systems of Australia, US, UK and Europe, and it was noted that there was a consistent trend towards more structured and formative training in these countries. Many of our senior clinicians, senior management of our restructured hospitals, the Director of the School of Graduate Medical Studies, NUS, the Master of the Academy of Medicine, Singapore and the Deans of our medical schools were engaged to explore how we could incorporate the best practices of these training systems while preserving the strengths of our existing framework.

8. MOH also invited the Accreditation Council for Graduate Medical Education (ACGME), which had decades of experience in structured formative training, to Singapore, to carry out a needs analysis for our specialty training system, and to advise the Ministry on measures which could be taken to improve it. Details of the collaboration with ACGME can be found in Annex A.

9. It was concluded that the most efficient training system was the US residency training system in which there was highly structured formative training, and where the assessment was continuous and standardised across all the institutions in the US. However, it was not our intention to adopt any particular country’s training system completely or to discard our current system altogether.

10. In 2008, the SAB recommended to MOH that our graduate medical education be reformed and structured along the lines of the US residency model to provide a high yield of well-trained doctors for Singapore.

Back to Top


NATURE OF CHANGES

11. The Residency postgraduate training system was introduced in 2010. Implementation was carried out in phases, with an incremental increase in the number of specialities being brought into the system. The first phase of residency was rolled out in July 2010 with eight programmes offered, including a Transitional Year residency programme which allowed residents to have a one-year exposure in various clinical disciplines to facilitate the choice of and preparation for a chosen specialty.

12. In July 2011, the second phase was rolled out, comprising 12 programmes. This included anaesthesia, ophthalmology, ENT, orthopaedics, radiology, O&G, and a common-trunk Surgery-in-General leading to training in five surgical sub-specialties. Various internal medicine specialties were included in the third phase in July 2013.

13. There were four key components of graduate medical education which were restructured – curriculum, assessments, people and systems. A structured curriculum was established to achieve a defined set of objectives and core competencies, and residents were given graded responsibilities. Training was provided by designated faculty, and residents had to undergo regular formative assessments. These four components created a comprehensive matrix to ensure the quality of teaching and learning, and therefore, the quality of specialists.

Curriculum

14. The curriculum was contextualised to suit local needs (Figure 2). It detailed how conventional competencies of medical knowledge and patient care should be taught progressively, and visibly demonstrated and practised, so that residents could increase their responsibilities in a stepwise manner. Besides specialty-specific competencies, emphasis was also placed on general competencies, which were increasingly becoming more important.

15. Newer competencies such as professionalism, communication, practice-based learning, scholarly activities and system-based practice were also included in the curriculum.

Decoration
Figure 2: The Curriculum

Assessment

16. In our prevailing British system, the emphasis was on intermediate and exit summative examinations, which, because of their punitive nature, drove learning. There was insufficient rigour in the training prior to taking the examinations, and training was not monitored or assessed on a regular basis. Formative assessment processes and tools such as mini clinical examinations and in-training examinations were not given specific attention or importance. Hence, there was a disconnect between training and examinations.

17. In addition, the UK membership examinations were usually moderated according to the overall performance of the examination candidates. In contrast, the US Specialty Board examinations were competency based, and not moderated to achieve a pass rate.

18. Under the Residency system, regular formative assessments and examinations were introduced. Trainees were assessed at regular intervals and provided with qualitative feedback. Many of these tools of assessment were psychometrically-validated, enabling international benchmarking. The assessments were also contextualised for local use, and all questions were scrutinised by the local test design workgroup.

People

19. Under the apprenticeship system, teaching was considered an honourable or respectable thing to do, and it was assumed that this would be done with dedication, despite the faculty not being allocated protected time for it. However, due to the increasing workload and the fact that doctors’ incomes were closely tied to service, few senior doctors found the time to devote themselves to teaching or following up on the development of their trainees.

20. Unlike the prior specialist training system, the development of residency programmes involved the separation of the roles of training providers and regulators. Thus, the individual Sponsoring Institutions (SIs) were responsible for the selection of residents and administration of residency programmes. Designated faculty were identified and given protected time to provide oversight of specialty training for the residents.

21. Every SI had to ensure that the Designated Institutional Official (DIO), Programme Directors (PD), Core Faculty and Programme Coordinators (Figure 3) appointed by them had sufficient financial support and protected time to effectively carry out his/her educational and administrative responsibilities. This included the resources (e.g. time, space, technology and other infrastructure) to allow for effective administration of the Graduate Medical Education Office. Attention was also given to the training and development of the faculty. In addition, teaching activities were supported by administrative staff, such as the Programme Coordinators. The funding principles for the residency programme are outlined in Annex B.

Decoration
Figure 3: Key People in Sponsoring Institutions

22. The roles of the DIO were to:
 
a. Establish and implement policies and procedures (decisions of GMEC) regarding the quality of education and the work environment for the residents in all programmes;
b. Ensure quality of all training programmes;
c. Implement institutional requirements;
d. Oversee institutional agreements;
e. Ensure consistent resident contracts;
f. Develop, implement, and oversee an internal review process
g. Establish and implement procedures to ensure that he/she, or a designee in his/her absence, reviews and co-signs all programme information forms and any documents or correspondence submitted to ACGME and MOH by PDs; and
h. Present an annual report (includes GMEC’s activities during the past year with attention to, at a minimum, resident supervision, resident responsibilities, resident evaluation, compliance with duty-hour standards, and resident participation in patient safety and quality of care education) to the governing body(s) of the SI and major participating sites.

23. The roles of the PD were to:

a. Be responsible for all aspects of the residency training programme;

b. Recruit residents & faculty;
c. Develop curriculum with faculty assistance;
d. Assess residents’ progress through programme;
e. Certify graduates’ competence to practice independently;
f. Ensure residents are provided with a written agreement of appointment/contract outlining the terms and conditions of their appointment to a programme;
g. Ensure residents are informed of and adhere to established educational and clinical practices, policies, and procedures in all sites to which residents are assigned; and
h. Administer and maintain an educational environment conducive to educating residents in each ACGME competency area.

24. The roles of the Core Faculty were to:

a. Devote sufficient time to an educational programme to fulfil their supervisory and teaching responsibilities and to demonstrate a strong interest in the education of residents;
b. Administer and maintain an educational environment conducive to educating residents in each of the competency areas; and
c. Establish and maintain an environment of inquiry and scholarship with an active research component.

Systems

25. A system of supporting organisational structures was put in place to provide the necessary checks and balances to ensure that residents were having the desired learning experience, and that teaching was conducted regularly and in accordance with plans (Figure 4).

Decoration
Figure 4: Organisational Structure

26. Both internal and external reviews were carried out on a regular basis so that improvements in the system could be made. Internally, each SI was required to form the Graduate Medical Education Committee (GMEC), which was responsible for establishing and implementing policies and procedures regarding the quality of education and the work environment for the residents in all programmes.

27. Residency Advisory Committees (RACs) were appointed by the SAB to oversee the respective specialties and TY training programmes. The roles of the various RACs were:

a. To develop the various specialties taking into consideration the healthcare needs of the nation;
b. To organize the In-Training Exams (ITEs) annually for the specialties and review the results of ITEs to improve the specialty training programme;
c. To work with the SAB, Joint Committee on Specialist Training (JCST) and MOH in matters pertaining to the specialty training and to assist in the selection and interview of residents/trainees;
d. Advise MOH and ACGME-I where necessary or requested for the standards, programme structure and curriculum for the specialties for Singapore;
e. To conduct accreditation site visits when necessary;
f. To work with the institutions, DIOs and PDs to ensure that the programmes comply with the requirements and standards set by ACGME-I and the local RAC, in consultation with MOH;
g. To advise, together with the specialty’s panel of subject matter experts for examinations, on the necessary improvements for assessments of residents; and
h. To conduct / oversee examinations or evaluations (American Board of Medical Specialties International (ABMS-I) or equivalent) for residents, based on the format approved by SAB/JCST.

28. The composition of the GMEC included the following:

  • All PDs
  • Associate DIOs
  • Representative Associate PD from major participating sites
  • Peer selected resident representatives
  • Institutional Coordinator
  • Representative from Human Resource
  • Representative from the Undergraduate Medical School

29. In addition, the GMEC consisted of the following Sub-Committees:

  • Resident Welfare Sub-Committee
  • Curriculum Sub-Committee
  • Evaluation Sub-Committee
  • Internal Review Sub-Committee
  • Faculty Development Sub-Committee
  • Research Sub-Committee
  • Work Improvement Sub-Committee

30. The GMEC was also required to develop, implement, and oversee an internal review process by forming an internal review committee for each programme.

31. In addition to the institutional education committees, external reviews were conducted by the ACGME’s residency review committee. As the accrediting body, the review committee conducted regular audits on the integrity of the institutions’ systems and the volume and quality of the teaching processes.

Back to Top


IMPLEMENTATION

32. There was a need to contextualise the ACGME-I Institutional, Foundational and Advanced specialty requirements when we implemented them. Three levels of difficulty were recognised. Firstly, there were requirements which we could adopt completely, albeit with some effort. Secondly, there were requirements which had to be modified to suit our local needs. Thirdly, some requirements which were incompatible with the design of our system and needs, and could not be implemented.

Requirements that were Adopted Completely

33. SIs and participating sites for each institution were identified, and the organisational structure for graduate medical education was developed. Protected time was established for the faculty, and faculty development was carried out through workshops and educational symposia.

34. Continuity clinics were implemented for specialist and family medicine clinics. Previously, there was a lack of continuity of patient care as trainees were not assigned to specific clinics. This was deemed unacceptable. Hence, continuity clinics were implemented, with residents assigned to one to two continuity clinics each week.

Requirements that were Modified and Adopted

35. With regard to case load and work hour requirements, while we recognised the value of appropriate case load and work hours, we were unable to match ACGME standards due to our service needs and manpower constraints. Hence, we had to come to a compromise for at least the short- to mid-term, as we increased our manpower among junior staff. For example, ACGME stipulated that a Postgraduate Year 1 (PGY1- equivalent to the House Officer year) doctor clerk no more than 12 cases per shift. However, in reality, our PGY1 doctors were clerking around 30 to 40 cases per shift. Hence, a compromise was made for them to clerk no more than 20 cases per shift on average.

Requirements that were Incompatible with our System Design

36. Some requirements were not implementable due to our system design. For example, to be licensed to practice, our graduates had to do at least 3 months each in medicine and surgery during PGY1. Hence, ACGME curricular requirements had to be modified. Fortunately, there were sufficient elective opportunities in most residency requirements to enable our licensing requirements to be met.

37. In addition, we decided to retain a broad-based approach to training especially in internal medicine related specialties and some surgical specialties to facilitate holistic patient care and minimise fragmentation of care. Our duration of training was also retained. Specialty training in the US typically took 3 to 4 years, while ours usually took 5 to 6 years. As there was insufficient evidence to indicate the optimum duration of training with certainty, and competency based training / assessment was still evolving, we chose to retain our current training durations, to minimise disruptions to our current system.

Back to Top


SUMMARY

38. The Residency system has allowed for the development of structured postgraduate medical education, with clearly defined curricula and regular formative assessments. Designated faculty were given protected time to provide specialty training to residents.

39. A current gap in our assessment is the reliance on external bodies to help with conducting assessments. While the help of these bodies will continue to be required in the near future, in the long run, there is a need for our clinical educators to develop our own unique assessment programmes, and for such expertise to be rooted in Singapore.

40. Postgraduate medical education will continue to evolve with time, and formal reviews will be undertaken at regular intervals. We must continue improving our system, building on what has worked well, and eliminate what has not. This will ensure that our system remains relevant and contextualised to meet Singapore’s needs.

Back to Top