• Resize Text: decrease increase

BACKGROUND

 

Prior to 2010, Singapore’s graduate medical education had been modelled after the UK system, which was based on apprenticeship and summative assessments. Doctors spent their first year after graduation as interns (House Officers) to obtain their license to practise, following which, they could choose to undertake graduate medical studies. Specialty education began with basic specialty training for 3 years (which was broad based for internal medicine and surgical specialties ), followed by more focused advanced specialty training for a further 3 years (Figure 1). The traditional focus was training, structured around hospital rotations under supervisors appointed by Heads of Departments. There was a high bar intermediate examination between the basic and advanced years, and an exit examination before doctor exited as a specialist. Over the years for surgical specialties the duration for a broad based basic training for surgery was reduced to two years and the high bar for intermediate exams lowered.

 


Figure 1: Graduate Medical Education System Prior To 2010

 

Back to Top

 


 

THE NEED FOR CHANGE

 

2. In 2006 and 2007, MOH conducted cross-sectional surveys and interviews with specialist and family medicine trainees on graduate medical training. The main concerns raised included the lack of training structure, and insufficient supervision and protected time for training. These could be attributed to three factors – the increase in service demands, the inclusion of fee-for-service model for doctors’ salaries, and the increase in the number of junior doctors undergoing training.

 

3. Previous efforts had been made to facilitate training. For example, there was an attempt to secure protected time for trainees through appropriate funding to protect 40% of the trainees time. However, it was found that protected time was rarely honoured. In addition, training was usually carried out on a voluntary and ad hoc basis, and commonly occurred after office-hours. Emphasis was placed on learning by the individual, rather than on training. Despite other efforts such as retrospective accreditation of past postings for specialty training and the introduction of electronic logbooks, the nature of specialty training remained suboptimal.

 

4. Secondly, it was found that the high bar British examinations had been a deterrent to universal graduate medical education for our doctors. The pass rate for most of the UK membership examinations, especially the Membership of the Royal College of Physicians, among local trainees was approximately 40 to 60% per sitting. This, together with the lack of a conducive training environment, deterred a significant proportion of doctors from continuing their training. For each cohort of medical graduates, about 35% qualified as specialists and about 10% qualified as family physicians, and this was largely achieved through their own efforts. The remaining graduates left the public sector after completion of their service obligation (bond) to become general practitioners in the community. This created a ‘wasted’ opportunity to develop better skilled doctors, especially in the presence of the service obligation imposed on our locally-trained doctors.

 

5. Thirdly, there had been an overall increase in demand for specialist manpower. In response to the need for more specialists to staff the upcoming hospitals, the quota on specialist training was lifted in November 2007. With the ramp up in the number of specialists being trained, there was a need to also ensure quality of training.

 

6. The urgency for change was catalysed by our larger and aging population, the increased number of trainees, the gradual erosion of the apprenticeship system due to service pressures and the need to improve graduate medical education, as shown by the feedback received. In addition, there was a need to optimise our human capital by providing quality training, to ensure good quality of patient care.

 

7. Between 2006 and 2007, MOH and the Specialists Accreditation Board (SAB) studied the graduate training systems of Australia, US, UK and Europe, and it was noted that there was a consistent trend towards more structured and formative training in these countries. Many of our senior clinicians, senior management of our restructured hospitals, the Director of the School of Graduate Medical Studies, NUS , the Master of the Academy of Medicine, Singapore and the Deans of our medical schools were engaged to explore how we could incorporate the best practices of these training systems while preserving the strengths of our existing framework.

 

8. MOH also invited the Accreditation Council for Graduate Medical Education (ACGME), which had decades of experience in structured formative training, to Singapore, to carry out a needs analysis for our specialty training system, and to advise the Ministry on measures which could be taken to improve it. Details of the collaboration with ACGME can be found in Annex A.

 

9. It was concluded that the most efficient training system was the US residency training system in which there was highly structured formative training, and where the assessment was continuous and standardised across all the institutions in the US. However, it was not our intention to adopt any particular country’s training system completely or to discard our current system altogether.

 

10. In 2008, the SAB recommended to MOH that our graduate medical education be reformed and structured along the lines of the US residency model to provide a high yield of well trained doctors for Singapore.

 

Back to Top

 


 

NATURE OF CHANGES

 

11. The Residency postgraduate training system was introduced in 2010. Implementation was carried out in phases, with an incremental increase in the number of specialities being brought into the system. The first phase of residency was rolled out in July 2010 with eight programmes offered, including a Transitional Year residency programme which allowed residents to have a one-year exposure in various clinical disciplines to facilitate the choice of and preparation for a chosen specialty.

 

12. In July 2011, the second phase was rolled out, comprising 12 programmes. This included anaesthesia, ophthalmology, ENT, orthopaedics, radiology, O&G, and a common-trunk Surgery-in-General leading to training in five surgical sub-specialties. Various internal medicine specialties were included in the third phase in July 2013.

 

13. There were four key components of graduate medical education which were restructured – curriculum, assessments, people and systems. A structured curriculum was established to achieve a defined set of objectives and core competencies, and residents were given graded responsibilities. Training was provided by designated faculty, and residents had to undergo regular formative assessments. These four components created a comprehensive matrix to ensure the quality of teaching and learning, and therefore, the quality of specialists.

 

Curriculum

14. The curriculum was contextualised to suit local needs (Figure 2). It detailed how conventional competencies of medical knowledge and patient care should be taught progressively, and visibly demonstrated and practised, so that residents could increase their responsibilities in a stepwise manner. Besides specialty-specific competencies, emphasis was also placed on general competencies, which were increasingly becoming more important.

 

15. Newer competencies such as professionalism, communication, practice-based learning, scholarly activities and system-based practice were also included in the curriculum.

 


Figure 2: The Curriculum

 

Assessment

16. In our prevailing British system, the emphasis was on intermediate and exit summative examinations, which, because of their punitive nature, drove learning. There was insufficient rigour in the training prior to taking the examinations, and training was not monitored or assessed on a regular basis. Formative assessment processes and tools such as mini clinical examinations and in-training examinations were not given specific attention or importance. Hence, there was a disconnect between training and examinations.

 

17. In addition, the UK membership examinations were usually moderated according to the overall performance of the examination candidates. In contrast, the US Specialty Board examinations were competency based, and not moderated to achieve a pass rate.

 

18. Under the Residency system, regular formative assessments and examinations were introduced. Trainees were assessed at regular intervals and provided with qualitative feedback. Many of these tools of assessment were psychometrically-validated, enabling international benchmarking. The assessments were also contextualised for local use, and all questions were scrutinised by the local test design workgroup.

 

People

19. Under the apprenticeship system, teaching was considered an honourable or respectable thing to do, and it was assumed that this would be done with dedication, despite the faculty not being allocated protected time for it. However, due to the increasing workload and the fact that doctors’ incomes were closely tied to service, few senior doctors found the time to devote themselves to teaching or following up on the development of their trainees.

 

20. Unlike the prior specialist training system, the development of residency programmes involved the separation of the roles of training providers and regulators. Thus, the individual Sponsoring Institutions (SIs) were responsible for the selection of residents and administration of residency programmes. Designated faculty were identified and given protected time to provide oversight of specialty training for the residents.

 

21. Every Sponsoring Institution had to ensure that the Designated Institutional Official (DIO), Programme Directors (PD), Core Faculty and Programme Coordinators (Figure 3) appointed by them had sufficient financial support and protected time to effectively carry out his/her educational and administrative responsibilities. This included the resources (e.g. time, space, technology and other infrastructure) to allow for effective administration of the Graduate Medical Education Office. Attention was also given to the training and development of the faculty. In addition, teaching activities were supported by administrative staff, such as the Programme Coordinators. The funding principles for the residency programme are outlined in Annex B.

 


Figure 3: Key People in Sponsoring Institutions

 

22. The roles of the DIO were to:

a. Establish and implement policies and procedures (decisions of GMEC) regarding the quality of education and the work environment for the residents in all programmes ;

b. Ensure quality of all training programmes;

c. Implement institutional requirements;

d. Oversee institutional agreements;

e. Ensure consistent resident contracts;

f. Develop, implement, and oversee an internal review process

g. Establish and implement procedures to ensure that he/she, or a designee in his/her absence, reviews and co-signs all programme information forms and any documents or correspondence submitted to ACGME and MOH by PDs; and

h. Present an annual report (includes GMEC’s activities during the past year with attention to, at a minimum, resident supervision, resident responsibilities, resident evaluation, compliance with duty-hour standards, and resident participation in patient safety and quality of care education) to the governing body(s) of the SI and major participating sites.

 

23. The roles of the PD were to:

a. Be responsible for all aspects of the residency training programme;

b. Recruit residents & faculty;

c. Develop curriculum with faculty assistance;

d. Assess residents’ progress through programme;

e. Certify graduates’ competence to practice independently;

f. Ensure residents are provided with a written agreement of appointment/contract outlining the terms and conditions of their appointment to a programme;

g. Ensure residents are informed of and adhere to established educational and clinical practices, policies, and procedures in all sites to which residents are assigned; and

h. Administer and maintain an educational environment conducive to educating residents in each ACGME competency area.

 

24. The roles of the Core Faculty were to:

a. Devote sufficient time to an educational programme to fulfil their supervisory and teaching responsibilities and to demonstrate a strong interest in the education of residents;

b. Administer and maintain an educational environment conducive to educating residents in each of the competency areas; and

c. Establish and maintain an environment of inquiry and scholarship with an active research component.

 

Systems

25. A system of supporting organisational structures was put in place to provide the necessary checks and balances to ensure that residents were having the desired learning experience, and that teaching was conducted regularly and in accordance with plans (Figure 4).

 


Figure 4: Organisational Structure

 

26. Both internal and external reviews were carried out on a regular basis so that improvements in the system could be made. Internally, each Sponsoring Institution (SI) was required to form the Graduate Medical Education Committee (GMEC), which was responsible for establishing and implementing policies and procedures regarding the quality of education and the work environment for the residents in all programmes.

 

27. Residency Advisory Committees (RACs) were appointed by the SAB to oversee the respective specialties and TY training programmes. The roles of the various RACs were:

a. To develop the various specialties taking into consideration the healthcare needs of the nation;

b. To organize the In-Training Exams (ITEs) annually for the specialties and review the results of ITEs to improve the specialty training programme;

c. To work with the SAB, Joint Committee on Specialist Training (JCST) and MOH in matters pertaining to the specialty training and to assist in the selection and interview of residents/trainees;

d. Advise MOH and ACGME-I where necessary or requested for the standards, programme structure and curriculum for the specialties for Singapore;

e. To conduct accreditation site visits when necessary;

f. To work with the institutions, DIOs and PDs to ensure that the programmes comply with the requirements and standards set by ACGME-I and the local RAC, in consultation with MOH;

g. To advise, together with the specialty’s panel of subject matter experts for examinations, on the necessary improvements for assessments of residents; and

h. To conduct / oversee examinations or evaluations (American Board of Medical Specialties International (ABMS-I) or equivalent) for residents, based on the format approved by SAB/JCST.

 

28. The composition of the GMEC included the following:

  • All PDs
  • Associate DIOs
  • Representative Associate PD from major participating sites
  • Peer selected resident representatives
  • Institutional Coordinator
  • Representative from Human Resource
  • Representative from the Undergraduate Medical School

 

29. In addition, the GMEC consisted of the following Sub-Committees:

  • Resident Welfare Sub-Committee
  • Curriculum Sub-Committee
  • Evaluation Sub-Committee
  • Internal Review Sub-Committee
  • Faculty Development Sub-Committee
  • Research Sub-Committee
  • Work Improvement Sub-Committee

 

30. The GMEC was also required to develop, implement, and oversee an internal review process by forming an internal review committee for each programme.

 

31. In addition to the institutional education committees, external reviews were conducted by the ACGME’s residency review committee. As the accrediting body, the review committee conducted regular audits on the integrity of the institutions’ systems and the volume and quality of the teaching processes.

 

Back to Top

 


 

IMPLEMENTATION

 

32. There was a need to contextualise the ACGME-I Institutional, Foundational and Advanced specialty requirements when we implemented them. Three levels of difficulty were recognised. Firstly, there were requirements which we could adopt completely, albeit with some effort. Secondly, there were requirements which had to be modified to suit our local needs. Thirdly, some requirements which were incompatible with the design of our system and needs, and could not be implemented.

 

Requirements that were Adopted Completely

33. SIs and participating sites for each institution were identified, and the organisational structure for graduate medical education was developed. Protected time was established for the faculty, and faculty development was carried out through workshops and educational symposia.

 

34. Continuity clinics were implemented for specialist and family medicine clinics. Previously, there was a lack of continuity of patient care as trainees were not assigned to specific clinics. This was deemed unacceptable. Hence, continuity clinics were implemented, with residents assigned to one to two continuity clinics each week.

 

Requirements that were Modified and Adopted

35. With regard to case load and work hour requirements, while we recognised the value of appropriate case load and work hours, we were unable to match ACGME standards due to our service needs and manpower constraints. Hence, we had to come to a compromise for at least the short- to mid-term, as we increased our manpower among junior staff. For example, ACGME stipulated that a Postgraduate Year 1 (PGY1- equivalent to the House Officer year) doctor clerk no more than 12 cases per shift. However, in reality, our PGY1 doctors were clerking around 30 to 40 cases per shift. Hence, a compromise was made for them to clerk no more than 20 cases per shift on average.

 

Requirements that were Incompatible with our System Design

36. Some requirements were not implementable due to our system design, For example, to be licensed to practice, our graduates had to do at least 3 months each in medicine and surgery during PGY1. Hence, ACGME curricular requirements had to be modified. Fortunately, there were sufficient elective opportunities in most residency requirements to enable our licensing requirements to be met.

 

37. In addition, we decided to retain a broad-based approach to training especially in internal medicine related specialties and some surgical specialties to facilitate holistic patient care and minimise fragmentation of care. Our duration of training was also retained. Specialty training in the US typically took 3 to 4 years, while ours usually took 5 to 6 years. As there was insufficient evidence to indicate the optimum duration of training with certainty, and competency based training / assessment was still evolving we chose to retain our current training durations, to minimise disruptions to our current system.

 

Back to Top

 


 

ENROLMENT INTO THE RESIDENCY PROGRAMME

 

Eligibility Criteria for Residency Applications

38. General criteria for residency applications include the following:

a. Graduands or graduates of Singapore medical schools and those with primary medical qualifications registrable under the Medical Registration Act (First Schedule) are eligible to apply. Graduands from overseas must secure an offer of employment as a doctor from MOHH or a local healthcare institution before they are eligible to apply.

b. Candidates should already be on Provisional, Conditional or Full Registration with the Singapore Medical Council at the start date of residency. Temporary registered doctors working in public healthcare institutions who wish to apply for residency may do so on the recommendation of their Heads of Department and Programme Directors, and must switch from cluster contracts to MOHH employment if successful.

c. In general, a resident who wishes to switch programme has to resign from his or her residency and wait one year before applying for residency again. However, in circumstances where a resident wishes to transfer to a less popular residency programme, where there are unfilled positions in the programme, this penalty may not be imposed. Such cases will be reviewed on a case-by-case basis.

d. Candidates should be employed by MOHH at the point of acceptance into the residency.

 

39. Programme-specific criteria (Table 1) includes the following:

a. For Direct Entry Programmes (in broad-based specialties) individuals could commence the programme during post-graduate Year 1 (PGY1), while fulfilling their licensing requirements.

b. For Post-House Officer (HO) / Transitional Year (TY) Programmes (which include Surgery-in-General-based Programmes and all phase 2 programmes), entrants must have completed HO/ TY before commencing Residency Year 1 at PGY2 and above. Fresh medical graduates matched to these programmes (categorical position awarded) would need to complete a categorical TY programme in their PGY1 before commencing Residency Year 1.

 

 

Direct Entry Programmes

Emergency Medicine

General Surgery

Internal Medicine

Paediatric Medicine

Preventive Medicine

Psychiatry

 

Post HO/TY Programmes

Anaesthesiology

Diagnostic Radiology

Obstetrics & Gynaecology

Ophthalmology

Orthopaedic Surgery

Otorhinolaryngology

Family Medicine

Pathology

 

Surgery-in-General-Based Programmes

Cardiothoracic Surgery

Hand Surgery

Neurosurgery

Plastic Surgery

Urology

 

Application Process and Matching Exercise

40. Applicants can choose to apply for a maximum of two specialties. A third specialty is allowed only if this is for Clinician Scientist (CS) residency (e.g., Internal Medicine, General Surgery, and CS in Paediatrics). All applicants for clinician-scientist residencies will need to meet the requirements for the clinical track in the same specialty before being considered for the clinician scientist track.

 

41. Final year student applicants, can, if they do not wish to commence a specialty training programme, apply for the Generic Transitional Year (TY) Programme instead. This programme was designed to fulfil the educational needs of graduands who desire a well-balanced, broad-based year in multiple disciplines within the structured framework of the residency system. From 2014 onward, all graduands who are unsuccessful in obtaining entry into a residency programme will undergo the TY programme, unless they choose not to.

 

42. Applicants would need to submit their portfolio, which includes their personal statements for each of the chosen specialties, as well as their letters of reference. For local graduands, one of the referees would be the Dean of their respective medical schools. Where applicable, House Officers and Medical Officers would also need to submit their academic records, transcripts, documentation of their work experience, supervisor reports, posting grades, postgraduate examinations, awards, and other accomplishments.

 

43. Applicants would be interviewed by a national panel. Interviews are conducted through a four-station Multiple Mini Interview (MMI) process. The panel would generally comprise:

a. Representative(s) from the RAC for that specialty, who would chair the panel;

b. The PDs (or their appointed representatives) for each sponsoring institution.

 

44. Applicants for the clinician-scientist track would undergo the MMI interview as a first requirement before proceeding to the CS interview conducted separately by the SAB Research Committee.

 

45. Applications are generally evaluated based on candidates’ academic scores, Student Internship Programme performance, letters of reference, interview performance and, where applicable, work performance (for those applying in PGY1 or later). This ensures that acceptance into residency is on the basis of merit.

 

46. MOH and MOHH would shortlist candidates to be matched to a specialty based on the factors above, taking into account the number of positions available for each specialty. Applications from final year medical students are assessed and ranked separately from the House Officer/ Medical Officer group.

 

47. Shortlisted candidates would then meet with the sponsoring institutions of the specialty programmes for which they had applied for and submit their rankings of programmes in order of preference to MOHH, while PDs would also submit their rankings of candidates to MOHH. Programme Coordinators. Following this, matching of candidates to programmes would be done by MOHH. For final year student applicants, match results would be released only after their final exams (e.g., MBBS) have been completed while those for the rest of the applicants would be released two to three weeks after the matching exercise has closed.

 

Back to Top

 


 

PROGRESSION IN THE RESIDENCY PROGRAMME

 

48. A resident’s performance is monitored by the PD and core faculty. Progression in the residency programme is based upon the resident meeting the SI and ACGME-I graduate medical education standards and clinical competencies required to advance to the next level of training.

 

49. The PD, with the advice of the faculty of the programme, will decide whether to progress a resident. Each programme should have written criteria for progression based on the specialty and subspecialty requirements of the ACGME-I. The method of evaluation would consist of direct observations of the resident as well as by indirect observation through rotations, evaluations and the ITEs.

 

50. If an evaluation indicates unsatisfactory performance, the resident would be provided with a remedial plan for correcting any deficiencies. However, should the remediation be unsatisfactory, this may be cause for probation or termination from the residency programme.

 

51. The SAB and the Medical Education Coordinating Committee (MECC) approved for all residents to have a common entry into Residency year 1 (R1). However, PDs could recommend residents who are assessed to be suitable for accelerated progression (within a period of 3 months in R1) to the RAC. Accelerated progression would be subject to the approval of the Specialist Accreditation Board based on a set of accelerated progression criteria. Maximum progression would be capped at R2.

 

52. The duration of residency programmes ranges from 5 to 6 years (R1 to R5 or R6). For doctors in training, the term “Senior Resident” would be used when residents progress to R4. The earliest point during training that a resident can be progressed to the position of Senior Resident in any specialty is:

a. Upon certification by the PD that the resident has completed R3 and acquired all the necessary competencies specified for an R3 resident; and

b. When the resident has fulfilled the criteria laid down by the RAC for progression to Senior Resident (Annex C).

 


Back to Top

 


 

EXIT CERTIFICATION

 

53. At the end of specialist training, trainees are assessed before being certified competent to exit as specialists. Currently, there are a variety of exit assessments, with some specialties having more structured assessments compared to others. The details of the various specialty exit criteria are found in Annex D.

 

54. The SAB has made recommendations for all specialty exit assessments to consist of both a written assessment and a clinical assessment. The written assessment can take the form of multiple choice questions, essays or short answer questions, while the clinical assessment can include clinical case assessments or discussions, and Objective Structured Clinical Examination (OSCE) stations (which can include an oral viva).

  • Written Assessment
    The written assessment should be done in the final or penultimate year of senior residency training.

  • Clinical / OSCE / Oral
    There should be stations for proper assessment of relevant core competencies. Station scenarios should be developed by the appointed examination committees.

  • Other Components
    For specialties with a very small number of residents/trainees, a workplace-based assessment e.g. conducting ward rounds, could be an alternative to clinical stations.

 

55. The development of structured exit assessments for each specialty is currently underway. Some specialties have also collaborated with the American Board of Medical Specialties (ABMS) to develop structured assessments for exit certification which are contextualised to our setting, psychometrically valid, reliable, and benchmarked to the US Boards. The details of the collaboration with the ABMS can be found in Annex E.

 

Back to Top

 




SUMMARY

 

56. The Residency system has allowed for the development of structured postgraduate medical education, with clearly defined curricula and regular formative assessments. Designated faculty were given protected time to provide specialty training to residents.

 

57. A current gap in our assessment is the reliance on external bodies to help with conducting assessments. While the help of these bodies will continue to be required in the near future, in the long run, there is a need for our clinical educators to develop our own unique assessment programmes, and for such expertise to be rooted in Singapore.

 

58. Postgraduate medical education will continue to evolve with time, and formal reviews will be undertaken at regular intervals. We must continue improving our system, building on what has worked well, and eliminate what has not. This will ensure that our system remains relevant and contextualised to meet Singapore’s needs.

 

Back to Top

 




ANNEX A

 

COLLABORATION WITH THE ACCREDITATION COUNCIL FOR GRADUATE MEDICAL EDUCATION (ACGME)

 

1. Following the decision to model our postgraduate medical education after the Residency system, MOH invited the ACGME to provide consultancy services to Singapore for the development of the Residency programme.

 

Ground Assessment

2. A ground assessment was carried out by ACGME, who expressed concerns that, due to the heavy clinical workload and the constraints in manpower resources in our institutions, it would be difficult for Singapore’s residency programmes, especially Internal Medicine, to comply with all the common and specialty-specific programme requirements. Hence, ACGME recommended that a phased approach be adopted to meet the different sets of requirements. It was recommended that institutions and their programmes first comply with the institutional requirements beginning May 2010 i.e. each SI having a DIO, as well as PDs and Programme Coordinators for each of their sponsored specialty programme, to monitor the training of residents. This ensured the presence of a structured framework for training.

 

Formal Contract between MOH and ACGME-I

3. A formal contact was then secured between MOH and ACGME-I, for ACGME-I to carry out the following:

a. Provision of consultancy services on the establishment of structured residency programmes for Singapore;

b. Conducting the accreditation of residency programmes in Singapore in phases which include pre-accreditation visits and site accreditations at regular intervals;

c. Conducting training workshops (includes sharing of relevant findings on educational activities in Singapore) for DIOs, PDs and Programme Coordinators involved in the administration of residency programmes in Singapore;

d. Provision of tools for monitoring and assessment of residents that are adapted for Singapore needs.

 

4. These services to MOH would be delivered in a phased approach over a five-year period by a dedicated experienced team of ACGME senior leadership and managers. In addition, ACGME's efforts would be augmented by US expert faculty from other healthcare and educational institutions as required.

 

Back to Top

 


 

ANNEX B

 

FUNDING PRINCIPLES

 

Funding for Sponsoring Institutions (SIs)

1. SIs are charged the sponsor fees and programme fees by ACGME-I, which MOH funds. All fees for accreditation of the SI’s programmes by ACGME-I will be borne by MOH, while SIs will bear additional charges for re-accreditation should SIs fail in any accreditation.

 

Funding for Faculty

2. Funding for clinical faculty is based on the Total Education Time (TET) for approved training. The funding for faculty can only be claimed by recruitment of backfill to offset services replaced by education. This ensures that the faculty’s remuneration from service is not affected by reduction in patients seen due to teaching activities.

 

3. Funding is based on actual gross salaries for each faculty for APDs, Clinician Scientists Mentors (CSMs) and Core Clinical Faculty Members (CCFMs), while funding for DIOs, PDs, CEs and RAC Chairs is based on actual salary.

 

4. Accountability for funding is based on outcomes of education and process indicators. Outcomes of education include successful progression of residents and intermediate and exit assessments.

 

Funding for Residents’ Salaries

5. Residents must achieve at least 4 hours of Protected Training Time (PTT) per week as an average within each month for full funding. Activities for PTT include simulator teaching, subspecialty teaching, case-based teaching, journal watch, core lectures, tutorials, workshops, operative simulation and continuity clinics. Funding will be forfeited for a resident should he/she not progress to the next year.

 

6. Co-payments by residents for popular specialities apply to residents who are PGY2 and above. An additional allowance will be given to Pathology residents starting from PGY2 similar to the BST/AST track. Co-payment and allowance rates for residents are subjected to yearly review.

 

Audits

7. Audits will be carried out by MOH’s appointed auditors to ensure accountability with proper tracking and utilization of resources for residency programmes. Supporting documents must be systematically filed and made ready for audit checks at any time, and should be able to account for the computation of education and training ratios, Full-Time Equivalents, working hours and headcounts recruited to backfill faculty. Interviews with faculty and residents on their schedules, training sessions, sufficiency of GME infrastructure etc, may be requested by auditors.

 

[1] Popular Specialties are: Otorhinolaryngology, Obstetrics and Gynaecology, Ophthalmology, Orthopaedic Surgery, Paediatric Medicine and Plastic Surgery.

 

Back to Top

 


 

ANNEX C

 

CRITERIA FOR PROGRESSION TO SENIOR RESIDENT

 

1. The criteria for progression to Senior Resident are as follows:

a. Certification by the PD that the resident has completed R3 and acquired the necessary competence specified for a R3 resident. The PD’s certification should be based on robust, objective and standardised criteria and should include regular multiple clinical formative assessments by core faculty members. This process could be further strengthened by inter-institution / inter-cluster workplace assessment of residents in R3; and

b. Fulfilment of the requirements laid down by the RAC for progression to Senior Resident (see table below).

 

2. Extension in the duration of training (if any) to fulfil the above requirements will not receive residency funding.

 

Specialty

Intermediate Assessment

Anaesthesiology

  • Pass final M.Med (Anaesthesiology)

Cardiothoracic Surgery

 

  • Pass MRCS
  • Successful completion of Surgery-In-General programme
  • Satisfactory SESATS and Progress Reports

Diagnostic Radiology

  • Pass FRCR 2A

Emergency Medicine

  • Pass MCEM and/or M.Med (Emergency Medicine)

Family Medicine

  • N.A.

General Surgery

  • Pass the M.Med (Primary) / MRCS
  • Achieve at least a minimum rating of 5 (Competent) on Competency Assessment

Hand Surgery

  • Pass MRCS
  • Successful completion of Surgery-In-General programme

Intensive Care Medicine

  • N.A.

Internal Medicine related specialties [2]

Nuclear Medicine

For 2010-2012 in-flight residents:

  • Pass MRCP/M.Med (Internal Medicine) and/or “ABIM(S)”
  • PD certification of successful completion of R3

 

For 2013 intake of new IM residents:

  • Pass local clinical exam and/or “ABIM(S)” exam
  • PD certification of successful completion of R3

Neonatology

  • N.A.

Neurosurgery

  • Pass MRCS
  • Successful completion of Surgery-In-General programme

Obstetrics & Gynaecology

  • Pass M.Med (O&G) or MRCOG

Ophthalmology

  • Pass M.Med (Ophthalmology)

Orthopaedic Surgery

  • Pass M.Med (Orthopaedic Surgery) Part 2

Otorhinolaryngology

  • Pass MRCS

Pathology

(Chemical Pathology, Forensic Pathology, Microbiology)

  • FRCPA (Part 1) or MRCPath (Part 1) or its equivalent
Pathology (Histopathology)
  • Pass “ABMS MCQ” (Pass FRCPA / FRCPath Part 1 for first batch of residents entering R4 in 2013)
Paediatric Medicine
  • Pass M.Med (Paediatric Medicine) and/or “ABP(S)”
Paediatric Surgery
  • N.A.
Palliative Medicine
  • N.A.
Plastic Surgery
  • Pass MRCS and M.Med (Surgery)
  • Successful completion of Surgery-In-General programme
Preventive Medicine
  • Completed and passed the MPH or equivalent
Psychiatry
  • Pass M.Med (Psychiatry) or MRCPsych (parts 1-4)
Radiation Oncology
  • FRCR / FRANZCR Part 1
Sports Medicine
  • N.A.
Urology
  • Pass MRCS
  • Successful completion of Surgery-In-General programme

 

[2] Cardiology, Endocrinology, Gastroenterology, Geriatric Medicine, Haematology, Infectious Diseases, Medical Oncology, Renal Medicine, Respiratory Medicine, Rheumatology, Dermatology, Neurology, Rehabilitation Medicine

 

Back to Top

 


 

ANNEX D

 

EXIT CRITERIA FROM SPECIALTY TRAINING

 

Specialty

Training Duration

Current Exit Examinations / Assessments

Anaesthesiology

TY/HO + R1-R5

  • Clinical and case management review
  • Paper critique
  • Logbook review

Cardiology

R1 – R6.5

  • MCQs
  • Data interpretation

Cardiothoracic Surgery

 

TY+ R1- R6/R7

 

  • FRCS
  • Clinical assessment
  • Oral assessment

Dermatology

R1-R6.5

  • Dermatopathology Slide Assessment
  • Paper critique
  • Oral assessment

Diagnostic Radiology

TY + R1 - R5

  • Exit interview
  • Publication of first authorship paper
  • To achieve at least 75% of the CME lectures

Emergency Medicine

R1 - R5

  • Paper critique
  • SAQs (short-answer questions)
  • Oral assessment

Endocrinology

R1-R6

  • Written assessment: case write-ups
  • Oral assessment

Family Medicine

TY/HO + R1-R3

  • Written assessment
  • Clinical assessment
  • Oral assessment

Gastroenterology

R1 - R6

  • MCQs

General Surgery

R1 - R5

  • Conjoint exam (FRCS Ed)
    • Written assessment
    • Clinical assessment
    • Oral assessment

Geriatric Medicine

R1 - R6

  • Clinical case assessment
  • Oral assessment

Haematology

R1-R6

  • Written assessment: essay and SAQs
  • Oral assessment

Hand Surgery

TY+R1-R6

  • MCQs
  • Clinical assessment
  • Oral assessment

Infectious Diseases

R1-R6

  • Clinical assessment
  • Clinical case discussions

Intensive Care Medicine

2 years

  • MCQs
  • Clinical assessment

Internal Medicine

R1-R5

  • Paper critique
  • Oral assessment

Medical Oncology

R1-R6

  • MCQs
  • Oral assessment

Neonatology

2  Years

  • Write-up on neonatal article
  • Oral assessment

Neurology

R1-R6

  • Oral assessment

Neurosurgery

TY+R1-R6

  • FRCS(Surgical Neurology)(Ed)

Nuclear Medicine

R1-R5

  • MCQs
  • Clinical case discussions
  • Oral assessment

Obstetrics & Gynaecology

TY/HO+R1-R6

  • OSCE stations
  • Panel interview

Ophthalmology

TY+R1-R5

  • Oral assessment

Orthopaedic Surgery

TY+ R1-R6

  • Conjoint exam (FRCS (Orth))
    • Written assessment
    • Clinical assessment
    • Oral assessment

Otorhinolaryngology

TY+R1-R5

  • Oral assessment

Pathology

(Chemical Pathology, Forensic Pathology, Microbiology)

3 years BST+ 2 years AST or

5 years seamless

  • RCPA Part 1 & 2 or
  • FRCPath Part 1 & 2

Pathology (Histopathology)

TY+R1-R5

  • RCPA Part 1 & 2 or
  • FRCPath Part 1 & 2

Paediatric Medicine;

R1-R6

  • Paper critique
  • Oral assessment

Paediatric Surgery

3 years

  • Clinical assessment: clinical short cases
  • Oral assessment

Palliative Medicine

3 Years (direct track)

2 Years (specialist track)

  • Clinical case assessment and discussion
  • Oral assessment

Plastic Surgery

TY+R1-R6

  • MCQs
  • OSCE stations
  • Oral assessment

Preventive Medicine

R1-R5

  • Discussion of projects / assignments undertaken during training
  • Paper Critique
  • Oral assessment

Psychiatry   

R1-R5

  • Written assessment: thesis
  • Oral assessment
    • Paper critique
    • Case discussion
    • Topical discussion

Radiation Oncology

5 years seamless

  • FRCR / FRANZCR Part 2

Rehabilitation Medicine

R1-R6

  • OSCE stations
  • Oral assessment

Renal Medicine

R1-R6

  • OSCE stations

Respiratory Medicine

R1-R6

  • MCQs
  • Oral assessment

Rheumatology

R1-R6

  • Written assessment, including case write up
  • Clinical assessment
  • Oral assessment

Sports Medicine

3 years AST

  • Paper critique
  • MCQs
  • OSCE stations

Urology

TY+R1-R6

  • MCQs
  • OSCE stations
  • Oral assessment
  • Exit interview: Research work and logbook review

 

Back to Top

 


 

ANNEX E
 

 

COLLABORATION WITH THE AMERICAN BOARD OF MEDICAL SPECIALTIES (ABMS)

 

Following the roll-out of Residency, MOH organised a postgraduate medical education retreat to assess the progress of transformation. The retreat recommended collaborating with an internationally-recognised body to strengthen Singapore’s system of assessments and to grow local expertise in examination development. The ABMS was identified as a potential collaborator which met these requirements as it represented an organisation of medical specialty boards with shared goals and standards, and had strong collaboration with ACGME.

 

Planning Grant Agreement

2. MOH subsequently entered into a biphasic planning grant agreement (PGA) with ABMS in September 2010 to study the feasibility and benefits of a long-term collaboration between MOH and ABMS to develop structured joint certification examinations in Singapore. The MOH Steering Committee for Joint Exams with ABMS (“MOH Steering Committee”) was appointed to guide the negotiations and implementation.

 

3. Following ABMS’ first progress report in April 2011, the SAB and the MOH Steering Committee was supportive of the collaboration, where AMBS served as the umbrella body by providing services to MOH on behalf of ABMS and its Member Boards. Approval was then granted in May 2011 for MOH to proceed with the second half of the biphasic PGA.

 

Accelerated Examination Preparations for 2013 Examinations

4. The ABMS coordinated various meeting sessions between ABMS Member Boards and their clinical counterparts in Singapore. In view of the two-year lead time to develop and implement examinations in Singapore, MOH entered into an Advanced Board contact for internal medicine, paediatrics and pathology specialties concurrently with the PGA so that examinations could be rolled out for the first eligible batch of residents from the three specialties in 2013. Contractual agreements for 10-year collaboration for 11 specialties were subsequently secured in 2013.

 

Examination Development

5. The collaboration with ABMS allowed for the development of a Singapore examination to suit local needs, rather than to import the US Boards examinations wholesale. Both parties had also established that the Singapore certificate would not rival the US certificate for licensing and employment purposes. However, the Singapore certificate would feature the ABMS brand, together with an appropriate Singapore authority, in recognition of the examination standards and the rigour of the psychometric process.

 

6. The Singapore examination, like the Board examinations, would be designed to complement the formative training of residency; and timed to provide summative assessment of the standards and competencies achieved by residents after completing a period of training.

 

7. The examination developed for Singapore was validated through a structured process of examination blueprint development, and was subjected to a rigorous process of standard setting. The performances of local examinees on US items3 were also be benchmarked to that of US examinees, allowing comparisons of standards between US and Singapore.

 

Knowledge Transfer

8. The collaboration saw ABMS and local examiners jointly involved in examination development, which enabled knowledge transfer in core examination development skills, such as blueprint development, item writing, standard setting; and other related areas such as item banking and security protocols. This allowed consolidation of a critical mass of trained examiners and educationalists, which would be expected to sustain the drive towards excellence in medical education beyond the term of the collaboration.

 

9. In addition, the contract with ABMS was designed for Singapore not to be overly-dependent on ABMS. Singapore was conferred with a perpetual and irrevocable license to use the intellectual property (IP) jointly developed by ABMS and local examiners. Singapore would continue to have access to the items beyond the 10-year collaboration. Similarly, developed examination infrastructure (e.g. item bank) would remain viable for Singapore’s use in the long term.

 

[3] At least 50% of the Singapore examination items would be common to the US counterpart, and be used for comparisons and benchmarking.

 

Back to Top

 


 

Last Updated :  05 Feb 2015 11:01