In 2014, a team was formed from the regulated jurisdictions to update, review and revise the 2012 PCs/PIs. The team completed their work and presented recommendations to FOMTRAC in June 2016. The four FOMTRAC member Colleges independently reviewed and adopted the revised PCs/PIs and updates to the document were completed by the end of September 2016.
The outcome is the 2016 Inter-jurisdictional Practice Competencies and Performance Indicators for Massage Therapists at Entry-to-Practice.
As of May 2018, all CMTNL Examinations will be based on the new 2016 PC’s/PIs.
The CMTNL Examination Process
Questions and tasks for the registration examination are developed by a group of CMTNL subject matter experts (SME) who are massage therapists. Questions and tasks undergo three levels of editing, and all items must be approved by the College of Massage Therapists of Newfoundland and Labrador. The CMTNL reviews and approves all items for inclusion in CMTNL examinations.
The CMTNL participates in the formulation of examinations. Using the modified Angoff method, a selected group of massage therapists formulate a fair passing score. This determines the necessary number of correct answers a candidate must achieve in order to pass the examination. This process ensures that, if all candidates tested demonstrate competency, all will pass.
Item analysis is performed for both the multiple-choice examination and the OSCE examination to confirm that all items are performing statistically within the parameters set for the examinations.
In order to ensure accuracy of the reported test scores, rigorous quality assurance procedures are adhered to. Through psychometric review, content experts review, and quality assurance procedures, the defensibility and accuracy of the reported scores are ensured.
Development and Validation of the CMTNL Examination
In 2002, the College of Massage Therapists of Newfoundland and Labrador(CMTNL) undertook a detailed review and comparison of the job analysis study conducted in Ontario by the consulting firm of psychometricians, Schroeder Measurement Technologies (SMT), and the College of Massage Therapists of Ontario (CMTO). The CMTNL determined that the job analysis accurately described the practice of massage therapy in the province of Newfoundland and Labrador. As a result, the CMTNL adopted the Massage Therapy Core Competencies and the examination of the College of Massage Therapists of Ontario. The Massage Therapy Competency Standards document is based on the job analysis and the data collected from the job analysis ensures that the test content is relevant to competent practice of the profession.
The CMTNL examinations are now solely developed by CMTNL. It is based on the Act Respecting the Practice of Massage Therapy and its Regulations, the CMTNL Code of Ethics and Standards of practice and subsequently reviewed by the CMTNL.
The CMTNL registration examination consists of two components, a Multiple Choice Questionnaire (MCQ) and an Objectively Structured Clinical Examination (OSCE). Candidates must successfully pass both components in order to apply for registration in Newfoundland and Labrador.
About scoring and psychometric principles: recognizing competency
Standards based testing offers all competent candidates the opportunity to demonstrate their abilities. Old style “bell curve” scoring models, set passing rates arbitrarily – a percentage of candidates would pass or fail the exam, regardless of how they performed on the test. Passing was a function of how candidates fell along the score distribution continuum – an unsupportable scoring model. Basing the minimum passing score on a measurable level of performance is accomplished for our purposes through the use of a Modified Angoff scoring model. A defensible score value is established for each test question based upon Subject Matter Experts response data to the following query: What percentage of a 100 minimally competent, entry-level practitioners would be able to answer this question correctly?
By collecting Angoff data for each question and task, a defensible cut score can be established for each assembled examination. When a test is developed using these psychometric principles, and a legally defensible scoring model is used, then all competent candidates will have the opportunity to pass and practice, and the examination will do its job of restricting professional access to those candidates failing to demonstrate competency.
About scaled scores: ensuring fair comparisons
Candidates’ raw scores are converted to scaled scores in order to ensure comparability and fairness. For continuity, raw scores are converted to scaled scores so that candidates’ performance on one examination can be compared to candidates’ performance on another form of the same examination. In order to compare the raw scores of exam candidates who took more difficult or easier forms of the examination a common scale is established. The pass (cut) score for an examination is transformed to a scaled score of 70 and all raw scores are converted to the appropriate scaled score. This process is called scaling (pass mark = scaled score of 70).
Candidates cannot take their reported scaled score and determine the number or percent of items in which credit was obtained. That is, a scaled score of 68 is not equivalent to 68%.