Foundations of the NBCOT Certification Examinations
Learn more about how the exams are developed, constructed, and scored.
Exam Development & Construction
The procedures used to prepare the NBCOT certification examinations are consistent with the technical guidelines recommended by the American Educational Research Association, the American Psychological Association, the National Council on Measurement in Education, the Institute for Credentialing Excellence, and the Association of Test Publishers. NBCOT test development and administration procedures also adhere to relevant sections of the Uniform Guidelines on Employee Selection (EEOC, 1978). NBCOT maintains dual accreditation with the National Commission for Certifying Agencies and the American National Standards Institute, which also ensures NBCOT meets specific assessment standards.
OTR and COTA certification examinations are constructed using a combination of scored (pre-equated) items and non-scored (field-test) items. Items for each examination are selected from the respective item banks in proportions consistent with the OTR and COTA exam content outlines. Items selected as scored items for each examination must be field-tested on a sufficiently large sample of candidates and have acceptable item-level psychometric statistics. The newly constructed exams undergo a rigorous review and validation process with a committee of subject matter experts to ensure the examinations contain content reflective of current entry-level occupational therapy practice.
Professional testing standards and regulations require that certification exam content is based on a study of the profession called a practice analysis. We conduct practice analyses every five years to identify the essential tasks performed by entry-level occupational therapy professionals and the knowledge required for proficient performance of those tasks. The results of the practice analyses are used to develop examination content outlines that guide content development for the OTR and COTA exams. Basing the exam content on a practice analysis ensures that the knowledge tested on the exams is linked directly to practice, thereby ensuring exam validity.
In the first phase of the practice analysis process, we get input on current entry-level practice from a panel of OTRs and COTAs who represent the demographic and practice characteristics of our certificant population. The panelists identify the essential tasks that define current entry-level practice and the knowledge required to perform those tasks successfully. In the second phase of the study, we survey thousands of entry-level OTR and COTA certificants to gather feedback on the tasks and knowledge that the panelists identified. The survey respondents rate the importance of the identified tasks to entry-level practice and the frequency with which they perform the tasks. We also ask them to evaluate the knowledge required for proficient entry-level performance of the tasks. Finally, respondents can optionally identify critical elements of occupational therapy practice that are not captured within the survey. More detail about the practice analysis process is provided in the summary reports.
We conduct practice analyses for each credential every five years. The most recent analyses were conducted in 2022.
NBCOT annually recruits OTR and COTA subject matter experts (SMEs) to develop new items for the certification examinations in accordance to the accreditation standards relating to the use of qualified personnel for examination development. Specifically, NCCA standards (2014) indicate SMEs must represent the appropriate demographics of the population to be certified and provide insight and guidance into examination processes.
SMEs are OTR and COTA certificants who represent the profession in terms of practice experiences, geographic regions, gender, and ethnicity. After questions or “items” for NBCOT examinations are developed, the items then undergo a rigorous review process by an additional committee of SMEs. This review is designed to: validate that the knowledge and tasks measured are compatible with the domain-level content specifications; assess the relative importance and frequency of each item to occupational therapy practice; and confirm that each item meets generally accepted fairness guidelines.
NBCOT takes into account the fairness of its examinations during item development. NBCOT adheres to recognized item writing, test development, and review procedures to ensure readability, neutral language, and the universal accuracy of terms used in its items. Additional fairness criteria include, but are not limited to:
- Editing items for issues of bias and stereotyping;
- Coding items to the approved examination content outline;
- Referencing items to approved and published resources in occupational therapy;
- Selecting subject matter experts who are OTR and COTA practitioners and educators from diverse geographical areas, practice experiences, and cultures;
- Field-testing items prior to their use as scored items on the exam.
During the examination process, fairness is addressed through standardized procedures regarding the registration process, accessibility issues, roles of proctors, and security of test materials and equipment. Fairness is addressed after the examination by consideration of confidentiality, accuracy of scoring, and timeliness of reporting the results.
NBCOT undertakes a series of quality control measures to maintain the fairness, integrity, reliability, and validity of each version of the examination. Content validity is preserved by constructing exams using the same content outline across all versions of the OTR or COTA examination. Additionally, a committee of subject matter experts validates each item on an exam using specific frequency and importance criteria. Finally, the passing score on the certification examinations is determined through a rigorous statistical process that is widely used in the professional testing industry. This method, called the Modified Angoff method (Angoff, 1971), is a way of determining the performance standard required for safe and competent occupational therapy practice, and then determining the number of examination questions candidates must answer correctly to demonstrate that they meet that performance standard. Consistent with all criterion-referenced examinations, once the passing standard is set, it cannot be changed. Future versions of the examination are then statistically equated to this standard to ensure that the passing standard remains constant over time, regardless of which version of the exam a candidate takes.
Exam Format
The COTA examination consists of single-response multiple-choice items and six-option multi-select items.
The single-response multiple-choice items contain a stem and three or four possible response options. Of the response options presented, there is only one correct or best answer. Candidates receive credit for selecting the correct response option. Points are not deducted for selecting incorrect response options.
The six-option multi-select items include a question stem followed by six possible response options. Of the options provided, three are correct responses and the other three are incorrect responses. The candidate must select three response options. Candidates receive credit for selecting the correct response options. Points are not deducted for selecting incorrect response options.
Examples of multi-select items are available. These sample items do not replicate the candidate experience on exam day. Please view the exam tutorial for more information.
Candidates are allotted a period of four hours to complete the examination. The COTA examinations consist of 200 multiple-choice items based on the exam content outline. Multiple choice items include the single-response items in which the candidate selects the single BEST option, as well as multi-select items in which the candidate must select the three BEST options out of six options. Multiple-choice and multi-select items are presented one at a time to the candidates. Some of the items include a picture or chart that contains information needed to answer the question. During the examination, candidates can highlight text in the stem of an item that they deem important. A strikeout feature is also available to help candidates visually eliminate response options. Candidates can flag items for review and change their item responses as testing time allows, or until they submit the examination for scoring. If time runs out before a candidate reviews the flagged items, the selected response(s) will be submitted for scoring. No credit will be given to flagged items that have no response option(s) selected. Candidates also have the ability to modify the color scheme by changing the background and text colors of the exam at any time.
At the start of the examinations, candidates have the option of taking a tutorial about the functionality of the test screens. Time spent on the examination tutorials are not deducted from the four-hour test clock. Details on the features of the computer, as well as additional functionality of the exam, can be viewed by accessing the online tutorial.
Candidates can access the exam tutorial. Please note that candidates can revisit the tutorial at any time during the exam; however, the exam timer will continue to run.
The OTR exam comprises three clinical simulation test (CST) items as well as single-response, multiple-choice items. Each CST item consists of three main components:
- Opening scene – This contains general background information about a practice-based situation that sets the scene for the CST item.
- Section headers – Following the opening scene, each CST item has four parts that each begin with a section header. These section headers contain information specific to the OT process addressed in the section. The candidate is asked a specific question based on this information.
- Response options and feedback – This includes a list of potential options the OTR may consider in response to the question posed in the section header. The list of options in the CST item consists of positive and negative options. Candidates must select either Yes or No for each option before proceeding to the next part of the CST item. Selecting Yes will cause a feedback box to appear to the right of the option. The feedback provides additional information related to the outcome selected but does not give information on whether or not the candidate’s response is correct. Feedback is not provided when No is selected.
Candidates receive credit for selecting the correct response options. Points are not deducted for selecting incorrect response options.
Candidates can navigate to previous screens within the CST item to review the information there, the selections made, and the feedback received in response to the selections; however, candidates cannot make changes to the responses. Candidates must complete the CST items in the order they are presented.
Candidates can access the exam tutorial. Please note that there is one exam tutorial presented before the exam that describes the CST and MC portions of the exam. The time allotted for the tutorial prior to beginning the exam is separate from the exam timer. Candidates can revisit the tutorial at any time during the exam; however, the exam timer will continue to run.
Candidates are allotted four hours to complete the OTR exam. The exam comprises three clinical simulation test (CST) items and 170 single-response, multiple-choice items. The CST items are presented one at a time. Candidates are required to select Yes or No for each response option presented. Once a response is selected, it cannot be changed. Candidates can click “Next” to proceed to the next part of the item after making selections for all response options in the present part. Candidates can navigate to previous screens within the CST item, but selections for the response options cannot be changed. Candidates must complete the CST portion of the OTR exam before beginning the multiple-choice section. NBCOT offers access to sample CST problems so candidates can orient themselves to the Yes/No structure of the items and the feedback boxes. These sample items do not replicate the candidate experience on exam day. Please review the exam tutorial for more information.
Sample CST problems can be accessed here:
Candidates cannot access the CST portion of the exam once it is complete. In the multiple-choice portion, candidates can flag items for review and change their choices, as testing time allows or until they submit the exam for scoring. If time runs out before the candidate reviews the flagged items, the selected response will be submitted for scoring. No credit will be given to a flagged item that has no response option selected.
Throughout the entire exam, candidates can highlight text they deem important; this can be done in the CST opening scenes and section headers and in the multiple-choice stems. A strike-out feature is also available in the multiple-choice portion of the exam to help candidates visually eliminate possible response options. Candidates also have the ability to modify the color scheme by changing the background and text colors of the exam at any time.
Details on all the features, as well as additional functionality of each section of the exam, can be viewed by accessing the online tutorial.
Candidates can access the exam tutorial. Please note that there is one exam tutorial presented before the exam that describes the CST and MC portions of the exam. The time allotted for the tutorial prior to beginning the exam is separate from the exam timer. Candidates can revisit the tutorial at any time during the exam; however, the exam timer will continue to run.
The OTR and COTA examinations are computer-delivered at testing centers located throughout the United States and internationally. The candidate can schedule to take the examination any day of the week during the business hours of the testing center the candidate selects. Scheduling instructions are provided in the candidate’s Authorization to Test Letter.
NBCOT provides reasonable and appropriate accommodations for qualified individuals with a disability who submit appropriate documentation. Additional information can be found on the Testing Accommodations page.
Scoring
NBCOT certification examinations are criterion-referenced, meaning a candidate must obtain a score that is equal to or greater than the minimum passing score in order to pass the examination. Overall performance is reported on a standardized scale ranging from 300 to 600. A total scaled score of at least 450 is required to pass the OTR or COTA certification examination. It is important to note that the passing standard is based on candidate performance across the entire exam. Pass-fail decisions are based only on the total number of exam questions answered correctly, and there is no domain-level passing standard. Likewise, for the OTR examination, the passing standard is based on the responses provided across the entire exam. Separate scores are not calculated for the CST and multiple choice sections.
A scaled score is a mathematical conversion of the number of items that a candidate correctly answered (raw score) that is transformed so that a consistent scale is used across all forms of the test. This transformation is similar to converting a weight from pounds to kilograms or a temperature from Celsius to Fahrenheit. The weight or temperature has not changed, only the format used to report the units.
Reporting scaled scores is standard practice on certification examinations and other standardized tests. The use of scaled scores allows for direct comparisons of scores across multiple test forms. Scaled scoring applies the same passing standard to all test forms, ensuring that the passing standard remains constant over time regardless of which version of the exam a candidate takes.
Scaled scores provide consistent and comparable scoring across exam forms. Percent correct and number of correct scores are simply other ways of reporting raw scores and therefore, do not resolve the issue of comparability of scores across different versions of the exam. Although each version of the examination tests the same domains of occupational therapy practice, each form contains a different set of test items, which means that one or more questions on one test form may vary in difficulty than the questions appearing on another test form. Simply using a raw score does not account for this difference.
Norm-referenced scoring is used to indicate performance differences among test takers, whereas criterion-referenced scoring uses a pre-defined minimum standard or criterion that all candidates must achieve as an indicator of whether the candidate has acquired specific knowledge as defined by a valid content outline. Best test practices involve the use of scaled scoring to address direct comparison of scores across multiple versions of the examination, not to other candidates' performance.
The NBCOT certification examinations are criterion-referenced, meaning a candidate’s performance is compared to a pre-determined minimum standard. In keeping with accreditation standards, NBCOT completes standard setting studies to establish the passing standard for the OTR and COTA examinations. The methodology used for the most recent standard setting studies, the Modified Angoff methodology (Angoff, 1971), requires panels of subject matter experts to identify the minimum level of competency required to pass the examinations (Cizek, 2012).
Candidates who pass the examination receive a performance feedback report that includes a congratulatory letter and the candidate’s earned exam score.
NBCOT also provides performance feedback to those candidates who do not achieve the passing standard on the examination. This performance feedback report includes the candidate’s score, as well as the average score of new graduates who recently passed the examination. The report also includes a domain-level performance chart to help a candidate identify areas of relative strength and weakness and additional information regarding how individual domain scores can be used. Finally, a list of frequently asked questions is presented to address queries regarding the determination of the passing score, use of scaled scores, candidate performance comparisons, score reporting, exam preparation, and exam content.
Sample OTR performance feedback report for candidate with failing score
Sample COTA performance feedback report for candidate with failing score
Calibration of the test items using Item Response Theory statistics is only one of the quality control procedures NBCOT uses as part of its score verification procedures. Additional quality control measures regarding test administration processes and procedures take place after administration of an examination and prior to sending an official feedback report in order to ensure candidates receive accurate information about their final scores.
Test Metrics
Each NBCOT exam includes a pre-selected number of field-test items. Although these items are not considered when determining candidates' scores, performance data is collected and analyzed for each field-test item. The statistical analysis of the field-test items is an important quality control measure NBCOT uses to preserve the reliability and validity of the examinations. Field-test items are presented randomly throughout the exams. Candidates are not able to distinguish between the scored and field-test (non-scored) items.
Once a sufficient number of responses are collected on a field-test item, the item statistics are reviewed based on pre-determined psychometric measures. Field-test items meeting these metrics are entered into the bank of items that can be used as scored items on subsequent exams. Item-level statistics falling below these metrics for field-test items are flagged for additional review and revision before undergoing further levels of field-testing.
NBCOT uses Item Response Theory (IRT) methodology to analyze and calibrate examination items and to pre-equate test forms. IRT statistics provide test developers with valuable information on the psychometric properties of each item in the item bank. Access to this information during test construction facilitates the selection of appropriate items for the new test forms. NBCOT psychometricians routinely evaluate the statistical performance of examination items against pre-determined metrics. This process contributes to the validity and reliability of the certification exams.
Using IRT methodology allows NBCOT to construct pre-equated test forms. This helps reduce the waiting period between the administration of the exam and the release of feedback performance reports to candidates.
References
Americans with Disabilities Act of 1990, Pub. L. No. 101-336, 104 Stat. 328 (1990).
Angoff, W. H. (1971). Scales, norms, and equivalent scores. In R. L. Thorndike (Ed.), Educational measurement (2nd ed.). Washington DC: American Council of Education.
Cizek, G. (2012). Setting performance standards: Foundations, methods, and innovations (2nd ed.). New York, NY: Routledge.
Equal Employment Opportunity Commission [EEOC] (1978). The Office of Personnel Management, U.S. Department of Justice and U.S. Department of Labor (1979).Uniform guidelines on employee selection procedures. 41 CFR Part 603 (1978).
International Organization for Standardization [ISO] (2012).ISO/IEC 17024, Conformity assessment – General requirements for bodies operating certification of persons (2nd ed.).
National Board for Certification in Occupational Therapy. (2023a). Practice Analysis. https://www.nbcot.org/exam-info/practice-analysis
National Board for Certification in Occupational Therapy. (2023b). 2022 COTA Examination Content Outline. https://www.nbcot.org/exam-info/exam-outline
National Board for Certification in Occupational Therapy. (2023c). 2022 OTR Examination Content Outline. https://www.nbcot.org/exam-info/exam-outline
National Commission for Certifying Agencies. (2021). The NCCA's standards for the accreditation of certification programs. Institute for Credentialing Excellence.