Are “qualified” personnel merely good at recalling the highpoints from training? Or are you certain they really understand how to apply their skills and knowledge to the job?
Camber’s Assessment Team employs proven industry standards and innovative techniques to support high-stakes testing and skills certification. We provide evidence of employees’ capability, pinpoint areas for improvement at the individual and organizational level, and make your training budget stretch farther.
A High-Quality Process Yields High-Quality Results
Using assessment development and program management industry standards, our psychometricians, industrial/organizational psychologists, evaluation specialists, and instructional systems designers collaborate with organizational subject matter experts to develop test items supporting job-critical knowledge, skills, and abilities. Team capabilities include:
- Job/task analysis (JTA)
- Front end analysis (FEA)
- Situational analysis
- Test specification design
- Development and maintenance of assessment items
- Non-adaptive and adaptive test design
- o Classical test theory and item response theory
Test Like You Work
Test items are developed that evaluate the range of cognitive functions used on the job. Items assess job knowledge applied to realistic organizational scenarios. In facilitating this process, we follow mature, standards-based processes in accordance with industry best practices. We apply innovative technology-based tools to create, administer, and score tests reliably and efficiently.
- Secure, worldwide administration
- Automated test assembly
- Online survey data collection with full qualitative and quantitative analysis
- In-person and virtual item development workshops
Accurate, Effective Measurement Is Our Goal
We consistently and rigorously evaluate test item performance and identify any items performing outside statistically acceptable norms. Our item scoring methods incorporate an expressed standard expected of qualified personnel (modified Angoff method). This yields confidence that those who pass are, indeed, qualified to pass.
- Actionable results reporting, including feedback reports with recommended training and development resources
- Tools for detection of skill decay
- Psychometric evaluation of test and item performance
- Evaluation of program effectiveness
Conferences and Presentations
Our staff members share their expertise regularly in professional conferences, webinars, and workshops.
Automatic Response Option Sampling for Situational Judgement Items
Anne Thissen-Roe, Ph.D., and Stephen Gunter, Ph.D.
Society for Industrial and Organizational Psychology (SIOP) Conference, 2016
Adaptive Testing: Adapt and Overcome the Shortfalls of Traditional Proficiency Assessments
Robert "Mac" McLaughlin, Stephen Gunter, Ph.D., and Jeff Pearson
Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2015
Are They Mission Ready? Using the Modified Angoff Method To Set Cut Scores
Ingrid Mellone and Carol Faben
Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2014
Evaluate Training and Performance Effectively, Quickly, and Inexpensively - Using the Situational Judgment Test (SJT)
Stephen Gunter, Ph.D., Ingrid Mellone, Kate Oakley, and Carol Faben
Interservice/Industry Training, Simulation, and Education Conference (I/ITSEC) 2013