The Summit program document provides a details of times of the presentations, descriptions and the presenters for the 44th PTC Summit. Download it by clicking the title above.
The advances in technology are creating opportunities for us to assess skills in new ways that were unimaginable even a few years ago. AI, machine learning, telemetry, cognitive services, virtual reality, augmented reality, just to name a few, will revolutionize how we identify skills and measure them, and we will be much more accurate in our assessments. As the PTC, are we thinking about performance testing too narrowly? Should we be expanding how we think about skills assessment given the doors that technology is opening for us? Does an exam have to be delivered in a test center or through online proctoring? Why can't we assess people as they are doing the work (in work assessment)--the purest form of performance testing imaginable?! Why haven't we been talking about this as an organization? The purpose of this is to share some potentially outrageous ideas about assessment that might shape the future of PT.
Presenter – Liberty Munson, Microsoft
Liberty Munson is the Lead Psychometrician for the Microsoft Worldwide Learning organization and is responsible for ensuring that the skills assessments in Microsoft Technical Certification and Employability Programs are valid and reliable measures of the content areas that they are intended to measure. Prior to Microsoft, she worked at Boeing in their Employee Selection Group, assisted with the development of their internal certification exams, and acted as a co-project manager of Boeing’s Employee Survey. She received my BS in Psychology from Iowa State University and MA and PhD in Industrial/Organizational Psychology with minors in Quantitative Psychology and Human Resource Management from the University of Illinois at Urbana-Champaign.
Presenter – Jonathan Parnell, ITI and Wallace Judd, Authentic Testing
The Construction Hazards Assessment works both in VR and on a computer monitor. We'll show segments of the on-screen exam.
Then we will show the types of analysis required to conform to ISO/IEC 17024, and the design elements that are invisible to the test taker but necessary to keep the test fair, secure, and ADA compliant.
We will demo a segment of the Construction Hazards Assessment, then show how it was designed and how we resolved the issues listed below.
We will help you to understand the following issues and how to solve them:
Security – What steps do you take to secure something as memorable as a Virtual Reality scenario?
Scoring – How can you score, not only on speed but on variable difficulty of live tasks?
Mechanics – With limited controls (point, shoot, click) how can candidate performance be modeled?
JTA – What components does a JTA have that a conventional (MC) JTA doesn’t?
Administration – How can a VR cert be administered at conventional brick-and-mortar sites?
ADA – What do you do with the 5% of candidates who cannot wear the headset without becoming disoriented?
Reporting – How can you report results to candidates when you have differential items for each candidate?
Audit – What is a reasonable audit trail & how do you maintain it?
Presenters – Jonathan Parnell, ITI and Wallace Judd, Authentic Testing
Jonathan Parnell is Director of Product Development for ITI. ITI is the foremost provider of educational and technical services for those who use cranes, rigging, and load handling equipment.
Wallace Judd is President of Authentic Testing Corp., a company which conducts analytical studies for certifications, both performance-based and conventionally administered.
Presenter – Greg Applegate, NREMT
Session – Adapting to a Pandemic: Changing Administration Guidelines for a Performance Examination
A review of lessons learned and open questions about how to adapt a performance examination that requires human contact to a non-contact environment. This will be more a discussion than a seminar and participants are encouraged to bring and share their own experiences with performance testing during a pandemic.
Presenter – Greg Applegate, NREMT
Greg Applegate is the Chief Science Officer for National Registry of Emergency Medical Technicians with over 30 years’ experience as an educator, psychometrician, and manager. His holds a PhD in Educational Psychology and a Master of Business Administration degree. His interests include item development, testing accommodations, and performance testing. At the National Registry, Greg directs the research, examinations, and psychometric teams with a focus on high quality EMS research and ensuring that certification examinations are fair, valid, reliable, and legally defensible. He is an active member of the education, science, and certification communities serving as a commissioner for the Commission on Accreditation of Allied Health Education Programs (CAAHEP) and as a member of the National Commission on Certifying Agencies (NCCA) guidelines revision committee. He is also a member of the American Association for the Advancement of Science (AAAS) and the National Council on Measurement in Testing (NCME), and the Performance Testing Council.
Our Cornerstone Members that support us all year are:
And thanks to our Summit Sponsors
Presenter – Sara Clark, NICET
One of the newest members of The Performance Testing Council, Sara Clark with the National Institute for Certification in Engineering Technologies (NICET) will describe their certification programs, their performance testing progress and why NICET joined the Performance Testing Council.
Presenter – Sara Clark, NICET
Sara is an avid reader and a hands-on Aunt. Her favorite childhood game was dress-up and she wanted to be The Boss when she grew up. Sara is the Manager of Certification Program Development at the National Institute for Certification in Engineering Technologies (NICET).
With more than 10 years’ experience in test development, Sara oversees NICET’s program development department. The department’s current challenge is integrating performance examinations alongside certification examinations.
Presenter – Amin Saiar, PSI Services
Every assessment needs to be designed in a thoughtful way before its creation to ensure that it can be an effective and meaningful measure of a job role or skill-set -- or something in between. Job analysis is a family of mixed-method approaches to defining the scope of an assessment. In credentialing, job analysis for traditional-format (i.e., multiple-choice) assessments is well established and performed routinely. In comparison, designing performance-based assessments is terra incognita for many psychometricians. This session will discuss how the job analysis process can be applied to develop examination programs that focus on performance-based assessment methods. The presenter will discuss how different classifications of competencies (knowledge, skills, and abilities) can be leveraged to inform assessment formats and item types, what additional design decisions are necessary at the job analysis phase for a performance-based exam, and how content expertise can be leveraged to design customized high-fidelity assessment formats without losing sight of applicable measurement principles.
Presenter – Amin Saiar, PSI Services
Dr. Amin Saiar is the Vice President, Psychometrics at PSI Services and joined PSI in January 2013. During his 12+ years of experience in the assessment industry, he has conducted job analyses, psychometric evaluation of credentialing programs, accreditation consulting, examination development, and standard setting for numerous certification and licensure programs. He is experienced in a variety of assessment formats including written examinations with alternative item types and performance-based examinations as well as assessment applications outside of credentialing including medicine, public health, licensing regulation, career development, and organizational psychology. Dr. Saiar served as a commissioner for the National Commission for Certifying Agencies (NCCA) for 6 years and is a current member of the Association of Test Publishers (ATP) Certification and Licensure Division leadership. He received his B.A. in Psychology from the University of California, Los Angeles and his Ph.D. in Industrial/Organizational Psychology from the California School of Professional Psychology.
Presenter - Vincent Lima, President of The Performance Testing Council
On Day 1, we will get to know each other and practice performance-testing skills. You’ll join a team creating a certification test in:
Build and fly an elegant paper airplane
Design, make, and model a scary face mask
Draw a realistic portrait in pencil
Make an ornamental edible object
Your group will have 30 minutes or so to design the test—and make or renew connections. Then, by the end of the day, you’ll create a short video of yourself demonstrating your competence (perhaps!) in the skill, and you’ll have a chance to rate the performance of the other members of your group.
On Day 2, we will review the test statistics for each team’s test and discuss issues and insights that may come up during this exercise.
Presenter – Vincent Lima, Authentic Testing
Vincent has guided numerous programs through the test-development process and accreditation by ANSI under certificate or personnel-certification standards. He was instrumental in developing the process for efficiently creating, reviewing, pilot testing, and scoring performance-based item sets for the Uniform CPA Exam. He served for five years as an assessment specialist at the Educational Testing Service, writing multiple-choice items for the GRE and GMAT. Outside the testing industry, he has worked as a newspaper and journal editor.
Vincent serves as director of test development at Authentic Testing Corporation, providing psychometric and test-development support to several test sponsors.