Menu
Five Mile Final | An Aviation Sandbox

Aviation Instructor's Handbook


Types of Assessments

Instructors continuously evaluate a learner's performance in order to provide guidance, suggestions for improvement, and positive reinforcement.

Traditional assessment involves the kind of written testing and grading that is most familiar to instructors and learners. This includes things like multiple-choice tests and brief quizzes.

Authentic assessment requires the learner to demonstrate not just rote and understanding, but also the application and correlation levels of learning. The learner must exhibit in-depth knowledge by generating a solution instead of merely selecting a response. Learners know the standards of success in advance.

A rubric is a guide used to score performance assessments in a reliable, fair, and valid manner. It is generally composed of dimensions for judging learner performance, a scale for rating performances on each dimension, and standards of excellence for specified performance levels.

Diagnostic assessments assess learner knowledge or skills prior to a course of instruction.

Formative assessments provide a wrap-up of the lesson and set the stage for the next lesson. They are not graded.

Summative assessments measure how well learning has progressed to that point. They are used periodically throughout the training.


Purpose of Assessment

An effective assessment provides critical information to both the instructor and the learner. A good assessment provides practical and specific feedback to learners, with direction and guidance indicating how they may raise their level of performance.

A well-designed assessment can help the instructor see where more emphasis is needed. If several learners falter when they reach the same step in a weight-and-balance problem, the instructor should recognize the need for special emphasis.


General Characteristics of Effective Assessment

An effective assessment displays several characteristics:


Traditional Assessment

A traditional assessment generally refers to written testing — multiple choice, matching, true/false, fill in the blank. There is a single, correct response for each item and an established time limit to complete the assessment.

The assessment, or test, assumes that all learners should learn the same thing, and relies on rote memorization of facts. Responses are often machine scored and offer little opportunity for a demonstration of the thought processes characteristic of critical thinking skills.

While the measure of performance is limited, such tests are useful in assessing the learner's grasp of information, concepts, terms, processes, and rules. This becomes the foundation needed for higher levels of learning.

Characteristics of a Good Written Assessment


Authentic Assessment

Authentic assessment asks the learner to perform real-world tasks and demonstrate a meaningful application of skills and competencies. Learners must rely on critical-thinking skills. Instructors rely upon open-ended questions and established performance criteria.

Authentic assessment focuses on the learning process, enhances the development of real-world skills, encourages higher order thinking skills, and teaches learners to assess their own work and performance.

Learner-Centered Assessment

Collaborative critique is a form of learner-centered grading. The instructor begins by using a four-step series of open-ended questions to guide the learner through a complete self-assessment. Through this discussion, the instructor and the learner jointly determine the learner's progress.

Questions that may help with the "Reflect" step of a collaborative critique include:

  1. What was the most important thing you learned today?
  2. What part of the session was easiest for you? What part was hardest? 3. Did anything make you uncomfortable? If so, when did it occur?
  3. How would you assess your performance and your decisions?
  4. How did your performance compare to the standards in the ACS?

Questions that may help with the "Redirect" step of a collaborative critique include:

  1. How does this experience relate to previous lessons?
  2. What might be done to mitigate a similar risk in a future situation?
  3. Which aspects of this experience might apply to future situations, and how?
  4. What personal minimums should be established, and what additional proficiency flying and/or training might be useful?


Rubric for assessing flight training maneuvers.

Rubrics

The collaborative assessment process in learner-centered grading uses two broad rubrics:

For maneuvers or procedures, at the completion of the scenario the learner is able to:

For assessing risk management skills, the learner is able to:


Choosing an Effective Assessment Method

When deciding how to assess learner progress, aviation instructors can follow a four-step process:

  1. Determine level-of-learning objectives.
  2. List indicators of desired behaviors.
  3. Establish criterion objectives.
  4. Develop criterion-referenced test items.

Authentic assessment may not be as useful as traditional assessment in the early phases of training. The learner does not have enough information about the concepts or knowledge to participate. When exposed to a new topic, learners first tend to acquire and memorize facts. When learners possess the knowledge needed to analyze, synthesize, and evaluate, they can participate more fully in the assessment process.

Determine Level-of-Learning Objectives

xxxxx

List Indicators/Samples of Desired Behaviors

xxxxx

Establish Criterion Objectives

xxxxx

Develop Criterion-Referenced Assessment Items

xxxxx


Critiques and Oral Assessments

An effective critique considers good as well as bad performance, the individual parts, relationships of the individual parts, and the overall performance.

Instructor/Learner Critique

xxxxxxx

Learner-Led Critique

xxxxxxx

Small Group Critique

xxxxxxx

Individual Learner Critique by Another Learner

xxxxxxx

Self-Critique

xxxxxxx

Written Critique

xxxxxxx


Oral Assessment

The most common means of assessment is direct or indirect oral questioning of learners by the instructor. Proper quizzing by the instructor can have a number of desirable results, such as:

Characteristics of Effective Questions

The instructor should devise and write pertinent questions in advance. To be effective, questions must:

Types of Questions to Avoid

Answering Learner Questions


Scenario-Based Training

xxxx


Flight Instructor Test Questions

Proper oral quizzing by the instructor during a lesson identifies points which need more emphasis.

Before students willingly accept their instructor's critique, they must first accept the instructor.

To be effective in oral quizzing during the conduct of a lesson, a question should be of suitable difficulty for that stage of training.

A written test has validity when it measures what it is supposed to measure, not when it yields consistent results (reliability).

A written test is said to be comprehensive when it samples liberally whatever is being measured.

Supply-type test items cannot be graded with uniformity.

A characteristic of supply-type test items is that the same test graded by different instructors would probably be given different scores.

Effective multiple-choice test items should keep all alternatives of approximately equal length.

Which is one of the major difficulties encountered in the construction of multiple-choice test items? Inventing distractors which will be attractive to students lacking knowledge or understanding.

In a written test, the "matching" selection-type items reduces the probability of guessing correct responses.

Practical Test Standards: Flight Instructor

I. Fundamentals of Instructing
Task D: Assessment and Critique

Objective: To determine that the applicant exhibits instructional knowledge of assessments and critiques by describing:

Assessment:

  1. Purpose of assessment
  2. General characteristics of effective assessment
  3. Traditional assessment
  4. Authentic assessment
  5. Oral assessment
  6. Characteristics of effective questions
  7. Types of questions to avoid

Critique:

  1. Instructors/student critique
  2. Student-lead critique
  3. Small group critique
  4. Individual student critique by another student
  5. Self-critique
  6. Written critique

Oral Exam Questions

  1. What is the purpose of assessment?
  2. What makes an assessment effective?
  3. What are the different types of assessments you can expect to use as an instructor?
  4. Why would you use a traditional assessment? An authentic assessment?
  5. What is the purpose of an oral assessment?
  6. What is a rubric?
  7. What are the characteristics of good questions?
  8. What types of questions should you avoid?
  9. What is the purpose of critique?
  10. What are some different type of critique?
  11. What are the four steps of a collaborative critique?

Robert Wederquist   CP-ASEL - AGI - IGI
Commercial Pilot • Instrument Pilot
Advanced Ground Instructor • Instrument Ground Instructor


Disclaimer: By accessing this website you agree to indemnify and hold harmless its curator, who combines public-domain material with his own thoughts and opinions about such matters. No warranty as to the timeliness or accuracy of the content found on Five Mile Final dot com is expressed, nor is any implied. All users of this website are encouraged to access FAA-published materials for the most recent and accurate information.