Skip top navigation
College of Education and Human Development - George Mason University

2023 CAEP Annual Reporting

Reporting Measure  
Measure 1. Completer Effectiveness (R4.1)

Measure 1. Completer Effectiveness (R4.1)

COMPLETER IMPACT IN CONTRIBUTING TO P-12 STUDENT-LEARNING GROWTH: As reported in AY20-21, school divisions across the Commonwealth closed schools in March 2020 to prioritize the health and safety of Virginia children and their teachers, and to lessen the spread of COVID. The majority of Virginia publics schools went fully remote for the Academic Year 2020-2021 as guidelines were being continuously revised based on new information and questions that arose.

In March 2021, the Virginia Department of Education released revised Guidelines for Uniform Performance Standards and Evaluation Criteria for Teachers. The Guidelines noted that the teacher performance standards cannot be modified; however, school divisions may modify the performance indicators and performance rubrics, and the approach and incorporation of observation, documentation, and other data sources. The measure of student progress (i.e., completer effectiveness) remains required to be a significant component of the evaluation; yet “how student academic progress is met in the evaluation is the responsibility of the local school division.” Essentially all Virginia teachers are required to be evaluated against the standards, but how they are evaluated, and whether this evaluation is collected in a systematic, automated process is decided at the local level.

In preparation for AY2021-22, school divisions began to re-evaluate the COVID climate, trying to determine what best suits the safety and educational needs of their local community. Approaches ranged from fully remote, to blended, to remaining virtual. As one employer commented in the VEAC Employer survey (more details below), “The 2021-2022 School Year was more like [XXXXX]'s first year of teaching, as she was a virtual teacher last year, and only returned to in-person learning with 50% of her class last year.” There continued to be substantial COVID-related challenges as protocols for handling positive COVID cases changed throughout the school year.

During this same period, the Virginia Department of Education discontinued providing each EPP with a list of the completers and their place of employment (see below). This added an another barrier for each EPP knowing which school district employed their completers.

Based on the continued COVID challenges, and the lack of supporting documentation from the Virginia Department of Education, in AY2021-2022, the EPP did not contact local school divisions to collect anonymized and aggregated Teacher Performance Evaluation data on EPP completers that would serve as evidence that contributes to P-12 student learning growth. The EPP felt that the request would be too burdensome for the local school divisions, and that, due to the potential range in teacher evaluation implementation, the data may not be valid or reliable for ongoing program evaluation. As a stakeholder in Virginia Education Assessment Collaborative (VEAC), the EPP plans to collaborate with other EPPs to determine the most beneficial approach to collecting evidence of completers contribution to P-12 student-learning growth.

In late spring 2022, the EPP did revisit the possibility of collecting Teacher Evaluation data with one local school division. However, the EPP also realized that due to COVID, school divisions had experienced severe turnover and school division contacts had changed. It has been important for the EPP to work on identifying appropriate school division contacts again, and establishing new relationships from which both the EPP and the school division could mutually benefit. In late spring, the EPP designated an internal EPP contact to reach out to school divisions and re-determine the approach for potentially collecting teacher evaluation data.

The long term goal is that once the EPP re-establishes a reporting system with this local school division, the EPP can collaborate with other EPP stakeholders in VEAC to collect Teacher Evaluation on all Virginia EPP completers in this school division and to share this information so that all Virginia EPP have equivalent, benchmarkable data as evidence for contributing to P-12 student-learning growth.

COMPLETER EFFECTIVENESS IN APPLYING PROFESSIONAL KNOWLEDGE, SKILLS, AND DISPOSITIONS: In the meantime, as noted last year, the EPP has re-focused on collecting more concrete, in-depth qualitative information about how completers apply professional knowledge, skills, and dispositions in the P-12 classroom. The EPP plans to collect information for its licensure programs in groups, not all licensure programs at once, to make each collection more useful and targeted.

As noted below, the EPP participates in the VEAC Completer Surveys. The EPP provides an in-house generated list of EPP completer emails to VEAC’s survey manager (University of Virginia) to send out and collect responses as evidence for CAEP R4.2, R4.3, RA4.1 and RA4.2.

During the next AY22-23, the EPP plans to increase its rate of response by 1) focusing on a selected group of licensure completers, 2) having better non-EPP email address for this selected group and 3) having program faculty email this selected group in advance of the survey as a “heads up” notice (using a VEAC template email, of course!).

Within this “heads up” email, program faculty will be asking for volunteers to participate in focus groups. VEAC has created a focus group questions and protocol that will be initially piloted by James Madison University, and then by Mason. The focus groups would be conducted by PhD students, giving them to opportunity to practice focus group monitoring and collection skills, and to ensure unbiased moderators. Also, with the increase in zooming, remote focus groups also can be considered without the need for travel and additional time considerations. Planning is in place for implementation in late spring 203.

Measure 2 (Initial and Advanced): Satisfaction of employers and stakeholder involvement (R4.2, R5.3, RA4.1)

As noted in previous Annual Reports, in the Commonwealth of Virginia, there is no clear mechanism for collecting and sharing data across the state education agency, EPPs, and P-12 school divisions. Collection of accurate emails and post-graduation contact information is a continuous challenge. Additionally, in summer 2021, the EPP also learned of the retirement of a critical, long-term Virginia Department of Education senior official, as well as a long-serving, loved VDOE staff member who provided each Virginia EPP their completer list during summer months. Both of these changes greatly impacted the EPP’s ability to provide Virginia Education Assessment Collaborative (VEAC) ( with accurate completer lists and emails for completer satisfaction surveys.

Despite these challenges, VEAC distributed and collected survey information for 29 Virginia EPPs for the third year in AY 2021-2022.

Employer Satisfaction Data
As noted in the Initial Licensure Employer Survey VEAC Report, “for our 2021-2022 cycle, VEAC fielded the Employer Survey to employers of completers from 29 EPP Initial Standard 4 partners” (up from 27 EPPs the previous year). Upon closing the survey in August 2022, VEAC collected 1169 employer complete and partial responses (29% response rate) and 874 completer complete and partial responses (29% response rate). For George Mason University, the EPP had a 6% (n=16) employer response rate and a 7% (n=30) completer response rate on the VEAC Surveys based on the total number of contacts submitted to VEAC minus the number of failed/bounced emails.” It was this disappointing response rate that has made Mason re-evaluate its approach to a more focused, targeted licensure population for the next Academic Year.

The average rated readiness of the EPP completers by employers was 4.31 (on a 1-5 scale); this score was lower than the overall VEAAC score of 4.43, and reflected the EPP’s overall results in the VEAC Employer survey. For every VEAC item, Mason completers scored lower than the overall VEAC group, except for one category: “Maintains a commitment to professional ethics, communicates effectively, and takes responsibility for, and participates in professional growth that enhances student learning.” For this item, Mason was only marginally higher (3.38 versus 3.36). The EPP plans to conduct an event where licensure programs can deeply review this data to understand how improvements can be made at the program level to increase Mason’s score.

Looking at the VEAC Employer Data in conjunction with the VEAC Completer data, for George Mason University, the average overall satisfaction of Mason completers scored was marginally lower than last year at 4.43, and marginally lower than the overall EPP score of 4.49. Mason completers scored measurably lower in “Work results in acceptable, measurable, and appropriate student academic progress.” This is a rating the program faculty need to review, and then reflect on how each program is preparing candidates for this measure.

Looking again at the interests of Technology and Diversity/Cultural Awareness, the Employer responses seems to reflect the need for Mason to review their completer preparation:

Item H: Selects technologies, informed by research, to promote learning for all students.
Completer EPP Mean: 3.23 vs VEAC Mean: 3.23
Employer EPP Mean: 3.12 vs VEAC Mean: 3.26

Item I: Integrates technology into instructional materials.
Completer EPP Mean: 3.23 vs VEAC Mean: 3.31
Employer EPP Mean: 3.12 vs VEAC Mean: 3.32

Diversity and Cultural Awareness:
J. Brings multiple perspectives to instruction, including the learners' personal, family, and community experiences
Completer EPP Mean: 3.23 vs VEAC Mean: 3.28
Employer EPP Mean:3.12 vs VEAC Mean: 3.24

K: Integrates diverse language and cultures into instruction to promote the value of multilingual / multicultural perspectives
Completer EPP Mean: 3.06 vs VEAC Mean: 3.07
Employer EPP Mean: 3.12 vs VEAC Mean: 3.17

In the open-ended responses, employers gave few responses. One quote is noted above in terms of the working environment. The two other responses were positive.

These reports were widely shared with the faculty to review and inform their program goals and improvement; however a deep dive needs plans to be conducted to determine why Mason results are overall lower than the Virginia average. These VEAC surveys are critical in informing our licensure programs about how the preparation of our programs is perceived.


In previous years, the advanced licensure faculty have expressed concern about survey distribution to advanced license completers who may choose to remain in their initial licensure employment, and to their employers. There was a concern that those surveyed would not be able to clearly differentiate between skills gained in the initial licensure program as opposed to the advanced licensure program. Quantitative data was collected, but did not satisfy the need for relevant and actionable data that may be more readily available from a qualitative approach.

Based on these findings, the EPP advanced licensure programs instead identified open-ended qualitative questions that were shared with their advisory councils.

As noted in last year’s Annual Report, over AY2021-22, the Advanced programs of the EPP submitted an EPP Advanced Interim CAEP Report and participated in a Virtual Site Visit in fall 2021. In May 2022, the Advanced Programs were granted CAEP accreditation until our next review, which will occur in fall 2025.

In preparation for this Advanced Interim Report, the programs developed plans for the collection of employer satisfaction data. The plans outlined the desire to increase the participation of Advisory Councils for the following reasons:

  • Advisory Council activities support the efforts of Advanced CAEP Standard 2, 3, 4, and 5, through the EPP’s DARE.
  • Advisory Council membership is inclusive and diverse. Members may include clinical experience partners, candidates, completers, employers, superintendents, school division HR experts, and other active community members. By having a diversity of positions and points of view, the Advisory Council discussions can inform and activate new program direction and improvement.
  • Advisory Council approach allows for the collection of more in-depth, qualitative information that can directly inform program content and effectiveness for the specialization field. An active Advisory Council also strengthens the partnership and collaborative efforts of each program and its stakeholders. The structure and activities of the Advisory Council are based on the needs of the advanced program, and work to best enhance and support program improvement.

In AY21-22, as part of the CAEP Interim Report, each Mason advanced program provided a description of their advanced program's current Advisory Council, their participants, and supporting evidence of their participation in program evaluation, improvement, and/or identification of models of excellence. The evidence of Advisory Council active discussions related to employer satisfaction can be extracted from the Advisory Council minutes. The feedback is content-specific, but informs the level of employer satisfaction.

As noted above, and in the Interim Advanced CAEP Report, the EPP had initially expressed the concern of using surveys for advanced completers and employers. The EPP is revisiting this approach as VEAC has created advanced surveys, specific to each licensure area, that align with the CAEP standards as well as the appropriate Virginia standards. The EPP plans to pilot this approach in spring 2023.

Stakeholder Involvement (R5.3): The EPP has a well-established Quality Assurance System called DARE (Data Assessment Review and Evaluation) which provides programs the opportunity to look at multiple measures to inform their program goals on an annual basis. As noted above, “Advisory Council activities support the efforts of Advanced CAEP Standard 2, 3, 4, and 5,” and advisory councils are made up of a variety of stakeholders and representatives who bring their own voice and perspective to each licensure program. The Advisory council actionable feedbdack – formal and informal- informs all aspects of each licensure program and serves as a measure for our DARE, and as evidence of stakeholder involvement.

In addition to our program Advisory Committees, as an active member in VEAC, the EPP is also collecting additional stakeholder feedback through the VEAC Surveys. The quantitative results, as well as the qualitative responses, of our completers and employers serves as some of the multiple measures for the EPP’s DARE, as evidence for informing the EPP’s mastery of Standard 4, and as evidence of stakeholder participation.

Measure 3 (Initial and Advanced): Candidate competency at completion.

The following candidates successfully completed all of the state licensure requirements and were recommended for licensure:

Data show:
Number of completers in programs leading to initial teacher certification or licensure: 340

Number of completers in advanced programs or programs leading to a degree, endorsement, or some other credential that prepares the holder to serve in P-12 schools (Do not include those completers counted above.): 187

Measure 4 (Initial and Advanced): Ability of completers to be hired.

Initial and Advanced Programs: Mason’s One Year Out Career Survey collects information about employment status and further education up to one year after obtaining a degree from Mason. This survey is administered to degree recipients once per year in early summer.

At this point, results after 2020-2021 have not yet been posted. Results from the 2020-21 survey had responses from 23 initial licensure and 32 advanced licensure completers. Initial licensure completers represented Health and Physical Education; ESOL and Foreign/World Languages; Early Childhood Education; Elementary Education; Secondary Education; and Art Education. Out of the 23 respondents, 100% are employed; 86% (or 20 completers) are working in their field.

All five advanced programs were represented in the 32 respondents (Education Leadership; Math Ed Leadership; Literacy; School Counseling; School Psychology). All but one of the respondents (99%) are employed in the field; the one outlier noted that they are not seeking employment.

The ongoing shortage for educator professionals in Virginia leads to many opportunities for employment of educators in Virginia. VDOE is making efforts to recruit for people to be teachers, but resulting data on these efforts is unknown:

Average GPA for Program Completers

Traditional Program Group Degree Level GPA
George Mason University (017) All program completers, 2017-18 Bachelor’s 3.52
George Mason University (017) All program completers, 2017-18 Master’s 3.96
George Mason University (017) All program completers, 2016-17 Bachelor’s 3.48
George Mason University (017) All program completers, 2016-17 Master’s 3.97
George Mason University (017) All program completers, 2015-16 Bachelor’s 3.44
George Mason University (017) All program completers, 2015-16 Master’s 3.97

Institutional Licensure Pass Rates

Traditional Program Group Number
taking tests
passing tests
Pass rate (%)
George Mason University (017) All program completers, 2017-18 334 334 100
George Mason University (017) All program completers, 2016-17 337 337 100
George Mason University (017) All program completers, 2015-16 375 375 100

Institutional Report Card