Skip top navigation
College of Education and Human Development - George Mason University

2022 CAEP Annual Reporting

Reporting Measure  
Measure 1. Completer Effectiveness (Initial Licensure)

As noted in the AY19-20 reporting, AY20-21 continued to be impacted by the closing of schools after March 2020. Fairfax County Public Schools (FCPS), the main employer for Mason’s graduates, was fully remote from August 2020-March 2021. In March 2021, students, and teachers, were given the option of returning in-person or continuing virtual for the remainder of the school year. By offering these options, all classrooms switched to “concurrent” classrooms that included a mixture of both face-to-face and online students and teachers. For classrooms in which teachers selected to remain online, FCPS searched for “classroom monitors” for the in-school components.

FCPS, and all other school divisions, appropriately prioritized careful, measured steps to ensure the safety and health of all students, teachers, and staff. The EPP recognized the needs of the school divisions to focus on these issues, and the ongoing challenge of collecting teacher evaluation data that would accurately reflect how the EPP programs prepared candidates for the classroom.

As noted last year, the EPP did not contact local school divisions to collect anonymized and aggregated Teacher Performance Evaluation data on EPP completers. Over AY20-21, the EPP had hoped to restart this request, but felt again the request would be too burdensome for the school divisions, and that the data may not be valid or reliable for ongoing program evaluation. At the end of AY20-21, as the EPP began to study the new CAEP Workbook with the revised CAEP standards, and to reassess appropriate measures for R.4.1, and the guidance that “while the most recent three cycles of data must be provided as part of the accreditation review, in the course of a seven year accreditation cycle data will be representative of all programs…” The approach allows the EPP to consider more options. Previously the EPP had focused on collecting data results for all licensure programs, at all times. The EPP is reconsidering the smaller-scale approaches of case studies and focus groups to collect more concrete, in-depth qualitative information about how completers effectively contribute to P-12 student-learning growth and apply their knowledge. With the re-opening of the schools, observations and case studies can be arranged. With the increase in zooming, remote focus groups also can be considered without the need for travel and additional time considerations. A review of these options will continue into AY21-22.

Measure 2 (Initial and Advanced): Satisfaction of employers and stakeholder involvement.

INITIAL LICENSURE
As noted in previous Annual Reports, in the Commonwealth of Virginia, there is no clear mechanism for collecting and sharing data across the state education agency, EPPs, and P-12 school divisions. Collection of accurate emails and post-graduation contact information is a continuous challenge. Additionally, in summer 2021, the EPP also learned of the retirement of a critical, long-term Virginia Department of Education senior official, as well as a long-serving, loved VDOE staff member who provided each Virginia EPP their completer list during summer months. Both of these changes greatly impacted the EPP’s ability to provide Virginia Education Assessment Collaborative (VEAC) (https://projectveac.org) with accurate completer lists and emails for completer satisfaction surveys.

In AY 19-20, VEAC launched the pilot completer survey and employer surveys, and again launched the same surveys again for AY20-21. VEAC partners submitted contact information for program completers to VEAC in January 2021. Initial recruitment for the survey was conducted on February 16, 2021 and was open with reminders through April 26, 2021.

Employer Satisfaction Data
As noted in the Initial Licensure Employer Survey VEAC Report, “for our 2020-2021 cycle, VEAC fielded the Employer Survey to employers of completers from 27 EPP Initial Standard 4 partners. Upon closing the survey in April 2021, VEAC collected 1,405 complete and partial responses (40% response rate)… For George Mason University, the EPP had a 39% response rate on the VEAC Employer Survey based on the total number of contacts submitted to VEAC minus the number of failed/bounced emails.”

Mason had more employers (107) than completers (33) complete the surveys. The EPP suspects have better emails for employers, and employers noting the value in the feedback, as a possible reason for the better response.

The average rated readiness by program completers for all EPPs in 2020-2021 was 4.52 (on a 1-5 scale). For George Mason University, the average overall satisfaction of Mason completers scored was marginally higher at 4.55, a significant change from the previous year at 4.26. Looking again at the interests of Technology and Diversity/Cultural Awareness, the Employer responses seems to equate to the completer responses”

Technology:
Item H: Selects technologies, informed by research, to promote learning for all students.
Completer EPP Mean: 3.16 vs VEAC Mean: 3.27
Employer EPP Mean: 3.32 vs VEAC Mean: 3.37

Item I: Integrates technology into instructional materials.
Completer EPP Mean: 3.41 vs VEAC Mean: 3.38
Employer EPP Mean: 3.43 vs VEAC Mean: 3.44

Diversity and Cultural Awareness:
J. Brings multiple perspectives to instruction, including the learners' personal, family, and community experiences
Completer EPP Mean: 3.13 vs VEAC Mean: 3.02
Employer EPP Mean: 3.36 vs VEAC Mean: 3.32

K: Integrates diverse language and cultures into instruction to promote the value of multilingual / multicultural perspectives
Completer EPP Mean: 3.31 vs VEAC Mean: 3.32
Employer EPP Mean: 3.36 vs VEAC Mean: 3.25

In the open-ended responses, employers gave very few negative responses. There were numerous positive responses but one best summarized the overall responses:

[Completer] dedicated to her craft as an educator and consistently seeks out additional learning experiences to develop the tools in her toolbox so she positively impact[s] students. She came with the necessary skills and mindset to tackle the challenges presented while thriving in a school setting. She is confident and secure in her beliefs about what is right for students and what will have the most impact on their learning. Thank you for pouring into her and preparing her to prepare future generations to live happy and successful lives as positive contributors to our community.

These reports were widely shared with the faculty to review and inform their program goals and improvement.

ADVANCED LICENSURE

In previous years, the advanced licensure faculty have expressed concern about survey distribution to advanced license completers who may choose to remain in their initial licensure employment, and to their employers. There was a concern that those surveyed would not be able to clearly differentiate between skills gained in the initial licensure program as opposed to the advanced licensure program. Quantitative data was collected, but did not satisfy the need for relevant and actionable data that may be more readily available from a qualitative approach.

Based on these findings, the EPP advanced licensure programs instead identified open-ended qualitative questions that were shared with their advisory councils. A summary of this approach was reported in the EPP Advanced Interim CAEP Report and included in the Plans developed for the Advanced Interim CAEP Report.

As indicated in the Plan, in AY20-21, the pandemic did impact the EPP; AERO lost critical staff involved in this process and advanced program leaders needed to focus on the shift to online coursework and experience. Advisory Councils, made up of primarily school representatives, were given the space to focus on their own immediate roles and employment activities. In AY20-21, each advanced program collaborated with stakeholders to respond to the immediate needs, while internally creating broad questions to start conversations related to the CAEP Advanced Standards.

The Advanced Programs saw numerous advantages to using their Advisory Council:

  • Advisory Council activities support the efforts of Advanced CAEP Standard 2, 3, 4, and 5, through the EPP’s DARE.
  • Advisory Council membership is inclusive and diverse. Members may include clinical experience partners, candidates, completers, employers, superintendents, school division HR experts, and other active community members. By having a diversity of positions and points of view, the Advisory Council discussions can inform and activate new program direction and improvement.
  • Advisory Council approach allows for the collection of more in-depth, qualitative information that can directly inform program content and effectiveness for the specialization field. An active Advisory Council also strengthens the partnership and collaborative efforts of each program and its stakeholders. The structure and activities of the Advisory Council are based on the needs of the advanced program, and work to best enhance and support program improvement.

In AY21-22, as noted in the plans, each advanced program plans to reconnect with their Advisory Councils to collect responses that will inform program improvement and be included as measures in the EPP’s DARE.

Measure 3 (Initial and Advanced): Candidate competency at completion.

The following candidates successfully completed all of the state licensure requirements and were recommended for licensure:

Data show:
Number of completers in programs leading to initial teacher certification or licensure: 310

Number of completers in advanced programs or programs leading to a degree, endorsement, or some other credential that prepares the holder to serve in P-12 schools (Do not include those completers counted above.): 213

Measure 4 (Initial and Advanced): Ability of completers to be hired.

Initial and Advanced Programs: Mason’s One Year Out Career Survey collects information about employment status and further education up to one year after obtaining a degree from Mason. This survey is administered to degree recipients once per year in early summer. At this point, results after 2019-2020 have not yet been posted. Results from the 2019-20 survey show an overall 20% return rate. Out of those who responded under the Curriculum and Instruction degree (the majority of our licensure programs), 93% were employed in their field; 6% were not employed but seeking employment; and one percent was not employed and not seeking employment. Three completers of Health and Physical Ed BSED responded that they were employed. For completers of the Education Leadership, or Mathematics Education Leadership program, 97% reported that they were employed in their field, while the remaining 3% were not seeking employment. The ongoing shortage for educator professionals in Virginia leads to many opportunities for employment of educators in Virginia (http://www.doe.virginia.gov/teaching/workforce_data/). It is important also to note that although the EPP is provided with limited employment data from VDOE, we make every effort to gather completer information.

Average GPA for Program Completers

Traditional Program Group Degree Level GPA
George Mason University (017) All program completers, 2017-18 Bachelor’s 3.52
George Mason University (017) All program completers, 2017-18 Master’s 3.96
George Mason University (017) All program completers, 2016-17 Bachelor’s 3.48
George Mason University (017) All program completers, 2016-17 Master’s 3.97
George Mason University (017) All program completers, 2015-16 Bachelor’s 3.44
George Mason University (017) All program completers, 2015-16 Master’s 3.97

Institutional Licensure Pass Rates

Traditional Program Group Number
taking tests
Number
passing tests
Pass rate (%)
George Mason University (017) All program completers, 2017-18 334 334 100
George Mason University (017) All program completers, 2016-17 337 337 100
George Mason University (017) All program completers, 2015-16 375 375 100

Institutional Report Card