External Reporting Requirements for Teacher Licensure
2025 CAEP Accountability Measures
Reporting Measure | |||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Measure 1. Completer Impact and Effectiveness |
Measure 1. Completer Impact and Effectiveness Completer Impact As noted last year, the EPP continued to collect anonymized Teacher Evaluation data from school districts. In early 2024, the Virginia Department of Education reinstituted sharing a list of completers with employer information. The EPP used this document as its main source of information. As described in the EPP’s recently submitted Self Study Report, based on this list, the EPP decided to make a reasonable effort to collect anonymized and aggregated Teacher Evaluation Data from other school districts that employ GMU completers. Out of the 933 initial completers on the VDOE list, 35% did not have information on where they may be employed. Out of the remaining 605 names, over 500 were employed in the three local, large school divisions in which collecting this data had proved inefficient and administratively challenging. For currency of the program, the EPP removed the 2019 completers. For completer confidentiality, only school divisions with three or more completers were contacted. Data from three out of the five remaining school districts was not collected for the following reasons:
The remaining two school districts provided limited data. After this multi-faceted approach, the results were anonymized Teacher Evaluation scores for 15 EPP 2020-2023 completers. As part of the EPP’s Quality Assurance System, each program received the results to review. The data was not representative of the initial licensure completers, but some small inference can be made that overall, completers are effective and meeting the Virginia Uniform Performance Standards for teachers in Virginia. In particular, out of the 15 EPP completers scored, 14 were meeting or exceeding Standard 8: Student Academic Progress. Completer Effectiveness As noted last year, the EPP also continued its efforts to conduct focus groups to collect completer feedback on their effectiveness in the classroom. In preparation for spring 2024, the focus group protocol was moderately revised to clarify question meaning, based on the results of the spring 2023 focus groups. Also, the solicitation approach sent via Qualtrics had every completer receive a message with the signature of a program contact. Names of completers interested in participating in the focus groups were again collected, this time resulting in 41 original names representing a broader range of programs. A pre-qualified PhD candidate facilitated the focus groups; however, only eight completers participated. The transcription for each of the focus groups was reviewed for themes and points of interest. Similar results resonated each year with the focus groups. Overall, the completers felt that they did effectively contribute to P-12 student-learning growth and were able to apply what they had learned in the EPP programs. Themes from the focus groups included the need for more classroom management training, but also recognized and appreciated the EPP’s emphasize on the importance of building relationships. The discussions had been fruitful and rich, and can be used to inform continuous improvement. As part of the EPP’s Quality Assurance System, each program received detailed Focus Group Summaries to review. In an effort to collect more evidence of EPP teacher effectiveness, the EPP conducted pilot case studies in spring 2024. An educator preparation faculty member with extensive experience in observing pre-service teachers was compensated to conduct these pilot Case Studies. The request for Case Study participation was solicited in early Spring 2024. From this spring 2024 solicitation, 19 completers expressed interest in participating in a Case Study, but only three of these completers participated. Another three were identified by the Case Study faculty member. Overall, the findings concluded that the completers were effective teachers and were able to apply the knowledge earned in their EPP programs. The case study participants also echoed the themes of the focus groups. As part of the EPP’s Quality Assurance System, each program received detailed Case Study Summaries to review. One overall observation that can be made is that it is the personal connection between the EPP faculty and their school district or completer that helps secure stakeholder participation. Building and maintaining strong mutually beneficial relationship is important. |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Measure 2. Satisfaction of employers and stakeholder involvement |
Measure 2. Satisfaction of employers and stakeholder involvement As noted in previous Annual Reports, in the Commonwealth of Virginia, there is no clear mechanism for collecting and sharing data across the state education agency, EPPs, and P-12 school divisions. Collection of accurate post-graduation contact information, and therefore post-graduation information is a continuous challenge. The Virginia Education Assessment Collaborative (VEAC) (https://projectveac.org ) has been the primary driver to collecting this information across the state for several years. Initial Licensure Programs To recap, several members of the 36 Virginia EPPs formed the Virginia Education Assessment Collaborative (VEAC) to create a state-wide system for data collection; specifically, to build a common set of assessment measures, tools, and activities that all Virginia EPPs may use in response to the CAEP requirements. In the subsequent years, the growing and strong relationship of the GMU EPP with other Virginia EPPs was due to the benefits of the VEAC partnership. One of the first developments of VEAC was the creation and implementation of employer surveys in response to CAEP standard R4.2. As noted in previous years, the employer response rate (N=16 at 6%) was very low and not representative. This survey was conducted as schools were emerging from COVID. For employers, VEAC provides a comprehensive list of principals’ names and emails for each school in the Commonwealth. The EPP has made better efforts to reference this list to provide complete information for each identified employer to reduce the amount of email bounce back to the initial survey request. Each year thereafter, the numbers and rate of response have grown (AY22-23, N=79 at 26%; AY23-24, N=108 at 34%). VEAC provides the individual EPP raw and prepared response data, and benchmarking results, on employer satisfaction to each EPP. As part of the EPP’s Quality Assurance System (QAS), each initial licensure program reviewed these results, as well as each program’s disaggregated data to evaluate qualitative comments from the employers. In response to CAEP R4.2, for the representative sample years, the employer response to the VEAC Overall Satisfaction question indicates that employers are satisfied with GMU EPP completers. For the question, "Based on your experience with this teacher, what best describes the extent to which they were ready to meet the needs of students in your school?" in AY22-23, on a scale of 1-5, 5 being "Fully ready," the EPP overall mean was 4.21. In AY23-24, this improved to 4.29. Stakeholder Involvement Our commitment to collaborative partnerships begins with the EPP leadership, including the dean of the College of Education and Human Development, the director of the School of Education, and other senior academic leaders. All of these leaders participate in high-level partnerships that are mutually beneficial to PreK-12 schools and community agencies, and that focus on preparing candidates to be future leaders. Each program is in continual contact with partners. The programs with formal broad advisory councils meet on a regular basis with representatives from partner school administrators, community leaders, alumni, clinical faculty, and adjunct faculty. These meetings have action-related discussions in which current pressing candidate issues are documented, which were collected and submitted as evidence for the EPP’s recent Self Study Report. Some specific partnerships to note are the Mason Elementary Professional Development School (PDS) Network and the Adapted and Blind/Visual Impairment Consortium Boards. Advanced Programs Similar to the initial program approach noted above, VEAC provided the individual EPP and benchmarking results on employer satisfaction to each EPP. As part of the EPP’s QAS, each advanced program reviewed these results, as well as each program’s disaggregated data to evaluate qualitative comments from the employers. Based on the data collected: EDLE graduates overall scored better than other participating Virginia EDLE graduates on dispositional skills such as "Collaboratively works with parents and school personal to ensure that students with disability about are included," "Commitment to continuous professional learning," and "Intentionally and purposefully models professional, moral and ethical standards." Despite low responses in AY22-23, employers indicated positive satisfaction with the EDLE completers. As more employers responded in AY23-24, more representative conclusions could be made. In response to CAEP RA4.1, results indicated that EDLE employers were satisfied with the completer preparation. For the question, "Based on your experience with this completer, what best describes the extent to which they were ready to effectively work with diverse P-12 students and their families?" on a scale of 1-5, 5 being "Fully ready," 87% of the employer responses for the five advanced programs for GMU EPP completers were scored "mostly ready" to "fully ready." After reviewing the scores and comments, programs made the following observations: The Literacy program noted "employers rated 50% fully and 50% mostly ready. Faculty noted that the years of prior teaching/working experience of these completers may have impacted the ratings of full/mostly. Faculty noted that a completer’s ability to have (and display) confidence can impact scores." The MEL observed that "Out of the 10 employers, 9 noted that the completers were "fully ready" (one completer received a "Not ready"). One employer praised the "deep understanding of mathematical development is impressive." Within the ratings, the use of resource materials and collaboration had four scores of "proficiencies" while overall scores were "exemplary." Stakeholder Participation The advanced programs have multiple EPP- and program-level formal and informal mechanisms for collecting employer satisfaction. The stakeholder activities outlined in CAEP RA2 represent another method of collection of employer satisfaction information. EDLE Advisory The EDLE program has had an active advisory council for many years. The EDLE advisory council focuses on having two-way conversations about actions that matter to the stakeholders; recruitment, retention, issues of equity/burnout/healthy pool and how to address the critical shortages. Participants include Superintendents, Assistant Superintendents, Directors of Leadership Development, and Central Division Leaders. The council meets twice a year. MEL Advisory The MEL program has had an active advisory council of school district math supervisors and alumni that meets twice a year. This group has talked about ways to bring other stakeholders into MEL discussions (e.g., "had 18 different businesses at our math night. The ask was how you use math in your job"), incorporating technology (e.g., "focusing on how we can get teachers to see technology as an interactive support for learning"), needed skills and opportunities, and current challenges (e.g., "intervention, intervention, intervention!"). Literacy Executive Advisory Board - This group is made up of Curriculum and Instruction, Teaching and Learning, and Literacy leaders throughout the region. The purpose of this board is to provide guidance and feedback to the Literacy Program in regards to its ability to meet the literacy demands of the region through its course offerings, proposed new degree programs, and engagement with local schools. Information for School Counseling employer satisfaction was collected through the Counseling Advisory Board. The Counseling Advisory Board provides feedback and insight into the Counseling program on an annual basis, which informs the program’s annual reports [CAEP Evidence: A2.2 Counseling Annual Reports and Board Survey]. The board responses are then reviewed by the program faculty to determine how recommendations may be implemented on program capacity and needs. The School Psychology Program does not currently have a formal advisory council, but has ongoing communication with a variety of partners to exchange information about the program and discuss ways the program can improve its curriculum/training to produce graduates who are highly skilled and relevant. This is a mutually beneficial process for school systems, graduate training, and practice in Virginia. The faculty of the school psychology program meet at least annually with lead psychologists in neighboring school districts (in Virginia, DC, and Maryland) to discuss the program and the needs of the local school districts. The faculty meet regularly with alumni and current students. University-based supervisors (all core program faculty) also meet with school-based practicum supervisors once a year and internship supervisors twice a year to discuss candidates’ progress. All three full-time school psychology faculty members are on the board of the Virginia Academy of School Psychologists, and interact and exchange ideas with state leaders in school psychology at the annual convention and at quarterly board meetings. Supporting evidence for these program partnerships has been submitted [Evidence A2.1]. |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Measure 3. Candidate competency at completion |
Measure 3. Candidate competency at completion The following candidates successfully completed all of the state licensure requirements and were recommended for licensure: 383 Number of completers in programs leading to initial teacher certification or licensure: 297 Number of completers in advanced programs or programs leading to a degree, endorsement, or some other credential that prepares the holder to serve in P-12 schools (Do not include those completers counted above.): 86 |
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
Measure 4. Ability of Completers to be Hired in Education Positions for Which They Have Been Prepared |
Measure 4. Ability of Completers to be Hired in Education Positions for Which They Have Been Prepared As noted last year, "Turning the Tide: Addressing the Educator Shortage in Virginia" is a strategic plan that includes background on the educator shortage and provides recommendations for moving forward over the next three years. It is hoped that this plan will be helpful to Virginia’s school divisions, institutions of higher education, VDOE employees, and other stakeholders committed to securing a diverse, highly-qualified educator workforce and contribute to fulfilling our shared vision of maximizing the potential of all learners." As of April 2023, significant milestones related to these grants include the following:
Initial and Advanced Programs: Mason’s Career Outcomes Survey most recent results are for Summer 22, Fall 22, and Spring 23 graduates. The information was identified via participation in the 2022-23 Career Plans Survey, LinkedIn, and the National Student Clearinghouse. Information is presented as the degree level, which may incorporate multiple licensure programs:
|
Section 6.1 Response:
Over AY23-24, the EPP, as well as the larger College of Education and Human Development (CEHD), initiated college-wide discussions on the structure of the college. Currently the college has divisions, rather than departments. For the initial licensure and advanced programs in the EPP, they are currently spread across five of the seven divisions of the School of Education in the CEHD. Over AY23-24, an initial group was formed to look at the structure of similar R1 state institutions, and the purpose of these structures. The findings of this initial group were then shared with the larger community. At that point, the college leadership began to identify the potential structures for the future of CEHD. The establishment of these structures and requirements, by internal stakeholders, was critical to be able to create clear parameters and expectations for all current discussions about the CEHD structure, and allowed CEHD faculty and staff to focus their concerns or thoughts within these parameters. This clear dissemination of data, shared and understood by all, provided clarity and transparency to each step moving forward for the college, and reduced potential uncertainty and misinformation in the process. I look forward to reporting on AY24-25!
Average GPA for Program Completers
Traditional Program | Group | Degree Level | GPA |
---|---|---|---|
George Mason University (017) | All program completers, 2017-18 | Bachelor’s | 3.52 |
George Mason University (017) | All program completers, 2017-18 | Master’s | 3.96 |
George Mason University (017) | All program completers, 2016-17 | Bachelor’s | 3.48 |
George Mason University (017) | All program completers, 2016-17 | Master’s | 3.97 |
George Mason University (017) | All program completers, 2015-16 | Bachelor’s | 3.44 |
George Mason University (017) | All program completers, 2015-16 | Master’s | 3.97 |
Institutional Licensure Pass Rates
Traditional Program | Group | Number taking tests |
Number passing tests |
Pass rate (%) |
---|---|---|---|---|
George Mason University (017) | All program completers, 2017-18 | 334 | 334 | 100 |
George Mason University (017) | All program completers, 2016-17 | 337 | 337 | 100 |
George Mason University (017) | All program completers, 2015-16 | 375 | 375 | 100 |