External Reporting Requirements for Teacher Licensure
2024 CAEP Accountability Measures
Reporting Measure | ||||||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Measure 1. Completer Impact and Effectiveness |
Measure 1. Completer Impact and Effectiveness As planned in the previous report, as school systems began to re-emerge from COVID, the EPP tried to re-establish a reporting system with local school divisions to collect anonymized Teacher Evaluation data for George Mason completers [As also noted in the previous report, VDOE instructed that “how student academic progress is met in the evaluation is the responsibility of the local school division.” Essentially all Virginia teachers are required to be evaluated against the standards, but how they are evaluated, and whether this evaluation is collected in a systematic, automated process is decided at the local level. During AY22-23, VDOE continued to not provide a completer employment list to each EPP.]. During AY22-23, the EPP reached out to one large school district to collect this information. At the end of AY22-23, the EPP was still working with the school district to answer questions about the request. This experience occurred multiple times over the next academic year as the EPP expanded its teacher evaluation request to multiple school districts; some come back with questions, but then denied the request due to manpower needed to comply. One district was willing, but charged an hourly fee to conduct the work. Others that graciously responded with teacher evaluation data provided it in a variety of aggregated forms (at the EPP level; like-programs grouping; program-level). More information will be provided in the EPP’s Self Study and next year’s Annual Reporting. Without a state-wide collection system in place, nor EPP access to state-level data of student performance, the EPP continues to try to build stronger relationships with school district-level administrators to establish an efficient and effective way to provide this information to the EPP without burdening the local resources. As noted last year, the EPP also attempted to collect more concrete, in-depth qualitative information about how completers apply professional knowledge, skills, and dispositions in the P-12 classroom by conducting focus groups.The 13 focus group recipients were out of the 458 targeted spring 2023 emails that went out to completers to participate in pilot focus groups in May/June 2023. In spring 2023, the EPP arranged focus groups with completers from the Elementary, Early Childhood, Physical Education, Foreign/World Languages, and ESOL. George Mason PhD candidates who had been trained in qualitative methods conducted the focus groups that occurred over Zoom. The focus group recordings and transcripts were then analyzed for themes. Completers felt that building relationships, open communication, and collaboration with colleagues and children were highly important skills to being an effective teacher and were emphasized within their programs. Although all completers thought that the lesson plan activities in their programs were very long and detailed, the activities built confidence in some completers and appreciation of the intentionality of each component of the lesson plan in others, making them more effective teachers. Cultural responsiveness and the belief that all children can learn were two strong dispositional themes that emerged and were discussed as important to their program preparation. Real-life experiences in clinical experiences and applicable resources were noted as very useful and used in their classroom as teachers. Many completers noted more system training in IEP/504 preparation, and handling “non-ideal” situations, including behavioral issues would be very useful. Other points such as knowing how to navigate school/county administration and systems would make them feel more prepared, as well as learning how to co-teach/collaborate with different roles of people in the building would be appreciated. Learning how to cope with the stresses of the job were also emphasized. Some completers noted that program faculty talked about this issue, but that the mental health of teachers needs to be recognized. Overall, completers felt that their programs prepared them with appropriate skills and knowledge of their content area, and that appropriate dispositions were strongly affirmed. More recent completers expressed less confidence, as well as those whose clinical experience was impacted by COVID, about preparation than those who had been in the field for a few years. All emphasized challenges outside of their content area – such as handling “non-ideal” situations and understanding school systems/dynamics – as the biggest challenges, but also recognized the challenge for programs to familiarize candidates with the systems and procedures of each county and school system. |
|||||||||||||||
Measure 2. Satisfaction of employers and stakeholder involvement |
Measure 2. Satisfaction of employers and stakeholder involvement As noted in previous Annual Reports, in the Commonwealth of Virginia, there is no clear mechanism for collecting and sharing data across the state education agency, EPPs, and P-12 school divisions. Collection of accurate emails and post-graduation contact information is a continuous challenge. Additionally, the position of the VDOE staff member who provided each Virginia EPP their completer list during summer months has still not been fulfilled. Both of these changes greatly impacted the EPP’s ability to provide Virginia Education Assessment Collaborative (VEAC) (https://projectveac.org) with accurate completer lists and emails for completer satisfaction surveys. Initial Licensure As noted in the Initial Licensure Employer Survey VEAC Report, 31 EPP participated (up from 29 from the previous year) in the 2022-2023 cycle. Upon closing the survey in May 2023, VEAC collected 1317 employer complete and partial responses (up from 1169 responses last year). For George Mason University, the EPP had 79 responses, a 26% response rate (a massive improvement from the previous year of 6%) based on the total number of contacts submitted to VEAC minus the number of failed/bounced emails. The average rated readiness of the EPP completers by employers was 4.21 (on a 1-5 scale). Since the EPP had recently begun graduating undergraduate, as well as graduate completers, the EPP evaluated the differences in these scores. When split, for the undergraduate completers, 60% were considered “Mostly Ready,” while 26.67% were considered “Fully Ready.” The mean for “Best described the extent to which completers are ready” for undergraduates was 4.00. For the graduate completers, the scores flipped – 24.56% were considered “Mostly Ready,” while 56.14% were considered “Fully Ready.” The mean for “Best described the extent to which completers are ready” for undergraduates was 4.26. When looking at all completers, undergraduates scored lower than graduates in every item of the survey. Out of every item, both groups had lower means for the following item: ID: Systematically gathers, analyzes, and uses all relevant data to measure student academic progress, guide instructional content and delivery methods, and provide timely feedback to students, caregivers, and other educators. Undergraduate Mean:2.80 Graduate Mean:3.09 IN: Engages in reflection on the impact of their teaching practice and adapts to meet the needs of each learner. Undergraduate Mean:2.93 Graduate Mean: 3.23 Looking through the lens of Mason Completer Years in these two items, overall scores have dropped 2019, 2020, 2021, 2022
In most cases, both groups’ scores dropped in 2019, 2020, and 2021. The graduate completers improved in 2022. The undergraduates continued to drop. This could be for several factors. For the graduate programs, the first post-COVID group was coming out of George Mason from established programs. For the undergraduate programs, the N was much smaller and included the first completers of new undergraduate programs. Overall it seems to show that graduate completers are coming out more equipped and ready for the classroom. This could be based on age, maturity and experience, or that the graduate programs are more established and the graduate/post baccalaureate level, candidates have the breath and ability to focus of pedagogical practice. Faculty teach both levels and have commented on the needed scaffolding for the undergraduate programs. This continues to be an issue. In the open-ended responses, 27 employers gave responses. Overall were positive, with two particularly negative responses. These reports were widely shared with the faculty to review and inform their program goals and improvement. These VEAC surveys are critical in informing our licensure programs about how the preparation of our programs is perceived. Advanced Licensure As noted in the Interim Advanced CAEP Report, the EPP had initially expressed concern about using surveys for advanced completers and employers, and focused on the qualitative responses of each Advisory Council. The programs have continued to have active advisory and stakeholder participation, but decided to supplement this work with the VEAC Advanced Survey. Over 2022-2023, the EPP returned to the VEAC Advanced Licensure Surveys for completers and employers, using a more targeted email and personalized program-specific message approach. Below describes a variety of methods used by advanced programs to collect employer satisfaction. VEAC Surveys The survey was a 4-point scale of Unacceptable (1.00), Needs Improvement (2.00), Proficient (3.0), and Exemplary (4.00). Administration & Supervision For the Item “Uses public relations and public engagement strategies and processes for building and sustaining positive relationships with families, caregivers and community partners for the benefit of school improvement and student development,” the EPP mean was 3.67, while the overall VEAC mean was 3.41. Since this data was collected, the program has earned NELP recognition. The preparation for the NELP SPA Report including re-aligning the current assessment in EDLE 612 Education Law, the Ethics Code, Case Study, and Analysis, in which candidates develop an ethical code and identify and write a case study about an ethical dilemma in their school system, using their Code of Ethics to analyze the case study. The results of this revised assessment will be closely monitored. Counselor Education Math Specialist School Psychology Reading Specialist Stakeholder Involvement In addition to our program Advisory Committees, as an active member in VEAC, the EPP is also collecting additional stakeholder feedback through the VEAC Surveys. The quantitative results, as well as the qualitative responses, of our completers and employers serves as some of the multiple measures for the EPP’s DARE, as evidence for informing the EPP’s mastery of Standard 4, and as evidence of stakeholder participation. At the program level, the clinical experience of each licensure program involves multiple school/site personnel and their feedback on our candidates and programs. For the initial licensure programs, Mentor Teacher feedback is collected formally through the Internships survey that helps collect feedback CAEP Standard 1 and 2. At the EPP level, the college has created high-level school partnerships that have created excellent opportunities for discussion about the needs of today’s schools and students. These senior school stakeholders are helping with feedback that impacts CAEP Standard 1 and 2. |
|||||||||||||||
Measure 3. Candidate competency at completion |
Measure 3. Candidate competency at completion The following candidates successfully completed all of the state licensure requirements and were recommended for licensure: Number of completers in programs leading to initial teacher certification or licensure: 441 Number of completers in advanced programs or programs leading to a degree, endorsement, or some other credential that prepares the holder to serve in P-12 schools (Do not include those completers counted above.): 186 |
|||||||||||||||
Measure 4. Ability of Completers to be Hired in Education Positions for Which They Have Been Prepared |
Measure 4. Ability of Completers to be Hired in Education Positions for Which They Have Been Prepared Currently data for Measure 4: Ability of Completers to Be Hired is currently unavailable, as the main administrative survey office of the university discontinued its one-year out survey as of AY2020-2021. The EPP’s Office of Institutional Effectiveness and Planning (OIEP) transitioned to collecting this information through the Career Plans Survey, “Formerly known as the Career Census Survey, the Career Plans Survey is a collaborative project between OIEP, University Career Services, and the Office of Alumni Affairs to collect information about Mason graduates’ (undergraduate and graduate) employment status, use of job search resources, and plans for further education.” (https://oiep.gmu.edu/institutional-effectiveness/surveys/instruments/#cps) Preparation of AY2022-2023 data of the Career Plans Survey is currently underway (summer 2024). Meanwhile, the EPP is working on alternative ways to improve data collection for this measure and we hope to resume EPP-level data reporting by CAEP Annual Reporting AY23-24. |
Average GPA for Program Completers
Traditional Program | Group | Degree Level | GPA |
---|---|---|---|
George Mason University (017) | All program completers, 2017-18 | Bachelor’s | 3.52 |
George Mason University (017) | All program completers, 2017-18 | Master’s | 3.96 |
George Mason University (017) | All program completers, 2016-17 | Bachelor’s | 3.48 |
George Mason University (017) | All program completers, 2016-17 | Master’s | 3.97 |
George Mason University (017) | All program completers, 2015-16 | Bachelor’s | 3.44 |
George Mason University (017) | All program completers, 2015-16 | Master’s | 3.97 |
Institutional Licensure Pass Rates
Traditional Program | Group | Number taking tests |
Number passing tests |
Pass rate (%) |
---|---|---|---|---|
George Mason University (017) | All program completers, 2017-18 | 334 | 334 | 100 |
George Mason University (017) | All program completers, 2016-17 | 337 | 337 | 100 |
George Mason University (017) | All program completers, 2015-16 | 375 | 375 | 100 |