Narrative Analysis Tool
Instructions
The Narrative Report responses below can be further filtered by one or more states, as well as keywords.
For more information on Narrative Reports please see the technical assistance documents.
State | Monitoring and evaluation of the quality and improvement of adult education activities |
---|---|
Alabama | The Alabama Community College System (ACCS) is responsible for ensuring that each grant recipient of AEFLA funds is monitored and evaluated on a continuing basis. Regional directors are tasked with working directly with local providers in conducting quarterly desktop monitoring virtually along with the systemic comprehensive review of each provider within the grant cycle.
The quarterly monitoring includes a review of each program’s AAESAP performance as detailed in their custom dashboard. The dashboard contains NRS data regarding the program’s enrollment by educational functioning level, measurable skills gains, post-test rate, career pathway achievement, High School Equivalency attainment and certificate completion. Desktop monitoring, along with routine data collection and reports received from providers, provides insight as to how the local programs are performing against expected results. Programs not making continuous improvement receive technical assistance related to the area of need from their respective regional director and system staff, as needed.
The on-site compliance monitoring conducted during the state’s three-year grant cycle is conducted by a team consisting of regional directors and system level subject matter experts, as needed. The monitoring schedule allows for an evaluation of at least eight programs per year. The order in which programs are monitored is determined by a risk analysis. The risk analysis is based on indicators that reflect program performance, fiscal responsibility, and data reporting. Programs are placed in quartiles, with the fourth quartile being the “goal quartile”. Programs in the first quartile, based on the risk-analysis, may have the greatest risk of not meeting performance measures and will be monitored earliest in the grant period with follow-up targeted monitoring, if needed. Programs are assessed using an ACCS-approved monitoring instrument. The monitoring instrument is based on program performance and management and follows five modules aligned with WIOA standards:
|
Alaska | The AAE Office traveled to Anchorage, Fairbanks, Nome, and Mat-Su to conduct on-site monitoring of four programs. The AAE Office used monitoring tools, including on-site monitoring, financial monitoring, and teacher observational tools to evaluate each program. The local programs were provided with technical assistance, recommendations, and placed on a Corrective Action Plan (CAP) if necessary. The AAE Office also highlighted best practices during on-site monitoring that can be utilized during technical assistance and training of other providers.
The following best practices were noted during on-site monitoring:
|
American Samoa | The evaluation of the program is based on measurable objectives and indicators as appeared in the Adult Education Plan within the American Samoa State Combined Plan. The number of classes held, annual population count of participants, program reports, and the number of GED/HiSET participants passing the test for their high school diplomas, are used as data for monitoring progress of the program. With the intended use of CASAS data and TOPSpro use in the future will continue to improve this area. |
Arizona | Comprehensive Monitoring: This was the third year of using the monitoring tool developed and aligned to the current grant contract. Revisions were made based on state staff feedback and changes were communicated with local providers. Local providers knew what to expect, such as the kind of evidence that would be collected, and most program administrators used the monitoring tool for their internal audits. The decision was made to keep the tool in place for the fourth and final year of the current grant contract. The tool will be updated to align with the PY 2024 – 2028 grant contract and requirements. The process necessary for continuing with desk monitoring during the COVID-19 pandemic allowed state staff to become proficient in gathering most evidence needed in a digital format. To protect students’ PII (Personally Identifiable Information) and to comply with the Arizona Department of Education’s data policies, local providers granted access to digital files, such as intake forms and attendance records, rather than emailing files. All adult education providers were desk monitored between July 1 and Dec 10, 2022, to determine testing administration errors for students assessed in the previous 12-month period. Findings from this monitoring led to the establishment of a below-10% error rate for in-compliance with the state assessment policy. 17 of 21 providers had error rates above 10% in the fall and received follow-up desk monitoring six-eight weeks later, as well as a spring monitoring conducted between Jan 1 and May 31, 2023. After the initial fall monitoring, technical assistance was provided to each program director and assessment coordinator on the TABE assessments and administration under the AZ Assessment Policy. By the conclusion of the spring assessment monitoring, an additional seven programs had brought their error rate below the 10% threshold. By June 1st, the average program testing error rate was 12.7%. This was used to establish a goal of a 7.3% error rate for PY23-24, and a goal of a 5% error rate for compliance for PY24-25. In all, over 6000 students and their 20,000+ TABE administrations were evaluated. The resulting TA (Technical Assistance) to the field led to a drastic decrease (7.0 percentage points) in students receiving tests administered in error and an increase in EFL (Educational Functioning Level) gains due to proper test administration. Local Provider Final Narrative Reporting: Local providers were required to submit narrative reports by July 31st to provide analysis of primary indicators of performance data, progress testing and progress test success rates (as a means of better understanding MSG outcomes); implementation of IET programming; professional learning activities; integration of digital literacy; and collaboration with workforce partners, particularly in the area of the MOU/IFA. For primary indicators of performance, programs specifically addressed 1) Areas of strength, 2) Any challenges faced and how those were resolved, and 3) Areas targeted for improvement. PY 2022-2023 was the third year of including these details in the final report template, and state staff have found that reports provide more information than in previous versions. Additionally, state staff provided comments and feedback to programs on the submitted reports. This process was positively received from program administrators. Data and Data Management System: The Arizona Adult Education Data Management System (AAEDMS) was monitored daily by state staff, collaborating with the vendor to address issues that arose and ensure accurate data was readily available. Statistical program data was analyzed monthly, so state staff had a timely understanding of program performance and addressed low performance issues or anomalies. Program liaisons reviewed the previous month’s data with local providers during monthly virtual meetings, for targeted technical assistance and broader discussions on program performance. |
Arkansas | Mandatory Administrators' Meetings are held semiannually, during which policies and procedures are discussed, information is disseminated, and providers can gain additional professional development from state staff, guest speakers, and each other. ADWS/AES monitors local programs through submitted quarterly reports, annual one-day site visits, and intensive three-day program reviews performed on each program every four years or as determined by their level of risk. One hundred percent (37/37) of providers received an annual site visit in 2022-23. Following completion of all visits, the program review and site visit instruments were updated in order emphasize topics such as collaboration with one-stop partners, student credential completions and the integration of career pathways, workplace literacy, digital literacy, etc. across the curriculum. ADWS/AES also evaluates each program annually through an E&E (Effective and Efficient) calculation, currently based upon the programs’ progress on the federally negotiated benchmarks in NRS Table 4. Programs not meeting E&E are provided with intense technical assistance to develop and implement a program improvement plan, with follow-up throughout the remainder of the program year. A new calculation for E&E using all performance measures on NRS Tables 4 and 5 was developed during 2022-23 and was shared with local program directors during the Spring Administrators’ Meeting for 2023-24 implementation. |
California | The AEO at the CDE provides monthly desktop monitoring, and uses a risk-based analysis to select sub-grantees for a more formal comprehensive review process. Criteria used to determine which agencies are reviewed include new administration; overall funding amounts; chronic, late deliverables, and similar issues. The Federal Program Monitoring (FPM) Office at the COE coordinated and scheduled FPM reviews for all programs required to monitor federal funds at the COE. Agencies selected for review attend several training workshops where they receive detailed instruction on the monitoring process, the Adult Education Instrument used to guide federal reviews, and all evidence requests agencies are expected to upload. Moreover, to ensure the AEO conducts fair, thorough, and consistent reviews of all agencies, reviewers meet yearly for a formal discussion of performance and several times throughout the year informally to debrief all reviews conducted throughout the state. |
Colorado | AEI conducted an annual risk assessment of grantees to determine risk but did not conduct onsite monitoring based on the results due to continued COVID-related restrictions. AEI conducted quarterly monitoring calls with all grantees to ensure alignment with AEFLA requirements and to identify best practices and innovative activities in programming. This collected information was provided to grantees in follow-up reports and during Office Hours. At the request of grantees, in-person monitoring occurred during these quarterly calls. Additionally, grantees received quarterly data performance reports displaying their key NRS data with trends, prior year comparisons, performance target reminders, and key recommendations. In the 22-23 program year, AEI continued to monitor grantee data monthly to identify any concerns about enrollment, post-testing, and measurable skill gains (MSG). AEI utilized a custom virtual grantee dashboard in the statewide data system, LACES. This monitoring was used to provide technical assistance to grantees to support improved performance and accuracy in data reporting. AEI also increased awareness around the importance of data by highlighting specific data topics in each bi-monthly Office Hours webinar and during quarterly Data Talks webinars. AEI oversaw the implementation of five IET programs in the 22-23 program year. The team worked with grantees to ensure compliance at every level: industry selection, development of shared objectives, and the implementation of co-enrollment. The IET toolkit AEI used in 22-23 focuses on alignment between the IET program and the CCRS. AEI also defined and documented its IET toolkit review and approval processes to aid grantees in progressing from IET design to IET implementation. |
Connecticut | The CSDE regularly and consistently provides updates to the field and monitors and evaluates the quality and performance of its providers by several means. Adult Education providers are expected to be in compliance with all State and Federal regulations.
Policy Forums/Updates
In order to effectively inform programs, the Adult Education Unit holds Policy Forums in September, January and May each year. The purpose is to inform Adult Education Directors and other key personnel on policy or procedural changes and provide updates regarding State and Federal regulations and requirements. At the conclusion of each Policy Forum, a copy of the presentation is emailed to all local directors along with any accompanying documents or forms. Additionally, Operations Memorandums (OpMemos) are written and widely distributed as necessary to inform of policy changes in real time. This year we requested additional clarification from our technical assistant at OCTAE on the counting of family literacy attendance hours and the eligibility requirements of IELCE and these responses were shared directly with the field. In addition to the Policy Forum and OpMemos, the CSDE provides timely updates at grantee meetings and meetings specific to sharing best practices.
Program Quality and Compliance Review (PQCR)
The Adult Education Program Quality and Compliance Review (PQCR) is a comprehensive on-site monitoring process. The review is an opportunity to focus on program quality and improvement, as well as ensuring compliance with state and federal requirements. Selection for review involves a process that combines the analysis of adult education data via the Connecticut Adult Reporting System (LACES) and the Connecticut State Department of Education (CSDE) focus on program quality and performance. The criteria for selecting districts for review involves a process that combines the analysis of adult education data via the Connecticut Adult Reporting System (LACES) and the Department’s focus on program quality and performance. The selection process also accounts for the size of the program and the date of the last official site visit by the CSDE. At the end of each PQCR, a detailed written report highlighting the commendations, recommendations and compliance requirements is sent to the provider’s district superintendent. Each selected district has 30 days to respond in writing of their corrective action plan for all recommendations and compliance issues. A follow up interview or on-site visit is provided by the local CSDE technical assistant one year later at which time an additional report is written and submitted identifying areas of improvement and continued recommendations.
Mid-Year and End-of Year Grant Reports
CSDE consultants regularly review all grants for their respective technical assistant regions and provide guidance to the directors. All grantees submit both mid-year and end-of-year reports, which are reviewed by the consultants. This year the CSDE spent a considerable amount of time revising and updating the PEP Mid-Year Report and End-of-Year Report to better align to federal and state outcomes and performance expectations and to provide opportunities for programs to reflect and highlight areas of growth and innovation. The forms now include sections that each grantee must respond to specific to the grant requirements listed in the RFP regardless of which priority area they have applied for funding. See below.
|
Delaware | Program performance data was reviewed on a quarterly basis at the state and local levels. In addition, programs submitted their LACES Delaware Outcomes Report Cards on Schoology prior to the monthly ABE Administrator meeting ensuring that programs reviewed their performances at least monthly. Combining these statistics with the self-reported “Ah-ha” moments expressed by program providers at monthly meetings resulted in the sharing of promising practices focused on increased student progress. Quarterly data chats rounded out the statewide monitoring process. The chats, held prior to the state data review date, were discussion points for better data management. Information from the chats were stored in the Schoology MIS group for permanent access by all administrators and data staff at state and local levels. In PY22, four community programs received monitoring visits by peer reviewers and state level coordinators. The state team hosted TA in advance of the visit to discuss with programs review requirements, schedules, student and staff interviews, and classroom observations. Process and document reviews were conducted prior to the in-person meeting via the monitored program’s upload of documents into Schoology a minimum of 10 days prior to the monitoring visit. The monitoring team reviewed program documents prior to monitoring date and asked questions during the in-person visit. The monitoring criteria focused on program administration, standards-based instruction, evidenced based reading instruction (STAR) implementation, and data & fiscal management, comprised of approximately 80 indicators for measurement. On the monitoring visit day, programs offered in person and virtual meetings with students and staff to meet diverse schedules. These visits reconfirmed those strong relationships between instructional staff and students fostered a greater understanding of student barriers, resulting in increased student persistence, increased goal achievement, and increased resolution of obstacles to student progress. |
District of Columbia | OSSE AFE monitors sub-grantees to evaluate local program performance via quarterly monitoring reviews, monthly and/or quarterly check-in meetings, desk reviews and final annual monitoring. Additionally, the AFE team conducts classroom observations, folder samplings and fiscal monitoring verifications. Local program providers are required to submit quarterly statistical and narrative reports with evidence that includes student roster reports, National Reporting System (NRS) fundable Student Roster Report, NRS Tables, CASAS Current Year Pre- and Post-test Assessment Report, student core goal attainment reports and other related documents. Local program participation in an annual final monitoring review and developing and implementing a continuous improvement plan, as applicable, are also required. The OSSE AFE Quarterly Reports, Continuous Improvement Plans, Final Monitoring Tool and classroom observation tool and student surveys, as applicable, continue to be used to assess the effectiveness of local programs and the improvement of adult education activities, as described in section 223(1)(d). The state also uses the performance data from local program providers via the monitoring process to address the specific professional development, technical assistance and/or resource allocation needs of local program providers and to work with local program providers to develop and implement plans for continuous improvement. |
Florida | Florida's monitoring and compliance system has effectively identified areas of compliance and non-compliance with federal law, state regulations and grant guidance. The state's comprehensive Quality Assurance and Compliance System has served as the foundation for ongoing improvement, fostering accountability, collaboration, targeted technical assistance and systemic enhancements. In 2022-23, Florida utilized a risk-based approach to evaluate WIOA-Adult Education and Family Literacy Act (AEFLA) grant recipients, employing risk assessments to tailor monitoring strategies, including resolution action plans and grant reviews. Various assessment methods, such as site visits and data quality checks, were incorporated into the evaluation process, and best practices were disseminated through workshops and conferences. The outcomes of these evaluations informed state-imposed program improvement action plans to ensure the effective monitoring and enhancement of adult education activities, aligning with WIOA's requirement for monitoring and evaluation of quality and improvement in adult education. FDOE added two new positions to the state team in 2022-23, Program Director of IET and restructured the former FDOE educational consultant role, transforming it into Adult Education Program Performance Analyst. The Program Director of IET for Adult Education at the FDOE assists in meeting federal AEFLA reporting requirements by overseeing the quality and compliance of IET programs, offering technical assistance and researching guidelines. The Director's role contributes to identifying and disseminating effective practices, aligns with progress in monitoring and evaluation and ensures that results are utilized for program improvement, all in accordance with AEFLA's mandates. The Program Performance Analyst plays a crucial role in helping the state meet federal reporting requirements under the AEFLA. Together, the IET Program Director and Program Performance Analyst provide technical assistance, reviewing program standards and reports, analyzing data sets and aiding in the development of data-driven performance reports, all of which support the state in identifying and disseminating successful models and practices to fulfill AEFLA reporting requirements effectively. |
Georgia | In PY22, GOAE monitored and evaluated adult education programs through desktop monitoring, onsite monitoring, and performance reports.
|
Guam | Each month, the local program submits a Cumulative Monthly Activity Report (CMAR) to the SAO describing its progress and challenges in program activities and supporting documentation. Through the CMARs, the SAO can monitor the progress of the local program and provide feedback to the program by giving the State Monthly Response (SMRs). The SMR feedback may consist of clarification questions, recommendations, action items to ensure compliance, improvements on data collection, and ways to improve activities that would allow the program to meet its goals and objectives stipulated in the program agreement. The Adult Education Program conducted various activities to increase student enrollment in basic Literacy, ESL, and AHS. Community activities consisted of communicating with village mayors, local agencies, and the Federated States of Micronesia General Consulate to 1) provide information to village residents interested in the Adult Education program, 2) discuss MOUs to hold classes off-campus and at designated locations, and 3) continue the partnerships with each office to increase the awareness of the Adult Education programs throughout the island. The AEP partnered with local agencies and efforts such as the Bureau of Women’s Affairs, Homeless Relocation Initiative, Residential Substance Abuse Treatment Cohort, and FSM General Consulate, to name a few, to raise awareness of the Adult Education program and provide information to potential students. Although AEP is active throughout the community and works persistently to recruit students, one of the challenges they face is locating individuals who took the CASAS test for the first time and have yet to return for services or enroll in the program. AEP captured 39% of the test listers. The State and local programs planned to investigate the situation to explain the non-returnees. Understanding this finding can provide insights into why students failed to return, which could assist the AE team in addressing them and potentially increase enrollment. Faculty and student surveys continue to be pivotal components to program success. The survey allows faculty and students' voices to be heard in identifying barriers and ways to improve service and what AEP is doing right. There was an 18% increase in enrollment this program year compared to PY 2021-2022 for students with at least 12 hours of instruction. Further, the number of students who attained a high school diploma or its equivalent (45) increased by 96% compared to 23 last program year. The significant redesign for GED students was the accessibility of the adult education counselor and emphasis on taking advantage of tutoring services, counseling, and adult basic education courses. One hundred thirty-three students achieved at least one EFL gain this program year compared to the last year, which had 118 students—a 13% increase. Although enrollment has increased since the previous program year, there was a decrease in the total percentage of students completing a level. This decrease can be attributed to students needing to take the post-test. Some students could not take the post-test due to the super typhoon (equivalent to Category 5 Atlantic hurricane) Marwar that hit Guam in May 2023. The island had widespread power and water outages that took months to recover. Program Year Program Number of students with at least 12 hours of instruction [NRS Table 1] Number of students who achieved at least one educational functioning level gain [NRS Table 4] Percentage of completing a level ABE 161 91 56.52% 2022-2023 ASE 21 11 52.38% ESL 44 31 70.45% Total 226 133 58.85% |
Hawaii | The State conducted a comprehensive audit of the Hawaii AEFLA program, including the State office and the local program. The final report was completed in August 2023 and found five areas needing improvement. One area is the documentation and sharing of program policies and procedures. Guidelines for fiscal operations and eligibility and intake were developed and shared, and more guidelines will be developed to address all areas of the program. The State procured a program evaluation of the local program’s implementation of the college and career readiness standards for adults. The report will be completed in December of 2023, and the results will be used to inform actions for improvement. The State issued a request for a proposal for marketing research services. One of the desired outcomes includes the development of an exit survey of program participants and an annual report to determine if required AEFLA services were provided to participants. |
Idaho | In PY22, Idaho adult education implemented monitoring and evaluation measures. These measures are discussed below.
The state implemented the new State of Idaho Office of Adult Education Local Program Monitoring Guide, which includes comprehensive procedures and tools for state monitoring of local programs. The guide states that virtual and onsite monitoring visits will assess local providers in the areas of:
|
Illinois | Adult education leadership and program support staff actively monitored adult education program quality through ongoing communication, site visits, and desk-top reviews. The adult education staff participated in ongoing staff meetings and staff retreats to jointly review program data and discuss program needs. This process created a collaborative environment where promising and innovative practices were identified and then disseminated through the adult education system. Active monitoring of the PY22 leadership activities and the efficacy of both the technical assistance and ongoing professional development also occurred throughout the year. 100% of adult education programs had a program support visit to ensure compliance with the adult education program expectations, foster positive relationships between programs and the ICCB, and identify areas of support needed to ensure high quality programming which leads to student outcomes.
Formal, on-site programmatic monitoring occurs to directly review compliance with all applicable governing laws and grant deliverables as outlined in the AEFLA Notice of Funding Opportunity/Grant application and the Uniform Grant Agreement. During the monitoring process, information was requested and analyzed to determine the compliance of specific reviewed items. A formal risk assessment using a quantitative system for rating and ranking grantees and their ICCB-funded programs was used to identify programmatic and fiscal risk. Each grantee was allocated points based on the criteria below (not an all-inclusive list) and was assigned a risk level of elevated, moderate, or low based on the total number of points allocated relative to other grantees. Criteria used in the risk assessment is evaluated and updated annually and included the following:
|
Indiana | To monitor and evaluate the quality of adult education activities, program management, fiscal management, data management, and performance measures are continuously assessed. Formal and informal monitoring, desk audits, data checks, and program visits were conducted by state office staff, and the InTERS data team. Low performing programs were identified, in part, based on NRS table 4 results. Visits (in-person and virtually) were made to struggling programs. At the same time, the Indiana Mentoring Project was established through collaboration with the state’s professional organization. A retired director of adult education, who was employed by urban and rural programs, mentored new and existing program administrators and offered support to local staff. The program was immensely popular and provided targeted professional development at a provider level. The mentoring project coordinator analyzed enrollment trends, marketing strategies, and offered guidance to assist low performing programs. In addition to in-person visits, he provided tips on increasing enrollments, program administration and budgeting, professional development opportunities, and how to promote student success. The project was a springboard to share promising practices gleaned from monthly visits with providers across the state. Local program personnel commented that the project provided a “safe place” to bounce ideas, discuss successes and challenges, and receive valuable feedback from a former colleague. Likewise, a comprehensive risk assessment was conducted as a prelude to monitoring while local programs developed professional development plans, targeted measurable skill gains to increase academic gains, and developed strategies to increase enrollments and reduce student separations. “Report cards” were provided monthly to local programs outlining key metrics and analysis was provided during regional meetings. Report cards presented comparisons to state and local data based on points in time and progress toward meeting goals. Quarterly reports submitted by professional development facilitators (PDFs) were utilized to identify promising practices, technical assistance, and gaps in service. Promising practices were highlighted monthly during statewide webinars. Local program personnel were placed on the agenda to share innovations in the webinar. |
Iowa | State staff assess providers’ implementation of the Iowa Program Standards Framework with on-site and virtual monitoring. The risk-based monitoring process allows staff to gauge compliance with WIOA provisions, identify areas of improvement, determine technical assistance needs, and note innovative or promising practices. In PY 2022-23 state staff employed an updated risk analysis tool that incorporated new data elements related to performance, fiscal management, services, assessment, data quality, and Integrated Education and Training (IET). The results help the Department identify which local programs are at high, moderate, or low risk of noncompliance and to administer strategies appropriate for each tier, including virtual monitoring and the development of improvement plans (high risk), virtual monitoring (moderate risk), and consultation with state staff to select and disseminate noteworthy practices (low risk). The Department conducts an on-site visit to each program during the five-year federal grant cycle (PY21-25), regardless of risk, for a total of three every year. While virtual monitoring involves a targeted review of select standards, on-site visits address all program standards. State staff expanded the on-site schedule to three days, with providers presenting a virtual program overview on the first day, the site visit on day two, and a virtual exit meeting for the preliminary report and discussion on the third day. The team met its goal of conducting a full monitoring of three programs: Iowa Central Community College (6/26/2023), Des Moines Area Community College (5/9/2023), and Northeast Iowa Community College (5/23/2023). Nine of the 12 programs received targeted virtual monitoring based on areas of need identified by the annual risk assessment. The remaining three programs consulted with the Department to identify and share a best practice or model with the Iowa AEL community. The ultimate goal for the Department’s monitoring process, regardless of strategy, is continuous program improvement. |
Kansas | KBOR maintains its commitment to monitoring at least 20 percent of local programs each program year, with a preference for onsite visits whenever possible. In PY2022, KBOR conducted five formal monitoring visits face-to-face, with 15 additional informal onsite program visits. KBOR staff also took the opportunity to attend 11 commencement ceremonies, including ceremonies held in correctional facilities. This led to a discussion at the summer 2023 Program Leaders Meeting about graduation celebration best practices, giving directors an opportunity for the first time to share ideas on this topic and gain insight from peers. During program monitoring, KBOR staff visit with program directors and other staff to discuss program activities, instruction, performance outcomes, data collection, professional development, and integration with one-stop partners. Monitoring results, including findings, recommendations, and promising practices, are shared with local program directors and with the head of the sponsoring institution. Documentation of corrections and responses to recommendations are required within 60 days. In addition, KBOR conducts an annual risk assessment for all programs, measuring actions and outcomes that may place programs at increased risk of noncompliance. Programs may be assessed as negligible risk, potential risk, moderate risk, or high risk. In PY2022, Kansas had no programs assessed at a high risk. Programs identified as potential or moderate risks received technical assistance from the state, including one program on a formal Program Improvement Plan, which included detailed goals, frequent progress checks, and intense collaboration with the program, the sponsoring institution, and KBOR. The state conducts multiple less formal program assessments throughout the year to guide programs, provide assistance, and identify potential issues. Data accuracy, compliance, and outcomes were reviewed at least quarterly, with in-depth data reviews for 10% of programs each month. Funding and spending for each program were evaluated quarterly. Professional development activities and spending were reviewed each month, and other program activities were monitored as needed, such as weekly check-ins during Cross-TREK sessions. |
Kentucky | Pursuant to the local provider’s contract with OAE for the provision of adult education services, local providers were required to operate a program in compliance with the KYAE Professional Learning Requirements for all provider roles. The Professional Learning minimum hour requirement was reviewed by OAE and based on local provider feedback and best practices solicited from other states, the requirement was reduced from 25 to 18. All eligible provider adult education staff regardless of compensation status are subject to KYAE Professional Learning Requirements. Per the PY22 KYAE Program Manual, page 70, “Should the provider fail to meet this requirement, a Notice of Noncompliance will be placed in the program file and will be considered in any evaluation of the program’s performance.” At the conclusion of PY22, 19 of the 26 local providers were fully compliant with professional learning requirements with no stipulations. The remaining non-compliant providers and their specific circumstances were addressed during the first quarter of PY23. The monitoring and evaluation process is based on a continuous improvement comprehensive model that focuses on progress towards performance requirements and the identification and mitigation of risks. All US ED, OCTAE and State-specific requirements are monitored and assessed throughout the PY to include performance indicators, finance/budget processes, and coordination across other WIOA types in support of co-enrollment and risk management. The Program Administration, Performance, and Compliance (PAPC) Branch leads the monitoring and evaluation effort which includes periodic site visits to identify risks and to provide technical assistance. The process involves identifying the performance levels across the eligible provider network and developing a plan via the Performance improvement Plan (PIP) process and based on fulfillment of PIP requirements the Technical Assistance Plan (TAP) process. PAPC has been able to identify existing and potential risks across the network as well identify best practices that can be implemented across the network to enhance compliance and performance in support of the overall state monitoring process. These best practices included organizational practices to mitigate risks regarding records management and finance processes. Risks were assessed monthly and quarterly to determine which providers required technical assistance. All monitoring and evaluation content is addressed in the KY Program Manual. Policies are updated annually to address best practices and to incorporate any modifications by US ED and or the Commonwealth of Kentucky. A Mobile Response Team (MRT) process was developed to measure the quality of activities and enhance the monitoring and evaluation process. The PAPC Branch leads this effort which consists of representation from all Branches across the state staff. The MRT conducts site visits upon requests by eligible providers and when risks and or performance gaps are identified based on the weekly performance reports and budget/finance analysis process. The results of the monitoring and evaluation process for PY 22 were used to guide and shape the updates to the Monitoring and Evaluation section of the KY Program Manual for PY 23. This is an iterative process that includes data analysis but goes beyond the numbers to include engagements with eligible providers to determine roots causes. The KY Program Manual is disseminated to all eligible providers and quarterly updates are provided throughout the PY. |
Louisiana | In 2022-2023, WRU moved monitoring visits from the spring to the fall to ensure that programs had adequate time to work on any issues or share best practices with the network during the year. Louisiana's onsite monitoring instrument, guided by a risk assessment model, was comprised of six vital modules: data, recruitment/retention, classroom activities, records/reports, partnerships, and finance. Aligned with USDOE/OCTAE guidelines, this instrument served as both a monitoring instrument and a training and planning tool for local providers. Based on federal requirements and performance measures, monitoring activities addressed fiscal and programmatic risks through a comprehensive assessment. The monitoring process included fiscal risk assessment criteria such as federal award amounts, single audit findings, and programmatic risks like student enrollment and participant progress. An electronic grants management system, eGrants, facilitated budgeting, revisions, and reimbursement requests with providers trained in the WRU Recipient Grant Management Handbook. Statewide compliance teams conducted onsite visits, analyzed student files and attendance records, prepared monitoring reports, and ensured non-compliance resolutions. Thirteen subrecipients were visited in 2022-2023. Seven visits consisted of a full review, with four programs having to write Corrective Action Plans (CAPs). The other site visits were considered technical assistance visits where the monitoring instrument was used as a guide for training purposes. All subrecipient findings with CAPs were successfully resolved after conversations and correspondence with state staff to ensure compliance within those areas. |
Maine | Maine has not conducted state-wide onsite program monitoring for several years, instead we have relied on desk monitoring, the submission of standardized quarterly reports, targeted technical assistance to address any issues coming up in reports (i.e. SSN collection, etc.), evaluations of professional development courses offered, and one-on-one visits initiated by a program or the state office as needed. In PY 22 Maine was monitored by OCTAE and issued a Corrective Action Plan (CAP). Part of the (CAP) included the creation of an approved Risk Management tool and AEFLA Monitoring Guide. In PY 23 Maine will conduct an onsite monitoring of all AEFLA/IELCE recipients. |
Maryland | Adult education program specialists conducted program evaluation and monitoring throughout the reporting period through a combination of a desk review of quarterly data, midyear and final reports, and virtual site visits. In PY22, the State undertook a total of four comprehensive monitoring visits. Classroom observations were carried out both in-person and virtually. The utilization of virtual access facilitated a notably greater number of classroom observations compared to on-site visits. The team systematically assessed the efficacy of online instruction, pinpointing noteworthy best practices and identifying areas in need of technical support. Following each monitoring visit, a detailed written report was furnished, encompassing observations, recommendations, and any necessary corrective actions. Additionally, five IELCE-IET programs underwent monitoring, incorporating a combination of online and in-person classroom observations, along with visits to assess IET training content. Fiscal monitoring visits and annual enrollment data verification audits are performed through the DWDAL Office of Monitoring and Compliance (OCM). In PY22, monitors from OMC conducted enrollment data verification for all local programs. State monitors have been able to conduct successful audits virtually, improving efficiency. Programs that fail to meet data quality standards are required to submit corrective action plans consistent with the federal data quality checklist and provide professional development to staff in understanding the importance of consistent data collection methods. MD Labor conducted fiscal monitoring for one program during PY22. |
Massachusetts | Because ACLS rebid the adult education system in PY2022-2023, ACLS staff did not provide any onsite program review or site visits and instead monitored via desk reviews. ACSL program quality review team worked on revising the evaluation protocols to prepare for the new funding cycle which began July 1, 2023. |
Michigan | Michigan uses a multi-faceted, team approach to its monitoring and evaluation activities. Topics covered include, but are not limited to, grant activities, allowable costs, data collection, data reporting, and data quality. Michigan monitors 100% of its grantees via a desk review. On a regular basis, the Fiscal Analyst runs reports that track budgetary activities in the Next Generation Grant, Application, and Cash Management System (NexSys) to ensure grantees are complying with federal and state fiscal regulations and policies. Concerns or instances of non-compliance are discussed with program staff and follow-up action is taken to address any concerns or non-compliance with providers. In addition, MAERS reports containing provider enrollment, measurable skill gains, outcomes, and other data is run on a regular basis and reviewed quarterly by the MAERS Team and Adult Education staff. Any concerns or instances of non-compliance are discussed internally, and follow-up action is taken, as necessary and appropriate, to address concerns or non-compliance with providers. The Adult Education staff also review grantee narratives, modification requests, and final narrative reports to ensure grantee compliance with federal laws, regulations, and guidance, and state policy. Again, any concerns or instances of non-compliance are addressed with providers. Onsite monitoring and evaluation visits are intended to complement the desk reviews and also provide an opportunity for state staff to provide targeted technical assistance. LEO-WD is revising onsite monitoring tools to be sent to OCTAE for review and approval and will resume on-site monitoring during PY 2023. |
Minnesota | The state Adult Education Leadership Team monitored the quality of adult education activities through the following: ongoing data system development and training to equip local and state staff to record and monitor adult education data; review of NRS data; expenditure verification via submission of audit-certified expenditure reports; site visits to local adult education programs (in-person and virtual); annual submission of assurances by grantees; and compilation and distribution of the annual “report card,” which ranks programs on several accountability measures including Measurable Skill Gains (MSGs) and post-testing rates. In addition, accountability training was provided at the following events: support services conference, ABE summer institute, fall and spring “regional” events, statewide local administrator meetings, quarterly webinars, and other events. Additional details can be found online at: www.mnabe.org/accountability-reporting. |
Mississippi | The OAE’s monitoring procedures included analysis of data and program performance through monthly data submissions and desk reviews. Follow-up onsite visits were conducted when warranted. Monitoring and evaluations are accomplished by multiple methods. Desktop monitoring and actual on-site review visits make up the process used to evaluate the success and/or areas for program improvement. The OAE utilizes a Desktop Monitoring Tool (DMT) based on the National Reporting System (NRS) Educational Functioning Levels (EFL), Measurable Skill Gains (MSG), High School Equivalency (HSE) attainment, and postsecondary education, training. Mississippi included four additional state indicators to include: 1) Posttest-rate goal, 2) Smart Start Credential attainment, 3) National Career Readiness Certificate attainment, and 4) Career Pathway enrollment. After the completion of the desk audit and quarterly submission of the DMT, the OAE contacts programs by phone, email, or an on-site visit to discuss recommendations for improvement and/or to provide technical assistance. As a follow-up, programs are required to provide written documentation addressing areas of concern. Each year, six programs are selected for on-site monitoring. Adult education programs are monitored on a three-year rotation and for the 2022-2023 fiscal year, the OAE created the Program Quality and Compliance Review (PQCR) instrument to follow-up on recommendations and technical assistance provided from prior on-site monitoring visits. The PQCR identifies six vital components in the areas of program quality – demonstrated past effectiveness; efficient data quality; relevant professional development; WIOA and State Plan coordination; transition opportunities through career pathways/IET; and fiscal management. The Assistant Director for WIOA Compliance provided training to all program directors and the OAE staff on updates to the 2022-2023 monitoring process. Programs selected for monitoring are notified via email approximately, 30 days, no less, of the on-site monitoring visit. In the email, directors are provided an official letter of notice, the PQCR, day of monitoring agenda, and the Cumulative Folder Checklist. Programs are expected to complete the PQCR with responses and return to the OAE within two (2) weeks of the official notice. During the 30-day notification, with support from adult education directors, the OAE implemented unannounced classroom visits for observation and review of permanent folders. A week before the on-site monitoring visit, OAE staff meet to discuss the program’s PQCR responses and confirm specific evidence or further elaborations needed on the day of the review. Adult education programs create binders with evidence of each program quality area of the PQCR for the monitoring team to review before interviewing the program director and/or staff. The creation of binders has become a best practice for MS since it allows the team to be more intentional and efficient in the on-site review process. Within four weeks of the visit, the OAE sends a letter to the program director noting any commendations, recommendations, and/or findings, if applicable. If it is determined a program is in noncompliance with state and federal policies related to local data management and program services, the program is placed on a Corrective Action Plan. Programs are given 90 days, which has been updated to 45 for the upcoming 2023-2024 calendar year, to prepare and submit a written plan of action describing the plan of resolution. In addition to these formal monitoring and evaluation methods, review of program data and other data analysis frequently prompts targeted technical assistance of specific performance areas, which generally includes a deeper assessment/evaluation of the area being analyzed. Programs who do not meet the annual state performance target are placed on a Program Improvement Plan (PIP) and required to complete the DMT monthly, attend monthly/quarterly virtual meetings (based on the need) to review goals, as well as receive intensive technical assistance. |
Missouri | DESE AEL divides local service providers into three cohorts to distribute on-site monitoring over the three-year grant cycle. Using a risk assessment, DESE AEL determines which programs receive on-site reviews and analyzes data identifying specific classrooms for observation based on performance, attendance, and testing. The state monitoring team utilizes virtual reviews before traveling to program locations. This protocol included secure file transfers, virtual monitoring of classrooms, and virtual interviews with students and instructors. Programs in the scheduled cohort not completing a virtual on-site review completed a self-assessment, which DESE AEL staff reviewed and discussed with local programs. DESE AEL worked to incorporate the department tiered monitoring tracking system this year and will deploy this tool next year. Once the DESE review team collects all necessary information, they examine Self-Assessments or On-Site reviews and supporting documentation. During team meetings, DESE AEL uses this opportunity to capitalize on areas of strength within each program and offer suggestions on areas of weakness and risk. The review team identifies two types of risk through a report for each local program after the review. “Findings” are typically issues that involve non-compliance with requirements in the grant or a state policy, fiscal/accounting concerns, or performance results below state standards and require a plan of action. “Comments” are for clarification purposes. Some programs must submit a written response and complete a corrective action plan. DESE AEL reviews teacher certification, professional development, student data entry, and financial and administrative procedures. In addition, monitoring includes interviews of adult students and adult educators for front-line feedback. |
Montana | During the 2022-2023 year, each program was monitored via desk audits and monthly reports. Regularly scheduled desk audits ensured that data was accurate and that programs satisfied grant requirements. Desk audits also helped programs maintain compliance with federal and state rules and allowed the state to find areas of deficit within individual programs. The state has implemented and published a monitoring calendar. Programs are encouraged to regularly and routinely self-monitor to ensure compliance with policies because accuracy and accountability are tied to performance funding. Monthly report submissions were mandatory for all programs. These reports had two sections, data analysis and a partner collaboration/activity log. The data analysis section required programs to report and analyze data on the gains for each educational functioning level, the number of students exited, the number of students post-tested, total attendance hours, and the number of high school equivalency completers. The data section also provided an opportunity for programs to visualize progress between consecutive years. A component was built-in for quarterly reflection on data goals, and to help programs stay on track to meet or exceed a projected increase in general performance measures. The partner collaboration/activity log section documented the programs’ ongoing work with agency partners to support their career pathway integration and coordination of services. Programs were required to report on all monthly activities with current partners and identify new partner meetings and activities. In addition, it was required that programs report on the outcome, or anticipated outcome, with the partner. The partner collaboration/activity log also required that the programs document ongoing strategies for increasing educational gains and specify what recruitment activities were completed throughout the month. The partner collaboration/activity log will become the basis for identifying models of promising practices. State staff continues to utilize the written portion of the monthly reports which coincide with programmatic goals and continuous improvement. Narrative components include areas for improvement, areas of success, professional development, motivational strategies implemented, reaching distance learners, and more. State staff completed the most recent WIOA Title II Adult Education Request for Proposal (RFP) competition during spring 2022. All RFP documents were updated with concise language and clearly defined expectations. The recent competition will guide ongoing efforts to monitor local programs either onsite or virtually. The state has always encouraged peer monitoring, onsite or virtually, as well. Programs are encouraged to collaborate, partner, and share clients to foster a statewide team approach for continuity of services. The state has revised risk assessment processes and plans to work on updating on-site monitoring tools to better align with federal monitoring expectations. |
Nebraska | Nebraska Adult Education’s comprehensive monitoring plan continuously evolved to ensure successful performance outcomes and compliance at the local and state levels. The State monitoring plan consisted of full on-site, targeted and desktop monitoring activities. Consistent communication with directors and program staff helped to ensure understanding and compliance during the program year, thus resulting in a streamlined focus and positive performance outcomes.
Additionally, an elaborate structure for local program self-monitoring requirements were established. Self-monitoring proved to be highly beneficial for the local programs to analyze data sets, identify errors and to make the necessary corrections on a regular schedule to maximize compliance and performance.
Professional development activities were evaluated with a multi-faceted approach. Evaluation surveys, provider discussions and internal evaluation of trainings through observation and through data analysis were all used to assess the quality of professional development activities. Training needs were solicited through targeted questions on quarterly reports, self-assessments, data quality checklists, data analysis, and from direct communication. Professional development activities were selected to address these needs utilizing the best available research and expected benefit.
Assessment and Monitoring Processes:
|
Nevada | State Leadership planned and offered Nevada Adult Education directors’ meetings, designed to provide opportunities to share best practices, policies, and tools to support program improvement. During the 2022-2023 program year meetings were more frequent but shorter to accommodate the virtual nature. A risk-based monitoring system and process for placing programs under corrective action is used with local program directors and staff to drive program improvement and the PD contractor provided targeted technical assistance based on areas in need of improvement. The work with local programs included reviewing and analyzing the National Reporting System (NRS) data. Program director meetings included information on contextualization and program management. During PY22 Nevada achieved Measurable Skill Gains above 50%. This makes three years in a row that the state has achieved Measurable Skill Gains at this level. Five of seven programs achieved MSGs over 50%. The two programs below 50% were significantly below and were placed under a Warning Status for PY23. Those two programs will receive intensive TA during PY23. Monitoring during the 2022-2023 program year was both virtual and on-site. In addition to monitoring state staff visited programs to provide Technical Assistance. Desk monitoring continued throughout the program year and promising practices were identified and communicated with all local programs. Work has continued with the state longitudinal data system and a public facing dashboard is still under development to showcase AEFLA program performance in each of the core measures. Ensuring the dashboard reflects accurate performance data has been somewhat difficult using raw deidentified data outside of the Management Information System used for federal reporting. |
New Hampshire | NH rolled out a new monitoring policy in 2021-2022 and updated it in 2022-2023. This includes a comprehensive risk assessment aligned to the same categories as the original Request for Proposals.
Programs are selected for on-site monitoring if they are in the high-risk category, additional programs are selected to ensure that all programs are monitored on a four year cycle. Four sites had an on-site monitoring visit in this program year. During the monitoring visits, both best practices and areas needing improvement were identified and discussed. The State Staff met with program administrators, designated role-specific staff (i.e. Intake & Assessment Specialists, Counselors and Data Entry Specialists), instructors and students. Topics covered included:
|
New Jersey | The monitoring and evaluation of the quality and improvement of adult education activities as per 223(1)(d) continued virtually and in-person for PY22. Two regional coordinators work in tandem with central OAL staff to complete monthly desk audits of all Title II programs by reviewing trends in the LACES database. Areas of focus for PY22 included reasonable/allowable spending, client intake/exit, digital literacy planning and execution, and curriculum monitoring based on the annual risk assessment. All Title II providers receive an annual detailed “report card” noting their agency’s progress in addition to a statewide summary of overall performance. Report cards were disseminated and discussed at the required Title II Director’s meeting in October 2022 held by the State Director. |
New Mexico | Monitoring ensures that funded programs comply with the federal and state requirements of the grant funding. This monitoring includes the review of fiscal and data processes and procedures, expenditures, provided services, eligible students, and other aspects of the grant agreement and federal and state regulations and requirements. Another critical purpose of program monitoring is to promote continuous program improvement. Our monitoring activities around compliance, fiscal management, data integrity and performance, and program management are carried out on an ongoing and regular basis throughout the year, as well as through formal site visits at least once every four years. NMHED-AE publishes Monitoring Guidelines, a Monitoring Form, and other items at Propelnm.org so that local programs may prepare and know what to expect from monitoring activities. Monitoring and evaluation on data integrity and performance are carried out through monthly desk review, which may prompt one-on-one technical assistance and/or correction of individual programs. Programs must submit simple monthly performance reports online through a brief survey. In addition to monthly desk reviews, data integrity and performance are also reviewed with programs on a quarterly basis in synchronous virtual meetings, both to further monitor the work and to provide TA as needed. Monthly fiscal desk review involved analyzing expenditures against program budgets and allowable cost rules, processing requests for reimbursement for state and federal grants. This monthly review also regularly prompted one-on-one corrections or TA for individual program administrators or teams of administrators on a nearly daily basis. In FY 22, NMHED-AE initiated a fiscal monitoring improvement project. The result of this project was a series of changes that included: (1) the in-depth review, supported by a formal review checklist, of all local programs’ financial statements on an annual basis, (2) an overhaul of the Business Policies and Procedures in order to include more information about allowable costs and other specific financial management information, and (3) formalization of monthly fiscal desk review processes and record-keeping. In FY 23, this project will continue with the addition of new, streamlined reimbursement processes for local programs, targeted technical assistance in response to problems identified in monitoring activities, and improved and formalized organization of records and processes at the state level. Programs turned in their annual reports by September 1. We read these reports and used the information gleaned from them, as well as information about each program contributed by each staff member, to evaluate local programs for risk. In November of PY 22, NMHED-AE performed the annual risk assessment, which helped identify the order of in-person site visits for that year. While NMHED-AE saw a significant turnover in our staff in PY 22, the state office was nonetheless able to perform five monitoring site visits and plans are in place to increase this number considerably in PY 23 since we are approaching full staffing levels again. Each site visit culminates in a site visit report, exit meeting, and a program enhancement plan (PEP). The PEP is collaboratively developed with the local program and provides a structured way for programs to plan to address key findings of the site visit in an organized way. While a Program Probation Policy is in place, we did not have the occasion to use it in FY 22. We have also begun an initiative to involve our state professional learning system, Propel, in providing TA for the programs to complete the items on the PEP. This initiative will be expanded in PY 23. Ongoing monitoring and evaluation on program management, including WIOA partnerships, IET, and funded work under WIOA Sections 225 and 243, are carried out through frequent, one-on-one interaction between NMHED-AE and local program staff. |
New York | As noted in the table above, AEPP, along with the RAEN and Accountability Office provided 421 monitoring activities across the state. All resulting narrative reports and relative program data are stored in secure, password protected, program accounts. The website and all secure documents are maintained through the office of Accountability for adult education under contract/direction from AEPP. The AEPP staff and program staff have full access to all accounts maintained on the Accountability website through password protected access. All technical support, monitoring activity, communication (emails, phone calls, recorded web meetings) were posted to assure constant and supportive assistance was provided to every program throughout the state. Monitoring programs in Program year 2022 included but was not limited to desk monitoring. The NRS team reviewed all data and created a program evaluation based on current and previous data elements including Measurable Skill Gain, Post Test Rate, and Follow Up Outcome data. Subsequently, web meetings were scheduled with those programs presenting with the most significant deficiencies in these same areas of performance. These web meetings included the AEPP regional team, the regional RAEN Director, and the NRS Director. Based on a review of standard data checking documents, determination of data errors, data omissions, and coding errors were assessed and shared with the local program staff. Itemized lists of action steps were shared at the conclusion of each web meeting; programs were then scheduled for a second review and meeting to confirm all data correction expectations. When necessary, remote tutorial sessions were schedules with local data teams to guide data corrections. Program Year 2022 marked the return to issuing NYSED Adult Education Report Cards. The report cards are designed to quartile rank Measurable Skill Gain (MSG) and Post Test Rate as compared to all other WIOA funded programs. MSG is also weighted to demonstrate the differences in learning trends in populations that were served. More credit is associated with programs that served those students where Measurable Skill Gain is the most challenging based on NYS and Federal data trends. Measurable Skill Gains include Educational Gain evidenced by a pre and posttest, the attainment of the high school equivalency diploma, and for industry credentials earned by students. The report cards also provide the follow up outcomes of employment and median wage matches both with the electronic data match conducted with NYSDOL and with manual surveys conducted by the local programs. NYSED requires all funded programs to enter data on a monthly basis. Enrollment, attendance, assessments, and follow up data were entered by the end of the month following the month in which the service was delivered. AEPP has found this policy ensures programs monitor their student attendance and identify trends both up and downward and have ample time to react accordingly. NYS local funding (used as part of the MOE for New York) is calculated on a contact hour logarithm, consequently, it has become increasingly more important for programs to keep a pulse on their monthly accrual for this NYS funding. This requirement also encouraged programs to employ program improvement strategies throughout the program year. The RAEN centers and NRS Accountability office provided additional training related to strategic data management and opportunities to interpret data trends along with upward or downward movement. |
North Carolina | Risk Assessment Before on-site and virtual monitoring sessions can be carried out, funded providers are first evaluated using a risk assessment instrument. All Title II funded providers were evaluated based upon the following criteria: 1) New Director (3 years or less), 2) MSG percentage, 3) Time since last monitoring, 4) Budget Expenditures. Fiscal and Programmatic Monitoring and Reporting During the 2022-23 program year, the North Carolina Community College System (NCCCS), Office of Adult Education conducted a combination of on-site comprehensive and desktop continuous fiscal and programmatic monitoring. To ensure compliance, all Title II funded programs were reviewed monthly via the learning management system, Moodle. Programs are responsible for submitting a monthly expenditure report and monthly or semi-annual time and effort reports. The NCCCS Compliance Team manually reviews and audits at least 25% of the required documentation submitted monthly. If the local provider has issues with specific computations required for monthly submission, a member of the NCCCS Compliance Team provides technical assistance office hours. The one-on-one meetings allow providers to ask questions specific to their program. Also, during the midpoint and endpoint of the year, a member of the NCCCS Compliance Team confirms with the providers that the Time and Effort reports have been reconciled with their local programs’ business office. On-Site Comprehensive Monitoring Providers selected for monitoring received a memorandum on August 11, 2022, announcing that all comprehensive monitoring sessions would be scheduled via email. Upon receipt of the official monitoring memorandum, providers had at least 90-business days to prepare and submit the necessary documentation for their upcoming monitoring session. Providers engaged in a one-hour webinar scheduled on September 8, 2022, and were required to have at least one member of the administrative team attend the session. The webinar covered topics including Title II Monitoring Procedures Manual, Title II Comprehensive Monitoring Checklist, and the Moodle submission site. During the statewide monitoring webinar, members of the Office of Adult Education provided an in-depth overview of the contents of Title II Monitoring Manual which covers pre-monitoring, on-site monitoring activities, and post-monitoring requirements. Number of Providers Monitored During the program year, we monitored 11 providers on-site: six providers for the 231 AEFLA Federal award, three providers for 225 Corrections Education, and two providers for 243 IEL/CE Federal award. Post-Monitoring Activities and Meetings Upon completion of the on-site comprehensive monitoring sessions, providers are required to engage in a one-hour post-monitoring meeting at which members of the NCCCS, Office of Adult Education discuss commendations, recommendations, observations, and findings. It is recommended that providers extend invitations to relevant members of the leadership team, faculty, and staff to facilitate open communication and ensure that all parties involved understand the topics being discussed and help to promote collaboration and ensure that all perspectives are considered. Written Report Once the on-site comprehensive monitoring cycle is completed, the NCCCS Compliance Team writes an in-depth final report which includes documented findings, recommendations, and the exact type of evidence submitted for evaluation. Corrective Action Plan(s) A total of 72% of providers monitored were placed on a Corrective Action Plan (CAP) during the 2022-23 program year. Providers that were placed on a CAP were required to submit written responses via Moodle within 30 business days of receipt of their agency’s final monitoring report. Providers had up to 180 business days to remedy the identified issues addressed in their agency’s monitoring report. To support providers, programs that were placed on CAPs were required to engage in monthly meetings with the NCCCS Compliance Team. The monthly CAP meetings were scheduled up to 180 days from the initial CAP communication. These meetings ensured that providers understood the requirements of their agency’s CAP. Additionally, upon receiving the CAP meetings, members of the NCCCS Compliance Team reviewed proposed timelines for remedying issues. Furthermore, the CAP meetings provided technical assistance and analysis of program functionality. For example, the NCCCS Compliance Team dedicated innumerable office hours to providing holistic technical assistance for student intake, financial management, student retention, and contextualized lesson planning. Currently, a total of 75% of the CAPs have been remedied and closed by the Office of Adult Education. As a requirement, to close the CAP, the local program director and Assistant State Director are signatories to the document. Moreover, an official memo is sent to the program director’s supervisor and Chief Executive Officer (president or board chair). Security of Monitoring Information All financial and programmatic information is stored on a secured drive. Only members of NCCCS, Office of Adult Education State Office have access to the evidence, written reports, and other pertinent correspondence. The NCCCS Compliance Team maintains a record of programs that are currently on CAPs, and those that did not comply with the continuous monitoring requirements. |
North Dakota | d) Federal and state compliance on-site and virtual monitoring continues. Formal on-site monitoring visits occurred in 2022-2023 with review of fiscal and programming protocols, virtual touch points to discuss data and protocol items, and a review of progress toward targets and goals. Informal visits were conducted in 2021-2022, with a review of data. Data review continues, general networking, and questions about a wide range of needs. We conduct informal on-site visits 2023-2024. Formal visits are conducted every other year. We continue to be available for Teams meetings as needed individually and as a group. Desk audits of data happen regularly, and a data review twice a year. Lastly, Adult Education holds an annual director meeting to discuss previous years data, current issues, best practice, large scale changes, and other items to steer Adult Education for the future. Bi-monthly meetings are held with the ALC directors to offer support, share information, and collaborate solutions. All information collected from monitoring, evaluation, desk audits, trending questions, etc. are used (in real time) to help adjust future professional development topics and technical assistance guidance communicate to the field. Quarterly reports are shared with the state office from each site as well for ABE, IELCE, and state grants. |
Northern Mariana Islands | Our office participates in a college-wide program review process where we evaluate the services we offer to our students. We look at how we gather and share data, assess student and program outcomes, professional credentials and training of our faculty and staff, and community engagement. Our office also regularly participates in professional development training on adult education, online instruction, curriculum improvement, in-take process delivery, as well as inclusivity, cultural sensitivity, and disability awareness and support. The state core partners participated in an Evaluation Peer Learning Cohort (EvPLC) training in Fall 2021. This provided our core group team members the guidance and foundation to apply what was learned to monitor and evaluate program activities in a way that brings value back into the programs. Core program members regularly meet and communicate through email, phone calls, or Zoom. We discuss the MOU we have between us and monitor the services we offer through our programs. |
Ohio | The monitoring and evaluation of the quality of education in Aspire programs continues to be performed primarily by ODHE Aspire program managers with more support and collaboration from the Kent State University PDN. We continue to use a risk assessment tool with 12 established criteria such as NRS performance measures, enrollment, allocation, and audit findings. For PY 2023, Ohio will release a new Risk with additional criteria to take in suggestions and ideas from the State Director Training in Spring of 2023. For monitoring purposes, in PY 22, programs were divided by the following for monitoring with a Program Manager overseeing each of these divisions:
|
Oklahoma | Programs were monitored continuously throughout the year using the LACES data system, CareerTech’s financial system, and onsite visits. Risk assessments were completed on all programs to assist state staff in effectively monitoring potential risk factors associated with grants funded by federal pass-through funds. ODCTE evaluated each subrecipient’s risk of non-compliance with federal statutes, regulations, and the terms and conditions of the subaward. The focus ensured that grant programs adhered to the grant’s guidelines, carried out appropriate services, and ensured that proper internal controls were in place. Oklahoma AEFL staff updated monitoring documents and procedures during the 2022-2023 year. These updates mirror some of the components used for federal monitoring. The new monitoring documents and procedures were utilized throughout the year. State staff monitored seven out of thirty-two programs during the 2022-2023 year. Monitoring was determined based on the risk assessment and a rotation of programs. Four programs were monitored using a desktop/virtual method. Those monitored using the traditional method had two state staff onsite monitoring and interviewing. All programs with findings developed corrective action plans with technical assistance from state staff. |
Oregon | Throughout the 2022-23 reporting period, the State ABS Team executed a multifaceted approach to monitoring activities. These activities were not limited to standard grant monitoring but also included the utilization of specialized tools and templates. The team's efforts encompassed direct meetings with sub-grantees, meticulous creation of monitoring reports, and actively involving grantees in the monitoring report process. The standard grant monitoring activities were comprehensive and varied. They entailed requirements such as document submission, desk monitoring, and various online monitoring activities. Each local ABS program was responsible for submitting a range of documents, including a Final Financial Status Report, Federal NRS (National Reporting System) Tables through TOPspro Enterprise, TOPSpro Enterprise Data Integrity Reports, and other specific TOPspro Enterprise reports as required. Communication between each local ABS program and the state ABS Team was consistent and multi-channeled, involving routine emails and Zoom meetings. This communication also included the submission of records detailing local staff professional development, invoice submissions, and comprehensive reviews of program operations. The State ABS Team was diligent in maintaining detailed documentation of all monitoring efforts and provided tailored feedback to each local ABS program. This feedback included risk assessments and identified areas for improvement, thereby ensuring that each local program was well-equipped to address any deficiencies. Moreover, the state team provided responsive training and technical assistance, guiding each local ABS program toward corrective actions when necessary. A notable aspect of the monitoring process was the creation of a Risk Assessment for each local ABS provider. This assessment was twofold, consisting of a Financial Risk Assessment component and a Program Performance Risk Assessment component. Subsequently, local ABS directors developed Program Improvement Plans (PIPs). These plans were comprehensive, incorporating aspects of program administration and innovative teaching and learning strategies, all designed to effectively meet the diverse needs of students. The PIPs were developed and thoroughly reviewed and discussed during conferences with the State ABS Team. During these discussions, the State ABS Team offered valuable feedback and support, helping each local ABS director refine their proposed deliverables and strategies. |
Palau | The Palau Adult Education Program during this reporting year was able to provide service to 101 adult students. Of the 101 adult students served, 70 took the CASAS Post Test and Pre-test, 37 had gained improvement and entered into HiSet testing, and 2 graduated from the program. |
Pennsylvania | Division advisors monitor program data for issues in real time to assist programs to improve the quality and accuracy of the data that they report and to identify both positive and negative trends. They have frequent conversations with program staff to discuss progress towards meeting contracted enrollment and program performance targets and identify areas for improvement. During these conversations, advisors also gather information about promising practices and ask those programs to present information to their peers on division webinars. In the February 2023 webinar, four local programs presented information on how they assess English language learners using multiple assessment instruments. In the March 2023 webinar, two local programs shared how they ran successful program improvement team meetings that focused on using data to guide improvement activities. In response to the pandemic, state staff developed tools and procedures to conduct remote monitoring reviews. As the pandemic waned, advisors began to use a hybrid approach that incorporates best practices from both in-person and remote reviews. During 2022-23, advisors completed nine monitoring reviews. Advisors use a risk rubric at the beginning of each year to determine which programs will receive a monitoring review. They updated the rubric in summer 2021 to remove subjective items so that the rubric is based on objective criteria and uses data for decision making. For the 2022-23 kickoff, advisors designed an interactive session in which agencies applied the new risk rubric to themselves and then examined a section of the monitoring tool to determine what evidence they would present during a monitoring review. The goal of this activity was to ensure that agencies were well prepared for monitoring. Using data for decision-making and for continuous program improvement is an ongoing focus of state leadership activities. The PDS provided technical assistance, training, and support to local programs in the collection, entry, reporting, use, and analysis of program data with the goals of ensuring accurate data and improving program services and student outcomes. To assist both the division and programs with monitoring progress and using data, the MIS Project created and annually updates an Access database template, which is linked to the web-based data collection system. Program staff can produce reports for individual teachers and classes to evaluate the impact of program improvement and professional development activities. The MIS Project produced monthly agency data check reports for program staff and division advisors. The MIS Project continued to improve the template to provide relevant and accurate data queries for division staff and programs to use. In 2022-23, they focused on reports that relied on accurately calculating educational functioning level (EFL) gains. While the Access database is not used for annual federal reporting, PDE and agencies rely on these reports for interim progress analysis, and they are now more accurate. |
Puerto Rico | The program has established a monitoring procedure implemented by the Federal Funds Division of the Department of Education. The federal funds division performs desk review monitoring on how local programs provide access to career services and the types of services provided through the one-stop system.
|
Rhode Island | State Leadership funds were used to monitor and evaluate the quality and improvement of local programs’ adult education and literacy activities in FY 2022. RIDE launched a new accountability system with Title II grantees in PY 2022, the first year of a new five-year grant cycle. The Accountability Tool is a shared spreadsheet of compliance metrics which is updated on a quarterly basis by grantees as part of the quarterly reporting process. Metrics on the Accountability Tool are aligned with federally negotiated targets for MSGs and other federal Indicators of Performance, as well as with state targets for local enrollment and post testing rates. IET MSGs are also broken out for those grantees who offer this type of programming. The shared spreadsheet serves as a dashboard for progress and provides a transparent framework for quarterly data audits and bi-annual performance conversations between grantees and state office staff. The data on the Accountability Tool also provides a basis for risk assessment. Grantees who do not meet enrollment or post testing rate targets receive individualized Technical Assistance and are at risk of a reduction in funding after two consecutive years of poor performance on these measures. RIDE staff conducts quarterly virtual evaluations of local providers, which include three components: Accountability Tool data, a narrative report, and an update on programming planned for the remainder of the year. The narrative report includes questions related to promising practices and implementation challenges over the previous quarter, as well as progress made on one-stop integration activities and employer partnerships. Local providers are required to submit their local NRS Tables 4, 4b, and 5 with the narrative report. Information gathered from the narrative helps state office staff to identify promising practices at the local level, which are highlighted in monthly grantee meetings or in peer-led PD sessions. In addition to grantees’ quarterly reporting submission, RIDE held one-on-one virtual meetings with each funded eligible provider during quarters one and three to review program implementation and outcomes year to date. These conversations were an opportunity to reflect on lessons learned, and to identify areas potentially in need of monitoring or technical assistance or promising practices to be shared. Apart from the biannual check ins, the state office has developed a system of regular touch points with local programs to identify emergent technical assistance needs, including issues related to data collection and reporting in the state MIS, LACES. In PY 2022 these included: twice monthly meetings with local Data Managers; drop-in times for programs to ask state staff database questions; an archive of supporting documentation related to data collection and reporting; a state-wide Google Group to communicate with data system users; quarterly review of key NRS reporting tables at the local level, and quarterly audits of local program data. This system of monitoring and review supports a high level of data hygiene and helped ensure that good work at the local level was accurately reflected in the state’s federally reportable data. |
South Carolina | Monitoring and Evaluation of Adult Education Activities The OAE used funds made available under section 223 to monitor and evaluate funded providers in the following ways: The State Director of Adult Education assigns a Compliance Monitoring Review (CMR) Team to formally monitor all school district programs and community-based organizations (CBO) receiving federal funds and/or state aid to support approved adult learning services once every four years. The Compliance Monitoring Review (CMR) process is a systematic approach designed to assess the educational opportunities and the effectiveness of adult education programs and services in school districts and community-based organizations (CBOs). One-third of the programs are reviewed each year of the grant cycle by a team of OAE staff. The other two-thirds of the programs are informally reviewed by an informal monitoring process, desktop monitoring tools and informal site visits. To be successful, the CMR effort requires continuous follow-up and support activities including professional development and virtual and on-site technical assistance. The formal review process includes both virtual and on-site monitoring. The on-site monitoring includes an onsite record and attendance review, the facility tour, and Instructional Observations. The two-day virtual portion includes teacher interviews, the director’s written documentation review, and financial monitoring. |
South Dakota | Subrecipient Monitoring As a required State Leadership activity, the agency conducted two (2) AEFLA Subrecipient Monitors during PY2022-23. These monitoring and evaluation efforts were comprised of remote, virtual, and onsite components. Moreover, the programmatic and fiscal reviews included student-file review [with data validation], staff interviews, financial monitoring tools, MIS demonstrations, data-flow discussion, voucher review, staff-development conversations, and One-Stop Alignment topics. Subsequently, the agency issued formal Monitor Reports detailing a Summary, Scope, any Findings [and resultant Required Actions], Areas of Concern, Recommendations, and Noteworthy Practices. Desk Monitoring State staff provided continuous technical assistance through telephonic and electronic correspondence, desk monitors, conference calls, webinars, video teleconferencing, and even an occasion site visit. Furthermore, local administrators, instructors, and data specialists took advantage of the fact they could contact state staff with any questions regarding programmatic policies and data-quality issues with the assurance they would receive timely responses. Adult Education’s web-based Administrators’ Meetings also provided opportunities to review participation rates, performance, data quality, policy changes, and program-goal updates. The Quarterly Reports assisted state staff with monitoring new or ongoing issues while concurrently providing agencies with more meaningful documentation and evaluative processes; the consistent submission of quarterly data-sets affords the local subrecipient providers and the agency easy access to longitudinal comparisons across different points of the program year. Evaluation of Quality and Improvement South Dakota’s Adult Education and Literacy Program continues to consider the challenges to and efficacy of juxtaposing outcomes of co-enrolled participants with those enrolled in only one WIOA Core Program. As with the Technical Assistance priorities, evaluation of quality and improvement were primarily highlighted in context with Co-Enrollments’ outcomes via the Participants Reaching Employment Potential service-delivery model. Moreover, the agency and its providers continue to use the program’s WIOA Title II Funding, Participation Levels, and Performance Ratings [reference] as a framework for conversations about progress and improvement. This reference details figures, outcomes, and percentages for a host of primary, secondary, and tertiary metrics. South Dakota’s State Director of Adult Education continues to make AEFLA data readily available to DLR’s Data and Evaluation Specialist, Policy and Data Analyst, and Executive Team. Additionally, the Workforce Development Council Members, WIOA Core Programs, WIOA Required Partners, and the One-Stop Managers are regularly encouraged to review South Dakota’s Statewide Performance Report(s) and NRS Tables, as well as submit any other AEFLA-related data queries to the agency. Dissemination of Proven or Promising Practices The agency made concerted efforts to highlight Noteworthy Practices [from the Subrecipient Monitors] and Program Highlights from the AEFLA providers' Quarterly Reports during the monthly AEL Administrators' Meetings. Just as proven Data Quality practices are emphasized during the monthly LACES NexGen Webinar Trainings, local promising practices often are contextualized and promoted during monthly Third Thursday Trainings (T3). Because local administrators and Data Specialists each have their own monthly venue, the T3 Series was specifically designed for local AEFLA instructors, as well as for our colleagues at Job Corps and Tribal Colleges. These monthly forums are often facilitated by the instructors themselves; topics have included Student Engagement, Artificial Intelligence in the Adult Education Classroom, Family Literacy, Distance Education Supports, and Teacher Talk. Occasionally the forum hosts guest speakers from partner agencies. |
Tennessee | In PY22, TDLWD continued to refine our monitoring and evaluation processes, including clearly defining the monitoring process to include a timeline of events to complete the monitoring process. In addition, we clearly defined the corrective action plan (CAP) process to ensure the local programs understand what to expect, as well as, when and how CAP’s get reviewed and closed. Early in the program year, we conducted a risk assessment to determine the programs most in need of monitoring. We conducted a formal, in-depth monitoring of four local providers based on the risk assessment. These providers were monitored on the ABE grants, IELCE grants, and the State AE General grants (state funding only). We were able to conduct more visits in person; however, there were adjustments made to include virtual visits as needed. Virtual visits were challenging because they were more impersonal and difficult to get a comprehensive understanding of the service delivery experience in particular service areas. As we expand our reach into the correctional facilities, we will need to refine the monitoring process for carceral settings. TDLWD staff members continued to focus much of their work this year on informal/desktop monitoring of the local providers’ fiscal practices and data and performance. State staff worked regularly to review program expenditures and assist locals with issues related to allowable costs, accurate reporting and spending rates. The data specialist worked regularly to review the statewide MIS performance data and to assist locals with troubleshooting errors in their data entry. TDLWD leaders regularly reviewed the providers whose performance was struggling, created program improvement plans and provided technical assistance accordingly. |
Texas | For quality assessment and improvement of AEL activities under Section 223(1)(d), TWC's Sub-Recipient Monitoring (SRM) team performs annual risk assessments and audits, addressing fiscal and programmatic aspects. Data validation reviews ensure OCTAE Memo 19-1 compliance. Any findings are resolved by AEL providers in collaboration with TWC's Audit Resolution Department.
Unresolved or repeat findings result in Technical Assistance Plans (TAPS) or Corrective Action Plans (CAPS) to ensure standards are met. State-led on-site or desk reviews cover all AEL providers at least once in the grant cycle. Monitoring outcomes inform TA resources, including webinars and one-on-one assistance.
In addition, each year, TRAIN PD evaluates professional development needs to respond to changes within the field of adult education and helps improve the outcomes for the following program year.
The data sources for this year include the following:
|
Utah | Utah conducted monthly desk monitoring with programs. Program reviews provide a deep dive into program practices that yield actionable recommendations for program improvement. Monthly data reports are compiled and distributed statewide to quickly identify possible issues. Regular communication with the programs allows deeper understanding of what is happening at the local level and facilitates the sharing of best practices among providers |
Vermont | Agency of Education staff audit student records quarterly to ensure active education functioning levels and appropriate services that lead to graduation. Other monitoring activities included quarterly performance monitoring for each provider and evaluation of program improvement.
In the spring of 2023, the Agency of Education held the AEL grants competition and facilitated a database migration. Applications were received from the same four AEL providers and each was awarded funding to serve specific counties. The process of reviewing grant applications provided an opportunity for evaluation of programs with input from colleagues at the AOE.
The AOE conducts quarterly desk monitoring in several ways. For those students receiving reimbursable services under the state-funded High School Completion Program, each student record is audited for accuracy, updated information such as goals and achievements, valid assessments, and appropriate instructional services. Other quarterly monitoring includes each AEL provider to review progress toward performance targets to identify challenges and successes. Beginning in PY21, the AOE included in AEL grant agreements a list of expectations for progress toward meeting established Federal and State targets known as “benchmarks”. For example, at mid-year, AEL providers are expected to have reached these benchmarks:
By December 31, 2022, the subrecipient will be expected to meet the following benchmarks to demonstrate progress toward meeting the annual targets:
|
Virgin Islands | For FY22 all subgrantees were monitored and monitoring reports were issued. The monitoring visits resulted in the State launching intense TA and training as described above. Programs were placed on a reimbursement basis to ensure that the spending is compliant with the regs. The SOCTAE continues to provide ongoing evaluation of programs based on the monitoring feedback. Additionally, all AEFLA funded programs are required to continue to submit reports with draw requests, that outlines program successes and challenges, recruitment and retention measures. The State also have begun to monitor each program's LACES data entry and provide feedback, or requests for information if no movement of data entered into LACES is noted. |
Virginia | In support of the requirement in Sec. 223(1)(d), monitoring and evaluation of quality, VDOE uses state leadership and administrative funds to support monitoring and evaluation activities, which include not only evaluating the quality of and improvement in local adult education activities but also the effectiveness of efforts by the VALRC. The requirement to disseminate information about models and proven and/or promising adult education practices within the state is discussed as an integral component of the work that the VALRC delivers as PD and technical support under 223(a)(1)(b). The VALRC collects and analyzes evaluation data from participants in their PD offerings, using the information to inform future topics, modes of delivery, degrees of intensity, clarify delivery, and evaluate the distribution of their efforts across the providers. The VDOE’s system for assessing the quality of providers of adult education and literacy activities is based on five major activities, which can occur both consecutively and concurrently throughout the program year: (1) a program self-assessment survey completed by each provider, (2) the distribution and review of an annual risk rubric for each program, (3) ongoing data monitoring of information entered by each program in the state Management Information System (MIS) and the Online Management of Education Grant Awards (OMEGA) fiscal system throughout the program year, (4) technical assistance calls, the content of which is based on 1-3 above, between the VDOE and each funded program, and (5) the identification of programs for site-visit reviews, based on information collected in activities 1-3 above. In PY2022-2023, activities 1-3 listed above were successfully completed, starting with the distribution of the Program Self-Assessment Survey, which is designed as a means for programs to document their leadership and management processes and assist the VALRC and state office staff in the development of program-specific technical assistance plans. Risk rubrics were created from various data related to program operations and performance and distributed to all funded programs. Technical assistance calls were conducted with programs throughout the year as needed but were not scheduled with each program (4) due to the January announcement of the 2023-2025 adult education grant competition. Responses to the 2023-2025 competitive adult education grant application led to the VDOE team adding a targeted monitoring visit to the types of site visits in PY2022-2023. A targeted monitoring visit, both in-person or virtual, will be conducted on an as-needed basis and may occur at any time throughout the year. Depending on the issue(s), a peer reviewer may be recruited to participate. A targeted monitoring visit does not mean the program is exempt from a full in-person site visit. One program was identified in June for a targeted monitoring visit, which will occur next program year. The VDOE team also issued terms and conditions to the 2023-2025 grant recipients. All recipients are required to document that, through the 2023-2025 grant cycle, staff members will receive training on serving individuals with disabilities and on the science of reading. Progress toward meeting these requirements will be monitored throughout the 2023-2025 grant cycle. Monitoring activities from PY2021-2022 carried over into PY2022-2023. An on-site review on June 15 and 28-30, 2022, and a virtual review on September 15, 2022, were conducted with both Region 13, serving the counties of Brunswick, Halifax, and Mecklenburg and Region 14, serving the counties of Amelia, Buckingham, Charlotte, Cumberland, Lunenburg, Nottoway, and Prince Edward in Southside Virginia. Southside Virginia Community College serves as the fiscal agent for regions 13 and 14; therefore, the on-site monitoring reviews occurred concurrently as many of the administrative processes and staff members are shared between programs. A Report of Findings was issued to each region, and the corrective action plans (CAP) were approved. Region 10, serving the counties of Albemarle, Fluvanna, Greene, Louisa, and Nelson and the city of Charlottesville, received an on-site monitoring visit in PY2021-2022 and continued to work on their CAP throughout PY2022-2023. |
Washington | Virtual Program Review and Technical Assistance Visits were conducted by BEdA staff with 19 providers. Monitoring visits were as follows:
|
West Virginia | WVAdultEd is dedicated to delivering high-quality education to the most underserved individuals in the state. All programs undergo regular monitoring to ensure adherence to NRS standards. The implementation of LACES has proven to enhance accountability and significantly improve program quality by enhancing data analysis at local, regional, and state levels.
Three correctional programs were visited; a fourth visit was postponed due to changes with program staffing issues. Seventeen adult education programs were visited, while 10 programs received technical assistance. During these visits, data quality audits are completed to verify data accuracy and ensure the completion of all necessary documents. Monitoring pinpoints areas for improvement within local programs. A training session at the annual fall conference taught program staff and administrators how to prepare for and conduct data quality audits. LACES Online Registration was piloted with seven classes to assist with registration and accuracy of data.
Additionally, an Annual Performance Profile Report was compiled to assess individual programs in meeting Federal Core Measures and Indicators. This report not only ranks individual classes to recognize exceptional programs but to also identify programs in need of technical assistance or monitoring.
WVAdultEd uses the following criteria to monitor classes and provide technical assistance:
|
Wisconsin | Monitoring and Evaluation The WTCS coordinated a series of activities during the reporting period to monitor and evaluate the quality and improvement of Wisconsin AEFLA providers and adult education services. Each AEFLA provider must submit student-level data through the WTCS Client Reporting System on at least a quarterly basis. The data submitted are analyzed and compiled into the WTCS AEFLA Reporting and Performance Accountability Monthly Report. This report presents outcomes data including the number of participants served, pre-/post-test rates, Measurable Skill Gain rates, and fiscal indicators like grant spend down rates. The WTCS Office staff review this report each month to inform provider specific monitoring discussions throughout the year. In addition, all AEFLA-funded providers submit tri-annual reports that are reviewed by WTCS Office staff to monitor grant performance, implementation of programming adjustments to meet goals, and grant expenditures. The WTCS also coordinates the Wisconsin AEFLA Program Review Process that includes the annual provider Risk Assessment process and comprehensive AEFLA monitoring activities. |
Wyoming | Monitoring & evaluation of local providers takes multiple forms. First, all providers are on a two year virtual compliance monitoring rotation. This virtual monitoring utilizes a 16 chapter compliance tool found at: https://docs.google.com/document/d/1B9xCFnm2fEO51Mejxqzi4yAqavRrkQp6/edit. Local providers are required to provide evidence for each item found on the checklist, when applicable. Despite being extremely time-consuming, virtual monitoring visits provide the SEA with extended time to review and comment on evidence submitted by the local providers as part of their monitoring tool checklist. Once the SEA’s review process is completed, a virtual meeting is held between the SEA and the local provider to review each chapter in the compliance checklist and to provide technical assistance. For FY 22/23 there were six programs monitored and all were found to be in compliance. “In Compliance” letters were subsequently sent to all six AE local providers. At the end of each year, providers submit an end of year report which summarizes compliance to the 13 considerations as well as other items outlined in the RFP and/or reapplication process. The SEA reviews these narrative reports and each provider is scored on a four point scale. These reports are also used to help complete each program’s ‘Risk Assessment.’ Providers who do not make MSG targets in a full category and/or overall are required to complete a targeted monitoring process. This year, we had one provider who failed to meet targets in every ESL category; which has subsequently resulted in a targeted monitoring for FY 23/24. The quality of a program’s data is critical to understanding what is truly happening in a program. Because of this the State began a two year process in FY 21/22 where local directors had to learn how to conduct a data dive for program improvement purposes. This task was finalized this year with a capstone project that required each local provider to conduct a data dive on any area of their program that they wanted to examine for program improvement purposes. Once completed the data had to be used to create a marketing brochure or as an informational report that could be submitted to the provider’s governing institution. Monitoring and evaluation is also done on a monthly and quarterly basis by both the local programs and at the State level. Each month the state reviews local program report submissions of their monthly report and utilizes the comprehensive data to complete longitudinal research to measure previous year performance to current year. The monthly report evaluates progress towards targets, fiscal responsibilities, individual cost per student for career services while providing local directors with an option to submit a short narrative on activities for the current month. The quarterly report summarizes activities for the quarter, outlines challenges and successes, requests for technical assistance, and student success stories. |