Sorry, you need to enable JavaScript to visit this website.
Skip to main content
U.S. flag

An official website of the United States government

Narrative Analysis Tool


The Narrative Report responses below can be further filtered by one or more states, as well as keywords.

For more information on Narrative Reports please see the technical assistance documents.

Print Download Word document

    Narrative Selection Switch - (Click box below for list)
State Monitoring and evaluation of the quality and improvement of adult education activities
Alabama Regional directors are primarily responsible for the monitoring and evaluation of the quality and improvement of adult education providers and to ensure that such practices within local programs is compliant with state and federal policies. These directors conduct quarterly desk-top monitoring which includes a review of each program’s AAESAP performance as detailed in their custom dashboard. The dashboard contains NRS data regarding the program’s enrollment by educational functioning level, measurable skills gains, post-test rate, career pathway achievement, High School Equivalency attainment and certificate completion. Desktop monitoring, along with routine data collection and reports received from providers, provides insight as to how the local programs are performing against expected results. Programs not making continuous improvement receive technical assistance from their respective regional director. All programs also receive at least one, on-site compliance monitoring during the state’s three-year Request for Funding Proposal (RFP) Cycle. On-site monitoring is conducted by the state team consisting of all three regional directors. The monitoring schedule follows a minimum of eight programs per year, which ensures that each local program will participate in a full compliance monitoring at least once every three years. The order in which programs are monitored is determined as a result of a federally required risk-analysis. This risk analysis is based on indicators that reflect program performance. Programs are placed in quartiles, with the fourth quartile being the “goal quartile” for all programs to reach. Programs in the first quartile, based on the risk-analysis, may have the greatest risk of not meeting performance measures and will be monitored earliest in the RFP period (the first program year (PY) of the RFP period). Programs in the second quartile will be monitored in the second PY of the RFP period, and programs in the third and fourth quartile will be monitored in the third PY of the RFP. Programs are assessed using an ACCS-approved monitoring instrument. The monitoring instrument is based on program performance and management and follows five modules aligned with WIOA standards:
  1. Performance Accountability Standard
  2. Program Performance Standard
  3. Adult Education and Literacy Activities
  4. Fiscal Considerations Standard
  5. Supplemental One Stop Partnerships
The ACCS uses Corrective Action Plans (CAP) to support grantees in their continuous improvement efforts. A CAP is required when programs receive a score less than a "3 – Needs Improvement” (based on a five-point scale), on any part of the monitoring instrument. A score less than a “3” results in a finding. A CAP template is provided to guide local program directors/grantees in documenting strategies, improvement steps, timelines, and results. Additionally, each plan provides an opportunity for thought partnership and dialogue between ACCS and grantee. For PY 2021-22 the following providers received an on-site monitoring: Provider Monitored Date of Monitoring Date CAP Closed Lawson State Community College 10/2021 Ongoing Shelton State Community College 10/2021 7/7/21, Closed Wallace State Community College – Selma 11/2021 7/7/21, Closed Central Alabama Community College 11/2021 7/7/21, Closed Jefferson State Community College 1/2022 Ongoing Chattahoochee Valley Community College 1/2022 Ongoing Bishop State Community College 2/2022 7/7/21, Closed Wallace State Community College – Dothan 3/2022 7/7/21, Closed    Targeted monitoring occurs as a follow-up to verify the satisfactory completion of findings identified during an on-site monitoring. Once targeted monitoring occurs and the program has been found to be following state and federal guidelines, the CAP is then closed.
Alaska Adult education monitoring activities are tracked through various outlets. The State AAE Office monitors student information digitally through AlaskaJobs (MIS system) and GEDTS®. Records are compared for accuracy in reporting student outcomes and credential obtainment. The State AAE Office, in partnership with WIOA Title I (Workforce Innovation and Opportunity Act Adult, Dislocated Worker, and Youth) and III (Wagner-Peyser) programs, continue to provide common data validation procedures for monitoring participant case files and validating data. The State AAE Office pulled five percent of all student files to monitor and validate data from PY 2021. Programs that had files out of compliance were given examples and were asked to correct data. Under data validation protocol, severe data accuracy issues may result in a program improvement plan (PIP). PY 2021, data validation revealed no severe data accuracy issues resulting in PIPs. Smaller issues were immediately addressed and corrected. Financial reimbursement reports are submitted monthly or quarterly to the State AAE Office through the grant management system. Records, receipts, and allowable costs are evaluated against regulations in Uniform Administrative Requirements, Cost Principles, and Audit Requirements for Federal Awards (2 CFR 200) and state policies prior to processing for payment. During PY 2021, DETS worked with a third-party vendor to develop a new grant management system, still currently in planning stages. The new system is expected to go live by PY 2022 and will provide financial reports and monitoring tools to both the eligible provider and the AAE State Office. The AAE Office traveled to Juneau, Bethel, Dillingham, and Kotzebue to visit four programs for on-site monitoring. The AAE Office used monitoring tools, including on-site monitoring, financial monitoring, and teacher observational tools to evaluate each program. The local programs were provided with technical assistance, recommendations, and placed on a Corrective Action Plan (CAP) if suggestions were noted.
American Samoa The evaluation of the program is based on measurable objectives and indicators as appeared in the Adult Education Plan within the American Samoa State Combined Plan. The number of classes held, annual population count of participants, program reports, and the number of HiSet participants successfully passing the test for the high school equivalency diplomas, the intended use of CASAS data, and TOPSpro use in the future will continue to improve this area.
Arkansas Mandatory Administrators' Meetings are held semiannually, during which policies and procedures are discussed, information is disseminated, and providers can gain additional professional development from state staff, guest speakers, and each other. ADWS/AES monitors local programs through submitted quarterly reports, annual one-day site visits, and intensive three-day program reviews performed on each program every four years or as determined by their level of risk. One hundred percent (37/37) of providers received an annual site visit in 2021-22. Due to more relaxed COVID restrictions, site visits and program reviews were again conducted on-site, following a year of primarily virtual monitoring in 2020-21. During the 2021-22 Spring Administrators’ meeting, several topics identified through these visits as areas of difficulty for local providers were addressed, such as recruitment planning and IET implementation. ADWS/AES also evaluates each program annually through an E&E (Effective and Efficient) calculation, currently based upon the programs’ progress on the federally negotiated benchmarks in Table 4. Programs not meeting E&E are provided with intense technical assistance to develop and implement a program improvement plan.
California The AEO at the CDE provides monthly desktop monitoring, and uses a risk-based analysis to select sub-grantees for a more formal comprehensive review process. Criteria used to determine which agencies are reviewed include new administration; overall funding amounts; chronic, late deliverables, and similar issues. The Federal Program Monitoring (FPM) Office at the CDE coordinated and scheduled FPM reviews for all programs required to monitor federal funds at the CDE. Agencies selected for review attend several training workshops where they receive detailed instruction on the monitoring process, the Adult Education Instrument used to guide federal reviews, and all evidence requests agencies are expected to upload. Moreover, to ensure the AEO conducts fair, thorough, and consistent reviews of all agencies, reviewers meet yearly for a formal discussion of performance and several times throughout the year informally to debrief all reviews conducted throughout the state.
Colorado AEI conducted an annual risk assessment of grantees to determine risk but did not conduct onsite monitoring based on the results due to continued COVID-related restrictions. AEI conducted quarterly monitoring calls with all grantees to ensure alignment with AEFLA requirements and to identify best practices and innovative activities in programming. This collected information was provided to grantees in follow-up reports and during Office Hours. At the request of grantees, in-person monitoring occurred during these quarterly calls. Additionally, grantees received quarterly data performance reports displaying their key NRS data with trends, prior year comparisons, performance target reminders, and key recommendations.  In the 21-22 program year, AEI continued to monitor grantee data monthly to identify any concerns about enrollment, post-testing, and measurable skill gains (MSG). AEI utilized a custom virtual grantee dashboard in the statewide data system, LACES. This monitoring was used to provide technical assistance to grantees to support improved performance and accuracy in data reporting. AEI also increased awareness around the importance of data by highlighting specific data topics in each bi-monthly Office Hours webinar and during quarterly Data Talks webinars. AEI oversaw the implementation of four IET programs in the 21-22 program year. The team worked with grantees to ensure compliance at every level: industry selection, development of shared objectives, and the implementation of co-enrollment. The IET toolkit AEI used in 21-22 focuses on alignment between the IET program and the CCRS. AEI also defined and documented its IET toolkit review and approval processes to aid grantees in progressing from IET design to IET implementation.  
District of Columbia OSSE AFE monitors sub-grantees to evaluate local program performance via quarterly monitoring reviews, monthly and/or quarterly check-in meetings, desk reviews and final annual monitoring. Additionally, the AFE team conducts classroom observations, folder samplings, and fiscal monitoring verifications. Local program providers are required to submit quarterly statistical and narrative reports with evidence that includes student roster reports, National Reporting System (NRS) fundable Student Roster Report, NRS Tables, CASAS Current Year Pre- and Post-test Assessment Report, student core goal attainment reports, and other related documents. Local program participation in an annual final monitoring review and the development and implementation of a continuous improvement plan are also required. The OSSE AFE Quarterly Reports, Continuous Improvement Plans, Final Monitoring Tool, classroom observation tool, and student surveys, as applicable, continue to be used to assess the effectiveness of local programs and the improvement of adult education activities, as described in section 223(1)(d). The state also uses the performance data from local program providers via the monitoring process to address the specific professional development, technical assistance, and/or resource allocation needs of local program providers and to work with local program providers to develop and implement plans for continuous improvement.
Georgia In FY22, GOAE supported monitoring and evaluation of adult education programs through desktop monitoring, onsite monitoring, program evaluations, and disseminating models and promising practices. Desktop Monitoring GOAE assigns each adult education program a Grant Program Support Coordinator (GPSC) who serves as their main contact. In FY22, GPSCs provided technical assistance and support to programs, with a focus on ensuring programs provide high quality, standards-based, and evidence-based instruction to students. GPSCs communicate with adult education programs weekly and conduct frequent, informal monitoring to ensure program compliance.  In FY22, GOAE conducted quarterly fiscal monitoring audits for 2-3 programs each quarter. The quarterly desktop monitoring process looks at a program’s budget, expenditures, and supporting documentation for a short time period (between one and three months) and serves as an additional internal control to ensure Adult Education federal and state funds are being used in accordance with federal and state statutes, regulations, and the terms and conditions of the federal award. Onsite Monitoring GOAE selects programs to monitor based on the results of an annual fiscal and programmatic risk assessment. In FY22, GOAE prepared risk assessments for each program based on a total of 11 fiscal, programmatic, and performance categories. GOAE selected five programs with the highest level of risk to conduct monitoring visits in FY22. Based upon the monitoring visits in FY2, additional technical assistance and training will be implemented in FY23 to ensure program understanding and compliance with Time and Effort reporting requirements.  Performance Reports In January 2022, program organizational heads, as well as adult education program administrators, received a mid-year report, which included year-to-date information on spending, enrollment, performance outcomes, distance education, IET performance, and professional development. The reports provided a status update on each program’s progress towards their negotiated targets and other important benchmarks. GOAE provided technical assistance to programs below 40% of their negotiated targets. Additionally, GOAE began sending out monthly program metric spreadsheets to all programs to help them monitor progress towards their negotiated targets. Programs used this information to drive new recruitment and retention initiatives during the year.
Guam Each month the program submits a Cumulative Monthly Activity Report (CMAR) to the SAO describing its progress on activities identified in the program activities.  Report on successes and challenges were reported with supporting documentation.  The State reviewed the report and provided feedback through a State Monthly Report (SMR).  The feedback can be clarification, recommendations, or actionable items to ensure compliance, improvements on data collection, and ways to expand and improve activities that would increase recruitment and retention efforts.  Program monitoring is pivotal to student and program success.  SAO conducted periodic reviews of documents, data collection, and data entry to gauge both the student and the program’s success and challenges.  The State widely used TOPSpro to monitor the number of individuals who took the CASAS test and did not return to avail of the program, including the number of students with less than 12 hours of instructional time.  SAO would contact the provider to identify strategies to determine the cause of the stop-out and how the program can further assist students in getting back to their educational or career pathway. Faculty and student surveys were highly encouraged to identify barriers in teaching, student learning, faculty and student needs, and satisfaction.   SAO and provider are committed to improving student recruitment, retention, and completion. There was a 2% decrease in enrollment from the last program year.  This program year, 118 (61.45%) participants completed at least 1 EFL participant who achieved at least one (1) EFL gain.  Moreover, it is essential to note that the total percent completing level was 64.86 percent exceeding the 50% negotiated level.    PROGRAM YEAR Program Entering Education Functioning Level (EFL) Enrollment with at least 12 hours of instruction [NRS Table 1 and 2] Number of EFL enrolled with at least 12 hours of instruction that achieved at least 1 EFL or attained a High School (HS) diploma or its equivalent [NRS Table 4 Percent Completing Level 2021 - 2022 ABE 120 73 60.83%   ASE 35 21 60.00%   ESL 37 24 64.86%   TOTAL 192 118 61.45%  
Hawaii Monitoring and Evaluation In PY 2021 – 2022, the monitoring and evaluation process was not implemented due to the state director’s position being vacant. In previous years when there was a state director, the following process was utilized:
  • Quarterly submission of WIOA performance data by the local service provider to the State Office.
    • Data submissions were followed by quarterly meetings with the local service provider staff responsible for inputting the data. These meetings included discussions related to the identification and resolution of questionable data.
    • Quarterly meetings were held with the local service provider's administrative staff and data-inputting staff to review the data results.
  • Desk monitoring of 50% of the local service provider sites during the second semester.
  • Onsite visits to the local service provider sites that were desk monitored.
    • Onsite visits generally consisted of a presentation by the local service provider site, followed by a discussion related to the desk monitoring review. When possible, classes were observed, and feedback was provided to the site administrator.
Idaho Adult education programs are on a regularly scheduled monitoring rotation basis.  The number of monitoring visits the state conducts each year is based upon this rotation.  In addition to in-person monitoring, programs were monitored virtually for specific areas of need or concern.  The quality and improvement of AE activities are typically reflected by increased program performance; however, the particular reporting period was a year of rebuilding from the pandemic.  The emphasis of program quality emphasized the programs’ abilities to offer a mixture of classroom types with flexible scheduling: fact-to-face, virtual, hybrid, and distance learning.    These varieties of learning platforms proved to increase the confidence of both new and returning students that their health and safety were of the highest priority and that their academic and career goals would be met.  Enrollment increased by approximately two-thirds, getting Idaho’s enrollment close to pre-pandemic levels.  Progress towards improving a risk-based tool was made in collaboration with local directors.  The tool identifies low, mid, and high-risk programs.  Depending on the outcome of the risk analysis, additional monitoring and site visits may be deemed necessary.  The monitoring procedures were improved, and a much more comprehensive procedure was developed and shared with local programs.  Professional development provided the expectations and timelines involved with a monitoring visit.  Progress towards planned monitoring and evaluation is solidified in conjunction with Perkins site visits and MOA visits.  A state team was identified to conduct monitoring.  During biweekly AEFLA director meetings, programs presented what models worked and which models failed to produce the desired program outcomes.  Programs shared best practices and implemented accordingly.  Despite the rebuilding challenges, programmatic improvements and/or modifications were made so that they were able to better serve participants, particularly those with health concerns needing continued distance learning opportunities.  Finally, a cornerstone for quality improvement involves quarterly desk audits.  In addition to MSGs, post-testing rates, and core performance data, programs report on quarterly successes and challenges which resulted in technical assistance and special recognitions. 
Illinois Adult Education and Program Support staff oversaw adult education program quality through ongoing communication, desk-top and on-site monitoring, and regular review of data. Additionally, the adult education division had weekly staff meetings and staff retreats to jointly review program data and discuss program needs. This process created a collaborative environment where promising and innovative practices were identified and then disseminated through the adult education Professional Development Network. In PY21, 100 percent of ICCB funded AEL programs received ongoing and monthly desktop monitoring facilitated by Adult Education Program Support Specialists. The intent of formal programmatic monitoring was to directly review compliance with all applicable governing laws and grant deliverables as outlined in the AEFLA Notice of Funding Opportunity/Grant application and the Uniform Grant Agreement. During the monitoring process, information was requested and analyzed to determine the compliance of specific reviewed items. A formal risk assessment using a quantitative system for rating and ranking grantees and their ICCB-funded programs was used to identify programmatic and fiscal risk. Each grantee was allocated points based on the criteria below (not an all-inclusive list) and was assigned a risk level of elevated, moderate, or low based on the total number of points allocated relative to other grantees. Criteria used in the risk assessment is evaluated and updated annually and included the following:
  • Unspent grant funds
  • Completion of grant deliverables
  • Number of material weaknesses or significant deficiencies in the grantee’s most recent audit
  • Number of conditions assessed in the most recent Internal Controls Questionnaire (ICQ)
  • Timeliness of required submissions (performance, programmatic, financial and final reports)
  • Number of findings in previous grant monitoring review
  • Amount of grant funding
  • Years since last monitoring visit
Monitoring activities were dependent on the grantee’s risk designation and included either a full virtual  review (elevated risk), a desk review (moderate risk), or fiscal and programmatic technical assistance (low risk). 100% of adult education programs had a program support visit to ensure compliance with the adult education program expectations, foster positive relationships between programs and the ICCB, and identify areas of support needed to ensure high quality programming which leads to student outcomes.
  • Monitoring and evaluation of the quality and improvement of adult education activities as described in section 223(1)(d).
To monitor and evaluate the quality of adult education activities, program management, fiscal management, data management, and performance measures are continuously assessed. Informal monitoring, desk audits, data checks, and program visits were conducted by state central office staff, and the InTERS data team. Low performing programs were identified, in part, based on NRS table 4 results. Visits (in-person and virtually) were made to struggling programs. Plans were outlined to contract with IAACE to employ a former local director of adult education to work with the state office to mentor new administrators and assist struggling programs. The position was to start October 2022.   Virtual monthly meetings were held to discuss program goals, outcomes, and continuous improvement with providers. Likewise, a comprehensive risk assessment was conducted in the spring of 2022. Local programs developed professional development plans, targeted measurable skill gains to increase academic gains, and developed strategies to increase enrollments and reduce student separations. “Report cards” were provided monthly to local programs outlining key metrics and analysis was provided during regional meetings. Report cards presented comparisons to state and local data based on points in time. Quarterly reports submitted by PDFs were utilized to identify promising practices, technical assistance, and gaps in service. Promising practices were highlighted monthly during statewide webinars. Local program personnel were placed on the agenda to share innovations on the call.
Iowa State staff assess providers’ implementation of the Iowa Program Standards Framework with on-site and virtual monitoring. The risk-based monitoring process allows staff to gauge compliance with WIOA provisions, identify areas of improvement, determine technical assistance needs, and note innovative or promising practices. In PY2021-22 state staff employed an updated risk analysis tool that incorporated new data elements related to performance, fiscal management, services, assessment, data quality, and Integrated Education and Training (IET). The results help the Department identify which local programs are at high, moderate, or low risk of noncompliance and to administer strategies appropriate for each tier, including virtual monitoring and the development of improvement plans (high risk), virtual monitoring (moderate risk), and consultation with state staff to select and disseminate noteworthy practices (low risk). The Department conducts an on-site visit to each program during the five-year federal grant cycle (PY21-25), regardless of risk, for a total of three every year. While virtual monitoring involves a targeted review of select standards, on-site visits address all program standards. State staff expanded the on-site schedule to three days, with providers presenting a virtual program overview on the first day, the site visit on day two, and a virtual exit meeting for the preliminary report and discussion on the third day. The team met its goal of conducting a full monitoring of three programs: Southeastern Community College (3/29-31/2021), Iowa Valley Community College District (4/26-28/2021), and Western Iowa Tech Community College (5/24-26/2021). Nine of the 12 programs received targeted virtual monitoring based on areas of need identified by the annual risk assessment. The remaining three programs consulted with the Department to identify and share a best practice or model with the Iowa AEL community. The ultimate goal for the Department’s monitoring process, regardless of strategy, is continuous program improvement. 
Kansas KBOR is committed to monitoring at least 20 percent of local programs each program year, which is performed with an on-site or virtual visit. Face-to-face monitoring is preferred, which was able to occur in the second half of PY2021, while virtual visits were conducted in the first half of the year due to travel restrictions. Local programs are monitored in multiple areas, including program activities, performance outcomes, data collection, professional development, and integration with one-stop partners. Monitoring results are discussed with local program directors and shared with the head of the sponsoring institution, with any findings required to be addressed within 60 days. For PY2022, KBOR has further refined the monitoring instrument and process for more complete information and improved internal communication and workflow. KBOR conducts an annual risk assessment for all programs, measuring actions and outcomes that place programs at increased risk of noncompliance. This matrix was updated in PY2021 with four assessment levels: negligible risk, potential risk, moderate risk, and high risk. Programs identified as potential or moderate risk will be engaged in interventions from the state and may be subject to a Program Improvement Plan, which includes more technical assistance and increased monitoring and is tailored to the needs of the individual program, with state and local staff collaborating to develop goals, measures of success, and interim progress checks. Any programs assessed as a high risk are immediately required to engage in a Program Improvement Plan, or, if appropriate, KBOR will explore other remedies for noncompliance as described in the Code of Federal Regulations. In addition to these formal monitoring methods, KBOR takes multiple opportunities throughout the program year to assess program quality and performance. Data accuracy, compliance, and outcomes were reviewed at least quarterly, with monthly checks scheduled in PY2022. Funding and spending for each program was evaluated semiannually, with more frequent reviews conducted for programs indicating fiscal challenges. Each month, progress in required professional development activities is reviewed and communicated to programs. For PY2022, plans are being developed for more regular and structured financial evaluations, on top of the dedicated and intensive fiscal audits performed by the KBOR Finance Office. KBOR has also instituted regular informal follow-ups with programs to discuss their goals and progress in PY2022
Kentucky Monitoring and Evaluation Pursuant to the local provider’s contract with OAE for the provision of adult education services, local providers were required to operate a program in compliance with the KYAE Professional Learning Requirements for all provider roles. Staff who deliver adult education services and are not paid are still subject to KYAE Professional Learning Requirements. Per the PY21 KYSU Implementation Guidelines, page 90, “Should the provider fail to meet this requirement, a Notice of Noncompliance will be placed in the program file and will be considered in any evaluation of the program’s performance.” At the conclusion of PY21, 14 of the 27 local providers were fully compliant with professional learning requirements with no stipulations. The remaining 13 were compliant with the stipulation that new instructors hired near the conclusion of PY21 must complete all required new instructor courses in PY22.
Louisiana During 2021-2022, Louisiana’s onsite monitoring instrument used a risk-assessment model that incorporates six vital modules -- data, recruitment/retention, classroom activities, records/reports, partnerships, and finance -- in an effort to model the USDE/OCTAE instrument and place emphasis on what is valued and consistent with the WRU mission. Programs were trained on the monitoring instrument. The monitoring instrument can be used as a training and planning tool for local providers.  Monitoring activities are based on federal requirements and performance measures. These measures are associated with risks that are high, medium, or low. This assessment includes both programmatic and fiscal functions or activities that include: Fiscal Risk Assessment
  • Federal Award Amount
  • Single Audit Findings
  • Previous Monitoring Findings
  • Unresolved Corrective Actions
  • Length of Time since the last Comprehensive Fiscal Monitoring
  • Timely Submission of Reports
  • High Balance Remaining of Expenditures
Programmatic Risks
  • Program Performance
  • Student Enrollment
  • Personnel Turnover
  • Participant Progress
WRU continued utilizing an electronic grants management system (eGrants). Recipients entered all budgets, revisions, and reimbursement requests in this system. Providers were trained on the WRU Recipient Grant Management Handbook as well. The purpose of the handbook is to provide recipients with a single point of reference for managing/expending all federal AEFLA funds and to set forth the policies, procedures, and guidelines intended to assist in the proper administration programs at the local level. The statewide compliance team’s monitoring procedures included data analysis such as program performance and fiscal information. State staff requested local data documentation based on desk reviews conducted according to a risk assessment determination. Five programs were selected for monitoring in FY 21-22. Onsite visits included examining student files, student attendance records, and program data submitted through the statewide data management system. Monitoring reports were prepared after each onsite monitoring visit. Sites that were non-compliant/had findings received recommendations for program improvements. Programs were given 30 days to prepare and submit a written action plan describing a resolution plan. State staff members were assigned to ensure all plans were adhered to and non-compliance was addressed in a detailed follow-up process.
Maryland Adult education program specialists conducted program evaluation and monitoring throughout the reporting period through a combination of a desk review of quarterly data, midyear and final reports, and virtual site visits. The State conducted three week-long full virtual monitoring visits during PY21. Using a virtual monitoring model developed in-house allowed for significantly more classroom observations than are possible during onsite visits. The entire state team (or majority of team staff) could more readily evaluate the effectiveness of online instruction and identify best practices and problem areas requiring technical assistance.  A written report detailing observations, recommendations, and required corrective action, if indicated, is provided at the conclusion of each monitoring visit.  Two IELCE-IET programs were also monitored during this time combining online class observation and in-person IET training content visits. Fiscal monitoring visits and annual enrollment data verification audits are performed through the Division’s Office of Monitoring and Compliance (OMC). In PY21, monitors from OMC conducted enrollment data verification for all local programs. State monitors have been able to conduct successful audits virtually, improving efficiency and avoiding delays due to COVID closures. Programs that fail to meet data quality standards are required to submit corrective action plans consistent with the federal data quality checklist and provide professional development to staff in understanding the importance of consistent data collection methods. Fiscal monitoring resumed in PY 21 with five local programs receiving monitoring from OMC.
Michigan Michigan uses a multi-faceted, team approach to its monitoring and evaluation activities. Topics covered include, but are not limited to, grant activities, allowable costs, data collection, data reporting, and data quality. Michigan monitors 100% of its grantees via a desk review. On a regular basis, the Fiscal Analyst runs reports that track budgetary activities in the Next Generation Grant, Application, and Cash Management System (NexSys) to ensure grantees are complying with federal and state fiscal regulations and policies. Concerns or instances of non-compliance are discussed with program staff and follow-up action is taken to address any concerns or non-compliance with providers. In addition, MAERS reports containing provider enrollment and performance information are also run on a regular basis and reviewed by the MAERS team and Adult Education staff. Any concerns or instances of non-compliance are discussed internally, and follow-up action is taken, as necessary and appropriate, to address concerns or non-compliance with providers. The Office of Adult Education staff also review grantee narratives, modification requests, and final narrative reports to ensure grantee compliance with federal laws, regulations, and guidance, and state policy.  Again, any concerns or instances of non-compliance are addressed with providers. Onsite monitoring and evaluation visits are intended to complement the desk reviews and also provide an opportunity for state staff to provide targeted technical assistance. LEO-WD is resuming onsite monitoring during PY 2022.
Minnesota The state Adult Education Leadership Team monitored the quality of adult education activities through the following: ongoing data system development and training to equip local and state staff to record and monitor adult education data; review of NRS data; expenditure verification via submission of audit-certified expenditure reports; site visits to local adult education programs (in-person and virtual); annual submission of assurances by grantees; and compilation and distribution of the annual “report card,” which ranks programs on several accountability measures including Measurable Skill Gains (MSGs) and post-testing rates. In addition, accountability training was provided at the following events: support services conference, ABE summer institute, fall and spring “regional” events, statewide local administrator meetings, quarterly webinars, and other events. Additional details can be found online at:
Mississippi MS OAE’s monitoring procedures included analysis of data and program performance through monthly data submissions and desk reviews. Follow-up onsite visits were conducted when warranted. During 2021-2022, Mississippi’s monitoring instrument, MS Adult Education Monitoring Tool, incorporated five (5) vital components – WIOA and State Plan Coordination, career pathways, data quality, curriculum and instructional practices, and finance. In 2021-2022, the six programs selected for on-site monitoring and classroom observations utilized the tool to guide discussion about the implementation of minimum components and other compliance requirements for providers receiving funding from the MCCB’s, Office of Adult Education. Programs being monitored would complete the tool and provide specific evidence in each area as requested.  Monitoring reports were prepared after each monitoring visit. With-in four weeks of the visit, the OAE sends a letter to the program director noting any commendations, recommendations, and findings, if applicable. If corrective action was needed, a letter outlining the timeline for reply is required from the program. Programs were given 30 days to prepare and submit a written plan of action describing the plan of resolution. State staff were assigned to ensure all plans were adhered to and non-compliance addressed in a detailed follow-up process to ensure a resolution was determined and put into effect. Monitoring and evaluations are accomplished by multiple methods.  Desktop monitoring and actual on-site review visits make up the process used to evaluate the success and/or areas for program improvement.  The OAE utilizes a Desktop Monitoring Tool based on the National Reporting System (NRS) Educational Functioning Levels (EFL), Measurable Skill Gains (MSG), High School Equivalency attainment, and postsecondary education, training.  Mississippi included four additional state indicators to include: 1) Posttest-rate goal, 2) Smart Start Credential attainment, 3) National Career Readiness Certificate attainment, and 4) Career Pathway enrollment.   After the completion of the desk audit, programs are contacted by phone, email, or a visit to discuss recommendations for improvement and to provide technical assistance.  Programs are required to follow-up on how to increase performance in those areas.  The state continues to revise and adapt new features to the compliance review and technical assistance process.  Desktop monitoring is completed for all programs quarterly, at a minimum, to assist programs with staying on track and meeting the annual state performance target. When completing formal on-site program monitoring, if it is determined a program is in noncompliance with state and federal policies related to local data management and program services, the program is placed on a Corrective Action Plan. In addition to these formal monitoring and evaluation methods, review of dashboard data and other data analysis frequently prompts targeted technical assistance of specific performance areas, which generally includes a deeper assessment/evaluation of the area being analyzed. Programs who do not meet the annual state performance target are required to complete the Desktop Monitoring Tool ( monthly, in lieu of a Performance Improvement Plan, as well as receive intensive technical assistance.  On-site monitoring visits are formal, scheduled visits with local program providers and are on a three-year rotation cycle.  These visits consist of examining the progress made in the project against the agreed upon goals set forth in the application for funds. Monitoring visits also provide an opportunity to make constructive suggestions, recommendations, learn best practices, and note areas in need of specific professional development.  Monitoring also employs systematic collection of data and on-site observations to provide stakeholders the extent of progress and achievement of objectives, proper and lawful use of funds, and compliance with federal and state policies and guidelines. On-site review visits resumed in the 2021-2022 fiscal year.
Nebraska The State Office Monitoring Team regularly monitored AEFLA programs throughout the program year in multiple ways to assess compliance with WIOA requirements, address non-compliance issues and identify best practices and programmatic progress. Quarterly desktop monitoring of all programs proved to be beneficial to identify the needs of providers and offer assistance in correcting issues. Assessing risk established priorities for both full and targeted monitoring during the program year.  With the continuation of issues related to the pandemic, additional targeted monitors proved to be beneficial in communicating ongoing discoveries of non-compliance. Local programs conducted self-assessments which proved invaluable for identifying potential issues. Continuous evaluation of data quality and program progress through both informal and formal means identified issues and ultimately laid the framework for successful performance.  
Nevada State Leadership planned and offered Nevada Adult Education directors’ meetings, designed to provide opportunities to share best practices, policies, and tools to support program improvement. During the 2021-2022 program year meetings were more frequent but shorter to accommodate the virtual nature. A risk-based monitoring system and process for placing programs under corrective action is used with local program directors and staff to drive program improvement and the PD contractor provided targeted technical assistance based on areas in need of improvement. The work with local programs included reviewing and analyzing the National Reporting System (NRS) data. Program director meetings included information on leadership and program management. During PY21 Nevada achieved the highest level of Measurable Skill Gains achieved in at least a decade.  All programs achieved MSGs over 40%, which was the initial threshold that was set as a minimum for programs to avoid being placed under a Warning Status. Most monitoring during the 2021-2022 program year was virtual, however, state staff did visit several programs to provide Technical Assistance. Desk monitoring continued throughout the program year and promising practices were identified and communicated with all local programs. Work has continued with the state longitudinal data system and a public facing dashboard is under development to showcase AEFLA program performance in each of the core measures.
New Jersey The monitoring and evaluation of the quality and improvement of adult education activities as per 223(1)(d) continued virtually and in-person for PY21. Three regional coordinators work in tandem with central OAL staff to complete monthly desk audits of all Title II lead agency programs and partner agencies. Areas of focus for PY 2021 included continued fiscal, client intake/exit, digital literacy planning, and curriculum monitoring based on the annual risk assessment.  NJ DOL OAL staff continue to analyze, review, and monitor the following: provider budgets for reasonable and allowable spending, contracts and modifications; monthly draw-down of expenditure reports, NRS data in LACES including meeting negotiated benchmarks for performance. All Title II providers receive a detailed “report card” noting their agency’s progress in addition to a statewide summary of overall performance. Report cards were disseminated and discussed at the required Title II Director’s meeting in October 2021. These reports provide an overall provider “grade” as well as a ranking of each Title II consortia program statewide against negotiated performance metrics. The Covid impact correlated to come providers not meeting some of the required federally negotiated targets. Many directors shared frustration with student attendance and persistence and the State Director is developing professional development around these topics for Program Year 2022.
New Mexico The primary purpose of program monitoring is to provide program oversight. This oversight monitoring ensures that funded programs comply with the federal and state requirements of the grant funding. This monitoring includes the review of fiscal and data processes and procedures, expenditures, provided services, eligible students, and other aspects of the grant agreement and federal and state regulations and requirements. Another critical purpose of program monitoring is to promote continuous program improvement. In our program improvement monitoring, we examine data and performance records, curriculum, program management, teacher and staff development and training, curriculum, and other program related components in order to assist programs in meeting students’ needs. Our monitoring activities around compliance, fiscal management, data integrity and performance, and program management are carried out on an ongoing and regular basis throughout the year, as well as through formal site visits. Ongoing monitoring and evaluation on fiscal management and data integrity are both carried out through monthly desk review, which may prompt one-on-one technical assistance and/or correction of individual programs. In addition to monthly desk reviews, data integrity and performance are also reviewed with programs on a quarterly basis in synchronous virtual meetings and technical assistance is further provided through monthly data and performance webinars led by LiteracyPro. Fiscal desk review involved processing requests for reimbursement for state and federal grants, and the comparison of these requests to program budgets and allowable costs rules. Ongoing monitoring and evaluation on program management, including WIOA partnerships, IET, and funded work under WIOA Sections 225 and 243, are carried out through frequent, one-on-one interaction with our program staff. Technical assistance provided to programs was reinforced through monthly all-program meetings and twice-weekly email updates from the state office, as well as through targeted professional development opportunities. Programs turned in their annual reports by September 1. We read these reports and used the information gleaned from them, as well as information about each program contributed by each staff member according to their area of expertise, to evaluate local programs for risk. The NMHED-AE team completed our Risk Assessment Tool in the fall of 2021 on all 26 AEFLA-funded programs and identified six programs as priorities to visit in program year 2021-2022. The programs identified for Site Visits during the 2021-2022 program year were Northern New Mexico College (NNMC), Gordon Bernell Community School (GBCS), Youth Development, Inc. (YDI), Catholic Charities (CC), Southeast New Mexico College (SENMC), and Luna Community College (LCC). A NMHED-AE staff member’s surgery and wildfires that threatened one of the communities caused the postponement of two of the visits (SENMC and LCC) to October 2022. Formal site visits provide us the opportunity to take a deeper and more wholistic look at each program. In PY 21/22, site visits followed a hybrid model with an online Entrance Meeting, followed by in-person program, data, and fiscal meetings, and an online Exit Meeting in which the Site Visit Report and Program Enhancement Plan (PEP) were reviewed between the NMHED-AE team and AEFLA Program staff and administration. NMHED-AE staff systematically monitored the PEPs in PY 21/22, and continue to do so now, to ensure that technical assistance is provided as needed and that deadlines for each item are met. NMHED-AE revised its monitoring procedures and implemented more thorough practices in PY 21/22.  Our revised documents are posted online for our programs in an effort towards increased transparency and partnership in continuous program improvement ( Our revised monthly data review survey is located here: Monthly Program Performance Collection (
North Carolina During the 2021-22 program year, the North Carolina Community College System (NCCCS), Office of Adult Education conducted a combination of virtual comprehensive monitoring and virtual continuous fiscal and programmatic monitoring. All Title II funded programs were monitored via the Learning Management System, Moodle. A total of 11 programs engaged in the virtual comprehensive monitoring sessions. Additionally, a total of 69 providers engaged in the monthly continuous fiscal and programmatic monitoring.   Risk Assessment Information   Prior to setting the schedule for monitoring, each year the NCCCS, Office of Adult Education conducts a comprehensive Title II Risk Assessment on providers. The risk assessment is used as an instrument to determine each provider’s level of risk regarding non-compliance with Title II funds. To ensure an equitable and objective selection process, providers were selected to participate in the virtual comprehensive monitoring based upon the results derived from their program’s risk assessment. If the program scored at least a total of 20 on their risk assessment, the provider was selected to engage in virtual comprehensive monitoring. Providers were assessed on the following criteria:  
  1. New Director (3 years or less)   
  1. MSG percentage   
  1. Time since last monitoring   
  1. New WIOA grantee   
  1. Budget Expenditures 
  NCCCS Virtual Monitoring   All comprehensive monitoring session dates were announced via email August 2022. This period allowed providers at least 90 business days to prepare and submit the required documentation for their impending monitoring session. Selected providers engaged in two-day virtual monitoring sessions from March-April 2022. Furthermore, due to COVID-19 protocols and restrictions, providers were required to participate in comprehensive monitoring completely at a distance. Facilitating monitoring completely at a distance was a new protocol introduced to providers, hence, the NCCCS, Office of Adult Education set the monitoring dates to allow time to provide ample technical assistance to providers.   To provide holistic fiscal and programmatic support, selected providers participated in a multitude of pre-monitoring activities. Providers were required to attend a two-hour statewide monitoring webinar and engage in pre-monitoring meetings. During the statewide monitoring webinar, programs were provided with information about the purpose of the Title II Virtual Comprehensive Monitoring. Also, members of the NCCCS, Office of Adult Education team reviewed the expectations and requirements of the upcoming comprehensive monitoring sessions. Per the monitoring webinar, providers were monitored and reviewed based upon the following criteria:   
  1. Ensure local providers meet the WIOA, Title II, Adult Education and Family Literacy Act (AEFLA) requirements,  
  1. Improve the quality of Federally funded activities,  
  1. Provide programs with assistance in identifying and resolving Title II accountability problems,  
  1. Ensure the accuracy, validity, and reliability of data collection and data reporting as well as policies and procedures for program accountability.  
Furthermore, providers that were selected for comprehensive virtual monitoring were provided with a detailed Title II Monitoring Procedures Manual. During the statewide monitoring webinar, members of the Office of Adult Education provided an in-depth overview of the contents of manual. Please see the link for the Title II Monitoring Procedures Manual: Monitoring Procedures Manual. Also, during the statewide webinar programs were provided an in-depth tutorial regarding how to use the Title II Monitoring Checklist document.   Upon completion of the statewide monitoring webinar, providers that were selected for virtual comprehensive monitoring were responsible for submitting both fiscal and programmatic evidence for one grant. Per the Title II Monitoring Procedures Manual, providers were required to upload all monitoring documents via Moodle 10 days prior to their agency’s selected monitoring date. For example, if a provider had all three grants (231, 225, and 243), the provider was only selected to engage in comprehensive monitoring for one of the three grants. To ensure validity and fidelity, the providers were monitored using a comprehensive checklist. Please see below the checklist utilized to monitoring the providers: Title II Monitoring Checklist(s) 231, 243 and 225.  Providers that engaged in comprehensive monitoring for the 231 Adult Education and Family Literacy Act (AEFLA), were reviewed based upon the 13 Considerations. To ensure accurate review of fiscal and programmatic evidence, the 13 Considerations were divided into five modules.  Please see module breakdown below:  
  1. Module 1-Instruction  
  1. Module 2-Program Practices  
  1. Module 3-Data and Performance Accountability  
  1. Module 4- Partnerships  
  1. Module 5-Financial Management 
Additionally, providers that engaged in comprehensive monitoring for the 243 Integrated English Literacy and Civics Education (IELCE) grant, were reviewed based upon the 13 Considerations and the IELCE program requirements. To ensure accurate review of programmatic evidence, the IELCE module was divided into six modules. Please see module breakdown below:  
  1. Program Participants and Services  
  1. Instructional Programs  
  1. Integrated Education and Training  
  1. IELCE Civics Education  
  1. Workforce Prep Activities  
  1. Professional Development  
Moreover, providers that engaged in comprehensive monitoring for the 225 Corrections Education grant, were reviewed based upon the 13 Considerations and the Corrections Education program requirements. To ensure accurate review of programmatic evidence, the Corrections Education module was divided into four modules. Please see module breakdown below:  
  1. Adult Education and Literacy Services  
  1. Post-Release and Transition Services  
  1. Integrated Education and Training  
  1. Instructor Professional Development  
Upon completion of the virtual comprehensive monitoring sessions, providers were required to engage in a one-hour post monitoring meeting. During the post-monitoring meeting, members of the Office of Adult Education discussed commendations, recommendations, and findings. Local programs were encouraged to allow pertinent members of their Title II team to participate in the post-monitoring meeting. Additionally, local providers were encouraged to invite their leadership team to participate in the post-monitoring meeting. By allowing a diverse audience to participate in the post-monitoring meetings, this cultivated broad programmatic conversations.   Upon review of the submitted evidence for all three grant(s), members of the NCCCS adult education team were required to document findings, recommendations, and the exact type of evidence submitted for evaluation. All Title II monitoring reports were developed based upon a team consensus. Each monitoring team consisted of a Subject Matter Expert (SME) for instruction, performance accountability, compliance, and management. Upon completion of the reports, all documents were reviewed and approved by the NCCCS, Title II Assistant State Director. All providers that engaged in virtual comprehensive monitoring received their official monitoring memorandum, monitoring report, and Corrective Action Plan within 60 business days of their agency’s official monitoring session.       Corrective Action Plan(s)  Of the 11 providers that engaged in comprehensive virtual monitoring, approximately eight of the providers were placed on Corrective Action Plans. A total of five programs were placed on Corrective Action Plans for the 231 AEFLA grant award. A total of two programs were placed on a Corrective Action Plan for the 225 Corrections Education Federal award. Moreover, only one provider was placed on a Corrective Action Plan for the 243 IELCE grant award.   Providers that were placed on Corrective Action Plans were required to submit written responses via Moodle within 30 business days of receipt of their agency’s final monitoring report. Additionally, upon receipt of the written responses’ members of the NCCCS, Office of Adult Education reviewed the proposed timelines for remedying the issues. Furthermore, providers that were placed on Corrective Action Plans were required to engage in monthly meetings with members of the NCCCS, Office of Adult Education Compliance Team. The monthly Corrective Action Plan meetings were set to last up to 180 days from the initial Corrective Action Plan communication. The purpose of the monthly meetings was to ensure that providers understood the requirements of their agency’s Corrective Action Plans. Additionally, the Corrective Action Plan meetings served as an apparatus to provide in-depth technical assistance and analysis of the program materials. For example, team members from the NCCCS, Office of Adult Education dedicated countless hours to providing holistic technical assistance around content standard aligned lesson plans. Per the requirements of the Corrective Action Plan, providers that received a required action regarding lesson planning were required to submit new lesson plans that met the requirements of the state office. This activity required a great amount of coaching and scaffolding as needed to ensure that students were receiving adequate instruction. Currently, 75% of the Corrective Action Plans have been remedied and officially closed by the NCCCS, Office of Adult Education. Both the local program director and the Title II Assistant State Director signed all closed Corrective Action Plans. At this time, 25% of the programs that were placed on Corrective Action Plans are working to remedy the identified issues by January 2023.   Continuous Fiscal and Programmatic Monitoring  To ensure that providers remain compliant with Title II fiscal and programmatic regulations, each provider is required to submit documentation monthly. Programs engaged in programmatic monitoring from July 1, 2021- June 30, 2022. During the 2021-22 program year, community colleges were required to submit an XDBR (Monthly Department Budget Report) for each grant and Time and Effort Reports. Community-Based Organizations were required to submit their agency’s Request for Reimbursement (ROR) and Time and Effort Reports. For the Continuous Fiscal and Programmatic Monitoring, all documents were submitted by the 15th of the month.   Upon receipt of the documents from the Title II funded providers, members of the NCCCS, Office of Adult Education Compliance Team hand reviewed each document. Specifically, the XDBR and RORs were hand calculated and reviewed for accuracy. If there were any questions, and or discrepancies with the submitted information, a member of the NCCCS, Compliance Team would reach out to providers to remedy the issue. Furthermore, on an as needed basis, NCCCS, Office of Adult Education Compliance Team members provided one-to-one technical assistance sessions regarding providers’ fiscal and programmatic documents. The one-to-one meetings allowed space for providers to ask questions that were specific about their program.  Furthermore, the one-to-one meetings provided more context regarding additional professional development that may be required to support programs.  For example, during the 2021-22 program year, many of the Community-Based Organizations had questions about completing Requests for Reimbursement. Subsequently, a fiscal training was established to support the CBOs and answer any questions related to submitting an accurate Request for Reimbursement.   Time and Effort reports were hand calculated and reviewed by members of the NCCCS, Office of Adult Education. Again, if discrepancies were found with the Time and Effort documentation, a member of the state would reach out to remedy the issue. Furthermore, because there were various questions about Time and Effort reporting, a statewide online webinar was facilitated to support programs. During the statewide online webinar, providers were allowed the opportunity to ask specific scenario-based questions that applied to exclusively to their programs.   Please note, a shared list of providers that are identified for comprehensive monitoring are maintained on a secure drive only accessible by NCCCS, Office of Adult Education staff members. Upon completion of both the comprehensive and continuous monitoring sessions all information is stored via the Moodle site for five program years. Additionally, a shared list is maintained regarding programs that did not comply with the requirements of the Continuous Fiscal and Programmatic Monitoring.  
Northern Mariana Islands Our office participates in a college-wide program review process where we evaluate the services we offer to our students.  We look at how we gather and share data, assess student and program outcomes, professional credentials and training of our faculty and staff, and community engagement. Our office also regularly participates in professional development training on adult education, online instruction, curriculum improvement, in-take process delivery, as well as inclusivity, cultural sensitivity, and disability awareness and support.  There are plans for the state core partners to participate in an Evaluation Peer Learning Cohort (EvPLC) training in Fall 2021.  We hope that this will provide our core group team members the guidance and foundation to apply what is learned in order to monitor and evaluate program activities in a way that brings value back into the programs.
Ohio The monitoring and evaluation of the quality of education in Aspire programs continues to be performed primarily by ODHE Aspire program managers with more support and collaboration from the Kent State University PDN. We continue to use a risk assessment tool with criteria such as NRS performance measures, enrollment, allocation, and audit findings.  Toward the end of FY 2021, the state director who began in May 2022 decided to restructure the split of programs for monitoring.  Instead of doing a basic geographic split by region, the state director with discussion of other team members split programs into 4 groups. 
  1.  College, University, and Community-Based Organizations
  2. K12, ESC, and Additional Career Centers
  3. Adult Diploma Associated Programs
  4. The Targeted Technical Assistance cohort based on 2 calculations including Enrollment rank and MSG rank. 
The senior program was assigned to work with the target technical assistance cohort to provide more intense monitoring, TA, and PD in combination with the PDN.  The 4 ODHE program managers monitor each program quarterly via the local Program Improvement Consultation Plan (PICP).  The decision was made in FY21, that the ODHE program manager and a PDN trainer would both participate in the PCIP discussion to assist with monitoring, TA, and the creation of a proactive PD plan.    In FY 2021, the ODHE team decided to pursue a new desk review template, and asked LiteracyPro Systems (LACES) to create a customized Desk Review.  In addition, each Aspire program received a data report before the final data submission in FY21 with data errors, missing pieces, and general components that needed to be verified within their data portal.  This is a new procedure and tool that we used to give programs more accountability in looking at and cleaning up their data at the end of the year.  Aspire providers submitted an annual Local Program Data Certification Checklist modeled after the federal checklist. This document certifies programs’ compliance with NRS data quality standards. Aspire staff monitor compliance with this document annually.   These tools for monitoring programs help the state staff and PD staff work collaboratively with the local programs to implement strategies for program improvement and stay on top of local performance issues. The state office and PD providers used various methods to ensure information about evidence-based practices and promising models were disseminated to Aspire practitioners. These methods included:
  • Offered “just-in-time” virtual trainings at the state and local levels to meet programs’ immediate needs. By focusing on the specific needs of the program, more local staff were able to participate and see that data improvement is a collective process. 
  • Sent a weekly electronic digest with information about training opportunities and quality resources.
  • Provided more peer-facilitated best practices webinars, webchats, and facilitated practitioner discussion listservs.
Oklahoma Programs were monitored continuously throughout the year using the LACES data system, CareerTech Information Management System (CTIMS) financial system, and onsite visits. Risk assessments were completed on all programs to assist state staff in effectively monitoring potential risk factors associated with grants funded by federal pass-through funds. ODCTE evaluated each sub-recipient’s risk of non-compliance with Federal statutes, regulations, and the terms and conditions of the sub-award. The focus ensured that grant programs adhered to the grant’s guidelines, carried out appropriate services, and ensured that proper internal controls were in place.   State staff monitored six programs during the 2021-2022 year. Due to the pandemic, three programs were monitored using a hybrid method. The hybrid method included one state staff member onsite and the other completing interviews virtually. Those monitored using the traditional method had two to three state staff onsite monitoring and interviewing. All programs monitored had commendations and opportunities for improvement. During the 2021-2022 year, none of the monitored programs had significant findings. Oklahoma AEFL staff are working to update monitoring documents and procedures to mirror some of the components used during federal monitoring with plans to roll out the new monitoring documents and procedures during the 2022-2023 year.
Oregon The state ABS Team carried out a variety of monitoring activities during the 2021-22 reporting period.  These activities included standard grant monitoring activities, use of program-specific tools and templates, meetings with grantees, creation of monitoring reports, and grantee involvement in monitoring reports.  Standard grant monitoring activities included document submission, desk monitoring, and online monitoring activities. Each local ABS program submitted a Final Financial Status Report, Federal NRS Tables (through TOPspro Enterprise), TOPSpro Enterprise Data Integrity Reports, and additional TOPSpro Enterprise reports as requested. Each local ABS program communicated with the state ABS Team via routine emails and Zoom meetings, as well as submitting records of local staff professional development, submission of invoices, and  reviews of program operations. The state ABS Team maintained documentation of monitoring efforts and provided feedback to each local ABS program, citing each program’s strengths and areas for improvement. The state ABS team also provided responsive training and technical assistance, thus ensuring that each local ABS program took appropriate actions to address deficiencies. The ABS Team created a Risk Assessment for each local ABS provider, incorporating a Financial Risk Assessment component and a Program Performance Risk Assessment component. Local ABS Directors then wrote Program Improvement Plans (PIPs), incorporating program administration and teaching and learning strategies to meet the needs of students. These PIPs were discussed during conferences with the state ABS Team, at which time the state ABS Team was able to offer feedback and support for each local ABS Director’s proposed deliverables and strategies.
Puerto Rico During the 2021-2022 PY, the AEP, through the Adult Information System (AIS) continued reviewing local activities data gathering on educational functioning level gains, and on the number of participants post-tested. The AEP maintained a service log for reference and monitoring. The AEP performed workshop for 48 education centers on data collection and upload to the AEP Participant information system (SIA – Spanish acronyms). Also maintained an incidents log to allow the education centers submit inquiries about issues and difficulties on data collection for NRS reporting. A total of 395 inquiries were attended.
South Carolina Monitoring and Evaluation of Adult Education Activities The OAE used funds made available under section 223 to monitor and evaluate funded providers in the following ways: The State Director of Adult Education assigns a Compliance Monitoring Review (CMR) Team to formally monitor all school district programs and community-based organizations (CBO) receiving federal funds and/or state aid to support approved adult learning services once every four years. The Compliance Monitoring Review (CMR) process is a systematic approach designed to assess the educational opportunities and the effectiveness of adult education programs and services in school districts and CBOs.  One-third of the programs are reviewed each year by a team of OAE staff. The other two-thirds of the programs are informally reviewed by desktop monitoring tools and informal site visits. To be successful, the CMR effort requires continuous follow-up and support activities including professional development and on-site technical assistance. Because of the COVID-19 pandemic, the formal review process was modified to include both virtual and in-person processes. The updated process includes an onsite record and attendance review.
South Dakota Subrecipient Monitoring Admittedly, the agency was remiss in not conducting any Subrecipient Monitors during PY2021-22.  Because of COVID-related challenges, PREP priorities, and the anticipated multiyear grant-competition [due to increased State General Funds], the WIOA Program decided instead to focus on Pre-Award Risk Assessments for the Applicants of Request for Proposal # 2735 in spring 2021.  (At the time of this report-submission, the agency has already conducted two PY2022-23 Subrecipient Monitors.) Desk Monitoring State staff provided continuous technical assistance through telephonic and electronic correspondence, desk monitors, conference calls, webinars, video teleconferencing, and even an occasion site visit.  Furthermore, local administrators, instructors, and data specialists took advantage of the fact they could contact state staff with any questions regarding programmatic policies and data-quality issues with the assurance they would receive timely responses. Adult Education’s web-based Administrators’ Meetings also provided opportunities to review participation rates, performance, data quality, policy changes, and program-goal updates.  The Quarterly Reports assisted state staff with monitoring new or ongoing issues while concurrently providing agencies with more meaningful documentation and evaluative processes; the consistent submission of quarterly data-sets affords the local subrecipient providers and the agency easy access to longitudinal comparisons across different points of the program year. Evaluation of Quality and Improvement South Dakota’s Adult Education and Literacy Program continues to consider the challenges to and efficacy of juxtaposing outcomes of co-enrolled participants with those enrolled in only one WIOA Core Program.  As with the Technical Assistance priorities, evaluation of quality and improvement were primarily highlighted in context with Co-Enrollments’ outcomes via the Participants Reaching Employment Potential service-delivery model.  Moreover, the agency and its providers continue to use the program’s WIOA Title II Funding, Participation Levels, and Performance Ratings [reference] as a framework for conversations about progress and improvement.  This reference details figures, outcomes, and percentages for a host of primary, secondary, and tertiary metrics. South Dakota’s WIOA Title II Program Specialist continues to make AEFLA data readily available to DLR’s Data and Evaluation Specialist, Policy and Data Analyst, and Executive Team.  Additionally, the Workforce Development Council Members, WIOA Core Programs, WIOA Required Partners, and the One-Stop Managers are regularly encouraged to review South Dakota’s Statewide Performance Report(s) and NRS Tables, as well as submit any other AEFLA-related data queries to the agency.
Tennessee Throughout PY21, TDLWD utilized four of our staff as a “curriculum and instruction team” who led technical assistance efforts related to curriculum and instruction for local staff. These staff included a director of ESL services, a director of academic services, a director of professional development, and an administrator of program development and operations. These staff members worked diligently to observe a variety of classrooms (both in-person and virtual) utilizing our instructor observation form and providing feedback to help teachers improve their practices, as well as to identify effective practices to consider for future statewide training and best practice dissemination.  We held many in-person training events and class observations as well as virtual options. This included a regular video conference of local providers’ curriculum and instruction liaisons to discuss ideas around instructional practice. It also included regular training for teachers on using the Schoology learning management system, WIN, NorthStar, BurlingtonEnglish, PowerSchool, and Aztec. We provided technical assistance to IELCE program instructors. The director of ESL services trained all IELCE instructors on the observation form. It was then used to train IELCE instructors on the requirements of an IELCE program and was used during monitoring activities. We continued observing classes and using the observation tool to collect data. We continued collaboration amongst IELCE instructors to share best practices. In 2021-22 we continued to implement an IELCE instructors professional learning community group, facilitated by our ESL director, which regularly convened virtually to discuss best practices.   In addition, IELCE instructors received multiple professional development opportunities throughout the year. At the 2021 statewide conference, instructors received training on English Language Proficiency Standards. Additional training was provided to each IELCE program on contextualizing lesson plans and instruction to incorporate student goal sheets. We also continued to expand the “Workforce and Civics Warm-up” curriculum for IELCE instructors to use in the classroom to meet the requirement for civics integration. Finally, IELCE lesson plan examples were shown to program instructors on how civics instruction can be implemented in the classroom. Fulfilling local providers’ role to provide access to employment, education, and training services as required one-stop partners.   In PY21, our regional education and workforce coordinators continued to provide technical assistance to local providers concerning their role as one-stop partners. They assisted local AE programs with increasing Workforce Development Initiatives, including:  integrated education and training (IET) programming (+11), pre-apprenticeships (+3), workplace literacy programs, and dual-enrollment opportunities (+1).    TDLWD continued to update the comprehensive “WIOA Partner Guidance” and the “Workforce Development Initiatives Guidance” documents to make sure the most up to date and accurate information is included.  TDLWD regional education and workforce coordinators took part in local partner meetings as well as cross agency training and business services team meetings.  TDLWD AE staff along with Workforce Services staff developed virtual access points for the virtual American Job Center and created a virtual pathway for Adult Education.  This allows AE students to access AJC services virtually as well as allowing AJC participants to access virtual AE services at the AJC.  Assistance in the use of technology, including for staff training, to eligible providers, especially the use of technology to improve system efficiencies.   In October 2021, TDWLD released the new professional development platform for all adult education staff. It is called Tennessee Adult Education Professional Development (TAEPD). Program Directors were trained on use of TAEPD. A user webinar was released and an ongoing user toolkit made available. It has allowed for on-demand access to state sponsored virtual professional development content for those who could not attend live sessions and contains registration, attendance, and feedback survey information. It houses additional professional development content to impact program effectiveness. Content varies and users can choose courses based on individual need or as recommended by program directors. Content includes topics on distance learning, Hyflex model of instruction, blended learning, Schoology, using Zoom, andhow-to” for the various state-sponsored software. The platform uses Schoology as the LMS to house the course content—this also serves as a model to instructors for use of Schoology as their distance learning tool with students. Schoology continued to be the state-sponsored distance learning platform. All new employees were provided an account when hired. Communication with users occurred within state level supported user groups. A series of professional development sessions for its use were provided monthly from January 2022-April 2022 and additional user courses were housed in TAEPD. Our membership has been extended with the IDEAL Consortium from World Education, which allows for TDLWD staff members to participate with staff from other states in learning about and discussing best practices in distance education and education technology. Resources on the Hyflex model from World Education are linked within TAEPD. This allows for ease of access to relevant content. In addition, TDLWD staff spearheaded the implementation of a new pilot program for corrections education, which allows incarcerated individuals to use Android tablets (via the vendor “APDS”) to access education and training resources. The pilot includes 24 tablets at three county jails in Tennessee. The tablets can be checked out and used by students in jail to work on basic academic skills curriculum and HiSET preparation. This project is a joint effort between TDLWD AE, TDLWD Workforce Services, the Office of Criminal Justice Programs (OCJP), and Tennessee Corrections Institute. The pilot program was funded through CARES funds. In the next program year, live video conferencing between student and teacher will be available on the tablets. TDLWD staff provided training to local staff on how to implement and utilize the tablets. The APDS tablet-based programs in county jails has expanded to include an additional 16 counties. These counties received grants through OCJP for evidenced-based programs. We have been able to expand adult education services to 10 facilities and are awaiting access to the additional six facilities to provide adult education services. To help support this effort, our staff members presented information about our services to the Tennessee Sheriff’s Association and the Tennessee Corrections Institute (TCI), the governing body for local and county jails across the state.
Texas To evaluate the quality and continuous improvement of adult education activities, as described in Section 223(1)(d), TWC’s Sub-Recipient Monitoring (SRM) team completes an annual risk assessment and then proceeds with desk-top audits or on-site review of each local program that includes both fiscal and programmatic criteria. SRM also conducts a separate review cycle for data validation to ensure compliance with OCTAE Memo 19-1. Upon completion of monitoring, AEL providers work with TWC’s Audit Resolution department to resolve any findings, and update procedures or local policies to ensure proper oversight. Unresolved findings or repeat findings are elevated to Technical Assistance Plans (TAPS) or Corrective Action Plans (CAPS) to ensure that appropriate strategies are put in place and that AEL providers have adequate oversight to meet acceptable standards and compliance expectations.  On-site or desk reviews are conducted by the state team to ensure that all AEL providers are monitored at least once during the grant cycle. Results from monitoring and data validation reviews are used to produced TA tools and resources to support AEL providers through various opportunities including special webinars, business meetings, trainings, and one-on-one assistance. In addition to compliance monitoring, TWC also initiates evaluation projects for specific purposes. In PY 21-22, TWC approved using 223 funds for a statewide evaluation project that is currently in the procurement process. The project scope will include:
  • Integration and coenrollment among WIOA Titles I, II, and IV programs
  • Student retention, persistence, and skill gains and credential achievement
  • Program staffing and organizational structures
  • Service models
  • Access to AEL services, including digital literacy assessment
This project will help to inform TWC of needed PD, TA, policy, and future state leadership initiatives.
Virgin Islands Ongoing program evaluation and feedback is paramount to program success. As such, SOCTAE is committed to continuously improving program performance. Programs are required to submit reports with drawdown requests that outline successes, challenges, students served, outreach, recruitment and retention measures and pain points (if any) and LACES entry. Accurate and timely data collection and entry, ongoing program monitoring, and professional development are key components to meeting SOCTAE’s mandates. SOCTAE monitors program data quality by performing regularly scheduled trainings and evaluation of the LACES database. In addition, SOCTAE analyzes performance measures in the territory, twice a year and monitors data quality and integrity using diagnostic tools on a monthly basis. SOCTAE coordinated and provided ongoing virtual trainings to Sub-grantees regarding data entry and use. The Assessment Policy was reviewed with and made available to every Sub-grantee which also incorporates NRS guidelines and measures necessary for program compliance. Programs received ongoing technical assistance from the State on their budgets, spending plans and allowable costs and each sub-grantee was provided with a copy of the EDGAR for reference. During FY 2020, SOCTAE experienced challenges with onsite monitoring due to COVID constraints.  However, desk and virtual monitoring for both fiscal and programmatic facets of the grant-award were conducted. SOCTAE provided continuous technical assistance through telephonic and electronic correspondence, desk monitors, conference calls, webinars, and video conferencing. Program Administrators, instructors, and data specialists contacted state staff with questions regarding programmatic policies and data-quality issues. The State also ensured that programs received ongoing TA from the MIS provider Literacy Pro. A virtual evaluation was conducted to demonstrate the challenges and issues that each sub-recipient endured during the pandemic, which included low enrollment of students and data entry into LACES.
Washington Virtual Program Review and Technical Assistance Visits were conducted by BEdA staff with 20 providers. Monitoring visits were as follows:
  • 13 were full program reviews.
    • Scheduled revisits with 11 providers for continued monitoring of corrective action plans.
  • 7 were follow-up visits to check progress on Corrective Action Plans.
  • Moved 2 providers to Enhanced Monitoring with more frequent check-ins.
  • Followed up on newly established corrective actions.
  • Conducted desk audits and followed up with providers to ensure issues were resolved and confirmed with evidence of corrected practice.
Wisconsin The WTCS coordinated a series of activities during the reporting period to monitor and evaluate the quality and improvement of Wisconsin AEFLA providers and adult education services. Each AEFLA provider must submit student-level data through the WTCS Client Reporting System on at least a quarterly basis. The data submitted are analyzed and compiled into the WTCS AEFLA Reporting and Performance Accountability Monthly Report. This report presents outcomes data including the number of participants served, pre-/post-test rates, Measurable Skill Gain rates, and fiscal indicators like grant spend down rates. The WTCS Office staff review this report each month to inform provider specific monitoring discussions throughout the year. In addition, all AEFLA-funded providers submit tri-annual reports that are reviewed by WTCS Office staff to monitor grant performance, implementation of programming adjustments to meet goals, and grant expenditures. The WTCS also coordinates the Wisconsin AEFLA Program Review Process that includes the annual provider Risk Assessment process and comprehensive AEFLA monitoring activities. Six providers engaged in the comprehensive AEFLA monitoring process during the reporting period. Monitoring activities focused on recruitment and retention, instruction, and data reporting and continuous improvement. All comprehensive monitoring sessions were coordinated virtually.
Wyoming Adult Education programs in Wyoming are on a two-year monitoring visit rotation; consequently the number of monitoring visits done each year is based upon this rotation. These virtual monitoring(s) have proven to be a very effective way to monitor and evaluate the quality of local AE programming.  Despite being extremely time-consuming, virtual monitoring visits provide the SEA with extended time to review and comment on evidence submitted by the local providers as part of their monitoring tool checklist. Once the SEA’s review process is completed, a virtual meeting is held between the SEA and the local provider to review each chapter in the compliance checklist and to provide technical assistance. For FY 21/22 there were five programs monitored and all were found to be in compliance. “In Compliance” letters were subsequently sent to all five providers. New, End of Year (EOY) report cards were issued to providers along with their risk assessment upon the submission and review of an EOY narrative report. These two documents gave providers information related to weaknesses and challenges the State identified in their program and provided them with their State ranking for both performance as well as size of program. The monthly State monitoring of local program data identified a systematic weakness in how career service hours were being recorded by local programs as not all hours were being recorded. The State conducted a training session with local directors and data staff so that the data could be corrected before the end of the year. In order to ensure that this type of problem does not occur in the future the State now requires that local monitoring of career service hours must be completed as part of the month desk monitoring report.