By Jim Schlimmer, Director of Enrollment Analytics


As the admissions’ 2023 recruitment season comes to a close, campus personnel from presidents to cabinet members to each member of a school’s enrollment management team are beginning to raise their questions: 

What happened with new student goals? 

Why did our application pool grow yet enrollment declined? 

Why did we see a declining conversion rate of first-year students? Of scholars? 

What happened to our net revenue projections?

While these questions are not new to the enrollment management profession, they are more poignant in an environment of declining high school graduates, declining consumer confidence in higher education, and the narrowing of net tuition revenue margins.

How should the consummate enrollment manager react? And how should these questions— and other similar ones—help plan for a stronger institution?

Besides searching your inbox for 30-day cruise specials to begin the day after the final census day, the next proactive response would be to make plans now for a thorough admissions audit to be conducted after the census data and conclude prior to the NACAC national meeting. 

What is an Admissions Audit?

Simply put, an admissions audit is a thorough review of each important enrollment metric for the past recruitment cycle. 

What worked? What did not work? How can your past results affect your master plan for 2024?

Note: it’s not too late to update this document!

We recommend freezing the year-end data and using these for all current year’s audits. 

Secondly, these data may be used for comparative purposes for past years and future years comparisons. Use this time in August to clean up the data (you may be surprised how many ZIP code entries are mistakes and state fields are empty) and prepare your EM database for a final look by the time census day arrives.

What Enrollment Metrics to Include in Your Audit

Traditional metrics used to track admissions effectiveness begin with the following: inquiry types, state counts, major counts, scholarships awarded (especially premiere awards), ZIP codes, territory analysis, campus visitors, and other key performance indicators unique to your institution (such as church connections or special niches your institution maintains from its mission and history).

These KPIs have always been critical in any enrollment office. However, with student search programs changing and social media programs creating more stealth applicants, the role of the metrics above are, perhaps, less important. 

These traditional metrics should still be analyzed, but now we have the power to create new KPIs that reflect the power of CRMs and good enrollment data management. 

The opportunities to analyze an EM program are limitless. The new standard KPIs might begin with the following:

hover to view


Whether an EM office focuses on the traditional metrics above or the “new standard” KPIs, a simple, traditional horizontal funnel might be the model to follow for determining “success.” This model would resemble the outline below—most likely in a spreadsheet environment:

#  of inquiries, applications, acceptances, deposits, enrolled, and mean ACT

This outline could be used for standard metrics (above), KPIs (above), and other metrics to be determined by the college. Once the data are extracted, a spreadsheet environment would make it convenient to show management ratios: % of inquiry/application pool, conversion of inquiry-to-applications, accepted-student-to-enrolled conversions, and more.  

A final model might resemble a look below for initial inquiries:

click to enlarge

Final Efforts: Charts, Publish, and Distribute

Once these data have been cleaned and sorted into horizontal funnels, it is time to chart the data, and distribute the outcomes to the EM staff and pertinent voices around the campus. 

How can the data collected assist an EM office with planning?

  • Budgeting: As college name-buying programs change, the chart above gives you a chance to evaluate the significance of your name-buy program. In the example above, 20% of the enrollment pool comes from student name-buys. Conversely, only 11% of the enrollment pool comes from college fairs. These two examples may help an office with budgeting issues related to these selected programs. 
  • Student-generated inquiries: In the example above, 58% of the enrolled student pools came from student-generated inquiries. Would this information give support for investing more in website revisions? What if these results are decreasing year-over-year results? 

Comparing Your Data with National and Regional Trends

One natural outcome of assessing is to compare your results with like or aspirant colleges. 

For most of the management ratios cited above, there are national data to be used for benchmarking your results. If your accepted-to-enrollment rate has declined over a two-year period, how does this compare with similar colleges?

There are numerous national vendors that provide a reliable source of benchmarking metrics for your review. 

Conducting Your Audit with The Parish Group 

For many EM professionals, these suggestions for EM audits are part of your teams’ assessment process. If your EM team needs assistance constructing any of the above audit models, The Parish Group is here to assist you. 

We will provide the data modeling support, allowing your team to plan your future successes. 

Enjoy your August. Happy data mining!