NCAA News Archive - 2007

« back to 2007 | Back to NCAA News Archive Index

Data review helps institutions complete APR puzzle


Apr 9, 2007 9:32:23 AM

By Gary T. Brown
The NCAA News

Here’s a pop quiz: What do you do when the NCAA national office calls to review the Academic Progress Rate data you’ve submitted?

a. Hit the speed-dial for your favorite compliance consultant.
b. Update your resume.
c. Rehearse the story you’re going to tell your AD.
d. None of the above.

The correct answer is “d.”

Actually, according to some campus officials recently selected to go through the newly implemented APR data review process, the correct answer is indeed “d,” but from the following list instead:

a. Don’t panic.
b. Carve out some additional workload time.
c. Prepare to benefit from the review.
d. All of the above.

The APR data review process was implemented this year — not to scare institutions, but to help ensure their data are accurate and complete. Fifteen schools were selected for the initial review, some because of irregularities in previously submitted data (the review covers APR data submitted for 2003-04 and 2004-05 academic years) and others randomly. None was chosen because of suspicious patterns.

“That’s an important distinction,” said Bill Regan, NCAA director of membership services who oversees the review process. “The review is an educational tool that helps institutions implement processes that produce accurate data, not an audit that assumes wrongdoing. Presidents said when they adopted reform that a condition of the APR is for data to be reliable. This not only supports that premise, but it also helps educate institutional personnel about the submission process.”

So far, the people responsible for filing welcome the assistance. And why not, since the review is intended to help ensure accuracy in all aspects of the program, including whether institutions are receiving all possible points and are taking advantage of all available waivers and exceptions.

“I am so pleased that the national office is doing this,” said Texas Tech University Associate Athletics Director John Anderson. “There are issues out there about the consistency of APR data, and clearly, this has helped us catch errors we were making in our own procedures. Now we know how to avoid them in the future.”

Many of the errors are simple, from data-entry mistakes to misinterpretation of definitions — the kinds of honest errors people wouldn’t know about unless they had to go through this kind of review.

“I said at the beginning of the process that I’m not worried about any of the data, but sure enough, the review pointed out eight to 10 honest errors of application,” Anderson said, noting that most were of the data-entry variety.

Holly Kerstner, assistant athletics director and senior woman administrator at Oakland University, said she trusted her school’s APR data, too, until the review revealed issues. “I saw the request simply as the NCAA needing to use our institution to gain more information and for us to make sure we knew what we were doing,” she said.

Data entry was the culprit in her case, too. “Some of our student-athletes were not given a retention point, but it was because the person doing the entry saw that they didn’t graduate in this amount of time but overlooked they were part time one of those semesters — things like that in which you have to dig for answers and make phone calls to the previous compliance people,” Kerstner said.

The NCAA’s Regan said track trips up people as well. Some schools that sponsor cross country, indoor track and outdoor track incorrectly count a student-athlete who participates in all three as one sport for APR purposes. Other common mistakes revolve around financial aid, especially instances in which a student-athlete’s athletics financial aid is for less than one academic year.

Benefits of the process

To the Division I Committee on Academic Performance, the group that created the review process, the exercise is accomplishing exactly what was intended.

CAP member Jack Evans, faculty athletics representative at the University of North Carolina, Chapel Hill, said it helps ensure a level playing field for all Division I programs, among other benefits.

“First, institutions going through the review may discover some instances in which they had misinterpreted definitions — the review provides accurate interpretations,” he said. “Second, they may discover a process that would benefit from some improvement to produce consistently accurate data.”

Also, Evans said, NCAA staff and committee members who participate in the process can identify improvement opportunities that can be shared through training seminars. To that end, the NCAA’s Regan said staff members already are applying feedback to improved reviews in the future.

“Institutions can and do use this as a catalyst for conversations and training on their campuses,” he said. “They also use it to initiate changes to enhance the processes on their campuses. The NCAA national office also uses the knowledge gained through this review to improve our educational efforts and share what we learned with the broader membership.”

Given all of that, while being called in for a review may be inconvenient, it shouldn’t cause heartburn.

“People need to understand it’s not an audit,” Oakland’s Kerstner said. “It’s to help you better understand how your process works and to understand your specific problems. We’ve seen the APR presentations at the NCAA regional seminars, but it doesn’t resonate until someone asks you why this person is coded this way, or why this person is in your cohort. Those specific questions applied to real cases make you examine your data more closely.”

Campus integration

Corey Bray, assistant director of athletics for administration at Eastern Kentucky University, said his advice to peers who receive the review memo would be to first understand that no NCAA police cars are waiting outside.

“Second,” he said, “realize that it could be a significant amount of work — depending on the number of issues the national office has with the data — added to your normal daily routine regarding APR submission. And third, don’t go into it with an adversarial attitude toward the people from membership services. All they’re trying to do is make the process better and help you do things more efficiently.

“In the end, the benefit will be better data now, and assurance for more accurate data in the future. It takes a little more work, but it’s a win-win.”

Another benefit of the review is that, like the athletics certification process, the APR data review encourages a broader campus reach to compile the data. To Bray, for example, the top priority during APR data compilation is to involve the registrar. That may already be ingrained in most institutions, he said, but not all.

“Also,” Bray said, “implement a system that works for your university — a standard process to acquire the data you need in a timely and accurate fashion. That will be different for every school, but it’s necessary to rely on a system that will cover your basics. Then once you get into that mode, you can cover some of the outlier student-athlete situations that take more time to investigate.”

Oakland’s Kerstner agreed, saying a good rapport with the registrar and the faculty athletics representative is essential.

“Make sure there’s not just one person compiling APR,” she said. “The review helps you determine who should be involved and the notes that should be taken in conjunction with compiling the data.”

Regan said about 20 institutions will be selected annually to go through the review process. So far, institutions are accepting the responsibility.

“When you first see that letter, you think to yourself, ‘OK, here’s another 20 hours per week,’ ” Anderson said. “But in the end, the additional review benefits the institution. It is time consuming and laborious ... but very worthwhile.”


© 2010 The National Collegiate Athletic Association
Terms and Conditions | Privacy Policy