Skip to content
Kansas Legislative Division of Post Audit

Reviewing Admissions to the KSU College of Veterinary Medicine and KU School of Medicine

Actions
Audit Team
Supervisor
Andy Brienzo
Manager
Matt Etzel
Auditors
Matt Fahrenbruch
Katie Merrill
Meghan Reynolds
Cade Graber
Published April, 2026

Introduction

Senator Mike Thompson requested this audit, which was authorized by the Legislative Post Audit Committee at its May 12, 2025 meeting.

Objectives, Scope, & Methodology

Our audit objective was to answer the following questions:

  1. Do the admissions policies at the Kansas State University College of Veterinary Medicine create potential preferential treatment for certain applicants?
  2. Do the admissions policies at the University of Kansas School of Medicine create potential preferential treatment for certain applicants?

To answer these questions, we reviewed information from 3 admission cycles (2023, 2024, and 2025) from the Kansas State University (KSU) College of Veterinary Medicine and University of Kansas (KU) School of Medicine. This included documentation showing the policies and processes these schools used to review and evaluate applicants for admission during these years. It also included data showing the details of the populations who applied for admission, received interview offers, and received admission offers in these years. As part of this work, we created logistic and linear regression models to determine whether certain factors were associated with possible preferential treatment. Finally, we talked with officials from both schools to discuss our results and gather their perspective.

More specific details about the scope of our work and the methods we used are included throughout the report as appropriate.

Important Disclosures

We conducted this performance audit in accordance with generally accepted government auditing standards. Those standards require that we plan and perform the audit to obtain sufficient, appropriate evidence to provide a reasonable basis for our findings and conclusions based on our audit objectives. We believe the evidence obtained provides a reasonable basis for our findings and conclusions based on our audit objectives.

Audit standards require us to report our work on internal controls relevant to our audit objectives. Standards also require us to report deficiencies we identified through this work. In this audit, we reviewed the controls the KSU College of Veterinary Medicine and KU School of Medicine have in place to prevent preferential treatment for certain applicants. We identified deficiencies in both schools’ controls, which we detail later in the report.

Our audit reports and podcasts are available on our website www.kslpa.gov.

Unintended preferences didn’t appear to ultimately influence who was admitted to the KSU College of Veterinary Medicine, but control weaknesses may have allowed preferences at certain points in the process.

KSU College of Veterinary Medicine Background

The Kansas State University College of Veterinary Medicine uses a multi-step application process for its Doctor of Veterinary Medicine degree program.

  • The College of Veterinary Medicine (CVM) offers several degree programs. We focused on the Doctor of Veterinary Medicine (DVM) program because it’s CVM’s flagship program. CVM also offers several master’s, PhD, and certificate programs related to things like veterinary medicine, biology, and public health, but these were not included in this audit.
  • Admission to the DVM program is very competitive. Figure 1 shows the number of applications CVM received during 2023-2025. As the figure shows, CVM only interviews and offers admission to a small percentage of applicants.
  • Because the process is competitive, even strong applicants may not be offered admission. The average admitted applicant during 2023-2025 had a 3.69 GPA in prerequisite science courses and 3.86 GPA in their last 45 credit hours of any courses prior to application. The average rejected applicant had a 3.42 GPA and a 3.50 GPA, respectively, in these categories. Both are far higher than the 2.8 GPA minimums.
  • The DVM admission process includes 2 key points at which the CVM determines whether to advance an applicant to the next stage.
    • First, the CVM uses a score-based formula to evaluate certain parts of an applicant’s file to determine whether to invite them for an interview.
    • Second, the CVM uses a second score-based formula to evaluate the remainder of the applicant’s file and their interview performance to rank them for admission. This formula includes 2 scores based on standardized rubrics. These processes are detailed later in the report.

Applicants must meet several academic requirements to qualify for admission to the DVM program.

  • Applicants begin the admission process by applying through the Veterinary Medical College Application Service (VMCAS). The American Association of Veterinary Medical Colleges operates this service, which allows applicants to apply to multiple DVM programs simultaneously. VMCAS verifies the academic information applicants submit. CVM officials said they only review verified applications.
  • To qualify academically and begin KSU’s DVM program, applicants must have at least 64 credit hours in prerequisite courses like chemistry, biology, physics, and communications. Applicants must have at least a 2.8 GPA in these prerequisite courses, a 2.8 GPA in just the science prerequisite courses, and a 2.8 GPA in their last 45 credit hours of any courses prior to application. Applicants don’t have to hold an undergraduate degree, however. Some applicants only complete the prerequisites to become eligible to apply.
  • In addition to academic information, applicants provide information related to things like their age, residency, and amount of veterinary, animal (non-veterinary, such as farm or ranch experience), research, employment, extracurricular, and volunteer experience. Applicants also provide information about their achievements, awards, and honors. Lastly, they must supply a personal essay, at least 3 letters of recommendation, and a reference for each relevant prior experience. CVM officials told us non-U.S. citizens are eligible to apply.
  • Finally, applicants may provide their gender, race, and ethnicity. These questions are optional in VMCAS, but nearly all applicants provided this information in the 3 years we reviewed.

Methodology for Reviewing the CVM’s Admission Process

We looked for certain controls in the CVM’s admission process to determine whether there were significant opportunities for preferential treatment.

  • Because admission to the DVM program is so competitive, legislators have raised concerns about whether preferential treatment may be affecting certain applicants’ admission outcomes. To determine whether this might be the case, we reviewed CVM’s controls for preventing preferential treatment. We also evaluated 4,117 qualified applicants’ outcomes from the 2023-2025 admission cycles.
  • We reviewed relevant academic literature and the U.S. Government Accountability Office’s standards for internal controls relevant to preventing preferential treatment in admission processes. We chose these standards because they are generally seen as authoritative and widely applicable.
  • Based on these standards, we reviewed whether the CVM had:
    • documented policies and procedures for each step of the process,
    • trained the reviewing personnel on the admission process requirements, relevant controls, and unintended preference prevention,
    • had multiple independent reviewers score applicant materials,
    • randomly assigned reviewers to each applicant, without management interference,
    • as much as possible, withheld applicant information that could result in preference (e.g., name, gender, race, ethnicity, background),
    • restricted access to application information that’s been withheld and scores that have been finalized,
    • used standardized rubrics, templates, and scoring systems, and
    • used standardized questions for applications and interviews.
  • We talked to CVM officials and reviewed documentation and data to determine whether they had these controls in place during the 2023-2025 admission cycles. This included things like policy and procedure documents, digital access controls, scoring rubrics, and formulas used to rank applicants.
  • Overall, we found that the CVM had implemented 5 of these 8 controls. For instance, they randomly assigned multiple reviewers to each applicant, used standardized application and interview questions, and withheld as much applicant information as possible. We found a few deficiencies, however. These were primarily related to training the reviewing personnel involved in the admission process. We also found a deficiency related to interviewers being able to input application materials scores after the interview. We detail these issues later in the report.

We used regression models to analyze the CVM’s 2023-2025 admission data to determine whether certain factors might have affected applicants’ outcomes.

  • We worked with CVM officials to access the applicant data they collected or created during the 2023-2025 DVM admission cycles. This included 4,117 qualified applicants across all 3 years. We combined 3 years of data to amplify our regression models’ predictive capability. For our analysis, we excluded academically unqualified applicants, incomplete applications, and a small number of applicants with special circumstances or who applied to special programs, such as early admission.
  • The data we analyzed included 19 variables of interest, listed in Appendix B.
    • Some variables reflect pre-application information, such as academic performance, age, gender, race, ethnicity, or residency. Applicants self-identified using dozens of racial and ethnic categories in the years we reviewed. For our analysis, we combined them into racial categories used by the U.S. Census Bureau: White, Black, Asian, Pacific Islander, American Indian, and Multiracial. The ethnic categories included Hispanic or Latino or non-Hispanic or Latino. A small number of applicants didn’t provide their race or ethnicity, which we categorized as “no response.”
    • Other variables reflect scores generated during the admission process, such as how CVM reviewers scored the applicant’s application materials or interview performance.
  • To analyze the data, we created 5 logistic and linear regression models. Unlike a simple trend analysis, regression models help isolate the impact of certain key variables on outcomes. In this audit, we used regression models to determine which, if any, key variables (e.g., race, gender, ethnicity) appeared to influence applicants’ chances of progressing through the admissions process. A statistically significant relationship between a variable (e.g., race) and an outcome (e.g., admission score) could indicate the presence of some preferential treatment.  Our models used a 95% confidence level, meaning the observed effects are 95% likely to be real, rather than chance.
  • Specifically, our models analyzed the 19 variables’ association with receiving:
    • an interview offer,
    • a higher application materials score,
    • a higher interview performance score,
    • an admission offer, and
    • a “not recommended for admission” designation.
  • Finally, we followed up with CVM officials to discuss our results.

Findings for CVM Process Review — Interview Stage

CVM policy is to invite all academically qualified Kansas residents for an interview.

  • The first step in the CVM’s process is to determine which applicants will be offered an interview. Figure 2 shows the steps in the DVM admission process. As the figure shows, CVM begins by screening out unqualified applicants. Then, the process goes through 2 key points at which CVM determines whether to advance an applicant to the next stage. The Dean makes the final selection based largely on the applicants’ final rankings.
  • CVM officials said that by policy, all academically qualified Kansas residents are invited for an interview. According to CVM officials, they adopted this policy because it aligns with their mission as a public institution that receives Kansas tax dollars. Of the 4,117 qualified applicants across the 3 years we reviewed, 360 (9%) were Kansas residents.
  • Non-Kansas applicants are evaluated for an interview offer using a formula reflecting 3 factors from their applications. These factors include the applicant’s GPAs from science prerequisite courses and from their last 45 credit hours of any courses prior to application. The formula also reflects the hours of relevant prior veterinary, animal (non-veterinary), research, employment, extracurricular, and volunteer experience the applicant reported. The formula weighs an applicant’s GPAs more than their relevant prior experience.
  • In the 3 years we reviewed, most applicants were eliminated at this stage. Many applicants don’t qualify to be considered for admission, as Figure 1 showed. But of the 4,117 qualified applicants we analyzed, just 1,398 (34%) received interviews. Of those, 358 (26%) were Kansas residents. That’s despite Kansas applicants only making up 9% of all qualified applicants we reviewed.

We didn’t find any unintended preferences in how the CVM selected applicants for an interview.

  • Applicants aren’t offered admission until they are ranked by the CVM’s second formula and selected by the Dean. But there are several steps in the CVM’s admission process where preferential treatment could influence these final selections. So, in addition to looking for preferences at just the admissions step, we also looked for them at various stages in the process. The first step we evaluated was who the CVM invited for an interview. According to CVM policy, this should be based on Kansas residency and the result of the CVM’s formula, and no other preferences should be visible. 
  • Figure 3 shows the results of our model evaluating which of the 19 factors we evaluated were associated with a higher likelihood of receiving an interview offer. As the figure shows, 4 factors were statistically significant. We expected to see all of them based on the CVM’s process for deciding who to interview. Because no other variables (e.g., race, gender, or ethnicity) were identified as significant, being selected for an interview does not appear to be associated with factors other than the criteria CVM officials described.
  • Kansas residency had an extremely strong effect, in alignment with CVM policy. Figure 4 shows how Kansas residency affected whether an applicant would receive an interview offer. As the figure shows, the average Kansan applicant with the minimum 2.8 GPA in their last 45 credit hours of any courses prior to application would have a 100% chance of receiving an interview offer. By contrast, the average non-Kansas applicant with the same 2.8 GPA had less than a 1% chance. Even with a perfect 4.0 GPA, a non-Kansas applicant would have a 91% chance.

However, we found several weaknesses in CVM’s process that could allow unintended preference in how applicants are scored at the interview stage.

  • Applicants receiving an interview are randomly assigned to a 3-person interview team from a pool of veterinary professionals, such as CVM faculty. Each of the 3 interviewers uses standardized scoring rubrics to evaluate the applicant on 2 dimensions. After the interviewers input their scores, each dimension’s scores are totaled and used in CVM’s second formula. This formula determines the applicants’ final rank. CVM officials said these interviewers were required to receive training that covered things like how to use the scoring rubrics and recognize their implicit biases.
    • Application materials score: Before the interview, each interviewer uses a standardized rubric to score the applicant’s application materials. This reflects the applicant’s personal essay, letters of recommendation, and the quality of their relevant prior experience. Other application elements, like the applicant’s GPA and hours of relevant prior experience, were previously factored into whether they received an interview offer.
    • Interview performance score: After the interview, each interviewer uses a second standardized rubric to score the applicant’s interview performance. This reflects things like the applicant’s communication skills, personal attributes, and knowledge and interest in veterinary medicine.
    • Also, after the interview, each interviewer designates the applicant as “highly recommended,” “recommended,” or “not recommended” for admission. This is separate from the other 2 scores.
  • However, we found control weaknesses in these processes that might have affected the applicants’ scores.
    • First, the CVM didn’t freeze interviewers’ data access after these scoring rounds were completed. We saw a couple instances where an interviewer submitted an application materials score after interviewing the applicant. We expected these scores to be submitted and locked in prior to the interview, to prevent interviewers from changing an applicant’s score after meeting the applicant during the interview. CVM officials said the submission of application materials scores prior to the interview simply hadn’t been closely controlled, but that this had been addressed. 
    • Further, the CVM didn’t actively monitor whether interviewers took required anti-preference training. CVM officials prepared training videos for all individuals contributing to the interview process, which covered topics including how to use the scoring rubrics and recognize their unintended biases. We expected CVM to document that all reviewers received such training, to ensure they would score applicants based on CVM policy. Instead, they relied on interviewers to self-report their own compliance. For the 2023 admissions cycle, 11 of 56 (20%) reviewers did not report to CVM officials that they took the training. This number rose to 17 of 67 (25%) reviewers for the 2024 admissions cycle. CVM officials said they didn’t require interviewers to confirm they completed the training for the 2025 admissions cycle.
    • Finally, the CVM encouraged interviewers to complete optional anti-bias training provided by Harvard University. But because it wasn’t required, CVM officials said they didn’t track whether anyone took this training in the years we reviewed. This is more focused on preventing preferential treatment than CVM’s required training, so it would be beneficial to require and track interviewers’ completion of it.

Our models showed some possibility of preferential treatment at the application material scoring stage, but this didn’t appear to have a large effect on who received admission offers.   

  • Applicants submit application materials such as personal essays, letters of recommendation, and information about their relevant prior experience. The 3 assigned interviewers use standardized rubrics to score these materials. These scores are totaled to create the application materials score. That score becomes 1 of 5 factors in the formula CVM uses to determine applicants’ final rankings. As a result, any unintended preferences that affect this application materials score would in turn feed into applicants’ final ranking scores and their prospects for admission.
  • Figure 5 shows the results of our model evaluating which of the 19 factors we evaluated were associated with higher application materials scores. As the figure shows, 6 factors were statistically significant. We expected to see some of them, such as more hours of relevant prior experience. This would likely increase the quality of their application materials. But others may show preferential treatment for certain applicants, influenced by gender and race. This could mean the scores the interviewers gave the applicants’ materials were influenced by these applicant characteristics, rather than just the quality of their application materials. This would not align with CVM policy.
  • By themselves, these preferences don’t seem to have had a large effect on which applicants received admission offers. Applicants’ application materials scores are used in CVM’s second formula, which affects their admission chances when compared to other applicants. But applicants’ scores aren’t compared to a specific threshold to determine whether to move them forward beyond the interview stage of the process.
    • Admitted applicants’ application materials scores overlapped significantly with rejected applicants’ scores. The application materials rubric allows for a maximum combined total of 3,000 points. The average admitted applicant we reviewed scored 2,587, whereas the average rejected applicant scored 2,305. The lowest admitted applicant’s score was 1,815. By contrast, the highest rejected applicant’s score was 2,961. This is possible because other factors are also included in CVM’s second formula, especially academic performance.
    • Within this context, the differences the model identified are too small to meaningfully affect most applicants. All else being equal, the average male applicant scored 36 points higher on the 3,000-point scale than the average non-male applicant at this point in the process. Conversely, the average American Indian applicant scored 135 points lower, and the average Kansan applicant scored 127 points lower. This is largely because it’s easier for Kansan applicants to get to this point in the process. For most applicants, differences of this size are unlikely to move them into or outside of the large overlapping range for admitted and rejected applicants.
    • Our model’s results are statistically valid at a high confidence level. But an applicant’s application materials score is just 1 of 5 factors that go into their final ranking formula. It is also weighted less than other factors, such as their academic performance. That’s why these results’ effects are limited, and they don’t carry through to the model analyzing admission outcomes. 

Our models also showed some possibility of preferential treatment at the interview scoring phase, but this also didn’t appear to have a large effect on who received admission offers.

  • The 3 assigned interviewers also score applicants on their interview performance using a standardized rubric. The 3 scores are totaled to create the interview score. As with their application materials scores, the interview score becomes 1 of 5 factors in the formula CVM uses to determine applicants’ final ranking. Any unintended preferences that affect this scoring would in turn feed into applicants’ final rankings and their prospects for admission.
  • Figure 6 shows the results of our model evaluating which of the 19 factors we evaluated were associated with higher interview performance scores. As the figure shows, 5 factors were statistically significant. We again expected to see some of them, such as more hours of relevant prior experience. Such experience would likely improve applicants’ interview performance. But others may show preferential treatment for certain applicants, influenced by military affiliation status or race. Our results could mean the scores the interviewers gave the applicants were influenced by characteristics of the applicant other than the quality of their performance during the interview. This would not align with CVM policy.
  • As with the application materials scores, these preferences don’t seem to have had a large effect on which applicants ultimately received admission offers. These scores are included in CVM’s second formula and affect applicants’ admission chances. But applicants’ scores aren’t compared to a specific threshold to determine whether to move them forward beyond the interview stage of the process.
    • Admitted applicants’ interview performance scores again overlapped with rejected applicants’ scores. The interview performance rubric allows for a maximum combined total of 6,000 points. The average admitted applicant we reviewed scored 5,405, whereas the average rejected applicant scored 4,628. The lowest admitted applicant’s score was 3,384, and the highest rejected applicant’s score was 6,460. (This applicant came from a North Dakota exchange program with a fourth interviewer and a higher possible total score.) Again, this is possible because of the other factors included in CVM’s second formula.
    • Within this context, the differences the model identified are too small to meaningfully affect most applicants. All else being equal, the average military-affiliated applicant scored 258 points higher on the 6,000-point scale than the average non-military-affiliated applicant at this point in the process. Conversely, the average Kansan applicant scored 241 points lower, the average Asian applicant scored 231 points lower, and the average Hispanic or Latino applicant scored 207 points lower. For most applicants, differences of this size are unlikely to move them into or outside of the large overlapping range for admitted and rejected applicants.
    • Our model’s results are statistically valid at a high confidence level. But an applicant’s interview performance score is just 1 of 5 factors that go into their final ranking formula. It is also weighted less than other factors, such as their academic performance. That’s why these results’ effects are limited, and they don’t carry through to the model analyzing admission outcomes. 

Findings for CVM Process Review — Admission Stage

After the interview stage, the CVM calculates applicants’ final scores and rankings, which help decide which applicants will be offered admission.

  • After applicants have been interviewed, CVM uses a second formula to determine their final score and ranking for admission. The formula reflects 5 factors from the admission process. These factors include the 2 required GPAs and experience information included in the first formula, as well as the 2 scores for the applicant’s application materials and interview performance.
    • 60% of the formula is based on 2 applicant GPAs, including those from the science prerequisite courses and the last 45 hours of any courses prior to application.
    • 11% of the formula is based on the hours of relevant prior veterinary, animal (non-veterinary), research, employment, extracurricular, and volunteer experience the applicant reported.
    • 29% of the formula is based on the 2 scores the applicant received on their application materials and interview performance. These are the totals of the 3 interviewers’ scores.
  • CVM officials said the officials involved in the admissions process meet to review the final list of applicants. The group considers the formula-based ranking of applicants, the interviewers’ designations of “highly recommended,” “recommended,” or “not recommended,” and whether applicants are from Kansas. The Dean determines how many admission spots will go to Kansans and nonresidents.
  • The committee then sends the Dean a list of applicants they believe should receive admission offers, alternate offers, or denials. CVM officials said the Dean makes the final admissions decisions but has historically followed the final rankings closely.
  • The total number of offers the Dean makes depends on how many seats are available in the incoming class. CVM officials said they try to admit as many Kansans as possible but must balance this with CVM’s financial needs, since students from outside Kansas pay higher tuition. 
  • The remaining unselected applicants are eliminated at this stage of the process. Of the 4,117 qualified applicants we analyzed for the 3 years we reviewed, just 759 (18%) received admission offers.

CVM does not have controls over the “not recommended for admission” designation, which can impact some candidates.

  • The 3 interviewers designate applicants after their interviews as “highly recommended,” “recommended,” or “not recommended” for admission. These designations are separate from the application materials and interview performance scores. It’s possible that applicants with high GPA scores or interview scores may nonetheless be designated “not recommended for admission.”
  • The CVM doesn’t have strong criteria or controls over these designations. For instance, there isn’t a standardized rubric or criteria for who should be recommended or not recommended. CVM officials said this subjective designation is intended to allow interviewers to express concerns about applicants who may otherwise score highly. CVM officials said it mainly reflects applicants’ communication skills during the interview, since this is imperative for veterinary medicine.
  • These designations can directly impact who is offered admission to the program. As such, we expected CVM to have criteria or guidance to help govern when to use these designations. For example, a standardized rubric could help ensure interviewers are consistent in when and why they give these designations.
  • The CVM’s process for incorporating these designations into the admission process may allow just 1 or 2 interviewers to negate the rest of the process. Receiving “not recommended for admission” designations from 2 interviewers triggers automatic disqualification, regardless of the applicant’s final ranking. Beginning with the 2025 admission cycle, receiving a “not recommended for admission” designation from 1 interviewer triggered a second application review to determine whether to disqualify the applicant.
  • Applicants who were “not recommended for admission” were unlikely to receive admission offers. In the years we reviewed, 241 out of 1,398 (17%) interviewed applicants received 1 or more “not recommended for admission” designation. Of these, 211 (88%) were not offered admission, whereas 30 (12%) were. We can’t say for sure why these 241 applicants received these designations.
  • In addition to not having controls over this designation, our regression models identified a preference in CVM’s use of these designations. 1 of our regression models showed Asian applicants were associated with a higher likelihood of receiving 1 or more “not recommended for admission” designations. This suggests this designation may be influenced by possible preferences. CVM officials said they were unaware of language barriers or other factors that might explain this result. However, we didn’t find a statistically significant preference against Asian applicants (or applicants of any other race) at the admission offer stage. It’s possible that the relatively low number of applicants (17%) that received a “not recommended for admission” designation limited the impact this designation had on the total applicant pool.

Although there were small indicators of preferential scoring along the way, we did not find evidence of unintended preferences in which applicants ultimately were offered admission to the DVM program.

  • CVM officials said the Dean receives the final formula-based ranking of applicants and uses this to determine who will receive an offer. This ranking also reflects applicants’ “highly recommended,” “recommended,” or “not recommended” for admission designations, as well as whether applicants are from Kansas.
  • Figure 7 shows the results of our model evaluating which of the 19 factors we evaluated were associated with a higher likelihood of receiving an admission offer. As the figure shows, 7 factors were statistically significant. We expected to see all of them based on CVM’s formula driven admission process. Because no other variables (e.g., race, gender, or ethnicity) were identified as significant, being selected for admission does not appear to be influenced by preferential treatment. Any preferences our models identified at earlier stages in the process weren’t strong enough to carry through to which applicants received admission offers.
  • Kansas residency again had a strong effect, in alignment with CVM policy. Figure 8 shows how much Kansas residency affected whether an applicant would receive an admission offer. As the figure shows, the average Kansan with a perfect score of 6,000 on CVM’s interview performance scale (1 of the 5 factors that go into the final ranking formula) had a 99% chance of admission during 2023-2025. By contrast, the average non-Kansan applicant with the same 6,000-point score had an 85% chance of admission. In the end, Kansas applicants accounted for 21% of admission offers, even though they were only 9% of qualified applicants.
  • No unexpected preferential treatment appeared at this stage of CVM’s process, likely because of the controls CVM put in place.
    • The formula CVM uses at this phase is heavily weighted for the applicants’ GPAs and relevant prior experience (71% of the formula).  The rubric-based scores (application and interview scores) had less weight (29% of the formula). Our regression models showed possible slight preferences in the lower-weighted rubric-based scores. The small effects the models showed did not carry through to final rankings.
    • Although the Dean has discretion, they followed the final formula rankings and generally selected the strongest applicants during 2023-2025. The average admitted applicant had a final score of 91.8 out of 100, whereas the average rejected applicant scored 63.1. Of the 625 applicants whose final scores were 90 or above out of 100, 580 (93%) were offered admission, and just 45 (7%) were rejected. All 45 who were rejected were from outside Kansas, in alignment with CVM’s preference for Kansas applicants.

Unintended preferences didn’t appear to ultimately influence who was admitted to the KU School of Medicine, but control weaknesses may have allowed preferences at certain points in the process.

KU School of Medicine Background

The University of Kansas School of Medicine uses a multi-step application process for its Doctor of Medicine degree program.

  • The School of Medicine (SOM) offers several degree programs. We focused on the Doctor of Medicine (MD) program because it’s SOM’s flagship program. SOM also offers several master’s, PhD, and certificate programs related to things like public health, statistics, and clinical research, but they are not included in this audit.
  • Admission to the MD program is very competitive. Figure 9 shows the number of applications SOM received during 2023-2025. As the figure shows, SOM only interviews and offers admission to a small percentage of applicants.
  • Because the process is competitive, even strong applicants may not be offered admission. The average admitted applicant during 2023-2025 had a 3.83 total GPA and 510 score on the Medical College Admission Test (MCAT). The average rejected applicant had a 3.69 total GPA and 506 MCAT score, meaningfully higher than the minimum 497 MCAT.
  • The MD admission process includes 2 key points at which SOM determines whether to advance an applicant to the next stage.
    • First, a single reviewer considers factors like GPA and MCAT score to generate a score and determine whether to recommend each applicant for an interview. Certain applicants with Kansas ties qualify automatically, however.
    • Second, 2-3 interviewers use a rubric to score applicants’ interview performance. A selection committee then uses these scores, along with their previously reviewed application materials, to score and rank applicants. These rankings ultimately help SOM decide who to admit to the program. These processes are detailed later in the report.

Applicants must meet several academic requirements to qualify for admission to the MD program.

  • Applicants begin the admission process by applying through the American Medical College Application Service (AMCAS). The Association for American Medical Colleges operates this service, which allows applicants to apply to multiple MD programs at once. AMCAS verifies the academic information applicants submit, and SOM officials said they only review verified applications.
  • To qualify academically for KU’s MD program, applicants must have a bachelor’s degree that included prerequisite courses like biology, chemistry, and physics. Although SOM doesn’t require a minimum GPA, applicants with GPAs below 3.25 in science and math courses are reviewed for automatic rejection. This GPA threshold was 3.0 during the 2023 admission cycle. As of the 2025 admission cycle, all applicants must have a minimum 497 MCAT score. Previously, there wasn’t a minimum required MCAT score.
  • In addition to academic information, applicants provide information related to things like their age and home state, as well as their employment, leadership, and health care experience. Applicants also provide information about their achievements, awards, and honors. Finally, they provide at least 1 personal statement and 3-5 letters of recommendation. SOM officials told us non-U.S. citizens are not considered for admission.
  • Finally, applicants may provide their gender, race, and ethnicity. These questions are optional in AMCAS, but nearly all applicants provided this information in the 3 years we reviewed.

Methodology for Reviewing SOM’s Admission Process

We looked for certain controls in SOM’s admission process to determine whether there were significant opportunities for preferential treatment.

  • Because admission to the MD program is so competitive, legislators have raised concerns about whether preferential treatment may be affecting certain applicants’ admission outcomes. To determine whether this might be the case, we reviewed SOM’s controls for preventing preferential treatment. We also reviewed 6,229 applicants’ outcomes from the 2023-2025 admission cycles.
  • We reviewed relevant academic literature and the U.S. Government Accountability Office’s standards for internal controls relevant to preventing preferential treatment in admission processes. We chose these standards because they are generally seen as authoritative and widely applicable.
  • Based on these standards, we reviewed whether SOM had:
    • documented policies and procedures for each step of the process,
    • trained the reviewing personnel on the admission process requirements, relevant controls, and unintended preference prevention,
    • had multiple independent reviewers score applicant materials,
    • randomly assigned reviewers to each applicant, without management interference,
    • as much as possible, withheld applicant information that could result in preference (e.g., name, gender, race, ethnicity, background),
    • restricted access to application information that’s been withheld and scores that have been finalized,
    • used standardized rubrics, templates, and scoring systems, and
    • used standardized questions for applications and interviews.
  • We talked to SOM officials and reviewed documentation and data to determine whether they had these controls in place during the 2023-2025 admission cycles. This included things like policy and procedure documents, digital access controls, and scoring rubrics.
  • Overall, we found that SOM had implemented 4 of these 8 controls. For example, they randomly assigned reviewers to each applicant, restricted access to application information that’s been withheld, and used standardized questions for applications and interviews. We found a few other deficiencies, however. These were primarily related to SOM’s initial file review for applicants without Kansas ties who didn’t meet automatic interview criteria. We also found deficiencies related to training the reviewing personnel involved in the admission process. We detail these issues later in the report.

We used regression models to analyze SOM’s 2023-2025 admission data to determine whether certain factors might have affected applicants’ outcomes.

  • We worked with SOM officials to access the applicant data they collected or created during the 2023-2025 MD admission cycles. This included 6,229 qualified applicants across all 3 years. We combined 3 years of data to amplify our regression models’ predictive capability. For our analysis, we excluded academically unqualified applicants, incomplete applications, deferred applicants from prior years, and a small number of applicants who applied to special programs, such as admission as an undergraduate.
  • The data we analyzed included 17 variables of interest, listed in Appendix C.
    • Some variables reflect pre-application information, such as academic performance, age, gender, race, ethnicity, and home state. Applicants self-identified using dozens of racial and ethnic categories in the years we reviewed. For our analysis, we combined them into racial categories used by the U.S. Census Bureau: White, Black, Asian, Pacific Islander, American Indian, and Multiracial. The ethnic categories included Hispanic or Latino or non-Hispanic or Latino. A small number of applicants didn’t provide their race or ethnic and were categorized as “no response.”
    • Other variables reflect scores generated during the admission process, such as how SOM reviewers scored the applicant’s interview performance or their overall assessment score on suitability for medical school.
  • To analyze the data, we created 4 logistic and linear regression models. Unlike a simple trend analysis, regression models help isolate the impact of certain key variables on outcomes. In this audit, we used regression models to determine which, if any, key variables (e.g., race, gender, ethnicity) influenced applicants’ chances of progressing through the admissions process. A statistically significant relationship between a variable (e.g., race) and an outcome (e.g., admission score) could indicate the presence of some preferential treatment. Our models used a 95% confidence level, meaning the observed effects are 95% likely to be real, rather than chance.
  • Specifically, our models analyzed the variables’ association with receiving:
    • an interview offer,
    • a higher interview performance score,
    • a higher overall assessment of suitability score, and
    • an admission offer.
  • Finally, we followed up with SOM officials to discuss our results.

Findings Related to SOM Process Review — Interview Stage

SOM policy is to offer an interview to all applicants with Kansas ties who meet certain GPA and MCAT standards.

  • The first step in SOM’s process is to determine which applicants will be offered an interview. Figure 10 shows the steps in the MD admission process. As the figure shows, SOM begins by screening out unqualified applicants. Then, the process goes through 2 key points at which SOM determines whether to advance an applicant to the next stage. The selection committee makes the final selection based on the applicants’ final rankings.
  • The strongest applicants with Kansas ties, who meet certain GPA and MCAT thresholds, are automatically invited for an interview. This requires a minimum 3.4 GPA in science and math courses and a 502 MCAT score. These thresholds are higher than what SOM requires to simply be considered for admission. SOM officials define Kansas ties as either state residency or things like graduating from a Kansas high school or university or having a parent who is an SOM employee or alumnus.
  • Applicants with Kansas ties who don’t meet the automatic interview criteria can still be offered an interview. They receive the same review as applicants without Kansas ties. This is conducted by a single file reviewer. SOM doesn’t use a standardized rubric for this file review. We found issues with this part of SOM’s process, which are described more below.
  • In the 3 years we reviewed, most applicants were eliminated at this stage. Many applicants don’t qualify to be considered for admission, as Figure 9 showed. Of the 6,229 qualified applicants we analyzed, just 1,418 (23%) received interviews. Of those, 1,068 (75%) had Kansas ties.

We found several weaknesses in SOM’s file review process that could allow unintended preference in who is invited for an interview.

  • Applicants with Kansas ties who met GPA and MCAT thresholds higher than the minimum required for admission are automatically offered interviews. Other applicants receive file reviews to determine this. SOM officials said these reviewers were required to receive training that covered how the admission process works and told them to disregard race and ethnicity when evaluating applicants after the U.S. Supreme Court ruled against this in Students for Fair Admissions v. Harvard in June2023.
  • We found control weaknesses in this file review process that might have affected who received interview offers.
    • First, only 1 file reviewer evaluated each applicant’s file to determine whether to recommend them for an interview. We expected to see at least 2 file reviewers for each applicant because best practices require multiple independent reviewers. Having 2 file reviewers for each applicant would help ensure no one can make interview decisions alone based on things that don’t align with SOM policy.
    • In addition, SOM didn’t require file reviewers to use a standardized rubric. SOM has a pool of SOM-affiliated individuals who can act as the single reviewer. Reviewers were instructed to spend 2-3 minutes evaluating whichever elements of the applicant’s file they chose. Reviewers were told their decisions should be based on whether they thought the applicant was likely to become a competent and compassionate physician, preferably in an underserved area in Kansas. We expected SOM to use a rubric to help ensure reviewers are consistently looking at the factors SOM wants.
    • Next, SOM didn’t withhold certain information from file reviewers that might have allowed preferential treatment. SOM officials said applicants’ races and ethnicities were withheld from reviewers after the U.S. Supreme Court ruled against race-conscious admissions in Students for Fair Admissions v. Harvard in June2023. But reviewers still had access to all other materials in applicants’ files. This included information like name, gender, and background information such as whether the applicant’s family was socioeconomically disadvantaged. These details help build a complete picture of each applicant, and totally withholding them would undermine this. However, we expected as much of this information to be withheld as possible, to ensure it didn’t influence whether applicants received interview offers. Details like the applicant’s gender or childhood socioeconomic status likely could have been withheld.
    • Further, SOM officials said they didn’t systematically track whether file reviewers attended required training. SOM officials said this training covered things like how to use the scoring rubrics and avoid preferential treatment. We expected SOM to document that all reviewers received such training, to ensure they would score applicants based on SOM policy. SOM officials said they informally monitored whether everyone attended this training. There was no formal documentation of who attended this training in the years we reviewed.
    • Finally, SOM officials said they stopped requiring anti-bias training for people involved in the admission process. They said they eliminated this required training after 2024 House Bill 2105 required such training materials to be made publicly available. However, this training is more focused on preventing preferential treatment, so it would be beneficial to require and track file reviewers’ completion of it.

Our models showed some possibility of preferential treatment at the interview invitation stage.

  • Applicants aren’t offered admission until they are ranked and selected by SOM’s selection committee. But there are several steps in SOM’s admission process where preferential treatment could influence these final selections. So, in addition to looking at preferences at just the admissions step, we also looked for it at various stages in the process. The first step we evaluated was who SOM invited for an interview. According to SOM policy, this should be based on Kansas ties, academic performance, and a full file review, depending on the applicant.
  • Figure 11 shows the results of our model evaluating which of the 17 factors we evaluated were associated with a higher likelihood of receiving an interview offer. As the figure shows, 8 factors were statistically significant. We expected to see some of them based on SOM’s criteria for deciding who to interview, such as Kansas ties, a high math and science GPA, and a high MCAT score. Several other variables the model identified may show preferential treatment for certain applicants, influenced by gender, race, and ethnicity. This could mean these applicants’ interview offers were influenced by these characteristics, rather than just the quality of their application files. This would not align with SOM policy and would be despite SOM withholding race and ethnicity from file reviewers.
  • Kansas ties had an extremely strong effect, in alignment with SOM policy. Figure 12 shows how having Kansas ties affected whether an applicant would receive an interview offer. As the figure shows, the average applicant with Kansas ties and the 2025 minimum MCAT score of 497 had a 72% chance of receiving an interview offer. By contrast, the average applicant without Kansas ties with the same 497 MCAT score had just a 1.5% chance of receiving an interview offer. Even with a perfect 528 MCAT score, the average applicant without Kansas ties would have a 64% chance.
  • Figure 11 also shows that gender, race, and ethnicity also had statistically significant effects, which was not supported by SOM policies. For example, Figure 13 shows being Black increased an applicant’s chance to receive an interview offer. All else being equal, the average Black applicant with the minimum 497 MCAT score had a 72% chance of being invited for an interview, whereas the average non-Black applicant with the same 497 MCAT score had a 4% chance. Even with a perfect 528 MCAT, the average non-Black applicant would have an 83% chance.
  • SOM officials stated their current policy is for reviewers not to consider race or ethnicity when determining which applicants to invite for interview. SOM policy provided preference to applicants in groups underrepresented in medicine, including Black applicants, before the U.S. Supreme Court ruled against this in Students for Fair Admissions v. Harvard in June2023. After that ruling, SOM policy eliminated race as a consideration for admission. They also required reviewers to attend training on the admission process, including not making decisions based on applicants’ race or ethnicity. As noted above, though, we found that SOM doesn’t formally track whether reviewers take this training. And they said that even with such training, it’s impossible to completely eliminate reviewers’ possible preferences.
  • Our model’s results are statistically valid at a high confidence level. And to ultimately receive admission, an applicant must first be offered an interview. However, this is only the first step of the admission process. As detailed later, SOM has created stronger controls over later steps. That’s why these results’ effects are limited, and they don’t ultimately carry through to the model analyzing admission outcomes.

We also found weaknesses in SOM’s process that could allow unintended preferences in how applicants are scored at the interview stage.

  • Interviewed applicants are randomly assigned to 2- or 3-person interview teams from a pool of SOM-affiliated individuals, such as faculty. These interviewers include 1 “open file” interviewer and 1-2 “closed file” interviewers. The former has access to all the information in the applicant’s file, similar to the reviewers who recommend which applicants to interview. After June 2023, this excludes the applicant’s race and ethnicity. The closed file interviewers only have access to limited information, such as the applicant’s name, personal statement, relevant prior experiences, and responses to application questions.
  • After the interview, each of the 2-3 interviewers uses a standardized rubric to rate each applicant they interviewed. These ratings are used to generate 2 scores that are later incorporated into the selection process.
    • Interview performance score: 1 score based on this rubric reflects the applicant’s interview performance, including things like the applicant’s understanding of and motivation for medicine, commitment to service, and leadership experience.
    • Overall assessment of suitability score: The other score reflects an overall assessment of the applicant’s suitability for medical school, including things like communication skills, maturity, and empathy.
  • However, the training-related weaknesses outlined above would also affect the interviewers involved at this stage of the process (in addition to those who did the original file reviews for interview offers). SOM officials said they used a manual process and didn’t systematically track whether these interviewers took the required training. They also said they no longer required anti-bias training for them.

Our models showed some possibility of preferential treatment in how interviews were scored, but this didn’t appear to have a large effect on who received admission offers. 

  • Applicants who are interviewed receive a score for their interview performance that is based on a standardized rubric. The applicant receives a score from each of their 2 or 3 assigned interviewers. Although 1 interviewer has access to all the applicant’s information, the other 1 or 2 only have limited access to information. The applicant’s score is used later as part of the final selection process, which ranks applicants for admission. Any unintended preferences that affect this scoring would in turn feed into that selection process and applicants’ chance of admission.
  • Figure 14 shows the results of our model evaluating which of the 17 factors we evaluated were associated with higher interview performance scores. As the figure shows, 7 factors were statistically significant. We expected to see some of them, such as high overall GPA or total MCAT score. These things reflect the applicant’s academic achievement and preparation, which likely help improve their interview performance. But others may show preferential treatment for certain applicants, influenced by gender or race. This could mean the scores the interviewers gave the applicants were influenced by characteristics other than the quality of their performance during the interview. This would not align with SOM policy.
  • By themselves, these possible preferences don’t seem to have had a large effect on which applicants received admission offers. These scores affect how the SOM selection committee considers applicants at a later stage. But applicants’ scores aren’t compared to a specific threshold to determine whether to move them forward beyond the interview stage of the process.
    • Admitted applicants’ interview performance scores overlapped significantly with rejected applicants’ scores. We used a normalized 0-1 point scale for our analysis because SOM’s scoring scale changed during 2023-2025. The average admitted applicant scored 0.89, whereas the average rejected applicant scored 0.73. The lowest admitted applicant’s score was 0.63, and the highest rejected applicant’s score was 1.0. This is possible because many other factors are also considered when SOM makes admission offer decisions.
    • Within this context, the differences the model identified are too small to meaningfully affect most applicants. All else being equal, being female increased the average applicant’s interview performance score by 0.04 on our 0-1 point scale compared to the average non-female applicant at this point in the process. Similarly, the average Black applicant’s score was 0.06 higher, and the average Asian applicant’s score was 0.02 higher. For most applicants, differences of this size are unlikely to move them into or outside of the large overlapping range for admitted and rejected applicants.

Our models also showed some possibility of preferential treatment in how applicants’ overall suitability was scored, but this also didn’t appear to have a large effect on who received admission offers.

  • In addition, applicants who are interviewed receive another score reflecting an overall assessment of their suitability for medical school. This is also based on a standardized rubric. The applicant’s score is used later as part of the final selection process, which ranks applicants for admission. Any unintended preferences that affect this scoring would in turn feed into that selection process and applicants’ chance of admission.
  • Figure 15 shows the results of our model evaluating which of the 17 factors we evaluated were associated with higher overall assessment of suitability scores. As the figure shows, 6 variables were statistically significant. Again, we expected to see some of these, such as high overall GPA or total MCAT score. These things reflect the applicant’s academic achievement and preparation, which are directly related to their overall suitability for medical school. But some of these variables may show preferential treatment for certain applicants, influenced by gender or race. This could mean the interviewers’ perceptions of the applicants’ suitability were influenced by characteristics like these. This would not align with SOM policy.
  • As with the interview performance scores, these possible preferences don’t seem to have had a large effect on which applicants received admission offers. These scores are also among those used by the SOM selection committee at a later step. But applicants’ scores aren’t compared to a specific threshold to determine whether to move them forward beyond the interview stage of the process.
    • Admitted applicants’ overall assessment scores again overlapped significantly with rejected applicants’ scores. Using another 0-1 point normalized scale, the average admitted applicant scored 0.91 in 2023-2025. By contrast, the average rejected applicant scored 0.71. The lowest admitted applicant’s score was 0.58, and the highest rejected applicant’s score was 1.0. Again, this is possible because of the other factors considered by SOM’s selection committee.
    • Within this context, the differences the model identified are too small to meaningfully affect most applicants. All else being equal, the average female applicant scored 0.05 higher on our 0-1 point normalized scale compared to the average non-female applicant at this point in the process. The average Black applicant scored 0.07 higher. For most applicants, differences of this size are unlikely to move them into or outside of the large overlapping range for admitted and rejected applicants.
  • The results of our interview performance and overall assessment scoring models are statistically valid at a high confidence level. But although these scores influence the selection committee’s discussion about the applicants, they don’t ultimately decide whether an applicant receives admission. This is decided by a later score determined by the selection committee, and SOM has created stronger controls over this step. That’s why these results’ effects are limited, and they don’t ultimately carry through to the model analyzing admission outcomes.

Findings Related to SOM Process Review — Admission Stage

After the interview stage, SOM uses a standardized rubric to determine applicants’ final rankings for admission.

  • After applicants have been interviewed and received their 2 associated scores, a committee made up of 18-20 SOM-affiliated individuals reviews each applicant. The open file interviewer who has been involved in the applicant’s interview and has access to the applicant’s entire file presents the applicant to the selection committee for their consideration. The other members of the selection committee don’t have access to the applicant’s personal information, such as name, gender, or race.
  • The format of the presentation depends on the strength of the applicant. This is determined using their science and math GPA, MCAT score, interview performance score, and overall assessment of suitability score. The strongest and weakest applicants are presented for only 30-90 seconds, and the presenter has discretion over what to discuss. The remaining applicants are presented for 3-5 minutes, and the presenter must cover information from all sections of the applicant’s file. SOM officials told us the applicant’s name, age, gender, race, and ethnicity are withheld from the selection committee.
  • Based on these verbal presentations, the selection committee creates a final score for each applicant using the same overall assessment of suitability rubric the interviewers used. The 18-20 scores from the members of the committee scores are averaged to determine the applicant’s final score and rank relative to the other applicants. The final selection is based on the applicants’ rankings and the number of available seats in the incoming class.
  • The remaining unselected applicants are eliminated at this stage of the process. Of the 1,418 qualified applicants that made it to the interview stage for the 3 years we reviewed, 757 (53%) received admission offers. Of these, 588 (78%) had Kansas ties.

Although there were small indicators of preference along the way, we did not find evidence of unintended preferences in which applicants ultimately were offered admission to the MD program.

  • Figure 16 shows the results of our model evaluating which of the 17 factors we evaluated were associated with a higher likelihood of receiving an admission offer. As the figure shows, 4 factors were statistically significant. We expected to see all of them based on how SOM officials described their process for deciding who to admit. Because no other variables (e.g., race, gender, or ethnicity) were identified as significant, being offered admission is not associated with factors other than the criteria SOM officials described. Any preferences our models identified at earlier stages in the process weren’t strong enough to carry through to which applicants received admission offers.
  • An applicant’s ties to Kansas again had a strong effect, in alignment with SOM policy. Figure 17 shows how much having Kansas ties affected whether an applicant would receive an admission offer. As the figure shows, the average applicant with Kansas ties with a perfect 1.0 interview performance score on our normalized scale had a 94% chance of being offered admission. By contrast, the average applicant without Kansas ties with the same 1.0 interview performance score had a 79% chance.
  • No unexpected preferential treatment appeared at this stage of SOM’s process because of the controls SOM had put in place.
    • Applicants’ final scoring and ranking was determined by a new score created by the selection committee. It wasn’t determined by the rubric-based scores for which our models identified possible preferences. This mitigated the possible preferences’ effects.
    • SOM officials said they withheld personal details from the selection committee during the applicants’ final presentations. This helped ensure preferences couldn’t affect the committee’s final scoring of the applicants, regardless of any preferences that might have affected prior stages.
    • Finally, SOM officials said they simply follow the final ranking as determined by the selection committee until all seats in the incoming class are filled. As such, they generally selected the strongest applicants during 2023-2025. The average admitted applicant had a final score of 3.53 out of 4, whereas the average rejected applicant scored just 2.21. Of the 453 applicants whose final scores were 3.5 or above, 450 (99%) were offered admission, and just 3 (1%) were rejected. We don’t know why these 3 were rejected.

Conclusion

Overall, we didn’t find unintentional preferences significantly affected which applicants were admitted to either school. However, both schools set up processes to admit applicants with Kansas residency or ties over applicants without a Kansas connection. This intentional preference showed up in our regression results. As a result, applicants with Kansas residency or ties were more likely to be admitted to both programs, all else being equal.

We also found some process weaknesses that might have introduced minor bias at certain points in both schools’ processes. It’s possible these unintended preferences could have some impact on applicants’ ability to progress through the admissions process. However, our models showed that these potential biases generally didn’t have a statistically significant impact on which types of applicants were ultimately offered admission to the programs. Both schools generally admitted their strongest applicants in the years we reviewed.

Recommendations

Recommendations for the KSU College of Veterinary Medicine

  1. The KSU College of Veterinary Medicine should take steps to eliminate the possible preferences our analyses identified. This could include things like adding additional training to their process to help avoid these types of preferences.
    • KSU CVM response: The College of Veterinary Medicine affirms agreement that preferences should be avoided in all selection processes. The college provides interviewer training for all individuals that contribute to student interviews. It is essential that individuals involved with this process are properly trained to specifically avoid any potential preferences in the selection process. In the LPA report, the reviewers acknowledge that each CVM interviewer is required to participate in anti-preference training by stating in the report: “Interviewers were required to complete training on how to use the scoring rubrics and avoid implicit biases.” The CVM fully intends to maintain the current requirement for pre-interview training completion that includes anti-preference content. Additionally, the CVM prioritizes that all individuals completing the interview process must complete available anti-preference training.
  2. The KSU College of Veterinary Medicine should develop processes to consistently track whether reviewers have completed all required training. They should also monitor and enforce these requirements for people involved in their admission processes.
    • KSU CVM response: The CVM agrees that tracking the completion of training is essential. The college will continue to track CVM admissions-specific training practices. The college will further expand the training confirmation processes to verify and track that all training is completed by all individuals who contribute to interview activities.
  3. The KSU College of Veterinary Medicine should create clear criteria for the “not recommended for admission” designation. It should also modify its use of this designation to ensure it cannot independently negate the rest of the process for any applicant without scrutiny of this outcome.
    • KSU CVM response: The CVM agrees that the review and reporting process for all candidates needs to be clear and consistent. In addition to the existing rubric that is used for the interview reporting, the college will include information pertaining to the classification of “Not recommended for admission” in this rubric. This classification will include the designation for individuals that should not be granted admission into the DVM program. This designation is in alignment with American Veterinary Medical Association Council on Education (COE) accredited U.S. veterinary college admissions processes and is carefully considered by the trained professionals that contribute to the selection process. Criteria that will be included in this rubric will include an expectation of demonstrated professional behavior, communication effectiveness, awareness of veterinary professional obligations and responsibilities. This designation is thoroughly scrutinized by the CVM Admissions Committee and is verified by the CVM Assistant Dean of Admissions. The college admits students that will be successful with their education and professional career.

Recommendations for the KU School of Medicine

  1. The KU School of Medicine should take steps to eliminate the possible preferences our analyses identified. This could include things like adding additional training to their process to help avoid these types of preferences.
    • KU SOM response: The KU School of Medicine is committed to a fair, merit-based admissions process that serves all Kansans and produces an excellent physician workforce for our state. We will take this recommendation into consideration under guidance from KU’s Office of the General Counsel to ensure compliance with State laws and regulations. In the meantime, we will continue our practice of instructing file reviewers, interviewers, and the admissions committee that decisions should be based on merit and strong Kansas ties and not on race or ethnicity.
  2. The KU School of Medicine should develop processes to consistently track whether reviewers have completed all required training. They should also monitor and enforce these requirements for people involved in their admission processes.
    • KU SOM response: Interviewer training is mandatory for all admissions interviewers and has remained consistent in its core requirements, even as the delivery mode has modernized. The Admissions Office does monitor annual training and verifies that each interviewer has completed requirements prior to interviewing any applicants. The SOM Admissions Office is in the process of implementing new admissions software that will facilitate standardized tracking of training requirements; anticipated implementation is Summer 2026.
  3. The KU School of Medicine should strengthen its controls over the interview selection phase of the admission process to ensure it is objective. This might include creating a standardized interview invitation formula similar to the KSU College of Veterinary Medicine’s. Alternatively, it might include requiring a concurring reviewer, withholding as much personal information about the applicant as possible, and using a rubric to standardize what file reviewers consider as part of their review.
    • KU SOM response: As confirmed in the audit report, our processes are designed to admit the best applicants who will serve the State of Kansas. Using the feedback from LPA, we will consider how best to refine, document, more clearly articulate, and continually evaluate our processes while continuing to remain consistent with both best practices in medical education and the expectations of the Kansas Legislature and the public. We understand and are committed to ensuring that admissions decisions, especially interview selections, are fair, transparent, and aligned with state priorities.

Agency Response

On March 9, 2026 we provided the draft audit report to the KSU College of Veterinary Medicine and the KU School of Medicine. Their responses are below. CVM and SOM officials generally agreed with our findings and conclusions.

KSU College of Veterinary Medicine Response

To:       Andy Brienzo, Principal Auditor

Kansas Legislative Division of Post Audit

From: Elizabeth Davis, Interim Dean

College of Veterinary Medicine

Re:      Response to Legislative Post Audit report – Reviewing Admissions to the KSU College of Veterinary Medicine and University of Kansas School of Medicine March 2026

The administration from Kansas State University College of Veterinary Medicine appreciates the dedication of time and effort from the Kansas Legislative Division of Post audit for their review and report on the college admissions process.  Upon review of the document, the administration would like to applaud the agency for the comprehensive review of this process.  Through this activity and feedback, the college will have the opportunity to review and refine the admissions process.  Comments are provided in response to LPA recommendations:

  1. The KSU College of Veterinary Medicine should take steps to eliminate the possible preferences our analyses identified. This could include things like adding additional training to their process to help avoid these types of preferences.

CVM Response:

The College of Veterinary Medicine affirms agreement that preferences should be avoided in all selection processes. The college affirms that increased training will be provided to those contributing to the student interview process.  It is essential that individuals involved with this process are properly trained to eliminate any potential preferences in the selection process.

The college would like to highlight the point that in the LPA report the reviewers acknowledge that each CVM interviewer is required to participate in anti-preference training by stating in the report: “Interviewers were required to complete training on how to use the scoring rubrics and avoid implicit biases.”   

The CVM fully acknowledges the feedback on this review and will increase pre-interview training completion that includes anti-preference content. The CVM is in complete agreement that all individuals completing the interview process must complete available anti-preference training.

  1. The KSU College of Veterinary Medicine should develop processes to consistently track whether reviewers have completed all required training. They should also monitor and enforce these requirements for people involved in their admission processes.

CVM Response:

The CVM agrees that tracking the completion of pre-interview training is essential. The college will continue to track CVM admissions-specific training practices. The college will further expand the training confirmation processes to verify and track that all training is completed by all individuals who contribute to interview activities. Individuals that have not completed the required training will not be permitted to contribute to the interview process until required training is completed. 

  1. The KSU College of Veterinary Medicine should create clear criteria for the “not recommended for admission” designation. It should also modify its use of this designation to ensure it cannot independently negate the rest of the process for any applicant without scrutiny of this outcome.

CVM Response:

The CVM agrees that the review and reporting process for all candidates needs to be clear and consistent.  In addition to the existing rubric that is used for the interview reporting, the college will include additional information pertaining to the classification of “Not recommended for admission” in this rubric. This classification will include the designation for individuals that should not be granted admission into the DVM program. This designation is in alignment with American Veterinary Medical Association Council on Education (COE) accredited US veterinary college admissions processes and is carefully considered by the trained professionals that contribute to the selection process. Criteria that will be included in this rubric will include an expectation of demonstrated professional behavior, communication effectiveness, awareness of veterinary professional obligations, and responsibilities for a veterinarian. This designation is thoroughly scrutinized by the CVM Admissions Committee and is verified by the CVM Assistant Dean of Admissions. The college admits students that will be successful with their education and professional career.

Additional observations

Summary findings include:

  • CVM policy is to invite all academically qualified Kansas residents to the college for an interview.
  • This process review did not find any unintended preferences on how the college selected applicants for an interview.
  • This process review did not find any evidence of unintended preferences regarding which applicants ultimately were offered admission to the DVM program.
  • The reviewers fully acknowledged the use of a multiple step application process and scoring formulas eliminated unintended preferences regarding which applicants were offered admission to the DVM program. 

Again, the college would like to thank and commend the agency on this thorough review process. We value this input and welcome the opportunity to continue to refine and improve the college admissions process.

Respectfully,

Elizabeth Davis, DVM, PhD, DACVIM

Interim Dean and Professor

Kansas State University College of Veterinary Medicine

KU School of Medicine Response

Chris Clarke, Legislative Post Auditor

Legislative Division of Post Audit

800 SW Jackson St, Suite 1200

Topeka, KS 66612-2212

Dear Ms. Clarke:

Thank you for the opportunity to respond to the audit, “Reviewing Admissions to the KSU College of Veterinary Medicine and KU School of Medicine.” We appreciate the professionalism of the audit team throughout the course of the review. The Legislative Post Audit (LPA) report clearly confirmed that our admissions practices are fair and based on merit.

The University of Kansas Medical Center’s School of Medicine strives to make Kansas the healthiest state in the nation through innovative education, cutting-edge research, and advancing the health of communities. The most important thing we do is educate the next generation of physicians. This work is an honor and a privilege, and it comes with enormous responsibilities. One of those responsibilities is selecting the highest quality applicants through a complex, specialized process that is consistent with state and federal law, best practices outlined by the Association of American Medical Colleges (AAMC), and the highest standards promulgated by our accrediting body, the Liaison Committee on Medical Education (LCME).

Our foremost obligation is to help meet Kansas’ healthcare needs by educating physicians who are likely to practice in our state. Selecting applicants with a meaningful connection to Kansas is a central mechanism for fulfilling that mission. As such, our policy is to show preference to applicants with strong ties to Kansas; this is intentional and directly aligned with our public mission as a state-supported medical school.

We are pleased that LPA’s report confirmed our commitment to Kansas residents and the quality of our admissions processes in selecting the best applicants based on their merit and strong ties to the State. We appreciate LPA’s diligence in evaluating our policies and processes and recommending opportunities for improvement. As an institution of higher education with a focus on continuous improvement, we welcome LPA’s recommendations.

Our commitment to serving Kansans by training health care practitioners for the state of Kansas and beyond is unwavering. We remain steadfast in our commitment to admissions practices that are aligned with our mission and with the health needs of our communities.

Sincerely,

Douglas Girod, M.D.

Chancellor

Appendix A – Cited References

This appendix lists the major publications we relied on for this report.

  1. Effectiveness of Implicit Bias Training. Federal Judicial Center.
  2. Investigating the Road to Equity: A Scoping Review of Solutions to Mitigate Implicit Bias in Assessment within Medical Education (2025). Kristin E. Mangalindan, et al.
  3. Is Implicit Bias Training Effective? (September, 2021) National Institutes of Health.
  4. Standards for Internal Control in the Federal Government (May, 2025). United States Government Accountability Office.
  5. Validated Names for Experimental Studies on Race and Ethnicity (2023). Charles Crabtree, et al.

Appendix B – Factors Used in KSU CVM Analyses

This appendix lists the 19 variables we included in the logistic and linear regression models we created for our analysis of the College of Veterinary Medicine. The data available for our models depends on what CVM collects from applicants and what scores are generated during the review process.

  • Residency: Kansas, out-of-state, international
  • GPA in science prerequisite courses
  • GPA in last 45 credit hours
  • Hours of relevant prior experience
  • Interview performance score
  • Application materials score
  • 1 or more “not recommended for admission” designation
  • Prior degree: bachelor’s, master’s, PhD, none
  • Race: White, Black, Asian, Pacific Islander, American Indian, Multiracial, no response
  • Ethnicity: Hispanic or Latino, non-Hispanic or Latino
  • Gender: male, female, nonbinary, no response
  • Geographic background: urban, rural, suburban, no response
  • Citizenship: U.S. citizen, non-U.S. citizen
  • Age at time of application
  • Military affiliation, including veterans, active duty, and military dependents
  • Pell grant eligibility
  • Employed during college
  • Prior DVM program applicant
  • Interview preference: remote, in-person, no response

Appendix C – Factors Used in KU SOM Analyses

This appendix lists the 17 variables we included in the logistic and linear regression models we created for our analysis of the School of Medicine. The data available for our models depends on what SOM collects from applicants and what scores are generated during the review process.

  • Residency: Kansas residency or ties, no Kansas ties
  • Total GPA
  • GPA in science and math courses
  • Total MCAT score
  • MCAT score in biology sections
  • Interview performance score
  • Overall assessment of suitability for medical school score
  • SOM applicant categorizations for selection committee presentations: weakest applicants, strongest applicants, neither
  • Race: Black, Asian, Pacific Islander, American Indian, Multiracial, no response
  • Ethnicity: Hispanic or Latino, non-Hispanic or Latino
  • Gender: male, female, nonbinary, no response
  • Geographic background: rural, non-rural
  • Family college background: first generation, non-first generation
  • Socioeconomic background: disadvantaged, non-disadvantaged, no response
  • Medically underserved background: medically underserved area, non-medically underserved area
  • Combined MD/PhD program applicant
  • Prior MD program applicant