Delhi | 25°C (windy) | Air: 185%

Balancing the potentials and pitfalls of AI in college admissions

  • Nishadil
  • January 02, 2024
  • 0 Comments
  • 12 minutes read
  • 21 Views
Balancing the potentials and pitfalls of AI in college admissions

January 2, 2024 This article has been reviewed according to Science X's editorial process and policies . Editors have highlighted the following attributes while ensuring the content's credibility: fact checked trusted source proofread by Ellen Evaristo, University of Southern California There is not a typical day in an admissions office according to Ryan Motevalli Oliner ME '20, associate dean for enrollment operations at Kenyon College in Gambier, Ohio.

As a small private school, Kenyon receives approximately 8,500 applications a year with a 29% acceptance rate in 2023. Motevalli Oliner's department processes and imports college applications for review. "We try to stay true to our mission, but also try to make sure we're meeting students where they are and give them the resources that they need to go through this unnecessarily complicated process," says Motevalli Oliner, who graduated from USC Rossier's Master of Education in Enrollment Management online program (EMP online).

Reviewing applications is a community effort at Kenyon. The college uses both the Common Application and the Coalition Application to gather student admissions materials and begins reviewing applications in mid November. "We have a holistic review process," Motevalli Oliner says. "We read everything that a student submits to us." Employing a committee based evaluation method that encompasses a two person review, teams read applications every day; one person reviews the applicant's academic side while another examines co curriculars and recommendations.

This approach contextualizes the prospective student. While there is a growing trend in college admissions to use artificial intelligence, Kenyon does not employ AI in their process at this point. There is an art and science to Kenyon's review, according to Motevalli Oliner. "Synthesizing information with AI, I can see that happening, but I don't think you'll ever take away from the human element," he says.

There are, however, a growing number of colleges and universities using AI to assist admissions offices as they evaluate applicants. Texas A&M University–Commerce and Case Western Reserve University utilize AI tools like Sia to quickly process college transcripts by extracting information like student coursework and college transfer credits.

Georgia Tech has been experimenting with AI to replicate admissions decisions using machine learning techniques. The technology allows schools to sift through large data sets , evaluating thousands of applications more efficiently. Theoretically, this frees admissions staff members to have more time to thoughtfully consider other aspects of applicants' submitted materials.

But what's at stake when AI is incorporated into the review process? "It's a complicated matter, and it's not the first time that admissions has considered how to use algorithms or formulas in its processes," says Jerome Lucido, founder of USC Rossier's Center for Enrollment Research, Policy and Practice (CERPP) and former chair of and national presenter for the College Board's Task Force on Admissions in the 21st Century.

While related, there are two distinct tools in the college admissions process: Algorithms and machine learning, according to Lucido. A college admissions algorithm is a set of rules or instructions used by educational institutions to evaluate and select applicants for admission. Colleges and universities often have their own unique admissions processes and evaluate based on the university's criteria.

Many institutions commonly use a holistic approach that considers a combination of factors including academic records, standardized test scores, extracurricular activities, recommendation letters and interviews. Machine learning, a subset of AI, is a specific technology that can be used to improve data analysis and decision making .

According to researchers at the USC Viterbi School of Engineering's Information Sciences Institute, machines are taught to behave, react and respond similarly to humans using data collected. As it applies to the college admissions, machine learning combined with admissions algorithms would streamline the process, identify patterns and make informed decisions to form predictions based on historical data.

This data driven approach could potentially help universities identify candidates who possess those characteristics determined by the institution for academic success. In a joint statement from the Association for Institutional Research (AIR), EDUCAUSE and the National Association of College and University Business Officers (NACUBO), the organizations supported and reinforced the use of data to help better understand students.

Data also lays the groundwork to develop innovative approaches for improved student recruiting. However, there is a challenge of relying too much on quantitative data. AI is efficient for processing data, yes, but it may not capture a student's complete life story, full potential or unique qualities.

For instance, factors like personal challenges, resilience and growth might not be reflected in the data, which could lead to missed opportunities for students who have overcome obstacles. "Many large public flagships and certainly selective privates were already well down a path that wasn't being called AI," says Don Hossler, senior scholar at CERPP.

"They were building in algorithms that help them screen students." The use of AI in the screening process, Hossler says, is really the next natural extension. For students applying to college, AI's role in admissions initially seems promising, offering several benefits. For example, chatbots, or automated live chats, become pseudo customer service representatives, providing instant assistance during the application process, answering common questions, offering personalized guidance based on the student's profile and even setting deadline reminders.

It is also important to recognize their limitations. While useful for routine queries, chatbots may not replace human interaction, especially for complex issues or emotional support that some applicants may require. A balanced approach would be a combination of a chatbot and human support from college admissions staff and counselors to ensure a successful and positive application experience for students.

On the flip side, students are turning to generative AI technology to help them pull together their applications, including using ChatGPT to write their personal essays— the one area of the process where applicants can show universities who they truly are. AI, with its near humanlike responses, may sound appealing, but it calls into question academic integrity .

Will university admissions be able to determine whether an essay was written by a human? "The sad part of that, on the student's side, will be that it may reduce the extent to which they think through the application process on their own," Hossler says. An essay prompt from this year's Common Application asks students to "Recount a time when you faced a challenge, setback, or failure.

How did it affect you, and what did you learn from the experience?" An AI generated response to the prompt would not result in a genuine student answer. However, one benefit for students using a tool like ChatGPT during the drafting stage is that it offers a forum to try out ideas or to formulate arguments.

According to Rick Clark, Georgia Tech's assistant vice provost and executive director of undergraduate admission, AI could act as a sounding board for students who cannot afford an admissions consultant. "Will they use it? Probably. Will we be able to decipher it? Probably not, to be honest," Motevalli Oliner says.

"It's a resource, but at the end of the day, you're going to have to write that essay yourself." While the essay is one of the most important parts of the review, it's not the only consideration. Kedra Ishop, vice president for enrollment management at USC, sees this next phase as another evolutionary step in admissions.

"We navigate at different levels, at different kinds of institutions," says Ishop. A 25 year higher education veteran and nationally recognized expert, she leads the university's admissions, financial aid and registration functions. "In the admissions space, we always have a sense of healthy, positive skepticism, and we seek more information to know more about the student," she says.

Ishop adds that admissions officers are adept at triangulation during the review process. Through triangulation, admissions professionals identify correlations within an application, looking to see if a student's voice is consistent throughout and ensuring that recommendations align. Admissions officers seek multiple sources of data on each student for that reason.

Ishop acknowledges that various individuals—parents, guardians, teachers or educational consultants—often assist and play a role in assembling admissions materials with students. "We'll see this year in particular what comes from [AI]," says Ishop. "We're not panicked about it." As with any new technological development, she is aware that it is something that the admissions team will have to steer through and expect that the student's voice will prevail.

Amid the landscape of the U.S. Supreme Court decision on race blind admissions, the implementation of AI in college admissions has raised equity concerns. On the plus side, these tools can help institutions identify applicants who might have been overlooked through traditional processes, but on the other, there are valid concerns about bias.

Can AI learn biases? Bias can seep into the system in a variety of ways. For example, AI systems learn to make decisions based on data that may include biased human decisions or that may contain a flawed data sampling featuring groups that are underrepresented. If not carefully designed and monitored, AI systems could conceivably perpetuate existing biases in the admissions process.

"We know from [UCLA internet studies scholar] Safiya Noble's work and that of many others that technological innovations like Google search engines are often baked with biases that can reproduce inequities," says Royel Johnson, USC Rossier associate professor. "AI is no different.

It's people who design and inform the algorithms, curate the data and make the decisions that shape these systems." This could disproportionately disadvantage certain groups, leading to inequitable results. AI systems may also unintentionally favor applicants who have financial resources to hire college consultants, which could create a class divide and widen the education gap.

According to Hossler, affluent students are likely working with private counselors who inform applicants of what they need to say or write rather than acting as an open editor for applications. Lucido, an outspoken expert on the affirmative action decision, is cautiously optimistic. "I want to keep an open mind about what this sort of machine learning can do to assist admissions and equity," Lucido adds.

"But everything I know about college admissions and how it's done suggests that even currently, we don't have a highly equitable system, particularly in the most selective places." The most important element about the review is reading in context, according to Ishop. Whether it is AI learning , neighborhood or socioeconomic bias, "our process is designed to read within that environmental context," she says.

Considering information such as an applicant's socioeconomic background and the educational opportunities available at a student's high school—several AP courses at one school versus only a few courses offered at another—provides context for the admissions team. How higher education institutions address equity and AI will require a multifaceted approach.

No system is perfect, and human involvement is still needed. Colleges and universities should invest in training admissions professionals to work with AI tools and carefully assess the recommendations provided by these systems. "You have to have mission directed people and highly trained people to understand how this works," says Lucido.

According to a PricewaterhouseCoopers report , individuals write the algorithms, select the data used by algorithms, and decide how to apply the results. Without diverse teams and rigorous testing of the AI systems created, there is a chance that individual biases may enter the AI. How do you change that? A diverse admissions staff may be one way, and collecting and using data that accurately reflect the backgrounds, experiences and achievements of a range of applicants could mitigate biases present in historical data and improve the algorithm's ability to identify the potential in all students.

Oversight, monitoring and adjustment of AI systems is needed when it's applied to college admissions. "It's an open question as to how much oversight can and will be given if these systems are used," Lucido says. Regular assessments of AI's impact on equity, combined with improvements, can help address biases and flaws.

"Certainly, there are enormous benefits of AI, but we must also be clear about the risks," Johnson adds. "Overreliance without conscientious efforts to mitigate bias will surely exacerbate the very inequalities we seek to address. AI is only as just as the equitable decisions that inform its design." For Liana Hsu ME '20, director of admissions at UC Berkeley Graduate School of Journalism and a graduate of USC Rossier's EMP online program, day to day work in the admissions office differs.

Berkeley's admissions team is focused on holistically supporting prospective students who are interested in learning about and applying to the Master of Journalism program. This work includes designing an equity centric admission review process. "We are continually in the midst of evaluating our admissions processes to understand how we are serving our students," Hsu says.

"I want to really understand how we can close the gaps for students to better support them and to think about how we strategically use our resources." AI does not currently play a role in the school's review process. "We want to hear from the students' voices directly—their full lived experiences and how that's shaped their passion for journalism.

These are not intricacies that AI can provide," Hsu says. Hsu sees potential AI benefits both on the university and applicant sides. Colleges could use AI to explore and fine tune marketing and outreach efforts, and candidates could utilize it as a search compilation tool to help them find funding and scholarships, particularly for graduate education.

"Hopefully, there are more conversations," Hsu says. "I think it's important for higher education institutions to always adapt and, in particular, always think about how we use new technologies to increase accessibility, advance educational equity, and leverage them as a tool to empower students." Provided by University of Southern California.