At last year’s Super Bowl, the fourth quarter was a real nail-biter. But while most viewers were worried about the score, I was stewing over a late-in-the-game Microsoft commercial.
The ad gave an eerie glimpse into the emerging shape of generative AI companions. In it, a series of mostly young actors appear with an air of gritty determination. To the tune of “Watch Me” by The Phantoms, a string of platitudes appear on screen: “They say I will never open my business… or get my degree… they say I will never make my movie…or build something… they say I am too old to learn something new…too young to change the world… but I say: watch me.”
As the song crescendos, a user asks her Microsoft Copilot, “Can you help me?” The Copilot obliges: “Yes I can.”
The “my bot and me against the world” vibe is creepy and depressing. But it’s also emblematic of the times we live in. AI is emerging against a backdrop of young people feeling left to fend for themselves.
Microsoft isn’t the only company selling companion bots to help young people beat the odds. A host of edtech tools designed to help students navigate their education and careers are starting to follow suit.
In a new report out this month, “Navigation and Guidance in the Age of AI,” Anna Arsenault and I explore the various ways bots are being built to provide students with more on-demand information, guidance, and support. We interviewed leaders and advisors at 30 tech companies and hybrid advising organizations spanning the education-to-career continuum.
The more leaders we spoke to, the more an unspoken tension in this market began to emerge: will AI be built to scale access to human help or simply to help more students help themselves? The answer doesn’t just depend on the technology but on the degree to which we treat building a career as a lone pursuit.
A Promising Solution to Chronic Gaps
AI could be a game-changer in democratizing access to college and career guidance. Today, punishing student-to-advisor ratios mean student support significantly lags behind need. On average, there’s one guidance counselor for every 385 high school students. For every one career services employee, there’s an average of 2,263 college students. And while students may turn to other staff and faculty for academic and career support, only half of graduates report access to a mentor.
With looming budget cuts, those numbers are likely to move in the wrong direction, leaving far too many students looking for colleges and jobs on their own.
If scarcity is the mother of innovation, then it’s hardly surprising to see chatbots beginning to fill those gaps.
The range of tools on the market is quite broad, as is the technology powering them. Some tools, like Mainstay, pre-date ChatGPT and are built on machine learning and pre-scripted decision trees designed to nudge students to meet with their advisors and complete time-sensitive administrative tasks. Other tools built on generative AI, such as Axio and College Vine, are more elaborate. They’re designed to get to know students over time, offering a constant companion to help them explore and pursue their interests, purpose, and potential.
The variety reflects the breadth and complexity of navigation and guidance itself. It’s a catch-all market for a dynamic—and overwhelming—mix of information, advice, care, reflection, exposure, exploration, nudging, and decision-making support.
Across that mix, leaders and entrepreneurs are grappling with where bots versus humans can excel.
The Blurring of Social and Parasocial Support
We set out to understand whether technology was being built to scale on-demand guidance or human connection. For most leaders we spoke with, the answer was a resounding “both”—we’re entering an era where students should ideally benefit from a smart blend of bot and human support.
To get more specific, we asked leaders what types of support bots versus humans should ideally offer to students in their education and career journeys. Nearly all said bots are optimal for informational support (which we defined as accurate information about education and careers), while humans are better for esteem, emotional, and motivational support (defined as expressing belief in students, addressing social-emotional challenges, and co-constructing a vision of the future).
But despite the common refrain that bots should perform technical tasks so humans can perform “human” ones, some leaders admitted that the distinction is getting much blurrier. When we looked under the hood, over half of respondents said bots should ideally provide some esteem, emotional, or motivational support. And the vast majority agreed that their bots were engineered with an emotionally supportive tone.
“We made all kinds of charts early on about what the human versus the bot would do, and it felt a lot cleaner,” says Kanya Balakrishna, co-founder and CEO of Future Coach, which is releasing its new tool this year. “But we’re finding it’s a little bit messier than that. The bot is already able to help with some of the things that felt like they were going to be inherently human.”
In other words, the more helpful bots become, the more the field will have to grapple with how much students can and should rely on human help.
That trend presents a paradox: in the age of AI, navigation and guidance tools could result in students gaining more access to resources and support but less access to social and professional networks.
Nurturing Networks in the Age of AI
If students spend more time engaging with guidance bots, will they spend less time building human connections? That’s an empirical question, but one where market signals can foreshadow where we’re headed.
Perhaps the most disturbing takeaway from Microsofts’ Superbowl ad was the idea that young people will accomplish things in spite of, rather than with the help of, other people. There’s little to suggest that’s actually how the world works. An estimated half of internships and jobs come through personal connections. Students with access to mentors are significantly more likely to graduate feeling prepared for life after school. Young people who have an adult encouraging them to pursue their goals are more than twice as likely as those without to have a promising future.
Based on our interviews, however, the market hasn’t caught up to that research. Navigation and guidance tools are largely optimized around delivering information and supports that lead students through school and toward jobs. It’s harder to build a sustainable business model around technology that scales the opportunities and connections that actually get students jobs or support job mobility.
Technological advancements and market failures are only part of the story. Students’ evolving preferences matter, too. Many leaders also told us that bots can draw some students out in ways humans can’t. “Sometimes the kids don’t say everything to us. Even if we ask the question, they won’t say it to us, but they’ll type it to a chatbot,” says Adia Adams, an advisor with College Advising Corps. Many leaders we interviewed noted that students preferred the on-demand convenience of bots.
While swapping one-off conversations from a human to a bot may sound trivial, even loose human connections are crucial to unlocking opportunities. In fact, research has long shown that we are more likely to get jobs through our “weak-tie networks”—more distant connections and acquaintances—than through our strong ties with close friends and family. If students turn to bots, rather than people, to explore careers or seek out advice, the outer rings of students’ networks could suffer, in turn shrinking their inroads and options in the labor market.
Weak ties are particularly susceptible to disruption because a well-trained bot can make you feel more supported than a distant human connection. Past research suggests that one-sided or “parasocial” relationships can be perceived as fulfilling emotional needs better than human weak-tie relationships. That leaves an opening in the market. Young people will be tempted to bank more on bots than on one-off human interactions, over time shrinking their networks in favor of digital surrogates.
The irony is that while those bots could indeed provide useful information, support, and even a sense of connection, they can’t actually open doors to opportunities the way our human weak-tie networks do. In short, bots could inadvertently chip away at the very opportunities they’re being engineered to unlock.
Some leaders we interviewed were hopeful that things could take a different turn if schools and colleges repurpose staff to support networking.
“Technology can always provide you with ‘here are all the people that you could potentially talk to.’ But AI can’t give you that warm introduction in a way that a human can, and it just feels different,” says Christine Cruzvergara of Handshake, which piloted its chatbot Coco last year. “That is fundamentally a huge value proposition for career centers as they work to stay relevant alongside gen AI.”
Cruzvergara and a number of the other leaders we interviewed envisioned a future where advisors and coaches step away from transactional support and become connectors and brokers on behalf of students. Other leaders were optimistic that bots could teach students to better mobilize networks on their own. For example, Coach, a bot powered by CareerVillage.org, is engineered to help students learn how to arrange and prepare for informational interviews, take stock of their networks, ask for references, and follow up after interviews.
“AI is helping unlock these opportunities for dynamic, interactive learning and role-play of these absolutely fundamental but often hidden soft skills,” says Jared Chung, the organization’s executive director.
Scaling Self-Efficacy or Perpetuating Rugged Individualism?
Despite these hopes, the fragile fate of students’ networks in an AI-enabled future tells a deeper story about the philosophy behind navigation and guidance solutions.
By the time our interviews wrapped up, while we never used the word self-help, a specter of rugged individualism had emerged. No one outright discounted the value of human relationships. But there seemed to be an unspoken assumption that if we give students more resources and supports, they will be better able to take care of themselves. That AI could fundamentally build a better compass, but not necessarily a bigger Rolodex.
In theory, that may not be a bad goal: the other side of the self-help coin is student agency. Many entrepreneurs in the space believe that technology will give more individuals the skills and mindsets to build their own futures. A well-trained bot can outcompete the other forms of self-help students may already be engaging in. With more precise and personalized information at their fingertips, students won’t have to rely on word of mouth, meandering Google searches, or infrequent guidance to gain a picture of their prospective colleges’ ROI or their career possibilities. That sense of empowerment might be the most powerful unlock that AI could scale, regardless of the human resources that school systems can, or in most cases, can’t offer.
However, that underlying philosophy could morph into architecting a system based on lone pursuits rather than collective help. In practice, I worry that the virtues of self-help—like empowerment, agency, and self-determination—are colliding with the realities of the market where bots are likely to become a replacement, rather than a supplement, to hard-to-fund human help.
Even if AI systems are built to encourage students to meet with their advisors or engage in other help-seeking behaviors, there’s still the challenge of ensuring there’s somebody there to answer the call. AI can’t fix that.
In other words, if we want to address very real opportunity gaps, are we building systems that don’t just teach students to fish but also stock the pond?
The “pond” should be filled with opportunities and networks. That’s much harder for tech tools to pull off. Pond-stocking is grueling, often underfunded work.
Teaching students how to fish is noble work with a long runway for innovation. But approaches that stop there are unlikely to unlock the transformative potential of AI. They won’t level the playing field of opportunity. They will run up against the inconvenient truth that Microsoft hid from football fans: you can’t build a business, get a degree, or change the world alone.
You also shouldn’t have to.
Julia Freeland Fisher is director of education research at the Clayton Christensen Institute and author of the book “Who You Know: Unlocking Innovations That Expand Students’ Networks.”
