According to EY Americas Consulting’s CTO, Jason Noel, the workforce doesn’t necessarily need to upskill for AI. “This idea of up-skilling the entire workforce to use AI,” he told Business Insider, “I think it’s kind of silly.” This seems like the opposite of what we’ve been hearing for the past few years, but it does happen to highlight a practical challenge many companies are dealing with today: how much do workers actually need to know about artificial intelligence to make use of it?
Upskilling has become a go-to solution for AI-Human adaption, referring to helping workers build new skills as technology evolves. But the term has taken on a broad meaning. Does it mean learning to code? Understanding how machine learning works? Or just getting comfortable with a new interface?
The answer depends: on the job, the company, and how AI is being used. It may be the case that not every worker needs to understand how AI functions under the hood. “They just know that they have a screen and an application that says, ‘Here’s how much stuff you have now of this,’ and ‘Here’s how many you have coming inbound,’” Noel said. “They don’t need to know how the technology works.” Put simply, AI literacy isn’t about the technical details, but about learning how to use it.
EY, like other consultancies, is helping companies adopt AI by creating tools tailored to specific roles, where the machine offers suggestions and the human makes the final call. For example, AI can help cruise ship staff predict guest behavior based on weather and onboard activity, suggesting changes to staffing or supplies. The person in charge then decides whether to follow through. That’s not traditional upskilling: it’s getting comfortable with a new interface. “The AI will turn around and list out and build out that process automatically,” Noel said. “The human in the loop says, ‘Okay, that makes sense,’ or ‘I want to change this piece.’”
Still, the idea that upskilling isn’t needed doesn’t apply to every situation. McKinsey’s Ben Ellencweig notes that AI literacy falls on a spectrum, from basic awareness to more advanced skills, and many companies are still figuring out where their people stand. “What we forget is usually the weakest link is us as humans,” he said. That’s where the knowledge gap shows up: between what workers need to know to use AI well and what they currently understand.
Some jobs will require more than just clicking through screens. Data analysts may need to become data engineers. HR teams using AI in hiring might need guidance on how to interpret algorithmic recommendations fairly. In these cases, companies are creating skill maps to define learning paths and qualifications needed to transition into new roles.
Context plays a big role. Digital-first companies are often more prepared for this shift. Others, especially those with older systems or siloed teams, may need to rethink how work gets done. As Melissa Swift of Anthrome Insight put it, “There’s probably an overemphasis on training and an underemphasis on actually using the technology.” In short, hands-on experience matters more than theoretical instruction.
At MIT’s Sloan CIO Symposium, experts highlighted the importance of a human-centered approach to AI. Workers tend to use AI to offload routine tasks, freeing them up to focus on what humans do best, that is, applying judgment, showing empathy, and offering insight. Research by Isabella Loaiza at MIT Sloan found that jobs supported by AI were more likely to be held by moderately skilled workers: a shift from past automation trends that mainly favored top performers. Seen this way, AI could help level the playing field. “Work is becoming more human,” Loaiza said.
That leads to another question: if AI makes strong performance more accessible, does it narrow the gap between high and low performers? Not necessarily. While AI might raise the baseline, those with a deeper understanding (who know how and why the model produces a result) may still stand out. IBM has pointed out that having a grasp of how generative AI, machine learning, or natural language processing works can be an advantage. Even basic knowledge can help workers spot problems or write better prompts.
This is leading to two different skill tracks. One group learns just enough to use the tools. The other digs deeper, learning how AI works and how to apply it more thoughtfully. Over time, that second group could be the ones driving innovation and leading teams, even if the technology is handling more of the routine work.
But companies can’t put the entire burden on workers. The most effective upskilling programs, whether for store employees or finance professionals, are built around real tasks and supported by leadership. Some partner with universities. Others set up in-house learning programs. Companies like Liberty Mutual start with an internal review: where do things stand, and what needs to change?
It’s also important to recognize that AI is evolving quickly, and training can become outdated fast. “Skills have a shelf life,” Loaiza said. That’s why building a culture where people can experiment (test things out, learn from mistakes) is more valuable than any single training module. The best programs leave space for trial and error.
So, is it silly to upskill the whole workforce? Not if “upskilling” means giving people the tools to work alongside AI, rather than expecting everyone to become an expert.








