How to Plan L&D Budget in the Age of AI?

The shelf life of a professional skill is getting shorter every year. Enterprise leaders are confronting a fundamental question: traditional L&D cycles, annual programs, fixed curricula, are built for a world that no longer exists.

Workforce Intelligence/Deep Dive

The AI Skills Shift: Where Should Companies Really Be Investing?

The shelf life of a professional skill is getting shorter every year. Enterprise leaders are confronting a fundamental question: traditional L&D cycles, annual programs, fixed curricula, are built for a world that no longer exists. Five CHROs and founders tell NervNow what the rethink actually looks like from the inside.

Editorial note: This piece presents direct inputs from five senior enterprise leaders across technology, HR, and cybersecurity. Responses have been edited for length and house style. Views expressed are those of the individual contributors and do not represent NervNow’s editorial position.

AI has not merely accelerated the pace of change in the skills economy. It has rewritten the underlying logic of what a critical skill is. Capabilities that commanded premium salaries 24 months ago are now being replicated, augmented, or outright replaced by tools that update every quarter. Enterprise L&D leaders are running into a structural problem: the old infrastructure, annual cycles, fixed curricula, credential-led outcomes, was built for a world that updated in years, and AI has compressed that to quarters. NervNow spoke with five senior leaders across technology, cybersecurity, HR, and workforce innovation to find out what the rethink actually looks like.

When the Definition of a Critical Skill Changes Overnight

A few years ago, ShepHertz Technologies hired for deep domain expertise and tool-specific proficiency. Today, the company looks for something quite different. Siddhartha Chandurkar, CEO and Founder of ShepHertz, said the organization now prioritizes adaptability, AI fluency, and the ability to translate human judgment into prompts and verification loops. Hard technical skills, particularly in coding, data analysis, and content generation, have a shelf life of roughly 12 to 18 months, given how quickly AI co-pilots evolve.

What has grown in value is harder to put on a job description: critical thinking, ethics, customer empathy, and cross-functional storytelling. Building wAnywhere, ShepHertz’s AI-powered productivity platform, made this shift concrete, because workflows now demand AI literacy alongside human intuition. As Chandurkar put it, “The half-life of skills has shortened, but the half-life of curiosity has only grown.”

“The half-life of skills has shortened, but the half-life of curiosity has only grown.”
Siddhartha Chandurkar, CEO & Founder, ShepHertz Technologies

Chaitra Vedullapalli, Co-Founder of Women in Cloud, framed the shift in similarly sharp terms. The organization has moved from training people in jobs to training people for workflow innovation and adaptability itself. The half-life of technical skills used to be five years; now it sits at 18 months. What is counterintuitive, she noted, is that Women in Cloud is not training more as a result. It is building workflows smarter, embedding adaptability into how work is designed rather than running harder to keep the training current.

Govind Rammurthy, CEO and Managing Director of eScan, identified a structural problem the other leaders touched on but did not name as directly. AI has automated away precisely the tasks that used to train junior employees: writing basic CRUD operations, generating boilerplate code, producing first-draft documentation.

These were entry-level assignments that taught fundamentals while delivering value. The skills they developed are not obsolete in an abstract sense, but the pathway through which people acquired them has compressed. The gradual skill-building that used to unfold over years is no longer guaranteed.

What has not become obsolete, Rammurthy noted, and likely will not, are the skills AI genuinely struggles with: understanding customer requirements buried in ambiguous requests, debugging complex systems where multiple variables interact unpredictably, making architectural decisions that balance competing priorities, and recognizing when technically correct solutions fail business reality tests. The critical skill of the AI era, in his view, is judgment.

Aditya Tiwari, Regional HR Manager for APAC at Kerakoll Group, offered a perspective that pushed back on the pace-of-obsolescence framing entirely. Skills are becoming hyper-customized and role-specific, shaped by how humans work alongside AI, and the bigger shift is in how skills are described rather than how fast they disappear.

“Skills are not becoming obsolete faster,” he said. “Relevance now depends on learning agility.” Curiosity, critical thinking, problem framing, judgment, and ethical decision-making are increasingly valuable and unlikely to become irrelevant. Storytelling, stakeholder influence, contextual thinking, and emotional intelligence matter more as AI takes over routine execution.

“Skills are not becoming obsolete faster. Relevance now depends on learning agility.”
Aditya Tiwari, Regional HR Manager APAC, Kerakoll Group

Manish Mohta, Founder of Learning Spiral, reinforced this from a learning design perspective. Technical skills have a shorter lifespan than before, and what was central to roles three to five years ago may no longer apply today, but skills do not actually disappear. They continue to evolve. The organizations he works with are paying much more attention to building learning agility as an organizational characteristic, creating structures that allow employees to identify, develop, and refresh their capabilities continuously rather than periodically.

Where the Smart Money Is Actually Going

The leaders NervNow spoke with are watching different things, responding to different pressure points, and making distinctly different bets on where capability investment actually compounds.

Vedullapalli framed the choice as a question of leverage. Imagine a $1 million L&D budget, she said. You could teach everyone Python, which is what most organizations are doing. Or you could teach people to ask the right questions to optimize and innovate workflows, which is what Women in Cloud is doing.

The second option multiplies leverage. The team does not need to be data scientists. They need to know enough to ask smart questions and spot bad answers. The rule she has landed on: invest in skills that multiply the value of AI, not skills AI replaces. That means decision-making, judgment, end-to-end workflow activation, and cross-functional navigation.

Chandurkar described an investment thesis anchored to three reference points: where AI is creating new value, where existing roles are being augmented, and where customer expectations are shifting fastest. ShepHertz listens closely to product roadmaps, customer feedback, and the kinds of unfamiliar problems its engineers run into, all of which surface where capability gaps exist.

The organization also tracks internal mobility. When employees are repeatedly stuck because they lack adjacent skills, that becomes an investment priority. Quick experiments, like one-week sprints with new AI tools and live insights from wAnywhere on how teams actually work, have proven more informative than annual training plans. The indicator Chandurkar uses to confirm an investment is working: learning translates into faster shipping.

Rammurthy at eScan invests where the gap between what AI can do and what the business requires is widest, and where that gap is unlikely to close quickly. Currently, that means three areas: security architecture requiring deep domain expertise that AI tools lack, customer-facing communication where trust and relationship matter more than technical correctness, and operational decision-making where context and judgment outperform pattern matching.

One of the more important distinctions eScan has drawn is between skills that need formal training and skills people acquire through exposure. AI tool usage generally does not require extensive L&D investment, because people learn by using tools daily. What requires formal training is knowing when not to use AI, recognizing AI-generated errors, and developing the kind of judgment automation cannot replicate. Those meta-skills around AI usage, Rammurthy argued, prove more valuable than teaching specific AI tools that will change next quarter anyway.

Tiwari’s approach at Kerakoll centers on identifying learning debt and strengthening behavioral capabilities rather than chasing every new technical skill. The priority is understanding how work is evolving and where people are falling behind in mindset, decision-making, and ways of working.

The indicators that guide his choices include recurring performance gaps, resistance to change, declining adaptability, and roles where AI is augmenting work but human judgment and collaboration remain critical. This leads to investment in behavioral upskilling: learning agility, curiosity, problem-solving, critical thinking, and change readiness, with AI literacy embedded into everyday learning rather than treated as a separate program.

Mohta’s approach at Learning Spiral aligns learning objectives directly with corporate strategy, whether around innovation, digital transformation, or operational efficiency, and establishes priority areas through skills gap analyses, performance assessment, and feedback from senior business leaders. The broader shift he has observed across clients is a move from broad-based training to more individualized, role-specific learning interventions with measurable results.

“Invest in skills that multiply the value of AI, not skills AI replaces. Your team does not need to be data scientists. They need to know enough to ask smart questions and spot bad answers.”
Chaitra Vedullapalli, Co-Founder, Women in Cloud
Five voices, five takeaways
Siddhartha Chandurkar CEO & Founder, ShepHertz Technologies

Peer-led learning circles, small groups sharing AI experiments weekly, outperformed every formal training program ShepHertz ran. Adults learn best when learning is contextual and immediately applicable.

Chaitra Vedullapalli Co-Founder, Women in Cloud

A weekly session called “What We Broke and Learned” changed the culture more than any training program. When capable people publicly share what failed and why, fear drops, curiosity rises, and the whole organization moves faster.

Govind Rammurthy CEO & Managing Director, eScan

eScan separated tool evolution from capability evolution. When a new AI tool launches, no training is required; people experiment on their own. Formal investment goes only to genuine capability changes, because that distinction helps employees know what actually needs to be mastered.

Aditya Tiwari Regional HR Manager APAC, Kerakoll Group

The goal at Kerakoll is to move away from a culture of constant catching-up to one of supported, paced development, where learning feels enabling rather than exhausting. Clarity on what matters now versus what can wait matters more than volume of training.

Manish Mohta Founder, Learning Spiral

Manager involvement matters as much as the learning content itself. The organizations making progress treat managers not as administrators of training programs, but as active supporters helping teams apply and retain what they have learned.

The L&D Confusion: Tools vs. People, and Who Gets It Wrong

On the question of investment balance, the leaders NervNow spoke with reached a collective verdict: most organizations are misallocating rather than misjudging the amount. The money is flowing, but it is flowing to the wrong places.

Vedullapalli made the arithmetic explicit. Companies are spending roughly 80 percent of their AI budgets on tools, new software, platforms, and consulting, and 20 percent on people. Her view is that ratio should be inverted.

Tool investment is a depreciating asset, obsolete in 18 months. People investment appreciates: the team gets smarter with each cycle. The organizations that bought tools without building thinking, she said, will be exposed in the next economic downturn. It always happens that way.

Chandurkar described the same pattern in different terms. Most companies are investing too heavily in tools and too little in mindset. Buying licenses or running a single prompt engineering workshop creates an illusion of progress without changing how teams actually work.

The real gap is in cultivating an AI-first mindset, the capacity to reimagine workflows rather than simply optimize them. At ShepHertz, the team separates quick-cycle skill enablement, which covers tool training, weekly hackathons, and sandbox environments, from deeper capability building around systems thinking, ethics, and customer-centric design. Near-term skills are treated as muscle memory and durable capabilities as judgment. Both matter, but skipping the second is what causes companies to plateau regardless of how many tools they buy.

Rammurthy pointed to something visible in eScan’s own hiring process. Candidates arrive knowing ChatGPT, Copilot, Claude, and a dozen other AI tools, but struggle to explain when those tools produce incorrect output or why certain problems should not be delegated to AI at all. Tool familiarity has been commoditized. Judgment remains scarce.

The companies getting this right, he observed, are not teaching AI skills as a separate category. They are teaching their core business with AI integration embedded, which builds both immediate productivity and long-term capability at the same time.

Tiwari offered the most measured framing of the group. AI skill investment is largely a function of organizational affordability and business context. Companies with deep pockets can afford to invest aggressively and experiment. Those with capital constraints are understandably more selective, and both approaches can be valid if they are aligned to business priorities.

In a VUCA and BANI environment, most skills should be viewed through a short-term lens with frequent recalibration, because AI-related requirements are evolving too fast for rigid, long-term skill bets. That said, a small set of capabilities must still be built for the long term: learning agility, critical thinking, problem framing, adaptability, and ethical judgment, because these are what allow people to continuously acquire new AI skills as demands change.

His balance is pragmatic: invest tactically in AI skills that solve immediate business problems, and build enduring human capabilities that allow the organization to keep pace with constant change. The real differentiator, he said, is not how much companies invest, but how dynamically they adjust those investments as the landscape shifts.

Mohta described a layered pathway at Learning Spiral: initially building an understanding of AI, then providing job-specific training to apply that knowledge, then developing more advanced strategic capabilities. The structure allows the organization to meet present business requirements while preparing the workforce for what comes next.

“Tool investment is a depreciating asset, obsolete in 18 months. People investment is an appreciating one. The ratio should be inverted.”
Chaitra Vedullapalli, Co-Founder, Women in Cloud

Experiments That Paid Off, and the Ones That Did Not

The final question produced the most candid answers of the conversation, partly because it asked for failure as well as success, and partly because the catch-up problem is one every leader here is actively managing rather than considering solved.

Vedullapalli described a cultural intervention rather than a training program. Most companies hide their AI failures. A manager experiments with a tool, it fails, and they move on quietly. The psychological effect, she said, is corrosive: everyone else in the organization comes to believe that AI works for everyone except them.

Women in Cloud started a weekly 30-minute session called “What We Broke and Learned,” where someone shares a failed experiment and the team debates it together. When intelligent, capable people publicly say “this AI implementation failed because,” it reframes the relationship with experimentation. Fear drops, curiosity rises, and velocity accelerates, because failure becomes visible and resolved rather than hidden and repeated.

Chandurkar’s biggest win at ShepHertz has been peer-led learning circles: small groups where engineers, product managers, and non-technical staff share AI experiments weekly. They outperformed every formal training program the company ran.

Building talentanywhere.ai reinforced this view: adults learn best when learning is contextual and immediately applicable. What has not worked is content-heavy LMS courses, because people simply skim them. Insights from wAnywhere help the team see where workflows slow down and where data security risks emerge, which allows L&D to be directed where it genuinely matters.

To prevent fatigue, ShepHertz has normalized “not knowing” as a cultural value, encouraging people to share what they are figuring out, not only what they have already mastered. “The goal is not to keep up with every AI release,” Chandurkar said. “The goal is to build resilient learners who can pivot when the next wave hits.”

“The goal is not to keep up with every AI release. The goal is to build resilient learners who can pivot when the next wave hits.”
Siddhartha Chandurkar, CEO & Founder, ShepHertz Technologies

Rammurthy at eScan identified an investment that was completely unsexy but worked: systematic documentation of how the company actually works, not how it wishes it worked. Every team documented their current workflows, decision criteria, and quality standards, and then worked through where AI tools could augment each step.

It worked because it started from operational reality rather than imposing abstract AI training on top of it. What failed, by contrast, was expensive AI certification programs that taught tools generically without connecting to eScan’s actual work.

People completed certifications, earned credentials, and then could not apply anything, because the training covered AI in isolation rather than AI within cybersecurity specifically. As Rammurthy put it, context matters more than credentials.

To prevent the catch-up feeling, eScan has separated tool evolution from capability evolution. When a new AI tool launches, no training is required: people experiment independently. When the security approach changes or customer requirements shift, that gets formal L&D investment, because it represents genuine capability change rather than a product update. This distinction helps employees differentiate between what is nice to know and what needs to be mastered.

Tiwari at Kerakoll found that bite-sized, context-driven learning embedded into real work, short learning nudges, peer learning circles, and problem-based projects supported by AI tools, has worked considerably better than large, generic AI upskilling programs rolled out without role relevance. The latter tend to leave employees overwhelmed, disengaged, or unsure how the learning connects to their day-to-day responsibilities.

To prevent the feeling of constant pressure to keep up, Kerakoll has focused on clarity over volume: being explicit about what skills matter now versus what can wait, providing guided learning paths rather than endless catalogs, and ensuring strong manager involvement so that teams can prioritize effectively. The goal, as Tiwari described it, is to move away from a culture of constant catching-up to one of supported, paced development where learning feels enabling rather than exhausting.

Mohta at Learning Spiral described a comparable outcome. Bite-sized continuous learning through combinations of microlearning modules and direct workforce application has outperformed traditional long-format training programs lacking clear context and follow-up. Employees engage more when learning is flexible, relevant, and immediately applicable.

The structural response to the catch-up problem has been formalized learning pathways with structured routes, regular communication channels throughout the journey, and active manager involvement: not just as administrators of learning, but as supporters helping teams apply and retain what they have learned.

“Context matters more than credentials. People earned certifications and then could not apply anything, because the training covered AI in isolation rather than AI within cybersecurity specifically.”
Govind Rammurthy, CEO & MD, eScan

Across all five conversations, the pattern that holds is this: the organizations making real progress on workforce learning in the AI era are not chasing the pace of tool releases. They are building the conditions under which their people can learn continuously, fail without shame, and exercise judgment in precisely the areas AI cannot cover. That is a culture problem as much as a curriculum problem, and the leaders who are getting it right are treating it as one.

Key Takeaways

What this conversation actually adds up to

  • 01 Most companies are spending on the wrong thing. Tools go obsolete in 18 months. The people who know how to think with AI, ask the right questions, and recognize bad output, are the asset that compounds.
  • 02 The skills that matter now are the ones AI cannot do: judgment, context, and knowing when not to use the tool at all. Generic AI certifications are not building these. Domain-specific, applied learning is.
  • 03 What works in L&D is small, specific, and applied immediately. Long programs without role relevance or follow-through do not transfer. The format that keeps appearing: peer learning, short nudges, real problems.
  • 04 The feeling of constantly falling behind is a culture problem as much as a training problem. Organizations that make failure visible, clarify what actually needs to be learned now versus later, and involve managers in application, do not feel like they are always catching up.
Avatar photo
Ojasvi Nath

Leave a Reply

Your email address will not be published. Required fields are marked *

Stay updated with NervNow Weekly

Subscribe now