

Contributed by Casey Werth, Energy Leader, IBM Technology
The electric utilities industry has reached a pivotal stage in its artificial intelligence (AI) journey. As customer focus shifts in favor of reliability, grid resiliency, and affordability, operators must transition from experimentation to implementation. At the utilities-focused annual conference, DISTRIBUTECH (DTECH) 2025, we’re publishing a new study from the IBM Institute for Business Value revealing that while most utilities have a grid modernization strategy, 21% report no progress. The key to catching up lies in applying AI technology to drive business value where it matters most.
To satisfy rapidly growing demand, energy providers must enhance stakeholder communication while ensuring they continue to deliver. This means using data to extract more value from existing assets, applying new technologies to increase visibility and automation, and allaying the data security fears that have slowed progress thus far.
AI is already here
That’s not to say that Utilities aren’t using AI. The technology has delivered significant efficiency benefits in recent years with initiatives including AI-driven remote sensing capabilities, weather modeling, and vegetation management programs.
While none of these are yet using large language models (LLMs), they are already providing tangible business value. As Generative AI (gen AI) continues to develop, however, we’re at the threshold of a new era of innovation. So where should utilities operators focus their strategies for scaling AI?
A vast majority of global energy and utility companies say they’re ready to embrace AI or already have, according to new insights gathered by IBM. Casey Werth, IBM’s global energy industry general manager, and a GridWise Alliance board member, warned that AI adoption comes with prerequisites for utilities. Establishing governance structures, both for systems and people, is critical for successful implementation.
Focus on operational value
Assets in the field are where utilities make and deploy capital, and how they deliver service to customers. The closer to that value pool we can apply AI or LLMs, the greater the impact we’ll see and the easier it will be to quantify.
The chat function is often the easiest place for operators to apply Gen AI. However, while it’s fantastic for integrating and freedom to communicate, the potential value of the work being replaced or reduced is relatively limited. My advice is to start by identifying capital where a step change in how you operate it would make the biggest financial gain.
Applying LLMs across the asset management process
I am particularly excited about the potential for foundation models to help optimize asset management workflows. Field service technician notes, for example, are notoriously hard to leverage in analytical modeling. But what if you had a model that could check what tickets have been generated and create work plans? Or, even better, optimize based on what we know about the assets. Finetuned LLMs can be applied to analyze and classify historical ticket data to identify the cause of failure and anticipate future issues. This language-driven approach can optimize work orders, improve asset management, and integrate into product suites, providing significant, quantifiable value.
Training geospatial and weather AI foundation models
Moving one step further again, we’re very excited about a current project between IBM research, clients, and partners including Linux Foundation Energy and national labs to build a foundation model that understands an electric grid. Just like LLMs are trained to be able to fill in missing words, we’ve applied the same methodology to weather and geospatial data. This has helped build models for weather forecasting and flood risk analysis. The aim is to apply this to grid data to create a model that understands what a grid segment looks like and can fill in missing parameters. This could potentially revolutionize asset management and risk analysis in the power distribution sector.
Future-proofing our grids
As loads increase, maintaining an accurate grid connectivity model and optimizing power delivery becomes increasingly challenging. Developing a model that understands these dynamics could help identify blind spots and improve state estimation, co-simulations, and interconnection studies.
Contingency analysis and planning is another exciting area. Imagine a model that can simulate scenarios like a major failure in gas lines. Such a tool could help mitigate worst-case scenarios and inform discussions with boards or regulators. Our partners in the renewable development space believe such a model could reduce simulations from overnight to nearly real-time, enhancing decision-making and scenario planning.
Defining your AI vision and strategy
With all this AI potential out there, it’s important to decide if you want to take a leadership position or are more comfortable with fast follower status. Considerations include comfort with SaaS and sensitive data, cultural adoption, and acceptance of missteps as an important part of the learning process.
Next, work with your IT leadership and technical teams to pick the right place to get started. Assemble a small, knowledgeable team who understand both the software and process, identifying where technology can enhance processes and deliver measurable benefits.
The focus should be on material improvements, not incremental time savings. If you succeed in taking a key process from eight hours to 25 minutes because you’re injecting the technology in the right place. That’s a huge win. That’s what you take to the board and your constituents, demonstrating that you’re fundamentally changing the way you do business and delivering value for all.
If you’re heading to DISTRIBUTECH 2025 please do drop by our booth #6326 and say hello!