From Metadata to AI Agents: Highlights from Gartner D&A 2025
You’re sitting at your desk right now, aren’t you? A slight furrow in your brow as you scroll through your inbox, wondering if you’ve missed something important in the flood of AI announcements that seem to drop daily.
That nagging feeling that somewhere, somehow, your competitors might be finding the secret sauce to implementing AI at scale while you’re still struggling with data quality issues.
You’re not alone. In fact, if anything, the Gartner Data & Analytics Summit 2025 is proof of what a common experience that is for data leaders today.
This year’s Summit was equal parts therapy session, wake-up call, and roadmap for data leaders trying to navigate the most transformative period in our industry’s history.
So, let’s explore the key insights from this year’s Gartner D&A, because these are likely to affect your data and AI strategy for 2025 and beyond.
“Are You Okay?”: The Opening Keynote That Hit Home
The conference kickoff from Gartner VP Analysts Gareth Herschel and Carlie Idoine began with a surprisingly empathetic question: “Are you okay?”
And room fell silent. They had tapped into the collective anxiety data leaders have all been feeling with the ‘AI tsunami’ hitting their businesses.
The keynote, “Scale Data and Analytics on Your AI Journeys,” acknowledged the perfect storm many CDAOs and data leaders are facing: high D&A ambitions, scattered AI pilots, and the growing pressure to deliver consistent execution rather than just more experiments.
The statistics they shared painted a stark picture of this collective challenge:
- 50% of CEOs believe AI will transform their industry in the next three years, yet data availability and quality remain the #1 obstacle to implementation.
- 49% of organizations report that demonstrating the value of AI is their top barrier to adoption
- 54% of companies say their current CDAO is the first
- Over half of organizations have serious issues with their data quality and readiness
- “Businesses not understanding D&A” remains a top three challenge for data & analytics leaders
- Data quality sits as the top risk responsibility for D&A leaders
What struck me most was the insight about data culture being the primary roadblock to analytics success. When only 1 in 5 organizations consistently use their data for AI use cases, it’s clear that people need data that speaks their language, not more training on how to speak “data.”
Three Parallel Journeys for AI Success
The presenters outlined three parallel journeys that successful organizations are navigating simultaneously:
- Journey to business outcomes (Trust = Value)
- Journey to data & analytics capabilities (Adaptability = Scale)
- Journey to behavioral change (People = Transform)
This is exactly the approach we’ve taken at illumex with our Generative Semantic Fabric (GSF) – bringing data to people in their natural business language through trustworthy self-serve analytics, rather than forcing technical literacy.
The session also emphasized turning passive metadata into active metadata. As this shift is exactly what enables trustworthy self-service analytics – where business users can confidently access insights without technical expertise. Which brings me to the next highlight.
Mark Beyer’s Active Metadata Revolution
On day two, Mark Beyer delivered what one might call the “rock star session” of the conference. His presentation on metadata management triggered a collective realization throughout the room.
“Stop complaining you don’t have enough metadata — you have enough, you just don’t use it,” he challenged the audience.
The traditional passive approach to metadata, those dusty data dictionaries and static catalogs, is fundamentally flawed. The wall between structured and unstructured data is entirely artificial, and active metadata techniques can bridge these worlds. In fact, active metadata is the secret ingredient that can take you from a tech stack to a trust stack.
The statistics he presented were eye-opening:
- 70% of organizations remain trapped in basic, passive metadata inventorying
- Only 12% have implemented active metadata analysis
- 64% use metadata governance solutions, but few employ truly AI-enabled governance
“The path to trustworthy, production-ready agentic AI isn’t through more prompting hacks or fine-tuning.
It’s through a semantic foundation that makes AI truly intelligent.“
Stop Calling It “Data About Data”
Metadata isn’t just “data about data.” It’s the complete nervous system of your business, capturing billions of observability points that reveal how your organization actually operates. The metadata from a single business report contains more contextual richness than most companies manually catalog in a month. From distribution lists to sentiment analysis to usage patterns.
The companies winning with AI aren’t the ones with more data. They’re the ones transforming passive metadata catalogs into living, breathing intelligence systems that show in real time the contextual truth of how their organizations work and use data.
For those of us building AI-powered enterprises, the message couldn’t be clearer:
The path to trustworthy, production-ready agentic AI isn’t through more prompting hacks or fine-tuning. It’s through a semantic foundation that makes AI truly intelligent.
This is precisely why we designed illumex with active metadata at its core.
Our Generative Semantic Fabric (GSF) learns how your organization uses data (without ever touching it directly), automatically mapping, reconciling semantics, and adding the human context that maximizes the return on your AI investments. It continuously and automatically turns passive information into an active intelligence system that makes trustworthy, inherently governed AI a reality.
AI Agent Reality Check and the Iron Man Blueprint
If Beyer’s session was the wake-up call, Ben Yan‘s presentation on AI agents painted the vision of where we’re heading – and faster than most of us had realized.
Yan compared ideal AI implementation to Tony Stark and JARVIS (hello, Marvel fans), highlighting that true AI value comes from augmentation, not replacement. It’s about Iron Man and JARVIS working together, not JARVIS alone.
He outlined how AI systems are evolving from current models that require strong supervision to future systems characterized by:
- Autonomous operation
- Undeterministic task flows
- Adaptable behavior
- Decentralized architecture
Yan sounded a warning about organizational technical debt. Companies that are creating mountains of isolated, non-interoperable AI agents with minimal reuse potential are setting themselves up for future integration nightmares.
Sound familiar? We’ve all watched as isolated AI implementations pile up across departments, each one a siloed island of potential. These disconnected systems are the opposite of what made JARVIS so powerful.
Yan noted that future AI success depends less on individual models and more on creating persistent environments where AI agents can learn and operate effectively. He also highlighted how the most successful implementations maintain strategic human oversight while allowing AI to handle increasingly complex tasks autonomously.
This is exactly why we built illumex’s Generative Semantic Fabric: to create the perfect “playground” where humans and AI agents can interact with each other, with applications, and with data, effortlessly.
GSF eliminates the looming technical debt long before it cripples your AI strategy, while providing you with trustworthy, inherently governed Agentic Analytics to power your business decisions.
Keep Track Of Your AI Implementation Costs
If you thought implementing Agentic AI was just about technology challenges, Arun Chandrasekaran‘s session on cost optimization was the cold shower of reality many needed.
His projections were sobering:
- Through 2028, 50% of GenAI projects will exceed budgets due to poor architectural choices and lack of operational know-how
- Model inference costs will consume 70% of total model lifetime costs through 2028
Looking around, I saw people mentally recalculating their budget forecasts. The collective nods across the room confirmed that these hidden costs of agentic and generative AI are finally being acknowledged industry-wide.
Chandrasekaran outlined 10 best practices across three pillars for optimizing GenAI costs:
Robust Architecture:
1) Be objective about accuracy, performance, and cost trade-offs
2) Create a model garden to promote choice and transparency
3) Balance upfront and operational costs in model augmentation
4) Understand self-hosting trade-offs
Efficient Model Operations:
5) Embrace guided prompt design to reduce costs
6) Implement caching for LLM responses
7) Automate model selection and routing
8) Institute FinOps governance for consumption
Effective Change Management:
9) Educate users on cost-effective practices
10) Continuously analyze both visible and hidden costs
Illumex addresses these cost concerns head-on by automating context and reasoning while reducing token costs by up to 80% – directly tackling the inference cost challenge Chandrasekaran had highlighted.
By eliminating the need for expensive (and inefficient) model customization techniques like RAG (Retrieval-Augmented Generation) or fine-tuning while still delivering context-aware responses, we’ve built cost efficiency into the architecture itself, not as an afterthought.
If you’re curious about the true ‘cost of success’ of your AI implementation, we’ve built a TCO calculator that reveals all hidden costs in less than 2 minutes.
Analytics Are Evolving From Dashboards to Decisions
Rita Sallam‘s session on “The Future of Analytics in the Era of Generative AI” highlighted how analytics is experiencing a monumental transformation.
According to Gartner research, 74% of IT and business leaders believe GenAI accelerates the speed of decision-making, while 65% believe that it leads to better decision outcomes.
Sallam introduced the concept of “perceptive analytics” that uses LLM-powered reasoning and AI agents to achieve proactive, contextual, outcome-driven decision-making. She projected that by 2027, augmented analytics capabilities will evolve into autonomous analytics platforms that fully manage and execute 20% of business processes.
This evolution from dashboards to decision platforms completely changes how organizations derive value from their data investments. This is a shift we’ve been anticipating and addressing with illumex’s decision-centric approach to AI.
By deploying hallucination-free, governed agentic analytics on top of a semantically rich foundation, we’re enabling precisely the kind of perceptive analytics that Sallam described, where the system doesn’t just show data but actively helps understand it in the right business context for smarter business decisions.
The Power of Knowledge Graphs
Maryam Hassanlou‘s session on graph analytics and knowledge graphs highlighted a critical but often overlooked aspect of AI success: the relationship between data elements.
Her presentation emphasized the importance of bridging the gap between data and AI by using knowledge graphs, which provide high-quality, semantically rich data that enhances model validity and accuracy.
The statistics were sobering:
- Nearly 55% of organizations need help to have AI-ready data
- Many organizations lack foundational data management practices
Yet the opportunity is enormous. By 2026, graph technologies will be used in 80% of data and analytics innovations, up from 10% in 2021, enabling better decision-making across enterprises.
Knowledge graph technology is a core component of illumex’s GSF. We combine semantic vector embeddings with knowledge graphs to create a comprehensive foundation for AI initiatives, capturing both the data itself and the relationships between data elements. Together with unique domain-specific ontologies, this provides the rich context that AI agents need to deliver accurate, hallucination-free responses.
The Common Thread – Context is Everything
As I connected the dots between sessions, a pattern emerged: Context is the critical currency of modern data strategies.
Whether it was:
- Mark Beyer talking about metadata becoming the contextual nervous system of organizations
- Ben Yan demonstrating how AI agents need environmental context to operate effectively
- Arun Chandrasekaran explaining how lack of context awareness drives up costs
- Rita Sallam discussing the next (AI-powered) stage in the evolution of analytics
- Maryam Hassanlou highlighting how knowledge graphs provide semantic context for AI
The message was consistent: isolated data points, disconnected models, and siloed data are the enemies of effective AI implementation and maximum agentic re-use for different workflows.
This is precisely why we built illumex.
While other vendors at the conference were focusing on individual components of the AI puzzle, we’ve taken a holistic approach that addresses the context (and reasoning) challenge head-on.
What This Means for Your Data and AI Strategy
As I packed up to leave Orlando, I couldn’t help but reflect on where all this leaves us as data and analytics leaders. Here are my top takeaways for navigating the next 12-18 months:
- Active Metadata: Stop treating metadata as documentation and start treating it as intelligence. What matters isn’t having more metadata but activating what you already have.
- Environment Over Models: Focus less on individual AI models and more on creating persistent environments where AI agents can learn from organizational context for maximum re-use.
- AI Cost Transparency: Implement mechanisms for tracking both the visible and hidden costs of AI implementations, with particular attention to inference costs.
- Semantic Ontology Foundation: Invest in solutions that create semantic coherence across disparate data sources without requiring massive migration projects.
- Decision Intelligence: Begin the transition from data-driven to decision-centric thinking, focusing on the decisions that matter most to your business.
The Semantic Cornerstone for AI Success
If I had to distill the entire Gartner D&A Summit 2025 into a single insight, it would be this: The future belongs to organizations that build an automated semantic foundation for their AI initiatives.
This isn’t just about labeled data or knowledge graphs in isolation. It’s about creating a comprehensive semantic fabric that automatically captures the unique language, relationships, and context of your business, making your data speak the way your employees do.
This is exactly what we’ve built with illumex’s Generative Semantic Fabric. By automatically translating structured data into meaningful, context-rich business language (with governance built in), we’ve created a base for trustworthy AI that can truly transform how organizations make decisions.
In a world where 49% of organizations struggle to demonstrate AI value, the ability to automatically activate metadata, create context-aware AI agents, and deliver hallucination-free responses through single source of truth ontologies is a business imperative.
As you return to your desk (yes, the one where you started reading this article, possibly with that same furrowed brow), remember that the path forward doesn’t have to be as complicated as most vendors make it seem.
With the right semantic infrastructure, you can navigate the three journeys Gartner outlined – to business outcomes, capabilities, and behavioral change – all at once.
Because at the end of the day, data doesn’t make decisions, people do. And empowering people with trustworthy, context-aware agentic AI isn’t merely a vision for the future; it’s what we’re delivering with illumex today.