Why AI has shifted the balance of what data is really worth
Data has always had value. Yet unlike oil, copper, or even wheat, there is no international exchange rate for data and no universally accepted measure to determine how much a business should pay for a unit of information. A kilobyte in Singapore is not priced against a terabyte in Tokyo, nor does Jakarta operate a futures market for streaming datasets. Data remains unregulated, decentralised and, crucially, subjective.
Yet with the rise of artificial intelligence, particularly generative and agentic systems, the very notion of data value has been recalibrated. AI has transformed information from a corporate by-product into the essential feedstock of enterprise strategy. The implications are vast for technology vendors, CFOs, and regulators alike.
AI has elevated data from a passive asset to a strategic currency that underpins competitiveness. Vendors now compete to offer unified platforms spanning data, analytics, and AI. The greatest challenge for firms is not volume, but governance, usability and sovereignty of their data. For more insights on this topic, check out our piece on How AI Recalibrated the Value of Data.
From analytics to AI first data platforms
In previous decades, vendors emphasised the raw power of analytics. Today, that is no longer enough. Businesses want proof that platforms can serve a triad of data, analytics and AI. Cloudera’s CEO Charles Sansbury put it bluntly: most corporate data stacks are a mess. Firms cannot simply “move everything to the cloud” and expect coherent AI outcomes. Instead, private AI deployments, running inside company controlled environments are becoming a pragmatic middle ground.
Sansbury points to data sovereignty as one of the biggest hurdles. Borders and identity management rules make data residency a political as well as technical issue. And as Informatica’s $8 billion sale to Salesforce demonstrated, investors have woken up to the gravitational pull of managed, high-quality data.
Clean data or fast rubbish
Enjoying this? Get more in your inbox.
Weekly AI news & insights from Asia.
Salesforce’s Muralidhar Krishnaprasad makes a sharp point: AI without trustworthy data produces “super-fast rubbish”. Agentic AI needs not just quantity but contextual, clean and unified datasets. A fragmented information landscape leads to duplication, bias and poor outcomes. The recalibration, then, is not about piling up petabytes but about ensuring enterprises build platforms that unify structured and unstructured data alike.
Hyperscalers and the strategic imperative
Google Cloud’s Yasmeen Ahmad casts data in almost economic terms. It is not a passive ledger entry, but the fuel of AI. Her focus is on “dark data”, the vast majority of unstructured information hidden in images, videos and documents. AI, she argues, acts as a universal translator that makes this information usable, giving firms a richer tapestry to analyse. The opportunity is vast, but so are the risks. If AI can surface all data, firms must double down on permissions, identity governance and compliance.
Meanwhile, Databricks CTO Dael Williamson notes that data is not recorded on balance sheets, despite investors valuing it heavily. AI is forcing CFOs to treat information as an asset class. That requires new architectures where operational and analytical data coexist, eliminating fragile pipelines and feeding models with the freshest information. For instance, how Starbucks is using AI to enhance supply chain visibility is a prime example of this.
Usability is the differentiator
Governance, scale and security all matter, but what distinguishes one vendor from another is usability. The platforms winning attention are those that juggle streaming data, machine learning and deep analytics within one metadata-driven framework. Gartner now places these competencies at the heart of its assessments, with hybrid cloud flexibility a non-negotiable. From Cloudera’s data lakehouses to Snowflake’s Iceberg tables and Google’s BigLake, the market is coalescing around those who can merge legacy reliability with AI-era agility. For more on the strategic shift, consider reading about AI's Secret Revolution: Trends You Can't Miss.
What next for data value in the AI era
The recalibration is not finished. Enterprises will judge vendors on their ability to reduce operational costs, simplify infrastructure, and manage data sovereignty while still enabling AI agility. In short, firms need platforms that are both trustworthy and experimental, balancing governance with innovation.
For Asia, where regulatory frameworks differ dramatically across borders, the challenge is acute. Yet the opportunity is greater still. If AI has taught us one thing, it is that the value of data is no longer in how much you own but in how intelligently you activate it.
So the next time someone says data is the new oil, perhaps we should correct them: with AI, data is becoming the new currency. The real question is, how much is yours worth? For further reading on the economic impact of data, refer to this report by the World Economic Forum on the Value of Data.












Latest Comments (2)
Interesting read! It's timely to see this discussion on how AI has really shaken up how we perceive data's worth. The point about usability trumping sheer volume resonates, especially here in India where we often grapple with mountains of siloed information. However, I’m still a bit sceptical about how thoroughly these vendor strategies truly address the ‘governance’ piece. Are we truly seeing *recalibrated* data strategies, or just a new coat of paint on old practices, especially when it comes to truly democratising data access whilst maintaining security? That’s the real trick, isn’t it?
Spot on with the recalibration! Here in Singapore, we're definitely seeing companies prioritise data *quality* over sheer *quantity*. Usability especially, even for our SMEs, is becoming absolutely critical to derive any real business intelligence. A good read, thanks!
Leave a Comment