Analytics, AI, and data usability: The new frontier for enterprises
17 min read
Last edited:

By some estimates, businesses worldwide generate an astonishing 175 zettabytes of data. That’s one billion terabytes per zettabyte. To put it into perspective, it would take over 139,000 years to transmit 175 zettabytes across a single high-speed fiber optic line continuously, even at the speed of 400 GB/s.
While enterprises are drowning in this deluge of data, they are starving for specific insights.
Despite billions poured into AI and analytics, most organizations still struggle to turn raw data into business value. The problem isn’t a lack of technology, but a lack of data usability. AI is only as good as the data that feeds it, and for most enterprises, that data is fragmented, outdated, or locked in silos.
The future of analytics is shifting. It’s no longer about having the biggest datasets or the most advanced AI models. It’s about real-time, structured, and accessible data pipelines that power true intelligence. And the companies that master this will dominate the next era of enterprise decision-making.
In this latest episode of The Effortless Podcast, Dheeraj Pandey (Co-Founder and CEO of DevRev, former CEO of Nutanix) and Amit Prakash (Co-Founder and CTO of ThoughtSpot, former engineer at Google and Microsoft) break down the state of analytics. We’ll dive deep into why Snowflake and Databricks are fighting to own enterprise data, how SaaS 2.0 is reshaping analytics with built-in intelligence, and why the real AI winners should focus on better data usability.
AI’s silent killer: How poor data stifles intelligence
[Data] is like blood. It’s a supply of blood and if you don’t have that supply of blood that goes to the brain, there is no thinking, there’s no intelligence.
This “cheesy” analogy, as Dheeraj humbly puts it, is anything but hyperbolic. Just as the brain cannot function without a rich, continuous supply of oxygenated blood, AI cannot operate without clean, structured, and accessible data.
Amit immediately agrees with Dheeraj’s analogy but adds another dimension: “It also connects data to the heart.”
If data is the blood and AI models the organs, the heart represents the data infrastructure—pipelines, storage, and transformations that keep the entire system alive. Without this well-functioning “heart,” AI fails.
Consider this: Every major AI advancement—be it LLMs like GPT, real-time fraud detection systems, or recommendation engines—relies on underlying data pipelines. Yet, most enterprises misunderstand or neglect this connection. They chase AI without fixing the foundational data infrastructure, resulting in unreliable outputs and biased decisions.
Bad data leads to bad AI. Incomplete or poorly structured data corrupts models, leading to dangerous consequences. A healthcare AI system skewed toward one demographic risks misdiagnosis for others. A financial AI model, trained on inconsistent transaction data, may fail to detect fraud.
The real issue here is this: Most enterprises don’t have a data problem—they have a data usability problem.
Enterprises are collecting more data than ever before, but they don’t know how to use it effectively. There are 3 key reasons for this impediment:
- Data silos: Customer data is separate from product data, which is separate from operational data, which is separate from financial data. AI models need to combine these datasets to be truly intelligent.
- Poor data governance: Without clear ownership and rules for maintaining data quality, AI models are trained on inconsistent, outdated, or incorrect data.
- Lack of real-time data processing: Many AI models rely on batch processing, meaning they work with old, historical data instead of real-time insights. In fast-moving industries like finance and cybersecurity, this makes AI far less effective.
AI isn’t a magic bullet—it’s an outcome of a well-structured, well-governed, and well-utilized data ecosystem. The businesses that win with AI won’t be the ones with the most advanced models. They’ll be the ones with the best data infrastructure—real-time, accessible, and built for intelligence.
The future of AI isn’t just about better models. It’s about better data. And the companies that understand this first will lead the next era of enterprise intelligence.
A lot of this data is sitting in apps. It’s sitting in file storage, NetApp and things like that. So there’s no affinity for compute to go over there. So I think whoever provides the best possible service for doing various AI workloads, whatever they may be in future, will win that compute.
Snowflake vs. Databricks: The battle shaping enterprise data supremacy
The battle between Snowflake and Databricks is perhaps the most defining conflict in modern analytics. These two companies have positioned themselves as the backbone of enterprise data infrastructure, each offering distinct solutions for storage, processing, and analytics. But their fundamentally different origins and design philosophies mean they are competing from opposite ends of the spectrum.
As Ashu Garg, General Partner at Foundation Capital, writes about the Databricks vs. Snowflake rivalry: “Both Databricks and Snowflake have recognized this paradigm shift. They understand that whoever controls enterprise data will control the future of enterprise AI. As a result, owning the platform layer for generative AI has become their top strategic priority.”
At its core, the Snowflake vs. Databricks debate is about two fundamentally different ways of thinking about data.
Snowflake’s strength lies in its SQL-first approach, which makes it highly accessible to businesses that have been using relational databases for decades. Meanwhile, Databricks was built with AI and ML in mind, meaning it’s optimized for unstructured, high-velocity, real-time data that needs to be analyzed, transformed, and modeled at scale.
A brief history of Snowflake vs. Databricks
The rivalry became public in 2023, when both companies held their annual conferences—Snowflake’s Data Cloud Summit in Las Vegas and Databricks’ Data + AI Summit in San Francisco—in the month of June. And the focus at both events? Generative AI, data governance, and AI model customization.
The rivalry escalated throughout 2023 and 2024, with aggressive acquisitions, product launches, and strategic partnerships. In June 2023, Databricks made a headline-grabbing move by acquiring MosaicML for $1.3 billion. This acquisition allowed Databricks’ customers to build, train, and own custom generative AI models using their proprietary data, eliminating reliance on third-party providers like OpenAI.
Snowflake quickly responded by announcing a strategic partnership with Nvidia on the same day as Databricks’ MosaicML acquisition. This partnership enabled Snowflake customers to build bespoke generative AI applications using Nvidia’s NeMo framework and Nvidia GPUs, all within Snowflake’s Data Cloud. Snowflake also acquired Neeva, a search startup, to bolster its natural language querying capabilities—further signaling its AI ambitions.
The battle intensified in 2024, as open-source strategies took center stage. Snowflake open-sourced Polaris, a catalog for Apache Iceberg, an open-source table format designed for interoperability. This move was aimed at reducing vendor lock-in, a longstanding criticism of Snowflake’s proprietary approach.
However, Databricks swiftly countered by acquiring Tabular, a company founded by the original creators of Iceberg, securing its position as the leader in open-source data formats.
ETL’s pivotal role in making AI business-ready
A big chunk of workload, whether it’s Snowflake or Databricks or GBQ, is actually just transforming the data, preparing it so that the business user could query it. And in that case, the latency doesn’t matter as much and the cost does matter.
This is a key insight that Amit highlights about modern enterprise analytics: Most of what happens in data platforms isn’t querying—it’s preparing the data to be queried. Businesses spend enormous resources on ETL (Extract, Transform, Load) processes, which clean, format, and optimize raw data before it can be used for analytics.
Databricks built its foundation on data engineering and transformation. Snowflake built its foundation on query performance and storage efficiency. Now, both are trying to master what the other already does well.
For years, Snowflake had a pricing advantage because of its ability to separate compute from storage. This allowed customers to only pay for the computing power they used rather than maintaining always-on infrastructure.
But now, as Amit points out, enterprises are getting smarter about costs, moving ETL workloads to cheaper alternatives when possible. “For the longest time, Snowflake was able to charge this premium, but now, for the ETL part, companies are moving their workloads wherever they find the cheaper cost,” he notes.
Databricks is leveraging this shift to its advantage. By offering more flexible, open-source-driven data formats (such as Iceberg, Delta Lake, and Hudi), it allows enterprises to store data independently from the processing engine—undermining one of Snowflake’s biggest competitive advantages.
Snowflake, in response, is racing to develop its AI and ML capabilities. It acquired Streamlit, an open-source app development framework for machine learning, and is pushing into Python-based workloads, aiming to compete directly with Databricks in the data science and ML space.
Now, given that both companies are rapidly expanding their capabilities to overlap with each other’s strengths, a question looms large: Will Snowflake and Databricks eventually look identical?
Amit sums it up: “I think they’re both at the beginning of this journey, and the whole world is at the beginning of this journey. It depends on who can innovate fast and who can move fast. And I won’t consider this race called to any extent.”
This battle is far from over. The winner won’t be the company that perfects its existing strengths—it will be the company that most successfully integrates AI, SQL, and real-time analytics into one seamless platform.
Visualization: The bridge between data and business action
To me, visualization looks like that thing to really complete the picture in analytics.
Data without visualization is like a novel without punctuation—difficult to process, frustrating to navigate, and nearly impossible to interpret at a glance. Companies don’t just need accurate insights; they need those insights to be instantly understandable, easily digestible, and immediately actionable. For instance:
- Finance teams need real-time dashboards to track revenue, costs, and profitability.
- Sales teams need interactive reports to forecast demand and identify opportunities.
- Marketing teams need intuitive charts to track campaign performance and customer engagement.
- Executives need a single-glance summary to make high-stakes decisions quickly.
Yet, while cloud data warehouses and AI-driven analytics have advanced dramatically, data visualization has lagged behind, often treated as an afterthought rather than a core component of the analytics stack. The modern enterprise generates massive amounts of data, but raw numbers don’t mean much unless they can be translated into meaningful narratives.
While cloud platforms like Snowflake and Databricks have revolutionized data processing, they haven’t built their own powerful visualization layers. Instead, the enterprise market for visualization is dominated by three key players: Microsoft Power BI, Salesforce Tableau, and Google Looker Studio.
Each of these platforms has its strengths and weaknesses, but as Amit points out, the current leader in terms of market reach is Power BI, despite its shortcomings like performance issues and limited AI capabilities. “Power BI is definitely taking a lot of revenue away from Tableau, even though it’s a significantly inferior product. But Microsoft knows the enterprise space," he says.
This is a fundamental truth in enterprise software: The best product doesn’t always win. Power BI isn’t leading because it’s the most innovative—it’s leading because of Microsoft’s unparalleled distribution power. “Looks like they have distribution and they probably do it well enough for the base of the pyramid,” Dheeraj adds.
While Tableau and Looker sell visualization as a standalone product, Power BI is deeply integrated into the Microsoft ecosystem:
- It comes pre-installed with Office 365 Enterprise plans, making adoption almost effortless.
- It’s tightly connected to Azure, SQL Server, and other Microsoft cloud services.
- IT teams are already comfortable with Microsoft tools, making Power BI a natural extension.
This bundling strategy allows Microsoft to price Power BI aggressively, making it an attractive choice for companies looking for a “good enough” solution at a fraction of the cost of Tableau or Looker Studio.
The SaaS 2.0 promise: AI and analytics that work for customers
We realised that, in this new world of SaaS 2.0, we need to do more for the customer. They shouldn’t have to go build a search engine and or buy a search engine and retrofit it, buy an analytics engine and retrofit it or buy a workflow engine and retrofit it.
The first generation of SaaS that emerged in the 2000s revolutionized how software was delivered by moving software from on-premises data centers to the cloud.
For the first time, businesses didn’t have to manage physical servers, worry about hardware procurement, or handle software upgrades. This “left-shifting” of complexity was groundbreaking, as companies could now consume software over the internet with minimal IT involvement. Dheeraj observes: “20 years ago, SaaS 1.0 was about, ‘Hey, we’ll left-shift a lot of the complexity.’”
However, SaaS 1.0 came with its own set of challenges. While it eliminated hardware worries, it left behind software complexity. Customers still had to stitch together multiple tools to create a comprehensive solution. This resulted in data silos, fragmented user experiences, and slow innovation cycles. The cloud had become a place for collaboration, but it was fragmented collaboration at best.
Another critical limitation of SaaS 1.0 was its one-size-fits-all architecture. Multi-tenancy allowed vendors to serve multiple customers from a shared infrastructure, reducing costs. However, this approach lacked granularity. Large enterprises could demand performance and customizations that smaller businesses couldn’t afford.
Meanwhile, startups and small businesses were often stuck with stripped-down versions of software that didn’t meet their needs. This rigid architecture hindered flexibility, making SaaS 1.0 either too heavy or too light, depending on who you asked.
This is where Dheeraj’s vision for SaaS 2.0 comes into play. To him, SaaS 2.0 means removing friction—not just from infrastructure management, but also from the day-to-day operations of the businesses using the product.
SaaS 1.0 took away the burden of managing IT. But it still left businesses piecing together solutions, integrating multiple products, and managing complexity at the software level. SaaS 2.0, as envisioned by Dheeraj, solves this by delivering:
- Built-in search engines
- Native analytics capabilities
- Integrated workflow engines
All of this comes pre-assembled—no need for customers to piece together third-party solutions. It’s about providing out-of-the-box intelligence that works for both startups and enterprises.
Dheeraj goes on to explain that a robust knowledge graph, an interlinked network of customer, product, employee, work, and user session data, powers the whole SaaS 2.0 architecture. The knowledge graph connects customers, products, people, and every interaction in real time using AI, enabling the trifecta of search, analytics, and workflows.
Analytics and AI: The race is on, but who’s leading?
Bubble burst or not, I think one way or the other, [AI] is a fundamental technology. We will figure out a way to pay back that money and make some more. It will just take longer and, in the process, we will invent new kinds of workloads for which none of the establishment has the advantage over somebody else. Somebody is going to come up with a better way and they’re going to take the cake.
The analytics world has raced to integrate AI, but the truth is, we’re still in the early innings of figuring out what actually works. Both Snowflake and Databricks—two of the most prominent players in data infrastructure—are making aggressive moves into AI, but neither has cemented a dominant position yet.
Despite pouring tons of money into AI research and development, there is no single company or platform that has built an unshakable lead in AI for analytics. “Nobody really has a technological moat right now,” Amit highlights, adding that enterprises are gripped by the anxiety of awaiting returns on the hundreds of billions of dollars they have already invested into AI.
Some questions on AI and analytics that companies don’t have a clear answer to yet are:
- AI can summarize reports, generate insights, and answer questions, but does that justify billions in investment?
- AI-powered analytics can help automate decision-making, but how much is that worth to enterprises compared to traditional analytics?
- Companies are fine-tuning large language models (LLMs), but are they really adding enough business value to justify their enormous compute costs?
AI-powered retrieval-augmented generation (RAG) has been a promising step forward, allowing companies to combine structured enterprise data with LLMs for context-aware AI-powered queries. But it’s still just scratching the surface.
“Everybody is doing essentially RAG. People talk about agents, but it’s a little bit of workflow sitting on top of RAG,” Amit notes. Basically, many businesses are simply throwing AI at problems without clear objectives.
Amit predicts that the future of AI in analytics will be shaped by three key trends:
- Unstructured data: AI will need to process messy, unstructured information (documents, emails, logs, PDFs) rather than just structured SQL data.
- Real-time AI processing: Instead of working on batch-processed historical data, AI must analyze live streams of information.
- Conversational AI: Instead of clicking through dashboards, users will talk to AI like a human assistant.
AI is undoubtedly the future of analytics, but the industry hasn’t figured out its true role yet. The next few years will determine who successfully integrates AI into analytics in a way that creates real, lasting business value. Until then, the AI race in analytics remains wide open—and the real winners have yet to emerge.
Wrapping up: The future of enterprise analytics in an AI-first world
I feel there’s one way we can disrupt SaaS 1.0. It goes through analytics on some level. The path to AI is through analytics. The path to AI is through search. The path to AI is through workflows.
The future of enterprise AI doesn’t lie in standalone models or flashy demos. It lies in the quiet power of infrastructure—in the systems that connect search, analytics, and workflows into a seamless whole. This interconnected trifecta will determine which companies merely adopt AI and which ones truly own it.
Enterprises today are standing at a crossroads. The temptation is to bolt AI onto existing systems and expect transformation. But AI without the backbone of real-time analytics, robust data pipelines, and intelligent workflows is like a brain without a nervous system—impressive but incapable of action. The winners will be those who see AI not as a layer, but as the outcome of a tightly integrated ecosystem.
Search isn’t just about retrieval anymore; it’s about context. The ability to surface the right insights at the right time. Analytics isn’t about dashboards—it’s about intelligence flowing into decisions as they happen. Workflows are no longer static processes; they’re dynamic, self-optimizing engines driven by AI predictions.
The shift from SaaS 1.0 to SaaS 2.0 demands this holistic approach. Enterprises no longer have the luxury of fragmented systems. The complexity of modern business demands platforms that do more—intelligently, contextually, and automatically.
The question for leaders isn’t whether AI will reshape the landscape. It’s whether they’ll lead that reshaping—or watch from the sidelines.
The path to AI leadership is clear. The question is: who will walk it first?
If this conversation on AI, analytics, and the future of SaaS 2.0 sparked new ideas for you, there’s more where that came from. Head over to The Effortless Podcast Substack and stay ahead of the curve. Because the future doesn’t pause—and neither should you.