Events

Snowflake Summit 2025 Recap: Building the Future of AI and Apps

Snowflake Summit 2025, held in San Francisco, brought together nearly 20,000 attendees from around the world under the theme of “Build the Future of AI and Apps.” This year’s event highlighted the company’s bold vision for driving next-generation data-driven innovation.

What is Snowflake?

Snowflake is an integrated cloud data platform that supports data warehousing, data lakes, data engineering, and AI-driven application development. Compatible with AWS, Azure, and Google Cloud, it can handle structured, semi-structured, and unstructured data with high flexibility. With robust features in security, governance, and scalability, Snowflake provides an end-to-end solution for consolidating, analyzing, and leveraging data across organizations.

Data Strategy as the Foundation of AI Strategy

Source: Snowflake Summit 2025 Opening Keynote with Sridhar Ramaswamy, Sam Altman, and Sarah Guo

There Is No AI Strategy Without a Data Strategy

During the opening keynote, Snowflake CEO Sridhar Ramaswamy declared, “There is no AI strategy without a data strategy.” This highlighted the necessity of having a solid and coherent data foundation before scaling AI initiatives.

In today’s competitive landscape, businesses must be equipped with an infrastructure that allows accurate collection, management, and utilization of both structured and unstructured data. Snowflake’s cloud-native architecture enables seamless integration, governance, and analytics, serving as the backbone of AI readiness.

The Sanctity of Data

Lynn Martin, President of the NYSE Group, emphasized the importance of data quality in the age of AI, referring to the “sanctity of data.” Her point was clear: without a good source of truth, AI can lead to “unfortunate outcomes.”

This message is especially relevant for generative AI, where hallucinations often stem from poor data quality or structure. The importance of strong data governance and integrity was consistently emphasized throughout the event as a crucial foundation for any successful AI strategy.

The Beauty of Simplicity in Technology

Ramaswamy also remarked, “The true magic of a great technology is taking something that’s very complicated and making it feel easy. The key to a great solution is simplicity.” This idea perfectly reflects one of our core principles in product development: simplicity.

In areas such as data science and statistics, the levels of expertise among data users can vary widely. That’s why at dotData, we focus on delivering the best possible user experience for everyone, from advanced users like data scientists to those who just want to explore data more casually.

As AI and data utilization continue to advance, systems are becoming increasingly complex. This growing complexity can make them harder to manage and understand. Snowflake tackles this challenge by putting simplicity at the heart of its product design, emphasizing intuitive and accessible user experiences.

This philosophy strongly aligns with the direction we’re taking in our product development, and we believe it will become even more essential as we move deeper into the age of AI.

Sam Altman: “Just Do It”

One of the highlights of the opening keynote was a fireside chat between Snowflake CEO Sridhar Ramaswamy and OpenAI CEO Sam Altman. When asked what advice he would offer to enterprise leaders navigating the AI landscape in 2025, Altman offered a surprisingly simple yet powerful response: “Just do it.”

His message emphasized that in today’s rapidly evolving AI world, waiting for perfect plans or complete understanding can be counterproductive. Instead, organizations should embrace experimentation, take action, and learn from the process as they go. It was a call to action that resonated deeply with both business and technical executives.

What’s New in Snowflake?

At the Day 2 Platform Keynote, several Snowflake leaders and guest speakers took the stage to share the latest product innovations. Christian Kleinerman, EVP of Product, led the session by presenting updates built around customer needs, framed as “I want…” statements, answered by “You can…” solutions from Snowflake.

I want a data architecture that is future-proof

Snowflake is evolving into a Single Unified Platform that supports the entire data lifecycle—from ingestion and processing to analysis and application deployment. It is designed to handle diverse workloads and adapt to emerging business needs, regardless of where data resides or what computing environments are involved.

Support for Apache Iceberg

At the core of this flexible architecture is native support for Apache Iceberg, an open table format that allows efficient management of large-scale data. This enables organizations to build scalable and interoperable data architectures that are future-ready.

Snowflake is also deeply committed to the open-source community, contributing to projects like Apache Iceberg, including support for complex data types such as Variant. These efforts enhance interoperability and openness across modern data ecosystems.

Ecosystem Expansion with Apache Polaris Catalog

Through integration with Apache Polaris Catalog, Snowflake can connect to external query engines and analytics platforms. This ensures seamless collaboration with other computing resources and strengthens Snowflake’s position as an extensible, ecosystem-friendly platform for AI-era data workloads.

I want to govern all my data

Governance isn’t just about control—it’s about enabling trusted, compliant access at scale. Enter Snowflake Horizon Catalog, a unified data governance layer.

  • Automated Sensitive Data Tagging
    Snowflake Horizon Catalog includes automated detection and tagging of sensitive data. When a table contains sensitive fields, any derived tables automatically inherit the relevant tags, reducing the risk of data leakage and ensuring compliance.
  • Synthetic Data Generation
    Snowflake also provides automated synthetic data generation based on existing tables. Unlike random data, this synthetic data mimics the statistical distribution of the source, preserving analytical value while ensuring data privacy. This makes it especially useful in development and testing environments where direct use of sensitive data is not permitted.
  • Internal Marketplace
    Snowflake introduced an internal marketplace that lets teams search, request access to, and share data and AI assets within a governed interface.
    When integrated with platforms like dotData, this marketplace can expose automatically generated features as reusable “data products” that accelerate cross-functional AI adoption.

I want to integrate all types of data

  • Seamless Data Ingestion with Snowflake Openflow
    Snowflake Openflow is a new pipeline service that enables seamless ingestion of diverse data types—including batch, streaming, and file-based sources—through a unified interface. Initially focused on structured data, Snowflake is expanding its capabilities to support unstructured and real-time data, meeting the modern AI-driven demands.
  • Built on Apache NiFi for Flexible Ingestion
    Openflow is built on Apache NiFi, allowing flexible data ingestion pipelines. It supports data sources not natively integrated with Snowflake and provides a visual, GUI-based interface that makes data pipeline building accessible even for non-engineers. This enables smoother ingestion of unstructured and real-time data that previously required complex ETL processes.
  • Technology Integration Through the Acquisition of Datavolo
    The technology behind Openflow originates from Snowflake’s acquisition of Datavolo. Now integrated into the Snowflake platform, it enhances platform compatibility and delivers a robust end-to-end pipeline that streamlines data ingestion through to operational use.

I want to deliver more business impact

From Data Platform to Full Application Support

Snowflake is moving beyond traditional data warehousing to become a unified platform for both data and applications. This allows organizations to seamlessly connect data-driven decisions with execution—all within a single, secure environment.

Snowflake Postgres: Enabling Transactional Workloads

Snowflake introduced Snowflake Postgres, a fully managed, PostgreSQL-compatible database. While Snowflake was primarily built for analytics, this new capability opens the door to transactional and operational application development directly on the platform.

For example, by combining it with Snowpark Container Services—a fully managed service that became generally available last year and enables the execution and management of containerized applications within Snowflake—enterprise applications could eventually be hosted within Snowflake’s secure and governed environment.

This would allow organizations to manage both analytics and application workloads on a single platform, improving efficiency and consistency across development and operations.

Delivering End-to-End Value from Data to Action

These new capabilities pave the way for a truly end-to-end workflow—from data ingestion and analysis to real-time application execution. Rather than stopping at insights, businesses can now operationalize data instantly, delivering greater agility, consistency, and ultimately, more tangible business impact.

I want to get faster insights

Snowflake’s newly announced Cortex AI SQL enables direct SQL queries on both structured and unstructured data. Traditionally, analyzing images, audio, or text required complex preprocessing like ETL or feature extraction. With Snowflake Cortex AI SQL, these steps can be skipped, allowing unstructured data to be analyzed directly—marking a significant shift from traditional BI tools.

  • Flexible Querying Powered by Generative AI
    With Snowflake Cortex AI SQL, generative AI assists query execution, enabling use cases that were previously difficult or impossible. These capabilities were showcased through examples such as:
Reference: Leveraging Cortex AISQL For Multi-Modal Analytics
  • Image Similarity Search and Performance Analysis
    Cortex AI SQL can search for visually similar banner images based on previously used creatives and compare their click-through rates using SQL—helping visualize which visuals perform best.
Reference: Leveraging Cortex AISQL For Multi-Modal Analytics
  • Automated News Matching for Target Companies
    Another use case involves automatically joining a watchlist of companies with a stream of news articles, using AI_FILTER to extract only the most relevant content—enabling faster, more informed decisions in marketing or investment.

This seamless integration of SQL with generative AI makes it possible to query and analyze unstructured data—like images and text—as easily as structured data.

Applicable Across a Wide Range of Business Scenarios

The use cases extend far beyond the examples above. Cortex AI SQL can also be applied to:

  • Mapping customer support tickets to internal knowledge bases
  • Matching FAQ documents with chat logs
  • Performing correlation analysis between descriptions in unstructured documents and business flags

All of this can be achieved through SQL knowledge, enabling advanced insights without requiring expertise in data science or machine learning.

I want to get my data ready for AI

A key challenge in adopting AI—especially generative AI—is that it often fails to grasp the meaning behind enterprise data. Metrics like “revenue” or “CTR” vary across organizations, making it difficult to interpret data correctly. Bridging raw data with business context is essential to utilize platforms like Snowflake fully.

Defining Business Context with Semantic View

To address this challenge, Snowflake introduced Semantic View, a new schema-level object that connects business concepts—such as revenue, profit, or churn—with the actual data sources and logic used in your Snowflake environment. This semantic layer enables generative AI to understand which tables, columns, and calculations to reference when responding to natural language queries.

Setting the Foundation for Accurate AI Answers

Asking an AI, “What was the click-through rate last month?” might sound simple—but unless the underlying formula, timeframe, and data tables are clearly defined, the AI may not produce the correct SQL or answer. During recent Snowflake sessions, benchmarks clearly demonstrated that using a semantic layer dramatically improved the accuracy of AI-generated responses to natural language queries. These results reaffirm a critical truth: AI needs semantically rich data to deliver trustworthy insights.

Making “Data Meaning” a Shared Organizational Asset

Semantic View goes beyond metadata—it establishes a shared understanding across the organization. With clearly defined semantics, AI can interpret business questions like a human, aligned with company-specific logic. It provides the foundation for accurate, context-aware AI output.

I want to accelerate business growth with AI agents

One of the most talked-about innovations at Snowflake Summit 2025 was Snowflake Intelligence, a new AI agent capability.

Autonomous Execution from Natural Language Prompts

Source: Talk To Your Data: Snowflake Intelligence Demo

For example, a user can simply type, “Show me the ticket sales trend for this month’s festival,” and Snowflake Intelligence will reference Semantic View and the user’s permissions to access the necessary data. It then runs the appropriate analysis automatically and presents the results in a visual format.

This eliminates the need for users to explicitly specify which tables to use or how to structure the query, making advanced analytics accessible even to non-technical users.

Accuracy Anchored by the Semantic Layer

What makes Snowflake Intelligence reliable is their integration with Semantic View. Rather than having a generative AI produce arbitrary SQL, queries are generated based on organization-defined business logic. This ensures the results are not just plausible but business-accurate.

Lowering the Barrier to AI-Driven Insights

The evolution of AI agents like Snowflake Intelligence is dramatically lowering the barrier to advanced data analysis. Tasks that once required data scientists, software developers or specialized teams can now be handled directly by business users. This marks a shift from “using AI” to working alongside AI—a transformative step toward more autonomous and efficient AI-powered decision-making across the enterprise.

Final Thoughts: Powering the AI-Ready Data Foundation

The new capabilities announced at the Snowflake summit this year clearly point toward a cloud data platform optimized for the new AI era. As organizations look to unify and activate their data in the cloud, Snowflake continues to deliver compelling solutions to those challenges.

It’s no surprise that an increasing number of companies are turning to Snowflake to gain a competitive edge in the AI era. With deeper integration between Snowflake and dotData on the horizon, we expect to see even greater progress in enterprise AI adoption moving forward.

Supercharging AI-Driven Insights: Snowflake + dotData

While Snowflake provides a robust foundation for managing and governing massive volumes of data, that’s only half of the equation. To turn that data into meaningful business insights and drive action, organizations need intelligent automation—and that’s where dotData comes in.

Accelerating AI-Driven Data Utilization: The Power of dotData and Snowflake Integration

At Snowflake Summit 2025, the phrase “AI Ready” echoed throughout the event. Unlocking business value with AI requires more than simply collecting data—it demands that data be well-organized, well-governed, and readily transformable into actionable insights. This is where the integration of Snowflake and dotData plays a critical role.

Snowflake provides a robust data foundation that enables efficient storage, processing, and governance of both structured and unstructured data at scale. However, raw data alone doesn’t automatically translate into business insight. Unlocking value often requires formulating hypotheses and writing complex SQL queries—skills not always available across business teams.

dotData’s Feature Factory addresses this challenge by automatically generating AI features—hidden data patterns relevant to business objectives—directly from the data stored in Snowflake. These features remain governed within Snowflake and can be securely shared across teams and applications.

When paired with dotData Insight, Snowflake users can go even further—discovering key drivers of business KPIs and taking action through intuitive tools, without needing deep technical expertise. From surfacing insights to identifying target lists, the entire process becomes seamless.

Together, Snowflake and dotData enable an end-to-end workflow:

  • Preparing and transforming stored data
  • Extracting insights with business context
  • Taking data-driven action based on those insights

For organizations already having Snowflake account and looking to take the next step toward accelerating enterprise AI adoption and advanced analytics—or for those aiming to move beyond dashboards to deeper insight—this integration offers a compelling path forward.

If you’re ready to get more value from your data and accelerate your AI journey, we invite you to connect with us.

Takumi Sakamoto, VP of Engineering

Takumi is VP of Engineering at dotData. He leads all engineering efforts, including the development and operations of dotData Cloud, product support, and security. Prior to joining dotData, he served as VP of Engineering at Kaizen Platform. He also held key roles at SmartNews as both a Site Reliability Engineer and Data Engineer, where he helped establish the SRE team and data infrastructure, and at DeNA as an infrastructure engineer responsible for operating large-scale web services. With a career spanning major tech companies and startups, Takumi brings deep technical expertise and a strong track record of solving complex engineering challenges and driving successful projects.

Share
Published by
Takumi Sakamoto, VP of Engineering

Recent Posts

Discovering Hidden Insights with Data Analytics For Retail – Part 2

Summary of Part 1: Challenges and Limitations of Traditional Retail Analytics In Part 1 of…

2 weeks ago

Harnessing Data Analytics in the Retail Industry for Actionable Insights – Part 1

Drowning in Data, Starving for Insights Why Deep Analytics in Retail is No Longer Optional…

3 weeks ago

NAF 2025: The Evolving Landscape of Non-Prime Auto Finance

Key Takeaways from NAF 2025 The National Automotive Finance Association (NAF) conference is a pivotal…

4 weeks ago

Unlocking Hidden Yield & Quality Drivers with AI-driven Business Intelligence for Manufacturing

Introduction: The Importance of Manufacturing Yield and Quality Optimizing yield and maintaining high-quality standards in…

1 month ago

Best Practices for a Robust Enterprise Data Architecture

In today’s fast-changing business world, leveraging data is no longer optional — it’s essential for…

2 months ago

How AI is Reshaping the Lending Industry

A Guide for Chief Risk Officers Navigating the AI Revolution. Introduction For Lenders, the way…

2 months ago