Our second day at the Snowflake Summit began with an incredible announcement: Hakkoda was awarded Snowflake Americas Innovation Partner of the Year. This award recognized the company’s achievements as part of the Snowflake Data Cloud, helping joint customers migrate, modernize, and unlock innovation with Snowflake.
Our data experts also assisted in plenty of conferences and speeches, which provided them with great insight on what features are being released and how they can use them effectively.
The main takeaways from today were:
- The modernization of AI helps users create better data projects.
- New features and announcements were oriented towards better integration.
- Fidelity’s experience in creating a single source of truth.

The Modernization of AI Create Better Data Projects
Benoit Dageville, Co-founder and President of Product at Snowflake; Sridhar Ramaswamy, Senior Vice President of Snowflake and Founder of neeva and Mona Attariyan, Director of AI/ML Engineering at Snowflake talked about spearheading the efforts to bring AI to the data cloud, enabling the democratization of data. Dageville emphasized that the modernization of AI allows for the simplification of technical tasks, such as writing queries and creating visualizations, making them accessible to non-technical individuals. By leveraging generative AI within Snowflake, users gain direct access to run any model in the data cloud, streamlining processes and ensuring governance.
Ramaswamy expressed his excitement about empowering all Snowflake customers with the power of AI. Snowflake’s platform, coupled with the integration of the Language Model (LLM), provides the necessary context to enhance accuracy. He focused on the importance of combining the capabilities of the LLM with Snowflake’s features, as the models themselves lack the ability to discern what is real. Snowflake’s reputation as a secure and reliable data platform further reinforces its suitability for building easy-to-use data apps that harness the potential of machine learning (ML) and AI.
Finally, Mona Attariyan raised the question of why Snowflake is the ideal environment for shaping the future of AI. She highlights the ability to interact with computer systems using natural language through the implementation of generative AI. Some of the use cases discussed include improving catalog searches, gaining valuable insights, and facilitating assisted development.
Overall, the combination of Snowflake’s secure data platform, the integration of generative AI, and the ability to interact with systems using natural language provides a powerful framework for democratizing data, simplifying technical tasks, and unlocking the full potential of AI for all Snowflake users.

New Features Are Oriented Towards Better Integration
Christian Kleinerman, the SVP of Product at Snowflake, made several announcements and releases. One of the key updates is the unification of Iceberg tables as a single mode to interact with external data. This allows users to handle unstructured data by leveraging the private preview of Document AI. Users can ask questions in natural language about documents stored in Snowflake, and the system can extract fields and structure from the document, making the data available in tables for further analysis. The models can be published and used by other teams, providing opportunities for feedback and retraining.
Kleinerman also introduced various enhancements and features. Query constraints can be set as policies on datasets, defining the types of queries that can be performed, simplifying compliance and governance. Differential privacy and data clean rooms ensure privacy and secure multiparty computation. Geospatial support has been expanded, and the Snowflake Performance Index (SPI) has shown a 15% improvement in query duration based on customer workloads. Machine learning-powered functions are in private preview, enabling tasks such as time series analysis, forecasting, and trigger alerts.
In terms of developer experience, Snowflake offers a range of tools, including Python API, REST API, Snowflake CLI, and Git integration for code maintenance within the platform. Streaming capabilities have been introduced in public preview, allowing dynamic table creation without writing SQL but utilizing the Language Model. The Snowpark container services now include the computational power of NVIDIA GPUs, providing significant performance improvements and cost savings for training ML models in Snowflake. Overall, the focus at Snowflake is to make customers’ lives easier, as emphasized by Allison Lee, Senior Director of Engineering and Founding Engineer of SQL.

Snowflake Data Processing Enables a Single Source of Truth
During the event, Mihir Shah, CIO and Enterprise Head of Data Architecture & Engineering at Fidelity, shared their experience with Snowflake. Fidelity, a multinational financial services corporation, successfully un-siloed and consolidated their data into a single source of truth using Snowflake. Shah emphasized the importance of getting the data strategy right, stating that when that is in place, everything else falls into place. He highlighted that the technology is no longer a barrier to integrating enterprise data into a single data model; the challenge lies in developing an operating model for execution.
In the demos section, it was showcased how easy it is to publish native apps in the market for Fidelity’s 8,000+ customers. Custom event billing allows developers to choose the billing strategy and track usage per account or user. The capacity drawdown feature enables the use of capacity commitment to deduct app and dataset usage, including compute. The intellectual property of the native apps remains safe and not visible to customers. Other features mentioned include automatic sync with Git repositories for code stored within Snowflake and the introduction of the new Snowflake CLI, an open-source developer-centric tool.
A significant highlight was the discussion of Snowflake containerized services, currently in private preview. This feature allows for the acceleration of time to value by using more runtimes and languages. Customers can host Docker container services within Snowflake, providing support for GPUs.
Through demos, it was demonstrated how NVIDIA AI GPUs significantly boost model training, resulting in productivity improvements and cost reductions. By leveraging containerized services, customers can use third-party AI and ML providers within Snowflake, ensuring data security and shared access. Examples of such third-party containerized apps include Astronomer, SAS, Dataiku, Hex, NVIDIA’s Nemo LM models, and RelationalAI. Importantly, the data never leaves Snowflake, as these apps run natively in containers within the platform.
What a Day!
If you’re attending the Snowflake Summit, don’t forget to stop by our booth, learn more about what we do and play Hakkman Legacy Chomp!