Home Advanced Analytics SAC Live Connect to Snowflake – explained step by step

SAC Live Connect to Snowflake – explained step by step

20251127_Feature Update

How does SAC Live Connect to Snowflake work? In this guide, we'll show you step by step how to establish a live connection between SAP Analytics Cloud and Snowflake . You will get an overview of the architecture, technical requirements (including Data Access Agent), modeling, possible restrictions, and the most important advantages of real-time data analysis—explained in a clear and practical way.

Table of contents

1. Introduction – What is SAC Live Connect to Snowflake?

1.1. What is Snowflake?

Snowflake is a cloud-based data platform that helps companies store, manage, and analyze large amounts of data. The data platform was developed specifically for fast, user-friendly, and flexible processing of big data, enabling companies to integrate data from various sources and perform complex analyses.

Snowflake combines a completely new SQL query engine with an innovative architecture designed from the ground up for the cloud. It offers comprehensive features for enterprise analytics databases as well as unique characteristics and capabilities.

1.2. What is SAC Live Connect?

A live connection is a connection that uses data directly from the source system in real time. This means that changes made to the data in the source system areimmediatelyvisible when opening a story inSAP Analytics Cloud, as the data is not replicated in the SAC environment.

Figure 1 – Overview diagram for SAC Live connection[2]

1.3. What is SAC Live Connect to Snowflake?

SAC Live Connect to Snowflake means that SAP Analytics Cloud (SAC) establishes a live data connection to the Snowflake data platform—without importing the data into SAC. Every user interaction in SAC (filter, drill, chart load) generates a live SQL query and utilizes the full Snowflake computing power, and the result is visualized in SAC.

2. Architecture

Snowflake's architecture consists of three main components. These form the basis of the multi-cluster data architecture for Snowflake's cloud data platform:[3]

  • Cloud services: Snowflakeuses ANSI SQL for cloud services, enabling users to optimize their data and manage their infrastructure, and ensuring the security and encryption of stored data. Services include authentication, infrastructure management, query analysis and optimization, metadata management, and access control.
  • Computing power/compute cluster: Snowflake's computing layer consists of virtual cloud data warehouses that enable data to be analyzed through queries. Each virtual Snowflake warehouse is an independent cluster and does not compete for computing resources or affect the performance of others.
  • Storage layer:A Snowflake database stores a company's uploaded structured and semi-structured data sets for processing and analysis.

Figure 2 – Snowflake architecture[4]

3. Requirements for connection

Since the release of SAC QRC1.2026, SAP Analytics Cloud has offered standard integrated support for a live connection to Snowflake. Thanks to the new Data Access Agent, paid add-ons are no longer necessary. This means that both analytical models and (seamless) planning models based on Snowflake views can be created directly.

The following requirements must be met in order to establish SAC Live Connect to Snowflake:[5]

  • A connection to Snowflake is required
  • The SAC client must run on SAP HANA Cloud.
  • The Live Data Access Agent must be enabled.

The Data Access Agent is a data access component used to connect to live remote sources via SAP HANA Smart Data Access.

The Data Access Agent is available for clients running on SAP HANA Cloud. It enables live data sources to be connected to a database in real time and queried without having to collect and import data.

You must have the "Execute" right for the SDA Administration (Simple Diagnostics Agents) authorization to be able to change the status of the Data Access Agent and perform the following steps:

  • Select System àAdministration in the page navigation.
  • Go tothe Data Source Configuration tab.
  • Scroll down toData Access Agent.
  • SelectEnable Data Access Agent to activate the option.
  • In the confirmation dialog that opens, selectActivate.
  • SelectSave to apply your changes.

Result: The Data Access Agent is now enabled, and you can create live connections to SQL data sources.

4. Connection – Step-by-step instructions

4.1. A Snowflake Live connection in SAC can be created as follows

In SAP Analytics Cloud, select "Connections" in the side navigation.

Select Snowflake.

Enter a connection name.

Optional: Enter a description.

Enter the Snowflake account name.

Enter the user name.

Select "Select Key File" to locate and select the private key file .p8.

Optional: If the private key is encrypted, enter the passphrase.

Enter the name of the Snowflake database.

Result: You have now established a live connection between SAP Analytics Cloud and a Snowflake database.

4.2. A SAC Snowflake Live model can be created as follows:

SelectModel on the Modeler home page.

SelectStart with data.

In theSelect Data Sourcedialog box, selectSnowflake.

Select an existing connection to Snowflake or create a new connection.

In theSelect Data Processing Methoddialog box, selectLive Connection and clickNext.

Select a view.

By default, the application names the model after the selected view, but you can change the name manually.

SelectCreate.

Result: Alive connection badge is now displayed in the data source to confirm that the model has live data access to the Snowflake view you selected.

4.3. Restrictions on SAC Live Connect to Snowflake[8]

  • With live data access, the application retrieves data directly from Snowflake. For optimal performance, we recommend avoiding live connections to Snowflake with tables containing more than one million rows. This is a temporary limitation that SAP is actively working to resolve in future updates.
  • The following Snowflake data types are supported: NUMBER, DECIMAL, FLOAT, VARCHAR, DATE, TIME, TIMESTAMP, TIMESTAMP_LTZ, TIMESTAMP_NTZ, TIMESTAMP_TZ. If the application detects unsupported data types that are not listed above, it automatically prevents the creation of models.
  • The data types FLOAT and TIME are interpreted as text by SAP Analytics Cloud. If necessary, the data types can be changed when assigning columns in the Modeler. However, this can lead to performance issues. We recommend changing the data types directly in the source system.
  • These data types are partially supported: ARRAY, GEOGRAPHY, GEOMETRY, OBJECT, VARIANT. Models containing these data types may cause validation or other issues when used in stories.

5. Conclusion – The most important advantages at a glance

The live connection to Snowflake Data Cloud data from SAP Analytics Cloud offers the following advantages in addition to fast, efficient, and transparent data access:

  • Thanks to a live connection to SAP Analytics Cloud, the analysis processes work directly with the Snowflake environment, so no data needs to be replicated outside of Snowflake. Snowflake's powerful data processing engine enables extremely fast processing of live queries, with performance tested on over a billion rows of data.
  • The data in the Snowflake Data Cloud is always up to date without data replication or data latency.

6. List of sources

[1] What is Snowflake? Architecture, advantages, costs – Datasolut GmbH

[2] Overview Diagram for Live Data Connections | SAP Help Portal

[3] What is Snowflake? Architecture, advantages, costs – Datasolut GmbH

[4] What is Snowflake? Architecture, advantages, costs – Datasolut GmbH

[5] Create Model for Live Data Access | SAP Help Portal

[6] Managing the Data Access Agent for Live Data Connections to SQL Data Sources | SAP Help Portal

[7] Managing the Data Access Agent for Live Data Connections to SQL Data Sources | SAP Help Portal

[8] Creating a Model for Live Data Access | SAP Help Portal

The live connection between SAC and Snowflake offers enormous potential.

Our experts will help you get the most out of it—from architectural planning to performance tuning. Contact us for a no-obligation initial consultation.

Then book an appointment here with our SAP Analytics expert Robert Kehrli.

 

 

Published by:

Marie Daipo

author

How did you like the article?

How helpful was this post?

Click on a star to rate!

Average rating 5 / 5.
Number of ratings: 4

No votes so far! Be the first person to rate this post!

INFORMATION

More information

What is SAP S/4HANA?

SAP S/4HANA is more than just a technical upgrade—it’s a fundamental system transformation. In this article, you’ll learn...

AI Meets BI: Modern Reporting in the Databricks Lakehouse

In the traditional IT world, there are often two distinct realms: Business Intelligence (BI), which deals with the analysis of historical...
Symbolic image for data formats in Databricks. An icon represents the layered structure of Parquet files with an overlying Delta Lake layer.

Data formats in Databricks: A guide to Parquet, Delta Lake, and alternatives

Choosing the right data format is a critical but often underestimated factor for performance and efficiency in Databricks....
wiki_overview integration_methods_SAP according to Databricks-

SAP data to Databricks: A comparison of the 5 integration methods

How does this work in data sharing with SAP and Databricks? The strategic partnership between SAP and Databricks enables...
SAP Databricks Wiki

Zero Copy Delta Share at Databricks: Sharing data without copying it – the zero-copy principle explained simply

How does this work in data sharing with SAP and Databricks? The strategic partnership between SAP and Databricks enables...
9.1 Differences between SAP Databricks and native Databricks

SAP Databricks vs. Native Databricks: The detailed comparison for your company

SAP Databricks or Native Databricks? A strategic decision that many companies are facing. While SAP Databricks is a specialized solution...
Cover_Photo_SAC_AI_ML_Features_at_a_glance

SAC AI features explained: Joule, Just Ask, and Smart Predict

This wiki explains how to use Smart Predict to create automated forecasting models...