Tuesday, September 16, 2025

Break down information silos and seamlessly question Iceberg tables in Amazon SageMaker from Snowflake

Organizations usually battle to unify their information ecosystems throughout a number of platforms and companies. The connectivity between Amazon SageMaker and Snowflake’s AI Information Cloud affords a robust resolution to this problem, so companies can make the most of the strengths of each environments whereas sustaining a cohesive information technique.

On this submit, we show how one can break down information silos and improve your analytical capabilities by querying Apache Iceberg tables within the lakehouse structure of SageMaker straight from Snowflake. With this functionality, you may entry and analyze information saved in Amazon Easy Storage Service (Amazon S3) by AWS Glue Information Catalog utilizing an AWS Glue Iceberg REST endpoint, all secured by AWS Lake Formation, with out the necessity for advanced extract, remodel, and cargo (ETL) processes or information duplication. It’s also possible to automate desk discovery and refresh utilizing Snowflake catalog-linked databases for Iceberg. Within the following sections, we present learn how to arrange this integration so Snowflake customers can seamlessly question and analyze information saved in AWS, thereby enhancing information accessibility, lowering redundancy, and enabling extra complete analytics throughout your complete information ecosystem.

Enterprise use circumstances and key advantages

The aptitude to question Iceberg tables in SageMaker from Snowflake delivers vital worth throughout a number of industries:

  • Monetary companies – Improve fraud detection by unified evaluation of transaction information and buyer habits patterns
  • Healthcare – Enhance affected person outcomes by built-in entry to scientific, claims, and analysis information
  • Retail – Enhance buyer retention charges by connecting gross sales, stock, and buyer habits information for customized experiences
  • Manufacturing – Increase manufacturing effectivity by unified sensor and operational information analytics
  • Telecommunications – Scale back buyer churn with complete evaluation of community efficiency and buyer utilization information

Key advantages of this functionality embody:

  • Accelerated decision-making – Scale back time to perception by built-in information entry throughout platforms
  • Value optimization – Speed up time to perception by querying information straight in storage with out the necessity for ingestion
  • Improved information constancy – Scale back information inconsistencies by establishing a single supply of reality
  • Enhanced collaboration – Enhance cross-functional productiveness by simplified information sharing between information scientists and analysts

Through the use of the lakehouse structure of SageMaker with Snowflake’s serverless and zero-tuning computational energy, you may break down information silos, enabling complete analytics and democratizing information entry. This integration helps a contemporary information structure that prioritizes flexibility, safety, and analytical efficiency, in the end driving sooner, extra knowledgeable decision-making throughout the enterprise.

Resolution overview

The next diagram exhibits the structure for catalog integration between Snowflake and Iceberg tables within the lakehouse.

Catalog integration to query Iceberg tables in S3 bucket using Iceberg REST Catalog (IRC) with credential vending

The workflow consists of the next elements:

  • Information storage and administration:
    • Amazon S3 serves as the first storage layer, internet hosting the Iceberg desk information
    • The Information Catalog maintains the metadata for these tables
    • Lake Formation supplies credential merchandising
  • Authentication stream:
    • Snowflake initiates queries utilizing a catalog integration configuration
    • Lake Formation vends momentary credentials by AWS Safety Token Service (AWS STS)
    • These credentials are routinely refreshed based mostly on the configured refresh interval
  • Question stream:
    • Snowflake customers submit queries towards the mounted Iceberg tables
    • The AWS Glue Iceberg REST endpoint processes these requests
    • Question execution makes use of Snowflake’s compute assets whereas studying straight from Amazon S3
    • Outcomes are returned to Snowflake customers whereas sustaining all safety controls

There are 4 patterns to question Iceberg tables in SageMaker from Snowflake:

  • Iceberg tables in an S3 bucket utilizing an AWS Glue Iceberg REST endpoint and Snowflake Iceberg REST catalog integration, with credential merchandising from Lake Formation
  • Iceberg tables in an S3 bucket utilizing an AWS Glue Iceberg REST endpoint and Snowflake Iceberg REST catalog integration, utilizing Snowflake exterior volumes to Amazon S3 information storage
  • Iceberg tables in an S3 bucket utilizing AWS Glue API catalog integration, additionally utilizing Snowflake exterior volumes to Amazon S3
  • Amazon S3 Tables utilizing Iceberg REST catalog integration with credential merchandising from Lake Formation

On this submit, we implement the primary of those 4 entry patterns utilizing catalog integration for the AWS Glue Iceberg REST endpoint with Signature Model 4 (SigV4) authentication in Snowflake.

Conditions

You have to have the next conditions:

The answer takes roughly 30–45 minutes to arrange. Value varies based mostly on information quantity and question frequency. Use the AWS Pricing Calculator for particular estimates.

Create an IAM position for Snowflake

To create an IAM position for Snowflake, you first create a coverage for the position:

  1. On the IAM console, select Insurance policies within the navigation pane.
  2. Select Create coverage.
  3. Select the JSON editor and enter the next coverage (present your AWS Area and account ID), then select Subsequent.
{      "Model": "2012-10-17",      "Assertion": [          {              "Sid": "AllowGlueCatalogTableAccess",              "Effect": "Allow",              "Action": [                  "glue:GetCatalog",                  "glue:GetCatalogs",                  "glue:GetPartitions",                  "glue:GetPartition",                  "glue:GetDatabase",                  "glue:GetDatabases",                  "glue:GetTable",                  "glue:GetTables",                  "glue:UpdateTable"              ],              "Useful resource": [                  "arn:aws:glue:::catalog",                  "arn:aws:glue:::database/iceberg_db",                  "arn:aws:glue:::table/iceberg_db/*",              ]          },          {              "Impact": "Enable",              "Motion": [                  "lakeformation:GetDataAccess"              ],              "Useful resource": "*"          }      ]  }

  1. Enter iceberg-table-access because the coverage title.
  2. Select Create coverage.

Now you may create the position and connect the coverage you created.

  1. Select Roles within the navigation pane.
  2. Select Create position.
  3. Select AWS account.
  4. Beneath Choices, choose Require Exterior Id and enter an exterior ID of your selection.
  5. Select Subsequent.
  6. Select the coverage you created (iceberg-table-access coverage).
  7. Enter snowflake_access_role because the position title.
  8. Select Create position.

Configure Lake Formation entry controls

To configure your Lake Formation entry controls, first arrange the appliance integration:

  1. Check in to the Lake Formation console as an information lake administrator.
  2. Select Administration within the navigation pane.
  3. Choose Utility integration settings.
  4. Allow Enable exterior engines to entry information in Amazon S3 areas with full desk entry.
  5. Select Save.

Now you may grant permissions to the IAM position.

  1. Select Information permissions within the navigation pane.
  2. Select Grant.
  3. Configure the next settings:
    1. For Principals, choose IAM customers and roles and select snowflake_access_role.
    2. For Sources, choose Named Information Catalog assets.
    3. For Catalog, select your AWS account ID.
    4. For Database, select iceberg_db.
    5. For Desk, select buyer.
    6. For Permissions, choose SUPER.
  4. Select Grant.

SUPER entry is required for mounting the Iceberg desk in Amazon S3 as a Snowflake desk.

Register the S3 information lake location

Full the next steps to register the S3 information lake location:

  1. As information lake administrator on the Lake Formation console, select Information lake areas within the navigation pane.
  2. Select Register location.
  3. Configure the next:
    1. For S3 path, enter the S3 path to the bucket the place you’ll retailer your information.
    2. For IAM position, select LakeFormationLocationRegistrationRole.
    3. For Permission mode, select Lake Formation.
  4. Select Register location.

Arrange the Iceberg REST integration in Snowflake

Full the next steps to arrange the Iceberg REST integration in Snowflake:

  1. Log in to Snowflake as an admin consumer.
  2. Execute the next SQL command (present your Area, account ID, and exterior ID that you simply supplied throughout IAM position creation):
CREATE OR REPLACE CATALOG INTEGRATION glue_irc_catalog_int CATALOG_SOURCE = ICEBERG_REST TABLE_FORMAT = ICEBERG CATALOG_NAMESPACE = 'iceberg_db' REST_CONFIG = (     CATALOG_URI = 'https://glue..amazonaws.com/iceberg'     CATALOG_API_TYPE = AWS_GLUE     CATALOG_NAME = ''     ACCESS_DELEGATION_MODE = VENDED_CREDENTIALS ) REST_AUTHENTICATION = (     TYPE = SIGV4     SIGV4_IAM_ROLE = 'arn:aws:iam:::position/snowflake_access_role'     SIGV4_SIGNING_REGION = ''     SIGV4_EXTERNAL_ID = '' ) REFRESH_INTERVAL_SECONDS = 120 ENABLED = TRUE;

  1. Execute the next SQL command and retrieve the worth for API_AWS_IAM_USER_ARN:

DESCRIBE CATALOG INTEGRATION glue_irc_catalog_int;

  1. On the IAM console, replace the belief relationship for snowflake_access_role with the worth for API_AWS_IAM_USER_ARN:
{     "Model": "2012-10-17",     "Assertion": [         {             "Sid": "",             "Effect": "Allow",             "Principal": {                 "AWS": [                    ""                 ]             },             "Motion": "sts:AssumeRole",             "Situation": {                 "StringEquals": {                     "sts:ExternalId": [                         ""                     ]                 }             }         }     ] }

  1. Confirm the catalog integration:

SELECT SYSTEM$VERIFY_CATALOG_INTEGRATION('glue_irc_catalog_int');

  1. Mount the S3 desk as a Snowflake desk:
CREATE OR REPLACE ICEBERG TABLE s3iceberg_customer  CATALOG = 'glue_irc_catalog_int'  CATALOG_NAMESPACE = 'iceberg_db'  CATALOG_TABLE_NAME = 'buyer'  AUTO_REFRESH = TRUE;

Question the Iceberg desk from Snowflake

To check the configuration, log in to Snowflake as an admin consumer and run the next pattern question:SELECT * FROM s3iceberg_customer LIMIT 10;

Clear up

To scrub up your assets, full the next steps:

  1. Delete the database and desk in AWS Glue.
  2. Drop the Iceberg desk, catalog integration, and database in Snowflake:
DROP ICEBERG TABLE iceberg_customer; DROP CATALOG INTEGRATION glue_irc_catalog_int;

Be sure that all assets are correctly cleaned as much as keep away from surprising fees.

Conclusion

On this submit, we demonstrated learn how to set up a safe and environment friendly connection between your Snowflake atmosphere and SageMaker to question Iceberg tables in Amazon S3. This functionality may also help your group keep a single supply of reality whereas additionally letting groups use their most popular analytics instruments, in the end breaking down information silos and enhancing collaborative evaluation capabilities.

To additional discover and implement this resolution in your atmosphere, contemplate the next assets:

  • Technical documentation:
  • Associated weblog posts:

These assets may also help you to implement and optimize this integration sample on your particular use case. As you start this journey, keep in mind to begin small, validate your structure with take a look at information, and regularly scale your implementation based mostly in your group’s wants.


Concerning the authors

Nidhi Gupta

Nidhi Gupta

Nidhi is a Senior Associate Options Architect at AWS, specializing in information and analytics. She helps prospects and companions construct and optimize Snowflake workloads on AWS. Nidhi has intensive expertise main manufacturing releases and deployments, with concentrate on Information, AI, ML, generative AI, and Superior Analytics.

Andries Engelbrecht

Andries Engelbrecht

Andries is a Principal Associate Options Engineer at Snowflake working with AWS. He helps product and repair integrations, as effectively the event of joint options with AWS. Andries has over 25 years of expertise within the discipline of information and analytics.

Related Articles

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Latest Articles