Spring Sale 70% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: clap70

ARA-C01 SnowPro Advanced: Architect Certification Exam Questions and Answers

Questions 4

A group of order_admin users must delete records older than 5 years from an orders table without having DELETE privileges. The order_manager role has DELETE privileges.

How can this be achieved?

Options:

A.

Create a stored procedure with caller’s rights.

B.

Create a stored procedure with selectable caller/owner rights.

C.

Create a stored procedure that runs with owner’s rights and grant usage to order_admin.

D.

This is not possible without DELETE privileges.

Buy Now
Questions 5

Database DB1 has schema S1 which has one table, T1.

DB1 --> S1 --> T1

The retention period of EG1 is set to 10 days.

The retention period of s: is set to 20 days.

The retention period of t: Is set to 30 days.

The user runs the following command:

Drop Database DB1;

What will the Time Travel retention period be for T1?

Options:

A.

10 days

B.

20 days

C.

30 days

D.

37 days

Buy Now
Questions 6

When using the copy into

command with the CSV file format, how does the match_by_column_name parameter behave?

Options:

A.

It expects a header to be present in the CSV file, which is matched to a case-sensitive table column name.

B.

The parameter will be ignored.

C.

The command will return an error.

D.

The command will return a warning stating that the file has unmatched columns.

Buy Now
Questions 7

A company has an inbound share set up with eight tables and five secure views. The company plans to make the share part of its production data pipelines.

Which actions can the company take with the inbound share? (Choose two.)

Options:

A.

Clone a table from a share.

B.

Grant modify permissions on the share.

C.

Create a table from the shared database.

D.

Create additional views inside the shared database.

E.

Create a table stream on the shared table.

Buy Now
Questions 8

How can the Snowflake context functions be used to help determine whether a user is authorized to see data that has column-level security enforced? (Select TWO).

Options:

A.

Set masking policy conditions using current_role targeting the role in use for the current session.

B.

Set masking policy conditions using is_role_in_session targeting the role in use for the current account.

C.

Set masking policy conditions using invoker_role targeting the executing role in a SQL statement.

D.

Determine if there are ownership privileges on the masking policy that would allow the use of any function.

E.

Assign the accountadmin role to the user who is executing the object.

Buy Now
Questions 9

An Architect wants to integrate Snowflake with a Git repository that requires authentication. What is the correct sequence of steps to be followed?

Options:

Buy Now
Questions 10

How can an Architect enable optimal clustering to enhance performance for different access paths on a given table?

Options:

A.

Create multiple clustering keys for a table.

B.

Create multiple materialized views with different cluster keys.

C.

Create super projections that will automatically create clustering.

D.

Create a clustering key that contains all columns used in the access paths.

Buy Now
Questions 11

The following statements have been executed successfully:

USE ROLE SYSADMIN;

CREATE OR REPLACE DATABASE DEV_TEST_DB;

CREATE OR REPLACE SCHEMA DEV_TEST_DB.SCHTEST WITH MANAGED ACCESS;

GRANT USAGE ON DATABASE DEV_TEST_DB TO ROLE DEV_PROJ_OWN;

GRANT USAGE ON SCHEMA DEV_TEST_DB.SCHTEST TO ROLE DEV_PROJ_OWN;

GRANT USAGE ON DATABASE DEV_TEST_DB TO ROLE ANALYST_PROJ;

GRANT USAGE ON SCHEMA DEV_TEST_DB.SCHTEST TO ROLE ANALYST_PROJ;

GRANT CREATE TABLE ON SCHEMA DEV_TEST_DB.SCHTEST TO ROLE DEV_PROJ_OWN;

USE ROLE DEV_PROJ_OWN;

CREATE OR REPLACE TABLE DEV_TEST_DB.SCHTEST.CURRENCY (

COUNTRY VARCHAR(255),

CURRENCY_NAME VARCHAR(255),

ISO_CURRENCY_CODE VARCHAR(15),

CURRENCY_CD NUMBER(38,0),

MINOR_UNIT VARCHAR(255),

WITHDRAWAL_DATE VARCHAR(255)

);

The role hierarchy is as follows (simplified from the diagram):

    ACCOUNTADMIN└─ DEV_SYSADMIN└─ DEV_PROJ_OWN└─ ANALYST_PROJ

Separately:

    ACCOUNTADMIN└─ SYSADMIN└─ MAPPING_ROLE

Which statements will return the records from the table

DEV_TEST_DB.SCHTEST.CURRENCY? (Select TWO)

Options:

A.

USE ROLE DEV_PROJ_OWN;

GRANT SELECT ON DEV_TEST_DB.SCHTEST.CURRENCY TO ROLE ANALYST_PROJ;

USE ROLE ANALYST_PROJ;

SELECT * FROM DEV_TEST_DB.SCHTEST.CURRENCY;

B.

USE ROLE DEV_PROJ_OWN;

SELECT * FROM DEV_TEST_DB.SCHTEST.CURRENCY;

C.

USE ROLE SYSADMIN;

SELECT * FROM DEV_TEST_DB.SCHTEST.CURRENCY;

D.

USE ROLE MAPPING_ROLE;

SELECT * FROM DEV_TEST_DB.SCHTEST.CURRENCY;

E.

USE ROLE ACCOUNTADMIN;

SELECT * FROM DEV_TEST_DB.SCHTEST.CURRENCY;

Buy Now
Questions 12

A company has a source system that provides JSON records for various loT operations. The JSON Is loading directly into a persistent table with a variant field. The data Is quickly growing to 100s of millions of records and performance to becoming an issue. There is a generic access pattern that Is used to filter on the create_date key within the variant field.

What can be done to improve performance?

Options:

A.

Alter the target table to Include additional fields pulled from the JSON records. This would Include a create_date field with a datatype of time stamp. When this field Is used in the filter, partition pruning will occur.

B.

Alter the target table to include additional fields pulled from the JSON records. This would include a create_date field with a datatype of varchar. When this field is used in the filter, partition pruning will occur.

C.

Validate the size of the warehouse being used. If the record count is approaching 100s of millions, size XL will be the minimum size required to process this amount of data.

D.

Incorporate the use of multiple tables partitioned by date ranges. When a user or process needs to query a particular date range, ensure the appropriate base table Is used.

Buy Now
Questions 13

An Architect is using SnowCD to investigate a connectivity issue.

Which system function will provide a list of endpoints that the network must be able to access to use a specific Snowflake account, leveraging private connectivity?

Options:

A.

SYSTEMSALLOWLIST ()

B.

SYSTEMSGET_PRIVATELINK

C.

SYSTEMSAUTHORIZE_PRIVATELINK

D.

SYSTEMSALLOWLIST_PRIVATELINK ()

Buy Now
Questions 14

A healthcare company wants to share data with a medical institute. The institute is running a Standard edition of Snowflake; the healthcare company is running a Business Critical edition.

How can this data be shared?

Options:

A.

The healthcare company will need to change the institute’s Snowflake edition in the accounts panel.

B.

By default, sharing is supported from a Business Critical Snowflake edition to a Standard edition.

C.

Contact Snowflake and they will execute the share request for the healthcare company.

D.

Set the share_restriction parameter on the shared object to false.

Buy Now
Questions 15

A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

The data pipeline needs to run continuously ang efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

Which design will meet these requirements?

Options:

A.

Ingest the data using COPY INTO and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

B.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

C.

Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

D.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

Buy Now
Questions 16

An Architect has a table called leader_follower that contains a single column named JSON. The table has one row with the following structure:

{

"activities": [

{ "activityNumber": 1, "winner": 5 },

{ "activityNumber": 2, "winner": 4 }

],

"follower": {

"name": { "default": "Matt" },

"number": 4

},

"leader": {

"name": { "default": "Adam" },

"number": 5

}

}

Which query will produce the following results?

ACTIVITY_NUMBER

WINNER_NAME

1

Adam

2

Matt

Options:

A.

SELECT lf.json:activities.activityNumber AS activity_number,

IFF(

lf.json:activities.activityNumber = lf.json:leader.number,

lf.json:leader.name.default,

lf.json:follower.name.default

)::VARCHAR

FROM leader_follower lf;

B.

SELECT

C.

value:activityNumber AS activity_number,

IFF(

D.

value:winner = lf.json:leader.number,

lf.json:leader.name.default,

lf.json:follower.name.default

)::VARCHAR AS winner_name

FROM leader_follower lf,

LATERAL FLATTEN(input => json:activities) p;

E.

SELECT

F.

value:activityNumber AS activity_number,

IFF(

G.

value:winner = lf.json:leader.number,

lf.json:leader,

lf.json:follower

)::VARCHAR AS winner_name

FROM leader_follower lf,

LATERAL FLATTEN(input => json:activities) p;

Buy Now
Questions 17

A global company needs to securely share its sales and Inventory data with a vendor using a Snowflake account.

The company has its Snowflake account In the AWS eu-west 2 Europe (London) region. The vendor's Snowflake account Is on the Azure platform in the West Europe region. How should the company's Architect configure the data share?

Options:

A.

1. Create a share.2. Add objects to the share.3. Add a consumer account to the share for the vendor to access.

B.

1. Create a share.2. Create a reader account for the vendor to use.3. Add the reader account to the share.

C.

1. Create a new role called db_share.2. Grant the db_share role privileges to read data from the company database and schema.3. Create a user for the vendor.4. Grant the ds_share role to the vendor's users.

D.

1. Promote an existing database in the company's local account to primary.2. Replicate the database to Snowflake on Azure in the West-Europe region.3. Create a share and add objects to the share.4. Add a consumer account to the share for the vendor to access.

Buy Now
Questions 18

An Architect has been asked to clone schema STAGING as it looked one week ago, Tuesday June 1st at 8:00 AM, to recover some objects.

The STAGING schema has 50 days of retention.

The Architect runs the following statement:

CREATE SCHEMA STAGING_CLONE CLONE STAGING at (timestamp => '2021-06-01 08:00:00');

The Architect receives the following error: Time travel data is not available for schema STAGING. The requested time is either beyond the allowed time travel period or before the object creation time.

The Architect then checks the schema history and sees the following:

CREATED_ON|NAME|DROPPED_ON

2021-06-02 23:00:00 | STAGING | NULL

2021-05-01 10:00:00 | STAGING | 2021-06-02 23:00:00

How can cloning the STAGING schema be achieved?

Options:

A.

Undrop the STAGING schema and then rerun the CLONE statement.

B.

Modify the statement: CREATE SCHEMA STAGING_CLONE CLONE STAGING at (timestamp => '2021-05-01 10:00:00');

C.

Rename the STAGING schema and perform an UNDROP to retrieve the previous STAGING schema version, then run the CLONE statement.

D.

Cloning cannot be accomplished because the STAGING schema version was not active during the proposed Time Travel time period.

Buy Now
Questions 19

A media company needs a data pipeline that will ingest customer review data into a Snowflake table, and apply some transformations. The company also needs to use Amazon Comprehend to do sentiment analysis and make the de-identified final data set available publicly for advertising companies who use different cloud providers in different regions.

The data pipeline needs to run continuously and efficiently as new records arrive in the object storage leveraging event notifications. Also, the operational complexity, maintenance of the infrastructure, including platform upgrades and security, and the development effort should be minimal.

Which design will meet these requirements?

Options:

A.

Ingest the data using copy into and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

B.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Create an external function to do model inference with Amazon Comprehend and write the final records to a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

C.

Ingest the data into Snowflake using Amazon EMR and PySpark using the Snowflake Spark connector. Apply transformations using another Spark job. Develop a python program to do model inference by leveraging the Amazon Comprehend text analysis API. Then write the results to a Snowflake table and create a listing in the Snowflake Marketplace to make the data available to other companies.

D.

Ingest the data using Snowpipe and use streams and tasks to orchestrate transformations. Export the data into Amazon S3 to do model inference with Amazon Comprehend and ingest the data back into a Snowflake table. Then create a listing in the Snowflake Marketplace to make the data available to other companies.

Buy Now
Questions 20

Company A would like to share data in Snowflake with Company B. Company B is not on the same cloud platform as Company A.

What is required to allow data sharing between these two companies?

Options:

A.

Create a pipeline to write shared data to a cloud storage location in the target cloud provider.

B.

Ensure that all views are persisted, as views cannot be shared across cloud platforms.

C.

Setup data replication to the region and cloud platform where the consumer resides.

D.

Company A and Company B must agree to use a single cloud platform: Data sharing is only possible if the companies share the same cloud provider.

Buy Now
Questions 21

An Architect needs to grant a group of ORDER_ADMIN users the ability to clean old data in an ORDERS table (deleting all records older than 5 years), without granting any privileges on the table. The group’s manager (ORDER_MANAGER) has full DELETE privileges on the table.

How can the ORDER_ADMIN role be enabled to perform this data cleanup, without needing the DELETE privilege held by the ORDER_MANAGER role?

Options:

A.

Create a stored procedure that runs with caller’s rights, including the appropriate "> 5 years" business logic, and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

B.

Create a stored procedure that can be run using both caller’s and owner’s rights (allowing the user to specify which rights are used during execution), and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

C.

Create a stored procedure that runs with owner’s rights, including the appropriate "> 5 years" business logic, and grant USAGE on this procedure to ORDER_ADMIN. The ORDER_MANAGER role owns the procedure.

D.

This scenario would actually not be possible in Snowflake – any user performing a DELETE on a table requires the DELETE privilege to be granted to the role they are using.

Buy Now
Questions 22

A user, analyst_user has been granted the analyst_role, and is deploying a SnowSQL script to run as a background service to extract data from Snowflake.

What steps should be taken to allow the IP addresses to be accessed? (Select TWO).

Options:

A.

ALTERROLEANALYST_ROLESETNETWORK_POLICY='ANALYST_POLICY';

B.

ALTERUSERANALYSTJJSERSETNETWORK_POLICY='ANALYST_POLICY';

C.

ALTERUSERANALYST_USERSETNETWORK_POLICY='10.1.1.20';

D.

USE ROLE SECURITYADMIN;CREATE OR REPLACE NETWORK POLICY ANALYST_POLICY ALLOWED_IP_LIST = ('10.1.1.20');

E.

USE ROLE USERADMIN;CREATE OR REPLACE NETWORK POLICY ANALYST_POLICYALLOWED_IP_LIST = ('10.1.1.20');

Buy Now
Questions 23

What Snowflake system functions are used to view and or monitor the clustering metadata for a table? (Select TWO).

Options:

A.

SYSTEMSCLUSTERING

B.

SYSTEMSTABLE_CLUSTERING

C.

SYSTEMSCLUSTERING_DEPTH

D.

SYSTEMSCLUSTERING_RATIO

E.

SYSTEMSCLUSTERING_INFORMATION

Buy Now
Questions 24

An Architect runs the following SQL query:

How can this query be interpreted?

Options:

A.

FILEROWS is a stage. FILE_ROW_NUMBER is line number in file.

B.

FILEROWS is the table. FILE_ROW_NUMBER is the line number in the table.

C.

FILEROWS is a file. FILE_ROW_NUMBER is the file format location.

D.

FILERONS is the file format location. FILE_ROW_NUMBER is a stage.

Buy Now
Questions 25

Which steps are recommended best practices for prioritizing cluster keys in Snowflake? (Choose two.)

Options:

A.

Choose columns that are frequently used in join predicates.

B.

Choose lower cardinality columns to support clustering keys and cost effectiveness.

C.

Choose TIMESTAMP columns with nanoseconds for the highest number of unique rows.

D.

Choose cluster columns that are most actively used in selective filters.

E.

Choose cluster columns that are actively used in the GROUP BY clauses.

Buy Now
Questions 26

Which command will create a schema without Fail-safe and will restrict object owners from passing on access to other users?

Options:

A.

create schema EDW.ACCOUNTING WITH MANAGED ACCESS;

B.

create schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS - 7;

C.

create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 1;

D.

create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 7;

Buy Now
Questions 27

A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.

Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)

Options:

A.

Use, at minimum, the Business Critical edition of Snowflake.

B.

Create Dynamic Data Masking policies and apply them to columns that contain PHI.

C.

Use the Internal Tokenization feature to obfuscate sensitive data.

D.

Use the External Tokenization feature to obfuscate sensitive data.

E.

Rewrite SQL queries to eliminate projections of PHI data based on current_role().

F.

Avoid sharing data with partner organizations.

Buy Now
Questions 28

An Architect entered the following commands in sequence:

USER1 cannot find the table.

Which of the following commands does the Architect need to run for USER1 to find the tables using the Principle of Least Privilege? (Choose two.)

Options:

A.

GRANT ROLE PUBLIC TO ROLE INTERN;

B.

GRANT USAGE ON DATABASE SANDBOX TO ROLE INTERN;

C.

GRANT USAGE ON SCHEMA SANDBOX.PUBLIC TO ROLE INTERN;

D.

GRANT OWNERSHIP ON DATABASE SANDBOX TO USER INTERN;

E.

GRANT ALL PRIVILEGES ON DATABASE SANDBOX TO ROLE INTERN;

Buy Now
Questions 29

An event table has 150B rows and 1.5M micro-partitions, with the following statistics:

Column NDV*

A_ID 11K

C_DATE 110

NAME 300K

EVENT_ACT_0 1.1G

EVENT_ACT_4 2.2G

*NDV = Number of Distinct Values

What three clustering keys should be used, in order?

Options:

A.

C_DATE, A_ID, NAME

B.

A_ID, NAME, C_DATE

C.

C_DATE, A_ID, EVENT_ACT_0

D.

C_DATE, A_ID, EVENT_ACT_4

Buy Now
Questions 30

An Architect needs to meet a company requirement to ingest files from the company's AWS storage accounts into the company's Snowflake Google Cloud Platform (GCP) account. How can the ingestion of these files into the company's Snowflake account be initiated? (Select TWO).

Options:

A.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

B.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 Glacier storage.

C.

Create an AWS Lambda function to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

D.

Configure AWS Simple Notification Service (SNS) to notify Snowpipe when new files have arrived in Amazon S3 storage.

E.

Configure the client application to issue a COPY INTO

command to Snowflake when new files have arrived in Amazon S3 Glacier storage.

Buy Now
Questions 31

A table contains five columns and it has millions of records. The cardinality distribution of the columns is shown below:

Column C4 and C5 are mostly used by SELECT queries in the GROUP BY and ORDER BY clauses. Whereas columns C1, C2 and C3 are heavily used in filter and join conditions of SELECT queries.

The Architect must design a clustering key for this table to improve the query performance.

Based on Snowflake recommendations, how should the clustering key columns be ordered while defining the multi-column clustering key?

Options:

A.

C5, C4, C2

B.

C3, C4, C5

C.

C1, C3, C2

D.

C2, C1, C3

Buy Now
Questions 32

What built-in Snowflake features make use of the change tracking metadata for a table? (Choose two.)

Options:

A.

The MERGE command

B.

The UPSERT command

C.

The CHANGES clause

D.

A STREAM object

E.

The CHANGE_DATA_CAPTURE command

Buy Now
Questions 33

A company has an external vendor who puts data into Google Cloud Storage. The company's Snowflake account is set up in Azure.

What would be the MOST efficient way to load data from the vendor into Snowflake?

Options:

A.

Ask the vendor to create a Snowflake account, load the data into Snowflake and create a data share.

B.

Create an external stage on Google Cloud Storage and use the external table to load the data into Snowflake.

C.

Copy the data from Google Cloud Storage to Azure Blob storage using external tools and load data from Blob storage to Snowflake.

D.

Create a Snowflake Account in the Google Cloud Platform (GCP), ingest data into this account and use data replication to move the data from GCP to Azure.

Buy Now
Questions 34

An Architect is designing Snowflake architecture to support fast Data Analyst reporting. To optimize costs, the virtual warehouse is configured to auto-suspend after 2 minutes of idle time. Queries are run once in the morning after refresh, but later queries run slowly.

Why is this occurring?

Options:

A.

The warehouse is not large enough.

B.

The warehouse was not configured as a multi-cluster warehouse.

C.

The warehouse was not created with USE_CACHE = TRUE.

D.

When the warehouse was suspended, the cache was dropped.

Buy Now
Questions 35

When activating Tri-Secret Secure in a hierarchical encryption model in a Snowflake account, at what level is the customer-managed key used?

Options:

A.

At the root level (HSM)

B.

At the account level (AMK)

C.

At the table level (TMK)

D.

At the micro-partition level

Buy Now
Questions 36

A global company with operations in North America, Europe, and Asia needs to secure its Snowflake environment with a focus on data privacy, secure connectivity, and access control. The company uses AWS and must ensure secure data transfers that comply with regional regulations.

How can these requirements be met? (Select TWO).

Options:

A.

Configure SAML 2.0 to authenticate users in the Snowflake environment.

B.

Configure detailed logging and monitoring of all network traffic using Snowflake native capabilities.

C.

Use public endpoints with SSL encryption to secure data transfers.

D.

Configure network policies to restrict access based on corporate IP ranges.

E.

Use AWS PrivateLink for private connectivity between Snowflake and AWS VPCs.

Buy Now
Questions 37

What are purposes for creating a storage integration? (Choose three.)

Options:

A.

Control access to Snowflake data using a master encryption key that is maintained in the cloud provider’s key management service.

B.

Store a generated identity and access management (IAM) entity for an external cloud provider regardless of the cloud provider that hosts the Snowflake account.

C.

Support multiple external stages using one single Snowflake object.

D.

Avoid supplying credentials when creating a stage or when loading or unloading data.

E.

Create private VPC endpoints that allow direct, secure connectivity between VPCs without traversing the public internet.

F.

Manage credentials from multiple cloud providers in one single Snowflake object.

Buy Now
Questions 38

A data platform team creates two multi-cluster virtual warehouses with the AUTO_SUSPEND value set to NULL on one. and '0' on the other. What would be the execution behavior of these virtual warehouses?

Options:

A.

Setting a '0' or NULL value means the warehouses will never suspend.

B.

Setting a '0' or NULL value means the warehouses will suspend immediately.

C.

Setting a '0' or NULL value means the warehouses will suspend after the default of 600 seconds.

D.

Setting a '0' value means the warehouses will suspend immediately, and NULL means the warehouses will never suspend.

Buy Now
Questions 39

Which organization-related tasks can be performed by the ORGADMIN role? (Choose three.)

Options:

A.

Changing the name of the organization

B.

Creating an account

C.

Viewing a list of organization accounts

D.

Changing the name of an account

E.

Deleting an account

F.

Enabling the replication of a database

Buy Now
Questions 40

A company has built a data pipeline using Snowpipe to ingest files from an Amazon S3 bucket. Snowpipe is configured to load data into staging database tables. Then a task runs to load the data from the staging database tables into the reporting database tables.

The company is satisfied with the availability of the data in the reporting database tables, but the reporting tables are not pruning effectively. Currently, a size 4X-Large virtual warehouse is being used to query all of the tables in the reporting database.

What step can be taken to improve the pruning of the reporting tables?

Options:

A.

Eliminate the use of Snowpipe and load the files into internal stages using PUT commands.

B.

Increase the size of the virtual warehouse to a size 5X-Large.

C.

Use an ORDER BY command to load the reporting tables.

D.

Create larger files for Snowpipe to ingest and ensure the staging frequency does not exceed 1 minute.

Buy Now
Questions 41

What are some of the characteristics of result set caches? (Choose three.)

Options:

A.

Time Travel queries can be executed against the result set cache.

B.

Snowflake persists the data results for 24 hours.

C.

Each time persisted results for a query are used, a 24-hour retention period is reset.

D.

The data stored in the result cache will contribute to storage costs.

E.

The retention period can be reset for a maximum of 31 days.

F.

The result set cache is not shared between warehouses.

Buy Now
Questions 42

Assuming all Snowflake accounts are using an Enterprise edition or higher, in which development and testing scenarios would be copying of data be required, and zero-copy cloning not be suitable? (Select TWO).

Options:

A.

Developers create their own datasets to work against transformed versions of the live data.

B.

Production and development run in different databases in the same account, and Developers need to see production-like data but with specific columns masked.

C.

Data is in a production Snowflake account that needs to be provided to Developers in a separate development/testing Snowflake account in the same cloud region.

D.

Developers create their own copies of a standard test database previously created for them in the development account, for their initial development and unit testing.

E.

The release process requires pre-production testing of changes with data of production scale and complexity. For security reasons, pre-production also runs in the production account.

Buy Now
Questions 43

Which feature provides the capability to define an alternate cluster key for a table with an existing cluster key?

Options:

A.

External table

B.

Materialized view

C.

Search optimization

D.

Result cache

Buy Now
Questions 44

A Snowflake Architect Is working with Data Modelers and Table Designers to draft an ELT framework specifically for data loading using Snowpipe. The Table Designers will add a timestamp column that Inserts the current tlmestamp as the default value as records are loaded into a table. The Intent is to capture the time when each record gets loaded into the table; however, when tested the timestamps are earlier than the loae_take column values returned by the copy_history function or the Copy_HISTORY view (Account Usage).

Why Is this occurring?

Options:

A.

The timestamps are different because there are parameter setup mismatches. The parameters need to be realigned

B.

The Snowflake timezone parameter Is different from the cloud provider's parameters causing the mismatch.

C.

The Table Designer team has not used the localtimestamp or systimestamp functions in the Snowflake copy statement.

D.

The CURRENT_TIMEis evaluated when the load operation is compiled in cloud services rather than when the record is inserted into the table.

Buy Now
Questions 45

The following table exists in the production database:

A regulatory requirement states that the company must mask the username for events that are older than six months based on the current date when the data is queried.

How can the requirement be met without duplicating the event data and making sure it is applied when creating views using the table or cloning the table?

Options:

A.

Use a masking policy on the username column using a entitlement table with valid dates.

B.

Use a row level policy on the user_events table using a entitlement table with valid dates.

C.

Use a masking policy on the username column with event_timestamp as a conditional column.

D.

Use a secure view on the user_events table using a case statement on the username column.

Buy Now
Questions 46

What is a valid object hierarchy when building a Snowflake environment?

Options:

A.

Account --> Database --> Schema --> Warehouse

B.

Organization --> Account --> Database --> Schema --> Stage

C.

Account --> Schema > Table --> Stage

D.

Organization --> Account --> Stage --> Table --> View

Buy Now
Questions 47

Which SQL ALTER command will MAXIMIZE memory and compute resources for a Snowpark stored procedure when executed on the snowpark_opt_wh warehouse?

Options:

A.

ALTER WAREHOUSE snowpark_opt_wh SET MAX_CONCURRENCY_LEVEL = 1;

B.

ALTER WAREHOUSE snowpark_opt_wh SET MAX_CONCURRENCY_LEVEL = 2;

C.

ALTER WAREHOUSE snowpark_opt_wh SET MAX_CONCURRENCY_LEVEL = 8;

D.

ALTER WAREHOUSE snowpark_opt_wh SET MAX_CONCURRENCY_LEVEL = 16;

Buy Now
Questions 48

What are characteristics of Dynamic Data Masking? (Select TWO).

Options:

A.

A masking policy that Is currently set on a table can be dropped.

B.

A single masking policy can be applied to columns in different tables.

C.

A masking policy can be applied to the value column of an external table.

D.

The role that creates the masking policy will always see unmasked data In query results

E.

A masking policy can be applied to a column with the GEOGRAPHY data type.

Buy Now
Questions 49

A company is following the Data Mesh principles, including domain separation, and chose one Snowflake account for its data platform.

An Architect created two data domains to produce two data products. The Architect needs a third data domain that will use both of the data products to create an aggregate data product. The read access to the data products will be granted through a separate role.

Based on the Data Mesh principles, how should the third domain be configured to create the aggregate product if it has been granted the two read roles?

Options:

A.

Use secondary roles for all users.

B.

Create a hierarchy between the two read roles.

C.

Request a technical ETL user with the sysadmin role.

D.

Request that the two data domains share data using the Data Exchange.

Buy Now
Questions 50

A company is storing large numbers of small JSON files (ranging from 1-4 bytes) that are received from IoT devices and sent to a cloud provider. In any given hour, 100,000 files are added to the cloud provider.

What is the MOST cost-effective way to bring this data into a Snowflake table?

Options:

A.

An external table

B.

A pipe

C.

A stream

D.

A copy command at regular intervals

Buy Now

Questions 51

A company’s table, employees, was accidentally replaced with a new version.

How can the original table be recovered with the LEAST operational overhead?

Options:

A.

Use Time Travel to recover the data using this command:

SELECT *

FROM employees

BEFORE (STATEMENT => '01a5c8b3-0601-ad2b-0067-a503000a1312');

B.

Use Time Travel with a timestamp to recover the data using this command:

SELECT *

FROM employees

AT (TIMESTAMP => '2022-07-22 16:35:00.000 -0700'::TIMESTAMP_TZ);

C.

Revert to the original employees table using this command:

UNDROP TABLE employees;

D.

Rename the new employees table and undrop the original table using these commands:

ALTER TABLE employees RENAME TO employees_bad;

UNDROP TABLE employees;

Buy Now
Questions 52

A company has a Snowflake environment running in AWS us-west-2 (Oregon). The company needs to share data privately with a customer who is running their Snowflake environment in Azure East US 2 (Virginia).

What is the recommended sequence of operations that must be followed to meet this requirement?

Options:

A.

1. Create a share and add the database privileges to the share2. Create a new listing on the Snowflake Marketplace3. Alter the listing and add the share4. Instruct the customer to subscribe to the listing on the Snowflake Marketplace

B.

1. Ask the customer to create a new Snowflake account in Azure EAST US 2 (Virginia)2. Create a share and add the database privileges to the share3. Alter the share and add the customer's Snowflake account to the share

C.

1. Create a new Snowflake account in Azure East US 2 (Virginia)2. Set up replication between AWS us-west-2 (Oregon) and Azure East US 2 (Virginia) for the database objects to be shared3. Create a share and add the database privileges to the share4. Alter the share and add the customer's Snowflake account to the share

D.

1. Create a reader account in Azure East US 2 (Virginia)2. Create a share and add the database privileges to the share3. Add the reader account to the share4. Share the reader account's URL and credentials with the customer

Buy Now
Questions 53

What does a Snowflake Architect need to consider when implementing a Snowflake Connector for Kafka?

Options:

A.

Every Kafka message is in JSON or Avro format.

B.

The default retention time for Kafka topics is 14 days.

C.

The Kafka connector supports key pair authentication, OAUTH. and basic authentication (for example, username and password).

D.

The Kafka connector will create one table and one pipe to ingest data for each topic. If the connector cannot create the table or the pipe it will result in an exception.

Buy Now
Questions 54

Which security, governance, and data protection features require, at a MINIMUM, the Business Critical edition of Snowflake? (Choose two.)

Options:

A.

Extended Time Travel (up to 90 days)

B.

Customer-managed encryption keys through Tri-Secret Secure

C.

Periodic rekeying of encrypted data

D.

AWS, Azure, or Google Cloud private connectivity to Snowflake

E.

Federated authentication and SSO

Buy Now
Exam Code: ARA-C01
Exam Name: SnowPro Advanced: Architect Certification Exam
Last Update: Feb 21, 2026
Questions: 182
ARA-C01 pdf

ARA-C01 PDF

$25.5  $84.99
ARA-C01 Engine

ARA-C01 Testing Engine

$30  $99.99
ARA-C01 PDF + Engine

ARA-C01 PDF + Testing Engine

$40.5  $134.99