Labour Day Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: geek65

ARA-C01 SnowPro Advanced: Architect Certification Exam Questions and Answers

Questions 4

Two queries are run on the customer_address table:

create or replace TABLE CUSTOMER_ADDRESS ( CA_ADDRESS_SK NUMBER(38,0), CA_ADDRESS_ID VARCHAR(16), CA_STREET_NUMBER VARCHAR(IO) CA_STREET_NAME VARCHAR(60), CA_STREET_TYPE VARCHAR(15), CA_SUITE_NUMBER VARCHAR(10), CA_CITY VARCHAR(60), CA_COUNTY

VARCHAR(30), CA_STATE VARCHAR(2), CA_ZIP VARCHAR(10), CA_COUNTRY VARCHAR(20), CA_GMT_OFFSET NUMBER(5,2), CA_LOCATION_TYPE

VARCHAR(20) );

ALTER TABLE DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS ADD SEARCH OPTIMIZATION ON SUBSTRING(CA_ADDRESS_ID);

Which queries will benefit from the use of the search optimization service? (Select TWO).

Options:

A.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where substring(CA_ADDRESS_ID,1,8)= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,8);

B.

select * from DEMO_DB.DEMO_SCH.CUSTOMER_ADDRESS Where CA_ADDRESS_ID= substring('AAAAAAAAPHPPLBAAASKDJHASLKDJHASKJD',1,16);

C.

select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDLIKE ’%BAAASKD%';

D.

select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDLIKE '%PHPP%';

E.

select*fromDEMO_DB.DEMO_SCH.CUSTOMER_ADDRESSWhereCA_ADDRESS_IDNOT LIKE '%AAAAAAAAPHPPL%';

Buy Now
Questions 5

An Architect has a VPN_ACCESS_LOGS table in the SECURITY_LOGS schema containing timestamps of the connection and disconnection, username of the user, and summary statistics.

What should the Architect do to enable the Snowflake search optimization service on this table?

Options:

A.

Assume role with OWNERSHIP on future tables and ADD SEARCH OPTIMIZATION on the SECURITY_LOGS schema.

B.

Assume role with ALL PRIVILEGES including ADD SEARCH OPTIMIZATION in the SECURITY LOGS schema.

C.

Assume role with OWNERSHIP on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

D.

Assume role with ALL PRIVILEGES on VPN_ACCESS_LOGS and ADD SEARCH OPTIMIZATION in the SECURITY_LOGS schema.

Buy Now
Questions 6

A company has a table with that has corrupted data, named Data. The company wants to recover the data as it was 5 minutes ago using cloning and Time Travel.

What command will accomplish this?

Options:

A.

CREATE CLONE TABLE Recover_Data FROM Data AT(OFFSET => -60*5);

B.

CREATE CLONE Recover_Data FROM Data AT(OFFSET => -60*5);

C.

CREATE TABLE Recover_Data CLONE Data AT(OFFSET => -60*5);

D.

CREATE TABLE Recover Data CLONE Data AT(TIME => -60*5);

Buy Now
Questions 7

A retail company has over 3000 stores all using the same Point of Sale (POS) system. The company wants to deliver near real-time sales results to category managers. The stores operate in a variety of time zones and exhibit a dynamic range of transactions each minute, with some stores having higher sales volumes than others.

Sales results are provided in a uniform fashion using data engineered fields that will be calculated in a complex data pipeline. Calculations include exceptions, aggregations, and scoring using external functions interfaced to scoring algorithms. The source data for aggregations has over 100M rows.

Every minute, the POS sends all sales transactions files to a cloud storage location with a naming convention that includes store numbers and timestamps to identify the set of transactions contained in the files. The files are typically less than 10MB in size.

How can the near real-time results be provided to the category managers? (Select TWO).

Options:

A.

All files should be concatenated before ingestion into Snowflake to avoid micro-ingestion.

B.

A Snowpipe should be created and configured with AUTO_INGEST = true. A stream should be created to process INSERTS into a single target table using the stream metadata to inform the store number and timestamps.

C.

A stream should be created to accumulate the near real-time data and a task should be created that runs at a frequency that matches the real-time analytics needs.

D.

An external scheduler should examine the contents of the cloud storage location and issue SnowSQL commands to process the data at a frequency that matches the real-time analytics needs.

E.

The copy into command with a task scheduled to run every second should be used to achieve the near-real time requirement.

Buy Now
Questions 8

What considerations need to be taken when using database cloning as a tool for data lifecycle management in a development environment? (Select TWO).

Options:

A.

Any pipes in the source are not cloned.

B.

Any pipes in the source referring to internal stages are not cloned.

C.

Any pipes in the source referring to external stages are not cloned.

D.

The clone inherits all granted privileges of all child objects in the source object, including the database.

E.

The clone inherits all granted privileges of all child objects in the source object, excluding the database.

Buy Now
Questions 9

Which SQL alter command will MAXIMIZE memory and compute resources for a Snowpark stored procedure when executed on the snowpark_opt_wh warehouse?

A)

B)

C)

D)

Options:

A.

Option A

B.

Option B

C.

Option C

D.

Option D

Buy Now
Questions 10

Which of the following are characteristics of how row access policies can be applied to external tables? (Choose three.)

Options:

A.

An external table can be created with a row access policy, and the policy can be applied to the VALUE column.

B.

A row access policy can be applied to the VALUE column of an existing external table.

C.

A row access policy cannot be directly added to a virtual column of an external table.

D.

External tables are supported as mapping tables in a row access policy.

E.

While cloning a database, both the row access policy and the external table will be cloned.

F.

A row access policy cannot be applied to a view created on top of an external table.

Buy Now
Questions 11

An Architect needs to automate the daily Import of two files from an external stage into Snowflake. One file has Parquet-formatted data, the other has CSV-formatted data.

How should the data be joined and aggregated to produce a final result set?

Options:

A.

Use Snowpipe to ingest the two files, then create a materialized view to produce the final result set.

B.

Create a task using Snowflake scripting that will import the files, and then call a User-Defined Function (UDF) to produce the final result set.

C.

Create a JavaScript stored procedure to read. join, and aggregate the data directly from the external stage, and then store the results in a table.

D.

Create a materialized view to read, Join, and aggregate the data directly from the external stage, and use the view to produce the final result set

Buy Now
Questions 12

An Architect is troubleshooting a query with poor performance using the QUERY function. The Architect observes that the COMPILATION_TIME Is greater than the EXECUTION_TIME.

What is the reason for this?

Options:

A.

The query is processing a very large dataset.

B.

The query has overly complex logic.

C.

The query Is queued for execution.

D.

The query Is reading from remote storage

Buy Now
Questions 13

A Data Engineer is designing a near real-time ingestion pipeline for a retail company to ingest event logs into Snowflake to derive insights. A Snowflake Architect is asked to define security best practices to configure access control privileges for the data load for auto-ingest to Snowpipe.

What are the MINIMUM object privileges required for the Snowpipe user to execute Snowpipe?

Options:

A.

OWNERSHIP on the named pipe, USAGE on the named stage, target database, and schema, and INSERT and SELECT on the target table

B.

OWNERSHIP on the named pipe, USAGE and READ on the named stage, USAGE on the target database and schema, and INSERT end SELECT on the target table

C.

CREATE on the named pipe, USAGE and READ on the named stage, USAGE on the target database and schema, and INSERT end SELECT on the target table

D.

USAGE on the named pipe, named stage, target database, and schema, and INSERT and SELECT on the target table

Buy Now
Questions 14

An Architect needs to meet a company requirement to ingest files from the company's AWS storage accounts into the company's Snowflake Google Cloud Platform (GCP) account. How can the ingestion of these files into the company's Snowflake account be initiated? (Select TWO).

Options:

A.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

B.

Configure the client application to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 Glacier storage.

C.

Create an AWS Lambda function to call the Snowpipe REST endpoint when new files have arrived in Amazon S3 storage.

D.

Configure AWS Simple Notification Service (SNS) to notify Snowpipe when new files have arrived in Amazon S3 storage.

E.

Configure the client application to issue a COPY INTO

command to Snowflake when new files have arrived in Amazon S3 Glacier storage.

Buy Now
command to Snowflake when new files have arrived in Amazon S3 Glacier storage. This option is not relevant because it does not use Snowpipe, but rather the standard COPY command, which is a batch loading method. Moreover, the COPY command also does not support ingesting files from Amazon S3 Glacier storage7 References:
  • 1: SnowPro Advanced: Architect | Study Guide 8
  • 2: Snowflake Documentation | Snowpipe Overview 9
  • 3: Snowflake Documentation | Using the Snowpipe REST API 10
  • 4: Snowflake Documentation | Loading Data Using Snowpipe and AWS Lambda 11
  • 5: Snowflake Documentation | Supported File Formats and Compression for Staged Data Files 12
  • 6: Snowflake Documentation | Using Cloud Notifications to Trigger Snowpipe 13
  • 7: Snowflake Documentation | Loading Data Using COPY into a Table
  • : SnowPro Advanced: Architect | Study Guide
  • : Snowpipe Overview
  • : Using the Snowpipe REST API
  • : Loading Data Using Snowpipe and AWS Lambda
  • : Supported File Formats and Compression for Staged Data Files
  • : Using Cloud Notifications to Trigger Snowpipe
  • : Loading Data Using COPY into a Table
  • Questions 15

    What actions are permitted when using the Snowflake SQL REST API? (Select TWO).

    Options:

    A.

    The use of a GET command

    B.

    The use of a PUT command

    C.

    The use of a ROLLBACK command

    D.

    The use of a CALL command to a stored procedure which returns a table

    E.

    Submitting multiple SQL statements in a single call

    Buy Now
    Questions 16

    A healthcare company is deploying a Snowflake account that may include Personal Health Information (PHI). The company must ensure compliance with all relevant privacy standards.

    Which best practice recommendations will meet data protection and compliance requirements? (Choose three.)

    Options:

    A.

    Use, at minimum, the Business Critical edition of Snowflake.

    B.

    Create Dynamic Data Masking policies and apply them to columns that contain PHI.

    C.

    Use the Internal Tokenization feature to obfuscate sensitive data.

    D.

    Use the External Tokenization feature to obfuscate sensitive data.

    E.

    Rewrite SQL queries to eliminate projections of PHI data based on current_role().

    F.

    Avoid sharing data with partner organizations.

    Buy Now
    Questions 17

    An Architect is integrating an application that needs to read and write data to Snowflake without installing any additional software on the application server.

    How can this requirement be met?

    Options:

    A.

    Use SnowSQL.

    B.

    Use the Snowpipe REST API.

    C.

    Use the Snowflake SQL REST API.

    D.

    Use the Snowflake ODBC driver.

    Buy Now
    Questions 18

    An Architect for a multi-national transportation company has a system that is used to check the weather conditions along vehicle routes. The data is provided to drivers.

    The weather information is delivered regularly by a third-party company and this information is generated as JSON structure. Then the data is loaded into Snowflake in a column with a VARIANT data type. This

    table is directly queried to deliver the statistics to the drivers with minimum time lapse.

    A single entry includes (but is not limited to):

    - Weather condition; cloudy, sunny, rainy, etc.

    - Degree

    - Longitude and latitude

    - Timeframe

    - Location address

    - Wind

    The table holds more than 10 years' worth of data in order to deliver the statistics from different years and locations. The amount of data on the table increases every day.

    The drivers report that they are not receiving the weather statistics for their locations in time.

    What can the Architect do to deliver the statistics to the drivers faster?

    Options:

    A.

    Create an additional table in the schema for longitude and latitude. Determine a regular task to fill this information by extracting it from the JSON dataset.

    B.

    Add search optimization service on the variant column for longitude and latitude in order to query the information by using specific metadata.

    C.

    Divide the table into several tables for each year by using the timeframe information from the JSON dataset in order to process the queries in parallel.

    D.

    Divide the table into several tables for each location by using the location address information from the JSON dataset in order to process the queries in parallel.

    Buy Now
    Questions 19

    A company needs to share its product catalog data with one of its partners. The product catalog data is stored in two database tables: product_category, and product_details. Both tables can be joined by the product_id column. Data access should be governed, and only the partner should have access to the records.

    The partner is not a Snowflake customer. The partner uses Amazon S3 for cloud storage.

    Which design will be the MOST cost-effective and secure, while using the required Snowflake features?

    Options:

    A.

    Use Secure Data Sharing with an S3 bucket as a destination.

    B.

    Publish product_category and product_details data sets on the Snowflake Marketplace.

    C.

    Create a database user for the partner and give them access to the required data sets.

    D.

    Create a reader account for the partner and share the data sets as secure views.

    Buy Now
    Questions 20

    How is the change of local time due to daylight savings time handled in Snowflake tasks? (Choose two.)

    Options:

    A.

    A task scheduled in a UTC-based schedule will have no issues with the time changes.

    B.

    Task schedules can be designed to follow specified or local time zones to accommodate the time changes.

    C.

    A task will move to a suspended state during the daylight savings time change.

    D.

    A frequent task execution schedule like minutes may not cause a problem, but will affect the task history.

    E.

    A task schedule will follow only the specified time and will fail to handle lost or duplicated hours.

    Buy Now
    Questions 21

    Database DB1 has schema S1 which has one table, T1.

    DB1 --> S1 --> T1

    The retention period of EG1 is set to 10 days.

    The retention period of s: is set to 20 days.

    The retention period of t: Is set to 30 days.

    The user runs the following command:

    Drop Database DB1;

    What will the Time Travel retention period be for T1?

    Options:

    A.

    10 days

    B.

    20 days

    C.

    30 days

    D.

    37 days

    Buy Now
    Questions 22

    Role A has the following permissions:

    . USAGE on db1

    . USAGE and CREATE VIEW on schemal in db1

    . SELECT on tablel in schemal

    Role B has the following permissions:

    . USAGE on db2

    . USAGE and CREATE VIEW on schema2 in db2

    . SELECT on table2 in schema2

    A user has Role A set as the primary role and Role B as a secondary role.

    What command will fail for this user?

    Options:

    A.

    use database db1;

    use schema schemal;

    create view v1 as select * from db2.schema2.table2;

    B.

    use database db2;

    use schema schema2;

    create view v2 as select * from dbl.schemal. tablel;

    C.

    use database db2;

    use schema schema2;

    select * from db1.schemal.tablel union select * from table2;

    D.

    use database db1;

    use schema schemal;

    select * from db2.schema2.table2;

    Buy Now
    Questions 23

    What are purposes for creating a storage integration? (Choose three.)

    Options:

    A.

    Control access to Snowflake data using a master encryption key that is maintained in the cloud provider’s key management service.

    B.

    Store a generated identity and access management (IAM) entity for an external cloud provider regardless of the cloud provider that hosts the Snowflake account.

    C.

    Support multiple external stages using one single Snowflake object.

    D.

    Avoid supplying credentials when creating a stage or when loading or unloading data.

    E.

    Create private VPC endpoints that allow direct, secure connectivity between VPCs without traversing the public internet.

    F.

    Manage credentials from multiple cloud providers in one single Snowflake object.

    Buy Now
    Questions 24

    In a managed access schema, what are characteristics of the roles that can manage object privileges? (Select TWO).

    Options:

    A.

    Users with the SYSADMIN role can grant object privileges in a managed access schema.

    B.

    Users with the SECURITYADMIN role or higher, can grant object privileges in a managed access schema.

    C.

    Users who are database owners can grant object privileges in a managed access schema.

    D.

    Users who are schema owners can grant object privileges in a managed access schema.

    E.

    Users who are object owners can grant object privileges in a managed access schema.

    Buy Now
    Questions 25

    When loading data into a table that captures the load time in a column with a default value of either CURRENT_TIME () or CURRENT_TIMESTAMP() what will occur?

    Options:

    A.

    All rows loaded using a specific COPY statement will have varying timestamps based on when the rows were inserted.

    B.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were read from the source.

    C.

    Any rows loaded using a specific COPY statement will have varying timestamps based on when the rows were created in the source.

    D.

    All rows loaded using a specific COPY statement will have the same timestamp value.

    Buy Now
    Questions 26

    There are two databases in an account, named fin_db and hr_db which contain payroll and employee data, respectively. Accountants and Analysts in the company require different permissions on the objects in these databases to perform their jobs. Accountants need read-write access to fin_db but only require read-only access to hr_db because the database is maintained by human resources personnel.

    An Architect needs to create a read-only role for certain employees working in the human resources department.

    Which permission sets must be granted to this role?

    Options:

    A.

    USAGE on database hr_db, USAGE on all schemas in database hr_db, SELECT on all tables in database hr_db

    B.

    USAGE on database hr_db, SELECT on all schemas in database hr_db, SELECT on all tables in database hr_db

    C.

    MODIFY on database hr_db, USAGE on all schemas in database hr_db, USAGE on all tables in database hr_db

    D.

    USAGE on database hr_db, USAGE on all schemas in database hr_db, REFERENCES on all tables in database hr_db

    Buy Now
    Questions 27

    What does a Snowflake Architect need to consider when implementing a Snowflake Connector for Kafka?

    Options:

    A.

    Every Kafka message is in JSON or Avro format.

    B.

    The default retention time for Kafka topics is 14 days.

    C.

    The Kafka connector supports key pair authentication, OAUTH. and basic authentication (for example, username and password).

    D.

    The Kafka connector will create one table and one pipe to ingest data for each topic. If the connector cannot create the table or the pipe it will result in an exception.

    Buy Now
    Questions 28

    Which organization-related tasks can be performed by the ORGADMIN role? (Choose three.)

    Options:

    A.

    Changing the name of the organization

    B.

    Creating an account

    C.

    Viewing a list of organization accounts

    D.

    Changing the name of an account

    E.

    Deleting an account

    F.

    Enabling the replication of a database

    Buy Now
    Questions 29

    A healthcare company wants to share data with a medical institute. The institute is running a Standard edition of Snowflake; the healthcare company is running a Business Critical edition.

    How can this data be shared?

    Options:

    A.

    The healthcare company will need to change the institute’s Snowflake edition in the accounts panel.

    B.

    By default, sharing is supported from a Business Critical Snowflake edition to a Standard edition.

    C.

    Contact Snowflake and they will execute the share request for the healthcare company.

    D.

    Set the share_restriction parameter on the shared object to false.

    Buy Now
    Questions 30

    A large manufacturing company runs a dozen individual Snowflake accounts across its business divisions. The company wants to increase the level of data sharing to support supply chain optimizations and increase its purchasing leverage with multiple vendors.

    The company’s Snowflake Architects need to design a solution that would allow the business divisions to decide what to share, while minimizing the level of effort spent on configuration and management. Most of the company divisions use Snowflake accounts in the same cloud deployments with a few exceptions for European-based divisions.

    According to Snowflake recommended best practice, how should these requirements be met?

    Options:

    A.

    Migrate the European accounts in the global region and manage shares in a connected graph architecture. Deploy a Data Exchange.

    B.

    Deploy a Private Data Exchange in combination with data shares for the European accounts.

    C.

    Deploy to the Snowflake Marketplace making sure that invoker_share() is used in all secure views.

    D.

    Deploy a Private Data Exchange and use replication to allow European data shares in the Exchange.

    Buy Now
    Questions 31

    What are some of the characteristics of result set caches? (Choose three.)

    Options:

    A.

    Time Travel queries can be executed against the result set cache.

    B.

    Snowflake persists the data results for 24 hours.

    C.

    Each time persisted results for a query are used, a 24-hour retention period is reset.

    D.

    The data stored in the result cache will contribute to storage costs.

    E.

    The retention period can be reset for a maximum of 31 days.

    F.

    The result set cache is not shared between warehouses.

    Buy Now
    Questions 32

    You are a snowflake architect in an organization. The business team came to to deploy an use case which requires you to load some data which they can visualize through tableau. Everyday new data comes in and the old data is no longer required.

    What type of table you will use in this case to optimize cost

    Options:

    A.

    TRANSIENT

    B.

    TEMPORARY

    C.

    PERMANENT

    Buy Now
    Questions 33

    An Architect has chosen to separate their Snowflake Production and QA environments using two separate Snowflake accounts.

    The QA account is intended to run and test changes on data and database objects before pushing those changes to the Production account. It is a requirement that all database objects and data in the QA account need to be an exact copy of the database objects, including privileges and data in the Production account on at least a nightly basis.

    Which is the LEAST complex approach to use to populate the QA account with the Production account’s data and database objects on a nightly basis?

    Options:

    A.

    1) Create a share in the Production account for each database

    2) Share access to the QA account as a Consumer

    3) The QA account creates a database directly from each share

    4) Create clones of those databases on a nightly basis

    5) Run tests directly on those cloned databases

    B.

    1) Create a stage in the Production account

    2) Create a stage in the QA account that points to the same external object-storage location

    3) Create a task that runs nightly to unload each table in the Production account into the stage

    4) Use Snowpipe to populate the QA account

    C.

    1) Enable replication for each database in the Production account

    2) Create replica databases in the QA account

    3) Create clones of the replica databases on a nightly basis

    4) Run tests directly on those cloned databases

    D.

    1) In the Production account, create an external function that connects into the QA account and returns all the data for one specific table

    2) Run the external function as part of a stored procedure that loops through each table in the Production account and populates each table in the QA account

    Buy Now
    Questions 34

    Which command will create a schema without Fail-safe and will restrict object owners from passing on access to other users?

    Options:

    A.

    create schema EDW.ACCOUNTING WITH MANAGED ACCESS;

    B.

    create schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS - 7;

    C.

    create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 1;

    D.

    create TRANSIENT schema EDW.ACCOUNTING WITH MANAGED ACCESS DATA_RETENTION_TIME_IN_DAYS = 7;

    Buy Now
    Questions 35

    A company’s daily Snowflake workload consists of a huge number of concurrent queries triggered between 9pm and 11pm. At the individual level, these queries are smaller statements that get completed within a short time period.

    What configuration can the company’s Architect implement to enhance the performance of this workload? (Choose two.)

    Options:

    A.

    Enable a multi-clustered virtual warehouse in maximized mode during the workload duration.

    B.

    Set the MAX_CONCURRENCY_LEVEL to a higher value than its default value of 8 at the virtual warehouse level.

    C.

    Increase the size of the virtual warehouse to size X-Large.

    D.

    Reduce the amount of data that is being processed through this workload.

    E.

    Set the connection timeout to a higher value than its default.

    Buy Now
    Questions 36

    Which query will identify the specific days and virtual warehouses that would benefit from a multi-cluster warehouse to improve the performance of a particular workload?

    A)

    B)

    C)

    D)

    Options:

    A.

    Option A

    B.

    Option B

    C.

    Option C

    D.

    Option D

    Buy Now
    Questions 37

    What are characteristics of Dynamic Data Masking? (Select TWO).

    Options:

    A.

    A masking policy that Is currently set on a table can be dropped.

    B.

    A single masking policy can be applied to columns in different tables.

    C.

    A masking policy can be applied to the value column of an external table.

    D.

    The role that creates the masking policy will always see unmasked data In query results

    E.

    A masking policy can be applied to a column with the GEOGRAPHY data type.

    Buy Now
    Questions 38

    A table, EMP_ TBL has three records as shown:

    The following variables are set for the session:

    Which SELECT statements will retrieve all three records? (Select TWO).

    Options:

    A.

    Select * FROM Stbl_ref WHERE Scol_ref IN ('Name1','Nam2','Name3');

    B.

    SELECT * FROM EMP_TBL WHERE identifier(Scol_ref) IN ('Namel','Name2', 'Name3');

    C.

    SELECT * FROM identifier WHERE NAME IN ($var1, $var2, $var3);

    D.

    SELECT * FROM identifier($tbl_ref) WHERE ID IN Cvarl','var2','var3');

    E.

    SELECT * FROM $tb1_ref WHERE $col_ref IN ($var1, Svar2, Svar3);

    Buy Now
    Questions 39

    Which of the following ingestion methods can be used to load near real-time data by using the messaging services provided by a cloud provider?

    Options:

    A.

    Snowflake Connector for Kafka

    B.

    Snowflake streams

    C.

    Snowpipe

    D.

    Spark

    Buy Now
    Questions 40

    Which statements describe characteristics of the use of materialized views in Snowflake? (Choose two.)

    Options:

    A.

    They can include ORDER BY clauses.

    B.

    They cannot include nested subqueries.

    C.

    They can include context functions, such as CURRENT_TIME().

    D.

    They can support MIN and MAX aggregates.

    E.

    They can support inner joins, but not outer joins.

    Buy Now
    Questions 41

    The diagram shows the process flow for Snowpipe auto-ingest with Amazon Simple Notification Service (SNS) with the following steps:

    Step 1: Data files are loaded in a stage.

    Step 2: An Amazon S3 event notification, published by SNS, informs Snowpipe — by way of Amazon Simple Queue Service (SQS) - that files are ready to load. Snowpipe copies the files into a queue.

    Step 3: A Snowflake-provided virtual warehouse loads data from the queued files into the target table based on parameters defined in the specified pipe.

    If an AWS Administrator accidentally deletes the SQS subscription to the SNS topic in Step 2, what will happen to the pipe that references the topic to receive event messages from Amazon S3?

    Options:

    A.

    The pipe will continue to receive the messages as Snowflake will automatically restore the subscription to the same SNS topic and will recreate the pipe by specifying the same SNS topic name in the pipe definition.

    B.

    The pipe will no longer be able to receive the messages and the user must wait for 24 hours from the time when the SNS topic subscription was deleted. Pipe recreation is not required as the pipe will reuse the same subscription to the existing SNS topic after 24 hours.

    C.

    The pipe will continue to receive the messages as Snowflake will automatically restore the subscription by creating a new SNS topic. Snowflake will then recreate the pipe by specifying the new SNS topic name in the pipe definition.

    D.

    The pipe will no longer be able to receive the messages. To restore the system immediately, the user needs to manually create a new SNS topic with a different name and then recreate the pipe by specifying the new SNS topic name in the pipe definition.

    Buy Now
    Questions 42

    A Snowflake Architect is designing a multiple-account design strategy.

    This strategy will be MOST cost-effective with which scenarios? (Select TWO).

    Options:

    A.

    The company wants to clone a production database that resides on AWS to a development database that resides on Azure.

    B.

    The company needs to share data between two databases, where one must support Payment Card Industry Data Security Standard (PCI DSS) compliance but the other one does not.

    C.

    The company needs to support different role-based access control features for the development, test, and production environments.

    D.

    The company security policy mandates the use of different Active Directory instances for the development, test, and production environments.

    E.

    The company must use a specific network policy for certain users to allow and block given IP addresses.

    Buy Now
    Questions 43

    A company has built a data pipeline using Snowpipe to ingest files from an Amazon S3 bucket. Snowpipe is configured to load data into staging database tables. Then a task runs to load the data from the staging database tables into the reporting database tables.

    The company is satisfied with the availability of the data in the reporting database tables, but the reporting tables are not pruning effectively. Currently, a size 4X-Large virtual warehouse is being used to query all of the tables in the reporting database.

    What step can be taken to improve the pruning of the reporting tables?

    Options:

    A.

    Eliminate the use of Snowpipe and load the files into internal stages using PUT commands.

    B.

    Increase the size of the virtual warehouse to a size 5X-Large.

    C.

    Use an ORDER BY command to load the reporting tables.

    D.

    Create larger files for Snowpipe to ingest and ensure the staging frequency does not exceed 1 minute.

    Buy Now
    Questions 44

    What Snowflake features should be leveraged when modeling using Data Vault?

    Options:

    A.

    Snowflake’s support of multi-table inserts into the data model’s Data Vault tables

    B.

    Data needs to be pre-partitioned to obtain a superior data access performance

    C.

    Scaling up the virtual warehouses will support parallel processing of new source loads

    D.

    Snowflake’s ability to hash keys so that hash key joins can run faster than integer joins

    Buy Now
    Questions 45

    An Architect with the ORGADMIN role wants to change a Snowflake account from an Enterprise edition to a Business Critical edition.

    How should this be accomplished?

    Options:

    A.

    Run an ALTER ACCOUNT command and create a tag of EDITION and set the tag to Business Critical.

    B.

    Use the account's ACCOUNTADMIN role to change the edition.

    C.

    Failover to a new account in the same region and specify the new account's edition upon creation.

    D.

    Contact Snowflake Support and request that the account's edition be changed.

    Buy Now
    Questions 46

    A DevOps team has a requirement for recovery of staging tables used in a complex set of data pipelines. The staging tables are all located in the same staging schema. One of the requirements is to have online recovery of data on a rolling 7-day basis.

    After setting up the DATA_RETENTION_TIME_IN_DAYS at the database level, certain tables remain unrecoverable past 1 day.

    What would cause this to occur? (Choose two.)

    Options:

    A.

    The staging schema has not been setup for MANAGED ACCESS.

    B.

    The DATA_RETENTION_TIME_IN_DAYS for the staging schema has been set to 1 day.

    C.

    The tables exceed the 1 TB limit for data recovery.

    D.

    The staging tables are of the TRANSIENT type.

    E.

    The DevOps role should be granted ALLOW_RECOVERY privilege on the staging schema.

    Buy Now
    Questions 47

    A company is using Snowflake in Azure in the Netherlands. The company analyst team also has data in JSON format that is stored in an Amazon S3 bucket in the AWS Singapore region that the team wants to analyze.

    The Architect has been given the following requirements:

    1. Provide access to frequently changing data

    2. Keep egress costs to a minimum

    3. Maintain low latency

    How can these requirements be met with the LEAST amount of operational overhead?

    Options:

    A.

    Use a materialized view on top of an external table against the S3 bucket in AWS Singapore.

    B.

    Use an external table against the S3 bucket in AWS Singapore and copy the data into transient tables.

    C.

    Copy the data between providers from S3 to Azure Blob storage to collocate, then use Snowpipe for data ingestion.

    D.

    Use AWS Transfer Family to replicate data between the S3 bucket in AWS Singapore and an Azure Netherlands Blob storage, then use an external table against the Blob storage.

    Buy Now
    Questions 48

    An Architect is designing a file ingestion recovery solution. The project will use an internal named stage for file storage. Currently, in the case of an ingestion failure, the Operations team must manually download the failed file and check for errors.

    Which downloading method should the Architect recommend that requires the LEAST amount of operational overhead?

    Options:

    A.

    Use the Snowflake Connector for Python, connect to remote storage and download the file.

    B.

    Use the get command in SnowSQL to retrieve the file.

    C.

    Use the get command in Snowsight to retrieve the file.

    D.

    Use the Snowflake API endpoint and download the file.

    Buy Now
    Exam Code: ARA-C01
    Exam Name: SnowPro Advanced: Architect Certification Exam
    Last Update: May 1, 2024
    Questions: 162
    ARA-C01 pdf

    ARA-C01 PDF

    $28  $80
    ARA-C01 Engine

    ARA-C01 Testing Engine

    $33.25  $95
    ARA-C01 PDF + Engine

    ARA-C01 PDF + Testing Engine

    $45.5  $130