Summer Special Limited Time 65% Discount Offer - Ends in 0d 00h 00m 00s - Coupon code: geek65

DP-700 Implementing Data Engineering Solutions Using Microsoft Fabric Questions and Answers

Questions 4

You need to create the product dimension.

How should you complete the Apache Spark SQL code? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Questions 5

You need to recommend a solution to resolve the MAR1 connectivity issues. The solution must minimize development effort. What should you recommend?

Options:

A.

Add a ForEach activity to the data pipeline.

B.

Configure retries for the Copy data activity.

C.

Configure Fault tolerance for the Copy data activity.

D.

Call a notebook from the data pipeline.

Buy Now
Questions 6

You need to ensure that WorkspaceA can be configured for source control. Which two actions should you perform?

Each correct answer presents part of the solution. NOTE: Each correct selection is worth one point.

Options:

A.

Assign WorkspaceA to Capl.

B.

From Tenant setting, set Users can synchronize workspace items with their Git repositories to Enabled

C.

Configure WorkspaceA to use a Premium Per User (PPU) license

D.

From Tenant setting, set Users can sync workspace items with GitHub repositories to Enabled

Buy Now
Questions 7

You need to populate the MAR1 data in the bronze layer.

Which two types of activities should you include in the pipeline? Each correct answer presents part of the solution.

NOTE: Each correct selection is worth one point.

Options:

A.

ForEach

B.

Copy data

C.

WebHook

D.

Stored procedure

Buy Now
Questions 8

You need to ensure that the data analysts can access the gold layer lakehouse.

What should you do?

Options:

A.

Add the DataAnalyst group to the Viewer role for WorkspaceA.

B.

Share the lakehouse with the DataAnalysts group and grant the Build reports on the default semantic model permission.

C.

Share the lakehouse with the DataAnalysts group and grant the Read all SQL Endpoint data permission.

D.

Share the lakehouse with the DataAnalysts group and grant the Read all Apache Spark permission.

Buy Now
Questions 9

You need to schedule the population of the medallion layers to meet the technical requirements.

What should you do?

Options:

A.

Schedule a data pipeline that calls other data pipelines.

B.

Schedule a notebook.

C.

Schedule an Apache Spark job.

D.

Schedule multiple data pipelines.

Buy Now
Questions 10

You need to recommend a method to populate the POS1 data to the lakehouse medallion layers.

What should you recommend for each layer? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Questions 11

You need to recommend a solution for handling old files. The solution must meet the technical requirements. What should you include in the recommendation?

Options:

A.

a data pipeline that includes a Copy data activity

B.

a notebook that runs the VACUUM command

C.

a notebook that runs the OPTIMIZE command

D.

a data pipeline that includes a Delete data activity

Buy Now
Questions 12

You have a Fabric workspace that contains a warehouse named Warehouse1. Data is loaded daily into Warehouse1 by using data pipelines and stored procedures.

You discover that the daily data load takes longer than expected.

You need to monitor Warehouse1 to identify the names of users that are actively running queries.

Which view should you use?

Options:

A.

sys.dm_exec_connections

B.

sys.dm_exec_requests

C.

queryinsights.long_running_queries

D.

queryinsights.frequently_run_queries

E.

sys.dm_exec_sessions

Buy Now
Questions 13

You are building a data loading pattern by using a Fabric data pipeline. The source is an Azure SQL database that contains 25 tables. The destination is a lakehouse.

In a warehouse, you create a control table named Control.Object as shown in the exhibit. (Click the Exhibit tab.)

You need to build a data pipeline that will support the dynamic ingestion of the tables listed in the control table by using a single execution.

Which three actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order.

Options:

Buy Now
Questions 14

You have an Azure key vault named KeyVaultl that contains secrets.

You have a Fabric workspace named Workspace-!. Workspace! contains a notebook named Notebookl that performs the following tasks:

• Loads stage data to the target tables in a lakehouse

• Triggers the refresh of a semantic model

You plan to add functionality to Notebookl that will use the Fabric API to monitor the semantic model refreshes. You need to retrieve the registered application ID and secret from KeyVaultl to generate the authentication token.

Solution: You use the following code segment:

Use notebookutils.credentials.getSecret and specify the key vault URL and key vault secret. Does this meet the goal?

Options:

A.

Yes

B.

No

Buy Now
Questions 15

You have a Fabric warehouse named DW1 that contains a Type 2 slowly changing dimension (SCD) dimension table named DimCustomer. DimCustomer contains 100 columns and 20 million rows. The columns are of various data types, including int, varchar, date, and varbinary.

You need to identify incoming changes to the table and update the records when there is a change. The solution must minimize resource consumption.

What should you use to identify changes to attributes?

Options:

A.

a direct attributes comparison for the attributes in the source table.

B.

a hash function to compare the attributes in the DimCustomer table.

C.

a direct attributes comparison across the attributes in the DimCustomer table.

D.

a hash function to compare the attributes in the source table.

Buy Now
Questions 16

HOTSPOT

You have a Fabric workspace that contains two lakehouses named Lakehouse1 and Lakehouse2. Lakehouse1 contains staging data in a Delta table named Orderlines. Lakehouse2 contains a Type 2 slowly changing dimension (SCD) dimension table named Dim_Customer.

You need to build a query that will combine data from Orderlines and Dim_Customer to create a new fact table named Fact_Orders. The new table must meet the following requirements:

Enable the analysis of customer orders based on historical attributes.

Enable the analysis of customer orders based on the current attributes.

How should you complete the statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Questions 17

What should you do to optimize the query experience for the business users?

Options:

A.

Enable V-Order.

B.

Create and update statistics.

C.

Run the VACUUM command.

D.

Introduce primary keys.

Buy Now
Questions 18

You need to implement the solution for the book reviews.

Which should you do?

Options:

A.

Create a Dataflow Gen2 dataflow.

B.

Create a shortcut.

C.

Enable external data sharing.

D.

Create a data pipeline.

Buy Now
Questions 19

You need to resolve the sales data issue. The solution must minimize the amount of data transferred.

What should you do?

Options:

A.

Spilt the dataflow into two dataflows.

B.

Configure scheduled refresh for the dataflow.

C.

Configure incremental refresh for the dataflow. Set Store rows from the past to 1 Month.

D.

Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Year.

E.

Configure incremental refresh for the dataflow. Set Refresh rows from the past to 1 Month.

Buy Now
Questions 20

You need to ensure that the authors can see only their respective sales data.

How should you complete the statement? To answer, drag the appropriate values the correct targets. Each value may be used once, more than once, or not at all. You may need to drag the split bar between panes or scroll to view content

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Questions 21

HOTSPOT

You need to troubleshoot the ad-hoc query issue.

How should you complete the statement? To answer, select the appropriate options in the answer area.

NOTE: Each correct selection is worth one point.

Options:

Buy Now
Exam Code: DP-700
Exam Name: Implementing Data Engineering Solutions Using Microsoft Fabric
Last Update: Jun 15, 2025
Questions: 104
DP-700 pdf

DP-700 PDF

$33.25  $94.99
DP-700 Engine

DP-700 Testing Engine

$38.5  $109.99
DP-700 PDF + Engine

DP-700 PDF + Testing Engine

$50.75  $144.99