Coursera Answers

Data Integration with Microsoft Azure Data Factory Coursera Quiz Answers

In this article i am gone to share Data Integration with Microsoft Azure Data Factory Coursera Quiz Answers with you..

Enrol Link: Data Integration with Microsoft Azure Data Factory

Data Integration with Microsoft Azure Data Factory Coursera Quiz Answers


 

WEEK 1 QUIZ ANSWERS

Knowledge check

Question 1)
Much of the functionality of Azure Data Factory appears in Azure Synapse Analytics as a feature called pipelines. You can use it to integrate data pipelines between which of the following?
Select all options that apply.

  • Apache Hive
  • SQL Serverless
  • SQL Pools
  • Spark Pools

Question 2)
Which of the following provides a cloud-based data integration service that orchestrates the movement and transformation of data between various data stores and compute resources?

  • Azure HDInsight
  • Azure Databricks
  • Azure Data Factory
  • Azure SQL Database

Question 3)
ADF has native functionality to ingest and transform data, sometimes it will instruct another service, such as Databricks, to perform the actual work required on its behalf. Which of the following terms best describes this process?

  • Integration
  • Transformation
  • Orchestration

Question 4)
Which of the following terms describes analyzing past data patterns and trends by looking at historical data and customer insights?

  • Prescriptive Analytics
  • Descriptive Analytics
  • Predictive Analytics

Question 5)
Microsoft Azure provides a variety of data platform services that enables you to perform different types of analytics. Predictive analytics can be implemented through which of the following features?
Select all options that apply

  • Azure Data Lake Storage Gen2
  • Machine Learning Services
  • Azure Databricks
  • HDInsight

Question 6)
Data integration includes extraction, transformation, and loading of data. It is commonly referred to as Extract-Transform-Load or ETL.
At which stage in the ETL process is the splitting, combining, deriving, adding, and removing data carried out?

  • Load
  • Transform
  • Extract

 

Knowledge check

Question 1)
You are creating a new Azure Data Factory instance. The instance name must be unique within which of the following?

  • Globally within Azure
  • The region
  • The Resource Group
  • The Azure Subscription

Question 2)
How would you define an Azure factory dataset?

  • A dataset is a named view that points to, or references, the data.
  • A Dataset does not exist in Azure Data factory
  • A dataset is a table that holds a copy of the data from the source.
  • A Dataset is a table that holds a copy of the data for the destination.

Question 3)
How would you define an Azure Data Factory pipeline?

  • A Pipeline is a named view of data.
  • A pipeline is a logical grouping of activities that together perform a task.
  • A pipeline contains the transformation logic, or the analysis commands, of the Azure Data Factory’s work.

Question 4)
When working with Azure Data factory, how many linked services are required to copy data from Blob storage to a SQL database?

  • 2
  • 3
  • 0
  • 1

Question 5)
What are the three categories of activities within Azure Data Factory that define the actions to be performed on the data?
Select all options that apply.

  • Control
  • Linked Service
  • Data transformation
  • Data movement

Question 6)
When graphically authoring ADF solutions, you can use the control flow within the design to orchestrate which of the following pipeline activities?
Select all options that apply.

  • Execute Pipeline Activity
  • Parameters Activity
  • WebActivity
  • ForEach Activity

 

Visit this link:  Data Integration with Microsoft Azure Data Factory Week 1 | Test prep Quiz Answers

 


 

WEEK 2 QUIZ ANSWERS

Knowledge check

Question 1)
Which of the following methods provides the ability to build code-free data ingestion pipelines with zero transformation during the extraction of the data?

  • Azure Databricks
  • Azure SQL Data Warehouse
  • Azure Copy Activity
  • Azure Machine Learning Studio

Question 2)
Connectors are Azure Data Factory objects that enable your Linked Services and Datasets to connect to a wide variety of data sources and sinks. Which of the following are supported file formats? Select all options that apply.

  • Parquet format
  • YAML Format
  • Binary format
  • ORC format

Question 3)
You want to ingest data from a SQL Server database hosted on an on-premises Windows Server. What integration runtime is required for Azure Data Factory to ingest data from the on-premises server?

  • Self-Hosted Integration Runtime
  • Azure Integration Runtime
  • Azure-SSIS Integration Runtime

Question 4)
In Azure Data Factory when creating a new Copy Activity, which of the following steps do you carry out?

  • Click on Monitor and add a new pipeline and copy activity
  • Click on Author, and create a new copy activity
  • Click on Manage, and add a new copy activity
  • Open the authoring canvas, create a new Pipeline, or use an existing pipeline and then add a new copy activity

Question 5)
Azure Data Factory provides the ability to lift and shift existing SQL Server Integration Services workloads and natively execute these packages. Which of the following would you select to do this ?

  • An Azure Function
  • Integrate with Azure Databricks
  • Use Azure Machine Learning
  • An Azure-SSIS Integration Runtime

 

Visit this link:  Data Integration with Microsoft Azure Data Factory Week 2 | Test prep Quiz Answers

 


 

WEEK 3 QUIZ ANSWERS

Knowledge check

Question 1)
Azure Data Factory provides a range of methods to perform transformations. Which of the following would you use to perform transformations without writing code?

  • Mapping Data Flows
  • Stored procedure activity
  • HDInsight Pig activity
  • Azure Machine Learning Studio

Question 2)
Which of the following transformations are directly available in the Mapping Data Flows activity? Select all options that apply.

  • Insert
  • Conditional Split
  • Exists
  • Aggregate

Question 3)
Mapping Data Flow follows an extract, load, transform (ELT) approach and works with staging datasets that are in Azure. Which of the following datasets can be used in a source transformation? Select all options that apply.

  • Azure CosmosDB
  • Azure Synapse Analytics
  • Azure Blob Storage (JSON, Avro, Text, Parquet)
  • Microsoft SQL Server

Question 4)
Mapping Data Flows support Debug so that you can interactively watch how the data transformations are executing. When enabling Debug you are be prompted to select the Integration runtime to use. If you select AutoResolveIntegrationRuntime a cluster will be made available automatically. How many Cores will be made available in the cluster and what will the time to live be in minutes?

  • 8 Cores and 30 Minutes
  • 4 Cores and 30 Minutes
  • 4 Cores and 60 Minutes
  • 8 cores and 60 minutes

Question 5)
When working with Azure Data Factory and wrangling data flows. You add a Source dataset for your wrangling data flow and select a sink dataset. If you use an Azure Data Lake Storage Gen2 Connector what type of authentication is supported in this instance? Select all options that apply.

  • SQL authentication
  • Azure Data Lake Storage Gen2 supports Data Formats of CSV and Parquet and Authentication of Account Key and Service Principal.
  • Anonymous Access
  • Service Principal
  • Azure Data Lake Storage Gen2 supports Data Formats of CSV and Parquet and Authentication of Account Key and Service Principal.
  • Account Key

Question 6)
In Azure Data Factory Transformations which transformation in the Mapping Data Flow is used to route data rows to different streams based on matching conditions?

  • Lookup
  • Union
  • Conditional Split
  • Select

 

Knowledge check

Question 1)
Star schema design theory refers to common SCD types. In this design approach, the Dimension Table always reflects the latest values, and when changes in source data are detected, the dimension table data is overwritten. What SCD Type is being referred to in this scenario?

  • Type 3 SCD
  • Type 1 SCD
  • Type 6 SCD
  • Type 2 SCD

Question 2)
Star schema design theory refers to common SCD types. In this design approach, the Dimension Table s supports storing two versions of a dimension member as separate columns. What SCD Type is being referred to in this scenario?

  • Type 3 SCD
  • Type 2 SCD
  • Type 6 SCD
  • Type 1 SCD

Question 3)
You want to load a dimension table in Azure Synapse from source data in your Azure Synapse database using mapping data flows. Which of the following options would you choose in Synapse Studio Hub to create a Type 1 SCD in a Mapping Data Flow pipeline activity?

  • Develop
  • Integrate
  • Manage
  • Data

Question 4)
Which SCD type would you use to keep history of changes in dimension members by adding a new row to the table for each change?

  • Type 3 SCD
  • Type 1 SCD
  • Type 6 SCD
  • Type 2 SCD

Question 5)
Which SCD type would you use to update the dimension members without keeping track of history?

  • Type 2 SCD
  • Type 6 SCD
  • Type 1 SCD
  • Type 3 SCD

 

Visit this link:  Data Integration with Microsoft Azure Data Factory Week 3 | Test prep Quiz Answers

 


 

WEEK 4 QUIZ ANSWERS

Knowledge check

Question 1)
Control flow is the orchestration of pipeline activities. The If-condition activity which is similar to an if-statement provided in programming languages is an example of which type of control flow activity?

  • Looping Containers
  • Trigger based floes
  • Branching Activity
  • Chaining Activity

Question 2)
In Azure Data Factory activities in which a combination of ingesting and cleaning log data alongside a mapping data flow that analyzes the log data is referred to as which of the following?

  • Looping containers
  • Branching activity
  • Pipeline
  • Custom State passing

Question 3)
In Azure Data factory there are many activities that are possible in a pipeline. These activities are grouped into 3 categories. Which of the following are categories of activities?
Select all options that apply.

  • Control activities
  • Data movement activities
  • Data transformation activities
  • Branching Activities

Question 4)
In Azure Data factory activity dependency defines how subsequent activities depend on previous activities. In order to complete a task, an activity that depends on one or more previous activities, can have different dependency conditions. What are the four dependency conditions ? Select all options that apply.

  • Completed with Errors
  • Completed
  • Succeeded
  • Skipped
  • Pending
  • Failed

Question 5)
True or False
In Azure Data Factory, Pipelines or activities must be published prior to being debugged.

  • True
  • False

Question 6)
True or False
In Azure Data Factory you enable Debug by selecting the Debug slider. What features will be enabled when selecting the Slider?

  • Enables the Debug feature only
  • Enables the Debug feature runs the Pipeline and publishes the pipeline
  • Enables the Debug feature and publishes the pipeline
  • Enables the Debug feature and run the pipeline

 

Visit this link:  Data Integration with Microsoft Azure Data Factory Week 4 | Test prep Quiz Answers

 


 

WEEK 5 QUIZ ANSWERS

Knowledge check

Question 1)
In SQL Server, which of the following is a unit of deployment for SSIS solutions?

  • SSIS Projects
  • SSIS Packages

Question 2)
In Azure data factory an Integration Runtime (IR) can be defined as which of the following?

  • The bridge between the activity and the linked service
  • The Action to be performed
  • A target data store on a compute service

Question 3)
When creating a new SSIS Integration Runtime which of the following pieces of information are required on the General setting page? Select all options that apply.

  • Node Number
  • Location
  • Description
  • Name
  • Node Size

Question 4)
When you migrate your database workloads from SQL Server on premises to Azure SQL database services, you may have to migrate SSIS packages as well. The first step required is to perform an assessment of you current SSIS packages to make sure that they are compatible in Azure. Which of the following tools would be most appropriate to check for compatibility issues?

  • SQL Server Data Tools (SSDT)
  • Data Migration Assistant (DMA)
  • SQL Server Management Studio (SSMS)

Question 5)
An SSIS Project is the unit of deployment for SSIS solutions. In which version of SQL Server was SSIS Projects introduced?

  • SQL Server 2012
  • SQL Server 2008
  • SQL Server 2016

Question 6)
Which tool is used to create and deploy SQL Server Integration Packages on an Azure-SSIS integration runtime, or for on-premises SQL Server?

  • dtexec
  • SQL Server Data Tools
  • SQL Server Management Studio

 

Visit this link:  Data Integration with Microsoft Azure Data Factory Week 5 | Test prep Quiz Answers

 


 

WEEK 6 QUIZ ANSWERS

Knowledge check

Question 1)
In Azure Data Factory you can programmatically interact with multiple languages and SDKs. Which of the following languages and SDKs can you use? Select all options that apply.

  • PowerShell
  • Yaml
  • REST APIs
  • .NET
  • Python

Question 2)
Azure Data Factory allows you to configure a Git repository with either Azure Repos or GitHub.
Which of the following are advantages of using GIT Integration?
Select all options that apply.

  • Partial saves
  • PowerShell or an SDK changes are published directly to GIT
  • Source control
  • Better CI/CD

Question 3)
Which of the following would you select when you need to merge changes in your feature Branch to your collaboration branch?

  • Create Merge Request
  • Create Push Request
  • Create pull request

Question 4)
What effect does removing a Git configuration from a Data Factory have on the repository?

  • It does not delete anything from the repository and the factory will still contain all published resources and you can continue to edit the factory directly against the service.
  • It deletes everything from the repository. The factory will still contain all published resources and you can continue to edit the factory directly against the service.
  • It deletes everything from the repository. The factory will also lose access to all published resources, and you cannot continue to edit the factory directly against the service.

Question 5)
Mapping data flows allow you to build code-free data transformation logic that runs at scale. When building your logic, you can turn on a debug session to interactively work with your data. Which of the following is created when turning on Debug?

  • Spark Cluster
  • MS SQL Cluster
  • Blob Storage
  • Data Lake Storage Account

Question 6)
In Azure Data factory what is a supported connector for built-in parameterization?

  • Azure Synapse Analytics
  • Azure Data Lake Storage Gen2
  • Azure Key Vault

 

Visit this link:  Data Integration with Microsoft Azure Data Factory Week 6 | Test prep Quiz Answers

 


 

WEEK 7 QUIZ ANSWERS

Knowledge check

Question 1)
True or False
Azure Data Factory provides support for both code-free Extract Transform Load (ETL) and Extract Load Transform (ELT) processes.

  • True
  • False

Question 2)
As a data provider, you can share datasets with third parties or between departments within your organization. You would like to give your data consumers the ability to automatically get refreshed data, in this scenario, which feature would you enable?

  • Data Ingestion
  • Snapshots

Question 3)
Azure Data Share provides open and flexible data sharing, including the ability to share from and to different data stores. Which of the following Data stores support incremental snapshots? Select all that apply

  • Azure Blob Storage
  • Azure Data Lake Storage Gen1
  • Azure SQL Database
  • Azure Data Lake Storage Gen2

Question 4)
When creating an Azure Data Share from within the portal you must specify which of the following? Select all options that apply.

  • Name
  • Version
  • Tag
  • Resource Group Name
  • Location

Question 5)
You are defining the connection information for Azure Data Factory to an external resource that you want to connect with, such as an Azure SQL Database. In this scenario, which of the following do you need to create to connect to an external resource?

  • Dataset
  • Pipeline
  • Linked Service

Question 6)
When creating Azure Data Shares which of the following should be selected to enable access to data at source?

  • In-Place sharing
  • Snapshot sharing

 

Knowledge check

Question 1)
A Mapping Data Flow is a visually designed transformation service in Azure Data Factory. With the ability to visually design things such as different types of joins, aggregate functions etc. Which of the following Join types are supported in Mapping Data Flow? Select all options that apply.

  • Right Outer
  • Left Outer
  • Inner Join
  • Union

Question 2)
In Azure Data factory after you have finished joining and transforming a dataset through a mapping data flow, it is important to write the newly created dataset into a destination store. Which of the following options would you use to do this?

  • Union
  • Merge
  • Sink
  • Copy

Question 3)
In a Mapping Data Flow after you finish transforming your data, you write it into a destination store by using the sink transformation. When using sink transformations which of the following statements is True?

  • Every data flow requires at least one sink transformation, and you can write to multiple sinks as necessary.
  • Every data flow requires at least one sink transformation, but cannot write to multiple sinks.

Question 4)
In Azure Data Factory, what is a logical grouping of activities referred to as?

  • A Pipeline
  • A Linked Service
  • Data Ingestion
  • A Sink

Question 5)
To add or remove datasets created with Azure Data Share which of the following is True?

  • It is not possible to add or remove datasets if created with Azure Data Share.
  • It is possible to add or remove datasets within Azure Data Share after it has been created.
  • It is only possible to remove or add datasets before it’s sent within Azure Data Share.

Question 6)
What must be done when a connector in Azure Data Factory is not supported in a mapping data flow task to transform data from one of these sources?

  • Use a group by activity in the Dataflow activity.
  • Ingest the data into a supported source using the copy activity.
  • Use an aggregate transformation in the Dataflow activity.

 

Visit this link:  Data Integration with Microsoft Azure Data Factory Week 7 | Test prep Quiz Answers

 


 

WEEK 8 QUIZ ANSWERS

Visit this link:  Data Integration with Microsoft Azure Data Factory Week 8 | Course practice exam Quiz Answers