All Coursera Quiz Answers

Microsoft Azure Databricks for Data Engineering Week 7 | Test prep Quiz Answers

In this article i am gone to share Coursera Course: Microsoft Azure Databricks for Data Engineering Week 7 | Test prep Quiz Answers with you..

Enrol Link: Microsoft Azure Databricks for Data Engineering

Microsoft Azure Databricks for Data Engineering Week 7 | Test prep Quiz Answers


Test prep Quiz Answers

Question 1)
Stream processing is where you continuously incorporate new data into Data Lake storage and compute results. Which of the following are examples of Stream processing? Select all that apply.

  • Invoicing
  • Bank Card Processing
  • Monthly Payroll processing
  • IoT Device Data
  • Game play events

Question 2)
The following example creates a schema.
dataSchema = “Recorded_At timestamp, Device string, Index long, Model string, User string, _corrupt_record String, gt string, x double, y double, z double”
In SQL syntax this is referred to as which of the following?

  • Data Control Language (DCL)
  • Data Definition Language (DDL)
  • Data Manipulation Language(DML)

Question 3)
Which of the following syntax will allow you view the list of active streams in an Azure Databricks workspace?

  • spark.streams

Question 4)
What happens if the command option (“checkpointLocation”, pointer-to-checkpoint directory) is not specified?

  • The streaming job will function as expected since the checkpointLocation option does not exist
  • When the streaming job stops, all state around the streaming job is lost, and upon restart, the job must start from scratch
  • It will not be possible to create more than one streaming query that uses the same streaming source since they will conflict

Question 5)
Select the correct option to complete the statement below:
In Azure Databricks a schema is the definition of column names and data types. For file based streaming sources in Azure Databricks the schema is __________.

  • Not Required
  • Defined for you
  • User Defined

Question 6)
In Spark Structured Streaming, what method should be used to read streaming data into a DataFrame?

  • spark.readStream

Question 7)
In Azure Databricks when creating a new user access token, the Lifetime setting of the access token can be manually set. What is the default Token Lifetime when creating a new token?

  • Indefinite
  • 90 Days
  • 30 Days
  • 120 Days
  • 60 Days

Question 8)
What is the purpose of “Activities” in Azure Data Factory?

  • To link data stores or computer resources together for the movement of data between resources
  • To represent a data store or a compute resource that can host execution of an activity
  • To represent a processing step in a pipeline

Question 9)
How can parameters be passed into an Azure Databricks notebook from Azure Data Factory?

  • Use notebook widgets
  • Deploy the notebook as a web service
  • Use the API endpoint option on a notebook