Coursera Answers

Azure Data Lake Storage Gen2 and Data Streaming Solution Coursera Quiz Answers

In this article i am gone to share Azure Data Lake Storage Gen2 and Data Streaming Solution All Weeks Quiz Answers with you..

Enrol Link:   Azure Data Lake Storage Gen2 and Data Streaming Solution

Azure Data Lake Storage Gen2 and Data Streaming Solution Coursera Quiz Answers

 


 

WEEK 1 QUIZ ANSWERS

Knowledge check

Question 1)
Azure Data Lake Storage Gen2 supports which of the following authorization mechanisms?

  • Shared access signature (SAS) authorization
  • Shared Key authorization
  • Microsoft SQL Passthrough Authentication
  • Role-based access control (Azure RBAC)
  • Access control lists (ACL)

Question 2)
True or False?
Azure Data Lake Storage organizes the stored data into a hierarchy of directories and subdirectories

  • True
  • False

Question 3)
Azure Data Lake Storage Gen2 currently supports which of the following features?

  • Geo-redundant storage (GRS) only
  • Locally redundant storage (LRS) only
  • Both LRS and GRS

Question 4)
When creating an Azure Storage account, the name must be unique within which of the following scopes?

  • Resource Group Only
  • Subscription Only
  • Tenant Only
  • Within all of Azure

Question 5)
You have recently created an Azure Storage account and enabled hierarchical namespace for data Lake Gen 2. You now decide that this is not a requirement based on your storage requirements and you wish to revert back to a flat namespace. Where can you go to change the storage account back to a flat namespace?

  • You cannot revert back to a flat namespace
  • The overview page of the Storage Account in the Portal
  • by running a series of Powershell commands

Question 6)
Azure Data Lake Storage Gen2 plays an important role in a wide range of big data architectures. There are four stages for processing big data solutions that are common to all architectures. Select the four stages from the following options. Select four that apply.

  • Store
  • Model and serve
  • Transform Data
  • Prep and train
  • Ingestion

 

Visit this link:  Azure Data Lake Storage Gen2 and Data Streaming Solution Week 1 | Test prep Quiz Answers

 


 

WEEK 2 QUIZ ANSWERS

Knowledge check

Question 1)
True or false?
All data written to Azure Storage is automatically encrypted.

  • True
  • False

Question 2)
Azure Storage supports Azure Active Directory and role-based access control (or RBAC) for both resource management and data operations. You can assign RBAC roles that are scoped to which of the following? Select all options that apply.

  • A storage account
  • A table
  • An individual container or queue
  • A resource group
  • A subscription

Question 3)
Azure Storage accounts can create authorized apps in Active Directory to control access to the data in which of the following? Select all options that apply.

  • Azure Tables
  • Azure Blobs
  • Azure Queues
  • Azure Files

Question 4)
In Azure Storage, clients can use a shared key or shared secret for authentication and to restrict access to resources. A Shared key is supported by which of the following storage models? Select all options that apply.

  • Tables
  • Disks
  • Queues
  • Blobs
  • Files

Question 5)
True or false?
In Azure Storage accounts, shared keys give access to everything in the account.

  • True
  • False

 

Visit this link:  Azure Data Lake Storage Gen2 and Data Streaming Solution Week 2 | Test prep Quiz Answers

 

 


 

WEEK 3 QUIZ ANSWERS

Knowledge check

Question 1)
There are generally two approaches to processing data streams. The process of collecting streaming data over time and storing it as static data to be processed in batches during times when compute costs are lower is referred to as?

  • On-Demand processing
  • Controlled processing
  • Live processing

Question 2)
Event processing pipelines provide an end-to-end solution for ingesting, transforming, and analyzing data streams. Which of the following components is responsible for the ingestion and transformation of streaming event data?

  • An event consumer
  • An event producer
  • An event processor

Question 3)
Event processing pipelines provide an end-to-end solution for ingesting, transforming, and analyzing data streams and are made up of three distinct components. Which of the following make up these three components? Select three that apply.

  • An event handler
  • An event consumer
  • An event producer
  • An event processor

Question 4)
Azure Stream Analytics is a what type of event processing engine?

  • Software-as-a-Service (SaaS)
  • Infrastructure-as-a Service (IaaS)
  • Platform-as-a-Service (PaaS)

Question 5)
Which of the following technologies typically provide an ingestion point for data streaming in an event processing solution that uses static data as a source?

  • Azure Event Hubs
  • Azure Blob storage
  • Azure IoT Hub

Question 6)
To consume processed event streaming data in near-real-time to produce dashboards containing rich visualizations, which of the following services should you use?

  • Azure Cosmos DB
  • Power BI
  • Event Hubs

 

Knowledge check

Question 1)
An event is a small packet of information that contains a notification. Events can be published individually, or in batches, but a single publication cannot exceed how much?

  • 2MB
  • 1MB
  • 512KB

Question 2)
Azure Event Hubs is a cloud-based, event-processing service that can receive and process millions of events per second. An entity that reads data from the Event Hubs is called what? Select two that apply.

  • Subscriber
  • Consumer
  • Reader
  • Publisher

Question 3)
To configure an application to send messages to an Event Hub, which of the following information must be provided?

  • Event Hub name
  • Primary shared access key
  • Shared Access Policy Name
  • Event Hub namespace name
  • Username and Password

Question 4)
True or false?
Applications that publish messages to Azure Event Hub very frequently will get the best performance using Advanced Message Queuing Protocol (AMQP) because it establishes a persistent socket.

  • False
  • True

Question 5)
By default, how many partitions will a new Event Hub have?

  • Three
  • Two
  • Four
  • One

Question 6)
Published events are removed from an event hub based on a configurable, timed-based retention policy. What is shortest possible retention period for published events?

  • 7 Days
  • 1 Day
  • 1 Hour
  • 12 Hours

 

Knowledge check

Question 1)
True or False?
Azure Stream Analytics jobs perform all processing in memory to achieve the low latency required for efficient stream processing.

  • False
  • True

Question 2)
Azure Stream Analytics includes native support for five kinds of temporal windowing functions. Select the correct types of Windowing functions from the following list. Select five that apply.

  • Session
  • StartTime
  • EndTime
  • Sliding
  • Hopping
  • Snapshot
  • Tumbling

Question 3)
In Stream Analytics windowing functions, which of the following generate events for points in time when the content of the window actually changed. In other words, when an event enters or exits the window?

  • Snapshot
  • Session
  • Sliding
  • Hopping
  • Tumbling

Question 4)
Which of the definitions below best describes a Tumbling window?

  • A windowing function that clusters together events that arrive at similar times, filtering out periods of time in which there is no data.
  • A windowing function that segment a data stream into a contiguous series of fixed-size, non-overlapping time segments and operate against them. Events cannot belong to more than one tumbling window.
  • A windowing function that groups events by identical timestamp values.

Question 5)
Which of the following services is a valid input for an Azure Stream Analytics job? Select all that apply.

  • Azure Event Hubs
  • Azure Cosmos DB
  • Blob storage

Question 6)
Below is a list of key benefits of using Azure Stream Analytics to process streaming data. Which of the following statements are correct? Select all that apply.

  • Integration with Azure Blob storage
  • Being able to rapidly deploy queries into production by creating and starting an Azure Stream Analytics job
  • The ability to write and test transformation queries in the Azure portal

 

Visit this link:  Azure Data Lake Storage Gen2 and Data Streaming Solution Week 3 | Test prep Quiz Answers

 


 

WEEK 4 QUIZ ANSWERS

 

Visit this link:  Azure Data Lake Storage Gen2 and Data Streaming Solution Week 4 | Course Practice Exam Quiz Answers