Download Microsoft.DP-100.CertDumps.2020-04-20.91q.tqb

Vendor: Microsoft
Exam Code: DP-100
Exam Name: Designing and Implementing a Data Science Solution on Azure
Date: Apr 20, 2020
File Size: 7 MB

Demo Questions

Question 1
You are building an intelligent solution using machine learning models. 
The environment must support the following requirements:
  • Data scientists must build notebooks in a cloud environment 
  • Data scientists must use automatic feature engineering and model building in machine learning pipelines. 
  • Notebooks must be deployed to retrain using Spark instances with dynamic worker allocation. 
  • Notebooks must be exportable to be version controlled locally. 
You need to create the environment. 
Which four actions should you perform in sequence? To answer, move the appropriate actions from the list of actions to the answer area and arrange them in the correct order. 
Correct answer: To work with this question, an Exam Simulator is required.
Explanation:
Step 1: Create an Azure HDInsight cluster to include the Apache Spark Mlib libraryStep 2: Install Microsot Machine Learning for Apache SparkYou install AzureML on your Azure HDInsight cluster. Microsoft Machine Learning for Apache Spark (MMLSpark) provides a number of deep learning and data science tools for Apache Spark, including seamless integration of Spark Machine Learning pipelines with Microsoft Cognitive Toolkit (CNTK) and OpenCV, enabling you to quickly create powerful, highly-scalable predictive and analytical models for large image and text datasets. Step 3: Create and execute the Zeppelin notebooks on the clusterStep 4: When the cluster is ready, export Zeppelin notebooks to a local environment.Notebooks must be exportable to be version controlled locally. References: https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-zeppelin-notebookhttps://azuremlbuild.blob.core.windows.net/pysparkapi/intro.html
Step 1: Create an Azure HDInsight cluster to include the Apache Spark Mlib library
Step 2: Install Microsot Machine Learning for Apache Spark
You install AzureML on your Azure HDInsight cluster. 
Microsoft Machine Learning for Apache Spark (MMLSpark) provides a number of deep learning and data science tools for Apache Spark, including seamless integration of Spark Machine Learning pipelines with Microsoft Cognitive Toolkit (CNTK) and OpenCV, enabling you to quickly create powerful, highly-scalable predictive and analytical models for large image and text datasets. 
Step 3: Create and execute the Zeppelin notebooks on the cluster
Step 4: When the cluster is ready, export Zeppelin notebooks to a local environment.
Notebooks must be exportable to be version controlled locally. 
References: 
https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-zeppelin-notebook
https://azuremlbuild.blob.core.windows.net/pysparkapi/intro.html
Question 2
You are developing a hands-on workshop to introduce Docker for Windows to attendees. 
You need to ensure that workshop attendees can install Docker on their devices. 
Which two prerequisite components should attendees install on the devices? Each correct answer presents part of the solution. 
NOTE: Each correct selection is worth one point.
  1. Microsoft Hardware-Assisted Virtualization Detection Tool
  2. Kitematic
  3. BIOS-enabled virtualization
  4. VirtualBox
  5. Windows 10 64-bit Professional
Correct answer: CE
Explanation:
C: Make sure your Windows system supports Hardware Virtualization Technology and that virtualization is enabled. Ensure that hardware virtualization support is turned on in the BIOS settings. For example:      E: To run Docker, your machine must have a 64-bit operating system running Windows 7 or higher. References:https://docs.docker.com/toolbox/toolbox_install_windows/https://blogs.technet.microsoft.com/canitpro/2015/09/08/step-by-step-enabling-hyper-v-for-use-on-windows-10/
C: Make sure your Windows system supports Hardware Virtualization Technology and that virtualization is enabled. 
Ensure that hardware virtualization support is turned on in the BIOS settings. For example: 
   
E: To run Docker, your machine must have a 64-bit operating system running Windows 7 or higher. 
References:
https://docs.docker.com/toolbox/toolbox_install_windows/
https://blogs.technet.microsoft.com/canitpro/2015/09/08/step-by-step-enabling-hyper-v-for-use-on-windows-10/
Question 3
Your team is building a data engineering and data science development environment. 
The environment must support the following requirements:
  • support Python and Scala 
  • compose data storage, movement, and processing services into automated data pipelines 
  • the same tool should be used for the orchestration of both data engineering and data science 
  • support workload isolation and interactive workloads 
  • enable scaling across a cluster of machines 
You need to create the environment. 
What should you do?
  1. Build the environment in Apache Hive for HDInsight and use Azure Data Factory for orchestration.
  2. Build the environment in Azure Databricks and use Azure Data Factory for orchestration.
  3. Build the environment in Apache Spark for HDInsight and use Azure Container Instances for orchestration.
  4. Build the environment in Azure Databricks and use Azure Container Instances for orchestration.
Correct answer: B
Explanation:
In Azure Databricks, we can create two different types of clusters. Standard, these are the default clusters and can be used with Python, R, Scala and SQL High-concurrency Azure Databricks is fully integrated with Azure Data Factory. Incorrect Answers:D: Azure Container Instances is good for development or testing. Not suitable for production workloads.References:https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-science-and-machine-learning
In Azure Databricks, we can create two different types of clusters. 
  • Standard, these are the default clusters and can be used with Python, R, Scala and SQL 
  • High-concurrency 
Azure Databricks is fully integrated with Azure Data Factory. 
Incorrect Answers:
D: Azure Container Instances is good for development or testing. Not suitable for production workloads.
References:
https://docs.microsoft.com/en-us/azure/architecture/data-guide/technology-choices/data-science-and-machine-learning
EXAM SIMULATOR

How to Open TQB Files?

Use Taurus Exam Simulator to open TQB files

Taurus Exam Simulator


Taurus Exam Simulator for Windows/macOS/Linus

Download

Taurus Exam Studio
Enjoy a 20% discount on Taurus Exam Studio!

You now have the chance to acquire Exam Studio at a discounted rate of 20%.

Get Now!
-->