• DP-200: Implementing an Azure Data Solution [DP-200T01-A] including exam voucher

  • Group Training

    A perfect course to be an Azure Data Engineer! Learn about data storage services, data transformations, data retention policies and much more!

    Training code
    CGADP200CE
    Spoken Language
    English
    Language Materials
    English
    Dayparts
    6
    Price
    €400,00
    excl. VAT No extra costs.

    Book DP-200: Implementing an Azure Data Solution [DP-200T01-A] including exam voucher now

    This course will mostly take place in a group setting. We use several learning methods to help you obtain the knowledge, give you helpful insights and get you inspired. Check the Spoken Language on the left for language info.

    This course currently isn't planned. Please fill in your contact details below and we'll get in touch with you within two working days.

    Name*
    Email Address*
    Phone number*
     

    What is DP-200: Implementing an Azure Data Solution [DP-200T01-A] including exam voucher

    In this course, the students will implement various data platform technologies into solutions that are in-line with business and technical requirements, including on-premises, cloud, and hybrid data scenarios incorporating both relational and NoSQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data.
    The students will also explore how to implement data security, including authentication, authorization, data policies, and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing, and streaming data solutions.
    This course uses MOC (Microsoft Official Courseware) and will be given by an experienced MCT (Microsoft Certified Trainer).
    See the below modules for more information:
    Module 1: Azure for the Data Engineer
    This module explores how the world of data has evolved and how cloud data platform technologies are providing new opportunities for businesses to explore their data in different ways. The students will gain an overview of the various data platform technologies that are available and how a Data Engineer's role and responsibilities has evolved to work in this new world to an organization's benefit.
    Lessons

    • Explain the evolving world of data
    • Survey the services in the Azure Data Platform
    • Identify the tasks that are performed by a Data Engineer
    • Describe the use cases for the cloud in a Case Study

    Module 2: Working with Data Storage
    This module teaches the variety of ways to store data in Azure. The students will learn the basics of storage management in Azure, how to create a Storage Account, and how to choose the right model for the data want to be stored in the cloud. They will also understand how Data Lake storage can be created to support a wide variety of big data analytics solutions with minimal effort.
    Lessons

    • Choose a data storage approach in Azure
    • Create an Azure Storage Account
    • Explain Azure Data Lake storage
    • Upload data into Azure Data Lake

    Module 3: Enabling Team Based Data Science with Azure Databricks
    This module introduces students to Azure Databricks and how a Data Engineer works with it to enable an organization to perform Team Data Science projects. They will learn the fundamentals of Azure Databricks and Apache Spark notebooks; how to provision the service and workspaces; and how to perform data preparation task that can contribute to the data science project.
    Lessons

    • Explain Azure Databricks
    • Work with Azure Databricks
    • Read data with Azure Databricks
    • Perform transformations with Azure Databricks

    Module 4: Building Globally Distributed Databases with Cosmos DB
    In this module, students will learn how to work with NoSQL data using Azure Cosmos DB. They will learn how to provision the service, how they can load and interrogate data in the service using Visual Studio Code extensions, and the Azure Cosmos DB .NET Core SDK. They will also learn how to configure the availability options so that users are able to access the data from anywhere in the world.
    Lessons

    • Create an Azure Cosmos DB database built to scale
    • Insert and query data in your Azure Cosmos DB database
    • Build a .NET Core app for Cosmos DB in Visual Studio Code
    • Distribute data globally with Azure Cosmos DB

    Module 5: Working with Relational Data Stores in the Cloud
    In this module, students will explore the Azure relational data platform options, including SQL Database and SQL Data Warehouse. The students will be able explain why they would choose one service over another, and how to provision, connect, and manage each of the services.
    Lessons

    • Use Azure SQL Database
    • Describe Azure SQL Data Warehouse
    • Creating and Querying an Azure SQL Data Warehouse
    • Use PolyBase to Load Data into Azure SQL Data Warehouse

    Module 6: Performing Real-Time Analytics with Stream Analytics
    In this module, students will learn the concepts of event processing and streaming data and how this applies to Events Hubs and Azure Stream Analytics. The students will then set up a stream analytics job to stream data and learn how to query the incoming data to perform analysis of the data. Finally, they will learn how to manage and monitor running jobs.
    Lessons

    • Explain data streams and event processing
    • Data Ingestion with Event Hubs
    • Processing Data with Stream Analytics Jobs

    Module 7: Orchestrating Data Movement with Azure Data Factory
    In this module, students will learn how Azure Data Factory can be used to orchestrate the data movement and transformation from a wide range of data platform technologies. They will be able to explain the capabilities of the technology and be able to set up an end to end data pipeline that ingests and transforms data.
    Lessons

    • Explain how Azure Data Factory works
    • Azure Data Factory Components
    • Azure Data Factory and Databricks

    Module 8: Securing Azure Data Platforms
    In this module, students will learn how Azure provides a multi-layered security model to protect data. The students will explore how security can range from setting up secure networks and access keys, to defining permission, to monitoring across a range of data stores.
    Lessons

    • An introduction to security
    • Key security components
    • Securing Storage Accounts and Data Lake Storage
    • Securing Data Stores
    • Securing Streaming Data

    Module 9: Monitoring and Troubleshooting Data Storage and Processing
    In this module, the students will get an overview of the range of monitoring capabilities that are available to provide operational support should there be issue with a data platform architecture. They will explore the common data storage and data processing issues. Finally, disaster recovery options are revealed to ensure business continuity.
    Lessons

    • Explain the monitoring capabilities that are available
    • Troubleshoot common data storage issues
    • Troubleshoot common data processing issues
    • Manage disaster recovery
     

    Who should attend the DP-200: Implementing an Azure Data Solution [DP-200T01-A] including exam voucher

    The primary audience for this course is Data Professionals, Data Architects, and Business Intelligence Professionals who want to learn about the data platform technologies that exist on Microsoft Azure. The secondary audience for this course is individuals who develop applications that deliver content from the data platform technologies that exist on Microsoft Azure.
    Also, you will receive a voucher to make the exam. Enlist today!

    Prerequisites

    Candidates for this exam must be able to implement data solutions that use the following Azure services: Azure Cosmos DB, Azure SQL Database, Azure Synapse Analytics (formerly Azure SQL DW), Azure Data Lake Storage, Azure Data Factory, Azure Stream Analytics, Azure Databricks, and Azure Blob storage.
    In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
    Azure Fundamentals
    You can gain a better understanding of Azure by taking this free online training, which will give you the experience you need to be successful in this course.

    Objectives

    After completing this course, you will be able to:
    Explain the evolving world of data and survey the services in the Azure Data Platform

    • Identify the tasks that are performed by a Data Engineer
    • Describe the use cases for the cloud in a Case Study
    • Choose a data storage approach in Azure and create an Azure Storage Account
    • Understand the function of Azure Data Lake Storage
    • Explain and perform transformations with Azure Databricks
    • Create an Azure Cosmos DB database built to scale and query data
    • Build a .NET Core app for Azure Cosmos DB in Visual Studio Code
    • Distribute data globally with Azure Cosmos DB and use Azure SQL Database
    • Describe, create and query Azure SQL Data Warehouse
    • Use PolyBase to Load Data into Azure SQL Data Warehouse
    • Understand Data Ingestion with Event Hubs, data streams and event processing
    • Understand Processing Data with Stream Analytics Jobs
    • Understand Azure Data Factory Components and Databricks
    • Understand key security components, securing Storage Accounts and Data Lake Storage
    • Understand securing Data Stores, securing Streaming Data and monitoring capabilities
    • Troubleshoot common data storage issues and data processing issues
    • Manage disaster recovery
     
    Incompany

    A perfect course to be an Azure Data Engineer! Learn about data storage services, data transformations, data retention policies and much more!

    Training code
    CGADP200CE
    Spoken Language
    English
    Language Materials
    English
    Dayparts
    6
    Price
    €400,00
    excl. VAT No extra costs.

    With an Incompany training you have several advantages:

    - You choose the location
    - You experience the training with your colleagues, so it is always in line with your practice
    - The trainer can tailor explanations, examples and assignments to your organization
    - In consultation exercises can be adapted to organization-specific questions

    Request more information or a quote.

     

    What is DP-200: Implementing an Azure Data Solution [DP-200T01-A] including exam voucher

    In this course, the students will implement various data platform technologies into solutions that are in-line with business and technical requirements, including on-premises, cloud, and hybrid data scenarios incorporating both relational and NoSQL data. They will also learn how to process data using a range of technologies and languages for both streaming and batch data.
    The students will also explore how to implement data security, including authentication, authorization, data policies, and standards. They will also define and implement data solution monitoring for both the data storage and data processing activities. Finally, they will manage and troubleshoot Azure data solutions which includes the optimization and disaster recovery of big data, batch processing, and streaming data solutions.
    This course uses MOC (Microsoft Official Courseware) and will be given by an experienced MCT (Microsoft Certified Trainer).
    See the below modules for more information:
    Module 1: Azure for the Data Engineer
    This module explores how the world of data has evolved and how cloud data platform technologies are providing new opportunities for businesses to explore their data in different ways. The students will gain an overview of the various data platform technologies that are available and how a Data Engineer's role and responsibilities has evolved to work in this new world to an organization's benefit.
    Lessons

    • Explain the evolving world of data
    • Survey the services in the Azure Data Platform
    • Identify the tasks that are performed by a Data Engineer
    • Describe the use cases for the cloud in a Case Study

    Module 2: Working with Data Storage
    This module teaches the variety of ways to store data in Azure. The students will learn the basics of storage management in Azure, how to create a Storage Account, and how to choose the right model for the data want to be stored in the cloud. They will also understand how Data Lake storage can be created to support a wide variety of big data analytics solutions with minimal effort.
    Lessons

    • Choose a data storage approach in Azure
    • Create an Azure Storage Account
    • Explain Azure Data Lake storage
    • Upload data into Azure Data Lake

    Module 3: Enabling Team Based Data Science with Azure Databricks
    This module introduces students to Azure Databricks and how a Data Engineer works with it to enable an organization to perform Team Data Science projects. They will learn the fundamentals of Azure Databricks and Apache Spark notebooks; how to provision the service and workspaces; and how to perform data preparation task that can contribute to the data science project.
    Lessons

    • Explain Azure Databricks
    • Work with Azure Databricks
    • Read data with Azure Databricks
    • Perform transformations with Azure Databricks

    Module 4: Building Globally Distributed Databases with Cosmos DB
    In this module, students will learn how to work with NoSQL data using Azure Cosmos DB. They will learn how to provision the service, how they can load and interrogate data in the service using Visual Studio Code extensions, and the Azure Cosmos DB .NET Core SDK. They will also learn how to configure the availability options so that users are able to access the data from anywhere in the world.
    Lessons

    • Create an Azure Cosmos DB database built to scale
    • Insert and query data in your Azure Cosmos DB database
    • Build a .NET Core app for Cosmos DB in Visual Studio Code
    • Distribute data globally with Azure Cosmos DB

    Module 5: Working with Relational Data Stores in the Cloud
    In this module, students will explore the Azure relational data platform options, including SQL Database and SQL Data Warehouse. The students will be able explain why they would choose one service over another, and how to provision, connect, and manage each of the services.
    Lessons

    • Use Azure SQL Database
    • Describe Azure SQL Data Warehouse
    • Creating and Querying an Azure SQL Data Warehouse
    • Use PolyBase to Load Data into Azure SQL Data Warehouse

    Module 6: Performing Real-Time Analytics with Stream Analytics
    In this module, students will learn the concepts of event processing and streaming data and how this applies to Events Hubs and Azure Stream Analytics. The students will then set up a stream analytics job to stream data and learn how to query the incoming data to perform analysis of the data. Finally, they will learn how to manage and monitor running jobs.
    Lessons

    • Explain data streams and event processing
    • Data Ingestion with Event Hubs
    • Processing Data with Stream Analytics Jobs

    Module 7: Orchestrating Data Movement with Azure Data Factory
    In this module, students will learn how Azure Data Factory can be used to orchestrate the data movement and transformation from a wide range of data platform technologies. They will be able to explain the capabilities of the technology and be able to set up an end to end data pipeline that ingests and transforms data.
    Lessons

    • Explain how Azure Data Factory works
    • Azure Data Factory Components
    • Azure Data Factory and Databricks

    Module 8: Securing Azure Data Platforms
    In this module, students will learn how Azure provides a multi-layered security model to protect data. The students will explore how security can range from setting up secure networks and access keys, to defining permission, to monitoring across a range of data stores.
    Lessons

    • An introduction to security
    • Key security components
    • Securing Storage Accounts and Data Lake Storage
    • Securing Data Stores
    • Securing Streaming Data

    Module 9: Monitoring and Troubleshooting Data Storage and Processing
    In this module, the students will get an overview of the range of monitoring capabilities that are available to provide operational support should there be issue with a data platform architecture. They will explore the common data storage and data processing issues. Finally, disaster recovery options are revealed to ensure business continuity.
    Lessons

    • Explain the monitoring capabilities that are available
    • Troubleshoot common data storage issues
    • Troubleshoot common data processing issues
    • Manage disaster recovery
     

    Who should attend the DP-200: Implementing an Azure Data Solution [DP-200T01-A] including exam voucher

    The primary audience for this course is Data Professionals, Data Architects, and Business Intelligence Professionals who want to learn about the data platform technologies that exist on Microsoft Azure. The secondary audience for this course is individuals who develop applications that deliver content from the data platform technologies that exist on Microsoft Azure.
    Also, you will receive a voucher to make the exam. Enlist today!

    Prerequisites

    Candidates for this exam must be able to implement data solutions that use the following Azure services: Azure Cosmos DB, Azure SQL Database, Azure Synapse Analytics (formerly Azure SQL DW), Azure Data Lake Storage, Azure Data Factory, Azure Stream Analytics, Azure Databricks, and Azure Blob storage.
    In addition to their professional experience, students who take this training should have technical knowledge equivalent to the following courses:
    Azure Fundamentals
    You can gain a better understanding of Azure by taking this free online training, which will give you the experience you need to be successful in this course.

    Objectives

    After completing this course, you will be able to:
    Explain the evolving world of data and survey the services in the Azure Data Platform

    • Identify the tasks that are performed by a Data Engineer
    • Describe the use cases for the cloud in a Case Study
    • Choose a data storage approach in Azure and create an Azure Storage Account
    • Understand the function of Azure Data Lake Storage
    • Explain and perform transformations with Azure Databricks
    • Create an Azure Cosmos DB database built to scale and query data
    • Build a .NET Core app for Azure Cosmos DB in Visual Studio Code
    • Distribute data globally with Azure Cosmos DB and use Azure SQL Database
    • Describe, create and query Azure SQL Data Warehouse
    • Use PolyBase to Load Data into Azure SQL Data Warehouse
    • Understand Data Ingestion with Event Hubs, data streams and event processing
    • Understand Processing Data with Stream Analytics Jobs
    • Understand Azure Data Factory Components and Databricks
    • Understand key security components, securing Storage Accounts and Data Lake Storage
    • Understand securing Data Stores, securing Streaming Data and monitoring capabilities
    • Troubleshoot common data storage issues and data processing issues
    • Manage disaster recovery
     
  • Related

    Fields of Expertise
    Cloud
     
  • e-CF competences with this course

     

    At Capgemini Academy we believe in transparency and clarity in the training landscape. That is why, in the table below, we show you to which e-CF competence this training or certification contributes. For more information about how to use the e-Competence Framework read more here. If you want to know how you can apply the e-CF within your organization, read more on this page.

    e-Competence Level12345
    A.5.Architecture Design     
    B.6.Systems Engineering