MCA Microsoft Azure Data Scientist w/ MCA Microsoft Azure Data Engineer COMBO Training & Certification Bootcamp – 6 days

Exams Included & Administered During Camp:
DP-100: Designing and Implementing a Data Science Solution on Azure
DP-200: Implementing an Azure Data Solution
DP-201: Designing an Azure Data Solution
What's Included
Airfare To/From Sarasota or Tampa
3 Microsoft Test Vouchers
3 Microsoft Official Courses
6 Nights of Lodging
1 Retake Voucher (per exam, if needed)
Microsoft Study Labs & Simulations
Ground Transportation
Onsite Pearson Vue Test Center
Instructor Led Classroom Training
This camp combines the MCA Azure Data Scientist with the MCA Azure Data Engineer Associate certifications into a unified 6 day boot camp. Azure Data Scientists apply Azure’s machine learning techniques to train, evaluate, and deploy models that solve business problems. Azure Data Engineers design and implement the management, monitoring, security, and privacy of data using the full stack of Azure data services to satisfy business needs.
The Microsoft Certified Azure Data Scientist & Data Engineer Associate boot camp is taught using Microsoft Official Courseware
DP-100T01: Designing and Implementing a Data Science Solution on Azure
DP-200T01: Implementing an Azure Data Solution
DP-201T01: Designing an Azure Data Solution
While attending this 6 day camp - students will take three exams (DP-100 / DP-200 / DP-201) to achieve the MCA Azure Data Scientist with the MCA Azure Data Engineer Associate certifications. This hands on, instructor led live camp teaches the knowledge needed for Data Scientist and Data Engineer job roles in addition to the knowledge needed for the certification exams (administered while attending).
Skills Gained:
Implement data storage solutions
Manage and develop data processing
Monitor and optimize data solutions
Design Azure data storage solutions
Design data processing solutions
Design for data security and compliance
Data Science on Azure
Data Science with Azure Machine Learning service
Automate Machine Learning with AML service
Manage and Monitor Machine Learning Models with the AML service
Topics Covered in this Official Boot Camp
Getting Started with Azure Machine Learning
In this module, you will learn how to provision an Azure Machine Learning workspace and use it to manage machine learning assets such as data, compute, model training code, logged metrics, and trained models. You will learn how to use the web-based Azure Machine Learning studio interface as well as the Azure Machine Learning SDK and developer tools like Visual Studio Code and Jupyter Notebooks to work with the assets in your workspace.
Lessons
- Introduction to Azure Machine Learning
- Working with Azure Machine Learning
Lab : Create an Azure Machine Learning Workspace
After completing this module, you will be able to
- Provision an Azure Machine Learning workspace
- Use tools and code to work with Azure Machine Learning
No-Code Machine Learning
This module introduces the Automated Machine Learning and Designer visual tools, which you can use to train, evaluate, and deploy machine learning models without writing any code.
Lessons
- Automated Machine Learning
- Azure Machine Learning Designer
Lab : Use Automated Machine Learning
Lab : Use Azure Machine Learning Designer
After completing this module, you will be able to
- Use automated machine learning to train a machine learning model
- Use Azure Machine Learning designer to train a model
Running Experiments and Training Models
In this module, you will get started with experiments that encapsulate data processing and model training code, and use them to train machine learning models.
Lessons
- Introduction to Experiments
- Training and Registering Models
Lab : Run Experiments
Lab : Train Models
After completing this module, you will be able to
- Run code-based experiments in an Azure Machine Learning workspace
- Train and register machine learning models
Working with Data
Data is a fundamental element in any machine learning workload, so in this module, you will learn how to create and manage datastores and datasets in an Azure Machine Learning workspace, and how to use them in model training experiments.
Lessons
- Working with Datastores
- Working with Datasets
Lab : Work with Data
After completing this module, you will be able to
- Create and use datastores
- Create and use datasets
Working with Compute
One of the key benefits of the cloud is the ability to leverage compute resources on demand, and use them to scale machine learning processes to an extent that would be infeasible on your own hardware. In this module, you’ll learn how to manage experiment environments that ensure consistent runtime consistency for experiments, and how to create and use compute targets for experiment runs.
Lessons
- Working with Environments
- Working with Compute Targets
Lab : Work with Compute
After completing this module, you will be able to
- Create and use environments
- Create and use compute targets
Orchestrating Operations with Pipelines
Now that you understand the basics of running workloads as experiments that leverage data assets and compute resources, it’s time to learn how to orchestrate these workloads as pipelines of connected steps. Pipelines are key to implementing an effective Machine Learning Operationalization (ML Ops) solution in Azure, so you’ll explore how to define and run them in this module.
Lessons
- Introduction to Pipelines
- Publishing and Running Pipelines
Lab : Create a Pipeline
After completing this module, you will be able to
- Create pipelines to automate machine learning workflows
- Publish and run pipeline services
Deploying and Consuming Models
Models are designed to help decision making through predictions, so they’re only useful when deployed and available for an application to consume. In this module learn how to deploy models for real-time inferencing, and for batch inferencing.
Lessons
- Real-time Inferencing
- Batch Inferencing
- Continuous Integration and Delivery
Lab : Create a Real-time Inferencing Service
Lab : Create a Batch Inferencing Service
After completing this module, you will be able to
- Publish a model as a real-time inference service
- Publish a model as a batch inference service
- Describe techniques to implement continuous integration and delivery
Training Optimal Models
By this stage of the course, you’ve learned the end-to-end process for training, deploying, and consuming machine learning models; but how do you ensure your model produces the best predictive outputs for your data? In this module, you’ll explore how you can use hyperparameter tuning and automated machine learning to take advantage of cloud-scale compute and find the best model for your data.
Lessons
- Hyperparameter Tuning
- Automated Machine Learning
Lab : Tune Hyperparameters
Lab : Use Automated Machine Learning from the SDK
After completing this module, you will be able to
- Optimize hyperparameters for model training
- Use automated machine learning to find the optimal model for your data
Responsible Machine Learning
Data scientists have a duty to ensure they analyze data and train machine learning models responsibly; respecting individual privacy, mitigating bias, and ensuring transparency. This module explores some considerations and techniques for applying responsible machine learning principles.
Lessons
- Differential Privacy
- Model Interpretability
- Fairness
Lab : Explore Differential provacy
Lab : Interpret Models
Lab : Detect and Mitigate Unfairness
After completing this module, you will be able to
- Apply differential provacy to data analysis
- Use explainers to interpret machine learning models
- Evaluate models for fairness
Monitoring Models
After a model has been deployed, it’s important to understand how the model is being used in production, and to detect any degradation in its effectiveness due to data drift. This module describes techniques for monitoring models and their data.
Lessons
- Monitoring Models with Application Insights
- Monitoring Data Drift
Lab : Monitor a Model with Application Insights
Lab : Monitor Data DriftAfter completing this module, you will be able to
- Use Application Insights to monitor a published model
- Monitor data drift
Azure for the Data Engineer
This module explores how the world of data has evolved and how cloud data platform technologies are providing new opportunities for businesses to explore their data in different ways. The students will gain an overview of the various data platform technologies that are available and how a Data Engineer’s role and responsibilities has evolved to work in this new world to an organization’s benefit.
Lessons
- Explain the evolving world of data
- Survey the services in the Azure Data Platform
- Identify the tasks that are performed by a Data Engineer
- Describe the use cases for the cloud in a Case Study
Lab : Azure for the Data Engineer
- Identify the evolving world of data
- Determine the Azure Data Platform Services
- Identify tasks to be performed by a Data Engineer
- Finalize the data engineering deliverables
After completing this module, students will be able to:
- Explain the evolving world of data
- Survey the services in the Azure Data Platform
- Identify the tasks that are performed by a Data Engineer
- Describe the use cases for the cloud in a Case Study
Working with Data Storage
This module teaches the variety of ways to store data in Azure. The students will learn the basics of storage management in Azure, how to create a Storage Account, and how to choose the right model for the data want to be stored in the cloud. They will also understand how Data Lake storage can be created to support a wide variety of big data analytics solutions with minimal effort.
Lessons
- Choose a data storage approach in Azure
- Create an Azure Storage Account
- Explain Azure Data Lake storage
- Upload data into Azure Data Lake
Lab : Working with Data Storage
- Choose a data storage approach in Azure
- Create a Storage Account
- Explain Data Lake Storage
- Upload data into Data Lake Store
After completing this module, students will be able to:
- Choose a data storage approach in Azure
- Create an Azure Storage Account
- Explain Azure Data Lake Storage
- Upload data into Azure Data Lake
Enabling Team Based Data Science with Azure Databricks
This module introduces students to Azure Databricks and how a Data Engineer works with it to enable an organization to perform Team Data Science projects. They will learn the fundamentals of Azure Databricks and Apache Spark notebooks; how to provision the service and workspaces; and how to perform data preparation task that can contribute to the data science project.
Lessons
- Explain Azure Databricks
- Work with Azure Databricks
- Read data with Azure Databricks
- Perform transformations with Azure Databricks
Lab : Enabling Team Based Data Science with Azure Databricks
- Explain Azure Databricks
- Work with Azure Databricks
- Read data with Azure Databricks
- Perform transformations with Azure Databricks
After completing this module, students will be able to:
- Explain Azure Databricks
- Work with Azure Databricks
- Read data with Azure Databricks
- Perform transformations with Azure Databricks
Building Globally Distributed Databases with Cosmos DB
In this module, students will learn how to work with NoSQL data using Azure Cosmos DB. They will learn how to provision the service, how they can load and interrogate data in the service using Visual Studio Code extensions, and the Azure Cosmos DB .NET Core SDK. They will also learn how to configure the availability options so that users are able to access the data from anywhere in the world.
Lessons
- Create an Azure Cosmos DB database built to scale
- Insert and query data in your Azure Cosmos DB database
- Build a .NET Core app for Cosmos DB in Visual Studio Code
- Distribute data globally with Azure Cosmos DB
Lab : Building Globally Distributed Databases with Cosmos DB
- Create an Azure Cosmos DB
- Insert and query data in Azure Cosmos DB
- Build a .Net Core App for Azure Cosmos DB using VS Code
- Distribute data globally with Azure Cosmos DB
After completing this module, students will be able to:
- Create an Azure Cosmos DB database built to scale
- Insert and query data in your Azure Cosmos DB database
- Build a .NET Core app for Azure Cosmos DB in Visual Studio Code
- Distribute data globally with Azure Cosmos DB
Working with Relational Data Stores in the Cloud
In this module, students will explore the Azure relational data platform options, including SQL Database and SQL Data Warehouse. The students will be able explain why they would choose one service over another, and how to provision, connect, and manage each of the services.
Lessons
- Use Azure SQL Database
- Describe Azure SQL Data Warehouse
- Creating and Querying an Azure SQL Data Warehouse
- Use PolyBase to Load Data into Azure SQL Data Warehouse
Lab : Working with Relational Data Stores in the Cloud
- Use Azure SQL Database
- Describe Azure SQL Data Warehouse
- Creating and Querying an Azure SQL Data Warehouse
- Use PolyBase to Load Data into Azure SQL Data Warehouse
After completing this module, students will be able to:
- Use Azure SQL Database
- Describe Azure Data Warehouse
- Create and Query an Azure SQL Data Warehouse
- Use PolyBase to Load Data into Azure SQL Data Warehouse
Performing Real-Time Analytics with Stream Analytics
In this module, students will learn the concepts of event processing and streaming data and how this applies to Events Hubs and Azure Stream Analytics. The students will then set up a stream analytics job to stream data and learn how to query the incoming data to perform analysis of the data. Finally, they will learn how to manage and monitor running jobs.
Lessons
- Explain data streams and event processing
- Data Ingestion with Event Hubs
- Processing Data with Stream Analytics Jobs
Lab : Performing Real-Time Analytics with Stream Analytics
- Explain data streams and event processing
- Data Ingestion with Event Hubs
- Processing Data with Stream Analytics Jobs
After completing this module, students will:
- Be able to explain data streams and event processing
- Understand Data Ingestion with Event Hubs
- Understand Processing Data with Stream Analytics Jobs
Orchestrating Data Movement with Azure Data Factory
In this module, students will learn how Azure Data Factory can be used to orchestrate the data movement and transformation from a wide range of data platform technologies. They will be able to explain the capabilities of the technology and be able to set up an end to end data pipeline that ingests and transforms data.
Lessons
- Explain how Azure Data Factory works
- Azure Data Factory Components
- Azure Data Factory and Databricks
Lab : Orchestrating Data Movement with Azure Data Factory
- Explain how Data Factory Works
- Azure Data Factory Components
- Azure Data Factory and Databricks
After completing this module, students will:
- Understand Azure Data Factory and Databricks
- Understand Azure Data Factory Components
- Be able to explain how Azure Data Factory works
Securing Azure Data Platforms
In this module, students will learn how Azure provides a multi-layered security model to protect data. The students will explore how security can range from setting up secure networks and access keys, to defining permission, to monitoring across a range of data stores.
Lessons
- An introduction to security
- Key security components
- Securing Storage Accounts and Data Lake Storage
- Securing Data Stores
- Securing Streaming Data
Lab : Securing Azure Data Platforms
- An introduction to security
- Key security components
- Securing Storage Accounts and Data Lake Storage
- Securing Data Stores
- Securing Streaming Data
After completing this module, students will:
- Have an introduction to security
- Understand key security components
- Understand securing Storage Accounts and Data Lake Storage
- Understand securing Data Stores
- Understand securing Streaming Data
Monitoring and Troubleshooting Data Storage and Processing
In this module, the students will get an overview of the range of monitoring capabilities that are available to provide operational support should there be issue with a data platform architecture. They will explore the common data storage and data processing issues. Finally, disaster recovery options are revealed to ensure business continuity.
Lessons
- Explain the monitoring capabilities that are available
- Troubleshoot common data storage issues
- Troubleshoot common data processing issues
- Manage disaster recovery
Lab : Monitoring and Troubleshooting Data Storage and Processing
- Explain the monitoring capabilities that are available
- Troubleshoot common data storage issues
- Troubleshoot common data processing issues
- Manage disaster recovery
After completing this module, students will be able to:
- Explain the monitoring capabilities that are available
- Troubleshoot common data storage issues
- Troubleshoot common data processing issues
- Manage disaster recovery
Learning Path For Microsoft Azure Data Scientist Associate

Learning Path For Microsoft Azure Data Engineer Associate
Certification Camps has developed a comprehensive training / delivery format which focuses on learning beyond the core content accessible to any Microsoft training provider. Our program incorporates interactive demonstrations with explanations which go beyond the content of the book. Additional content, videos, labs & demonstrations are provided to expand on advanced topics - providing additional insight and perspective. Certification Camps training is not the typical book & PowerPoint presentation found at any local training center.
As a Microsoft Certified Partner with Gold Learning Competency - we adhere to the strict guidelines, standards and requirements to use Microsoft's exclusive curriculum. More over - our standards go beyond the "minimum requirements" set forth by Microsoft Learning.
We leverage our partnership benefits of courseware customization to build end to end technology training solutions. Students gain practical skills which can be implemented immediately.
At most training centers - learning starts on the first day of class and ends on the last day. Our boot camp training program is designed to offer resources before, during and after.
CERTIFICATION CAMPS FACILITIES
CAMPUS - Certification Camps built a state of the art training center with spacious classrooms, no sound transference between rooms, new desk, Herman Miller Aeron chairs & great common areas.
CLASSROOM SERVERS - Students work on a dedicated Dell 8700 / 8900 with an Intel i7 (6th generation), 32GB memory with 512GB SSD drives.
CAMPUS INTERNET - The campus is connected with a 300Mbs Verizon Fios Business Connection which provides complete internet (including VPN) access for students.
COMMON AREA - Unlike any training facility ever - break room with Mame Arcade, high end "kitchen" with snacks, drinks (Coffee, 100% juices, sodas, etc) and breathtaking terrace.
LODGING - We use the Marriott Fairfield Inn & Suites Lakewood Ranch. This "upgraded" hotel offers extremely comfortable beds, great breakfast and very fast (Verizon) internet access.
NEAR BY AMENITIES - Many shops, restaurants and grocery options are available within walking distance. Additionally - the hotel provided scheduled shuttle services. Restaurants like Cheesecake Factory, California Pizza Kitchen, Panera Bread, Bone Fish Grill, Ruby Tuesday's, Five Guys, Chipotle, Quiznos, Chili's and over 20 additional choices in the immediate area.