• Source Data Integration (EN)

  • Group Training

    Data is recorded in various source systems. Collecting and copying source data is called source access, or data access. Essential for companies to make the right decisions.

    Training code
    CGASDAINCE
    Spoken Language
    English
    Language Materials
    English
    Dayparts
    2
    Price
    €800,00
    excl. VAT No extra costs.

    Book Source Data Integration (EN) now

    In group training, we use several learning methods to help you obtain the knowledge, give you helpful insights and get you inspired. Check the Spoken language and Language materials on the left for language info.

    • 14-4-2025
      Utrecht
      €800,00
      €800,00
     

    What is Source Data Integration

    In today’s data-rich environment, extracting insights from data is crucial for businesses to make informed decisions. Data related to business operations is recorded in various source systems. By collecting and linking data from these different source systems, we can derive valuable insights. The more data we collect, the more insights we can extract. This process of collecting and copying source data is known as Source Data Integration.
    In this training, you will learn about:

    • The process of determining whether new data sources should be unlocked or expanded based on customer information needs.
    • Various data delivery techniques, including file delivery, web-based delivery through an API, database table exports, direct data extraction from a source table, and Change Data Capture (CDC).
    • The technique with which data can be processed and stored in, for example, a data warehouse. You will be introduced to Extract, Transform and Load (ETL).
    • Different data delivery methods, including full, incremental, stackable, and event-driven delivery methods.
    • Building up the history of source data in a data warehouse and deriving “deltas” or new and changed source data based on the delivery method.
    • The agreements that need to be made with the source to specify the data to be supplied, the formats, delivery technique, method, frequency, and times. You will learn about interfaces, metadata, and Service Level Agreements (SLA).
    • Determining the quality of data delivery and why it’s important.
    • The benefits of standardizing and reusing ETL jobs and how they can be made generic and metadata-driven.

    During the training the application of what you learn is central and you are challenged to work with the given theory in practical cases. The training is given by people who work in practice within Data Engineering and know what it is about. They would like to transfer that practical knowledge.

    Our trainers, with their wealth of practical experience, bring theoretical concepts to life by providing insights drawn from real-world scenarios and sharing best practices. This Source Data Integration training is designed to empower you with the necessary skills to adeptly navigate the intricate realm of data integration, thereby enabling you to extract maximum value from your organization’s data assets. This hands-on approach ensures that you are not just learning the concepts, but also understanding how to apply them in real-world situations to drive tangible results.

     
     

    Who should attend Source Data Integration


    • Data Engineers: Professionals who design, build, and manage the data infrastructure. They develop the architecture that helps analyze and process data in the way the organization needs it.
    • Data Analysts: They manipulate large data sets and use them to identify trends and reach meaningful conclusions to inform strategic business decisions.
    • Business Intelligence Professionals: They use data to help figure out market and business trends by analyzing data to develop a clearer picture of where the business stands.
    • Data Science Managers: Professionals who oversee the data science team, guiding the raw data journey to useful insights.
    • IT Managers: They are responsible for coordinating, planning, and leading computer-related activities in an organization.

    Prerequisites

    There are no specific requirements to participate in this training.
    The use of a laptop is required for this training.

    Objectives

    At the end of the training, you will be able to:

    • Determine when a new source should be integrated.
    • Understand different delivery techniques and delivery methods.
    • Understand what ETL is.
    • Guarantee the completeness and correctness of source data.
     
    Incompany

    Data is recorded in various source systems. Collecting and copying source data is called source access, or data access. Essential for companies to make the right decisions.

    Training code
    CGASDAINCE
    Spoken Language
    English
    Language Materials
    English
    Dayparts
    2
    Price
    €800,00
    excl. VAT No extra costs.

    With an Incompany training you have several advantages:

    - You choose the location
    - You experience the training with your colleagues, so it is always in line with your practice
    - The trainer can tailor explanations, examples and assignments to your organization
    - In consultation exercises can be adapted to organization-specific questions

    Request more information or a quote.

     

    What is Source Data Integration

    In today’s data-rich environment, extracting insights from data is crucial for businesses to make informed decisions. Data related to business operations is recorded in various source systems. By collecting and linking data from these different source systems, we can derive valuable insights. The more data we collect, the more insights we can extract. This process of collecting and copying source data is known as Source Data Integration.
    In this training, you will learn about:

    • The process of determining whether new data sources should be unlocked or expanded based on customer information needs.
    • Various data delivery techniques, including file delivery, web-based delivery through an API, database table exports, direct data extraction from a source table, and Change Data Capture (CDC).
    • The technique with which data can be processed and stored in, for example, a data warehouse. You will be introduced to Extract, Transform and Load (ETL).
    • Different data delivery methods, including full, incremental, stackable, and event-driven delivery methods.
    • Building up the history of source data in a data warehouse and deriving “deltas” or new and changed source data based on the delivery method.
    • The agreements that need to be made with the source to specify the data to be supplied, the formats, delivery technique, method, frequency, and times. You will learn about interfaces, metadata, and Service Level Agreements (SLA).
    • Determining the quality of data delivery and why it’s important.
    • The benefits of standardizing and reusing ETL jobs and how they can be made generic and metadata-driven.

    During the training the application of what you learn is central and you are challenged to work with the given theory in practical cases. The training is given by people who work in practice within Data Engineering and know what it is about. They would like to transfer that practical knowledge.

    Our trainers, with their wealth of practical experience, bring theoretical concepts to life by providing insights drawn from real-world scenarios and sharing best practices. This Source Data Integration training is designed to empower you with the necessary skills to adeptly navigate the intricate realm of data integration, thereby enabling you to extract maximum value from your organization’s data assets. This hands-on approach ensures that you are not just learning the concepts, but also understanding how to apply them in real-world situations to drive tangible results.

     
     

    Who should attend Source Data Integration


    • Data Engineers: Professionals who design, build, and manage the data infrastructure. They develop the architecture that helps analyze and process data in the way the organization needs it.
    • Data Analysts: They manipulate large data sets and use them to identify trends and reach meaningful conclusions to inform strategic business decisions.
    • Business Intelligence Professionals: They use data to help figure out market and business trends by analyzing data to develop a clearer picture of where the business stands.
    • Data Science Managers: Professionals who oversee the data science team, guiding the raw data journey to useful insights.
    • IT Managers: They are responsible for coordinating, planning, and leading computer-related activities in an organization.

    Prerequisites

    There are no specific requirements to participate in this training.
    The use of a laptop is required for this training.

    Objectives

    At the end of the training, you will be able to:

    • Determine when a new source should be integrated.
    • Understand different delivery techniques and delivery methods.
    • Understand what ETL is.
    • Guarantee the completeness and correctness of source data.
     
  • Related

    Fields of Expertise
    Data
     
  • e-CF competences with this course

     

    At Capgemini Academy we believe in transparency and clarity in the training landscape. That is why, in the table below, we show you to which e-CF competence this training or certification contributes. For more information about how to use the e-Competence Framework read more here. If you want to know how you can apply the e-CF within your organization, read more on this page.

    e-Competence Level12345
    A.6.Application Design     
    B.2.Component Integration     
    D.10.Information and Knowledge Management