Confluent

Confluent Developer Skills for Building Apache Kafka - Virtual English

21 hours
2125,00 €
Live Virtual Class
Live Virtual Class

Description

The Confluent Kafka Platform is a data transmission environment that allows organize and manage large amounts of data, as well as the second, the entry points of organizations in social networks. With Confluent, this growing flow of data is organized in Publication / Subscription, often unstructured, but incredibly valuable, Kafka Confluent becomes a unified and easily accessible data platform that is always available for many uses throughout the organization. These uses can easily be covered in Big Data analysis in batches with Hadoop and the feeding of real-time monitoring systems, up to the more traditional large-volume data integration tasks that require a high-performance backbone, extraction, transformation and load (ETL). Confluent Kafka offer customers different training classes, in particular, for administrators (implementation), for developers and the most modern communication of data with KSQL.

In this three-day Apache Kafka developer workshop, we will learn how to create an application that can publish data and subscribe to a Kafka group. Learn the role of Kafka in the modern line of data distribution, analyze the concepts and components of the Kafka architecture and review the APIs for the Kafka developers. The course also covers other components in the wider Confluent platform, such as Kafka Connect and Kafka Streams.

PUE is official Training Partner of Confluent, is authorized by this multinational company to deliver official training in Confluent technologies.

PUE is accredited and recognized to deliver consulting services and mentoring on implementing Confluent solutions in business environment with the added value in the practical business-centered focus of knowledge transferred from the official courses.

Audience and prerequisites

Esta formación está diseñada para desarrolladores de aplicaciones, desarrolladores de ETL (extracción, transformación y carga) y científicos de datos que necesitan interactuar con clústeres Kafka como fuente o destino de datos.

Se recomienda que los estudiantes estén familiarizados con el desarrollo en Java, .NET, C# o Python. Se requiere conocimiento práctico de la arquitectura de Apache Kafka, adquirida por medio de haber trabajado con la plataforma o mediante el  curso Confluent Fundamentals for Apache Kafka. Puedes comprobar tu conocimiento de Apache Kafka por medio de este cuestionario: https://cnfl.io/fundamentals-quiz

Objectives

At the end of the training, the student will get skills related to:

  • How to build an application that can publish data to and subscribe to data from an Apache Kafka® cluster.
  • The role of Kafka in the modern data distribution pipeline, discuss core Kafka architectural concepts and components.
  • Review the Kafka developer APIs.
  • Other components in the broader Confluent Platform, such as the Schema Registry, the REST Proxy, and KSQL.

Topics

Fundamentals of Apache Kafka

  • The Streaming Platform
  • The Commit Log & Log Structured Data Flow
  • Data Elements, Topics, Segments and Partitions
  • Log Replication & Log Compaction
  • Kafka Clients - Producers, Consumers & Kafka Connect
  • Producer Design, Serialization and Partitioning
  • Consumer Groups

Kafka’s Architecture

  • Kafka’s Commit Log, High Concurrency and Storage
  • Replicas for Reliability
  • Partitions and Consumer Groups for Scalability
  • Security Overview

Developing With Kafka

  • Programmatically Accessing Kafka
  • Writing a Producer in Java
  • Using the REST API to Write a Producer
  • Kafka’s Read Path
  • Writing a Consumer in Java
  • Using the REST API to Write a Consumer

More Advanced Kafka Development

  • Message Size & Durability
  • Enabling Exactly Once Semantics (EOS)
  • Specifying Offsets
  • Consumer Liveness & Rebalancing
  • Manually Committing Offsets
  • Partitioning Data

Schema Management In Kafka

  • An Introduction to Avro and Data Serialization
  • Avro Schemas and Schema Evolution
  • Using the Schema Registry

Data Pipelines with Kafka Connect

  • The Motivation for Kafka Connect
  • Types of Connectors
  • Kafka Connect Implementation
  • Standalone and Distributed Modes
  • Configuring the Connectors

Stream Processing with Kafka Streams

  • An Introduction to the Kafka Streams API
  • Kafka Streams Concepts
  • Creating a Kafka Streams Application
  • Kafka Streams by Example
  • Managing Kafka Streams Processing

Stream Processing with Confluent KSQL

  • KSQL for Apache Kafka
  • Writing KSQL Queries

Event Driven Architecture

  • Event Driven Platform
  • From CQRS to Event Sourcing
  • Microservices

Confluent Cloud

  • Confluent Cloud Overview
  • Using the Cloud CLI and Web UI
  • Configuring Kafka Clients

Open calls