The Confluent Kafka Platform is a data transmission environment that allows organize and manage large amounts of data, as well as the second, the entry points of organizations in social networks. With Confluent, this growing flow of data is organized in Publication / Subscription, often unstructured, but incredibly valuable, Kafka Confluent becomes a unified and easily accessible data platform that is always available for many uses throughout the organization. These uses can easily be covered in Big Data analysis in batches with Hadoop and the feeding of real-time monitoring systems, up to the more traditional large-volume data integration tasks that require a high-performance backbone, extraction, transformation and load (ETL). Confluent Kafka offer customers different training classes, in particular, for administrators (implementation), for developers and the most modern communication of data with KSQL.
In this three-day Apache Kafka training workshop, you will learn how to build and manage Kafka clusters using industry best practices developed by the world's leading Apache Kafka experts. You will learn how Kafka and the Confluent platform work, their main subsystems, their functions, how they interact and how to configure, manage and adjust your cluster.
PUE is official Training Partner of Confluent, is authorized by this multinational company to deliver official training in Confluent technologies.
PUE is accredited and recognized to deliver consulting services and mentoring on implementing Confluent solutions in business environment with the added value in the practical business-centered focus of knowledge transferred from the official courses.
Audience and prerequisites
This training is designed for engineers, system administrators, and operations staff responsible of building, managing, monitoring, and tuning Kafka clusters.
The students should be familiar with Linux/Unix, and understand basic TCP/IP networking concepts. Familiarity with the Java Virtual Machine (JVM) is helpful.
Prior knowledge of Kafka or complete the course Confluent Fundamentals of Apache Kafka is recommended, but is not required. To evaluate your Kafka knowledge for this course, you can complete this anonymous self-assessment here:
At the end of the training, the student will get skills related to:
- Using Kafka’s command-line tools.
- Automating configuration.
- Using Kafka’s administrative tools.
- Tuning Producer and Consumer performance.
- Securing the cluster.
- Building data pipelines with Kafka Connect.
Fundamentals of Apache Kafka
- Kafka as a Distributed Streaming Platform.
- The Distributed Log.
- Producer and Consumer Basics.
- Kafka’s Commit Log.
- Replication for High Availability.
- Partitions and Consumer Groups for Scalability.
- Security Overview.
- Data Replication.
- Failure Recovery.
- Log Files and Offset Management.
- Exactly-Once Semantics (EOS).
Managing a Cluster
- Installing the Confluent Platform.
- Configuration Management.
- Log Retention and Compaction.
- Commissioning and Decommissioning Brokers.
Optimizing Kafka’s Performance
- Monitoring, Testing, and Tuning Brokers and Kafka Clients.
- The Consumer Group Protocol.
- Transport Encryption.
- Securing Apache Kafka and the Complete Confluent Platform.
- Migrating to a Secure Cluster.
Data Pipelines with Kafka Connect
- The Motivation for Kafka Connect.
- Types of Connectors.
- Kafka Connect Implementation.
- Standalone and Distributed Modes.
- Configuring the Connectors.
Kafka in Production
- Kafka Reference Architecture for Apache Kafka and the Complete Confluent Platform.
- Capacity Planning.
- Multi Data Center Deployments.