In this three-day hands-on course you will learn how to build an application that can publish data to and subscribe to data from an Apache Kafka cluster. You will learn the role of Kafka in the modern data distribution pipeline, discuss core Kafka architectural concepts and components, and review the Kafka developer APIs. In addition to Kafka, Kafka Connect and Kafka Streams, the course also covers other components in the broader Confluent Platform, such as the Schema Registry, the REST Proxy and KSQL.
Who Should Attend?
This course is designed for application developers, ETL (extract, transform, and load) developers, and data scientists who need to interact with Kafka clusters as a source of, or destination for, data.
Attendees should be familiar with developing professional apps in Java (preferred), .NET, C# or Python. Additionally, a working knowledge of the Apache Kafka architecture is required for this course, either through prior experience or by taking the recommended prerequisite, Confluent Fundamentals for Apache Kafka.
To evaluate your Kafka knowledge for this course, please complete the self-assessment:
The free, On Demand course Confluent Fundamentals for Apache Kafka is available at https://cnfl.io/freefundamentals.
Participants are required to provide a laptop computer with unobstructed internet access to fully participate in the class.
This is a three-day training course.
- The Motivation for Apache Kafka
- Kafka Fundamentals
- Kafka’s Architecture
- Developing With Kafka
- More Advanced Kafka Development
- Schema Management in Kafka
- Kafka Connect for Data Movement
- Basic Kafka Installation and Administration
- Kafka Stream Processing
You may attend this public training courses without registering for the conference.