Job Summary:
Designing and implementing data streaming solutions using Kafka and Change Data Capture (CDC) tools
Building and managing data pipelines that capture realtime changes
Setting up monitoring and managing Kafka clusters for high availability scalability and security
Integrating Kafka with other data platforms and implementing stream processing applications
Ensuring data consistency and quality across CDC and Kafkabased systems
Tuning Kafka configurations for optimal performance and minimizing latency
Writing detailed documentation for data workflows and providing support to data engineers and analysts
Having experience with Apache Kafka CDC tools stream processing and knowledge of data systems
Familiarity with programming languages such as Java Python or Scala for stream processing and cloudbased data platforms
Possessing certifications in data engineering or specific Kafka certifications
Having strong technical expertise attention to detail and collaboration skills to work effectively across infrastructure and data teams.