TQUSI0531_5336 - Kafka EDF

Job Type: C2H

Work Mode: Hybrid (3 Days from office)

Role - Kafka EDF

Experience- 8+ Years

Location- Chennai/Hyderabad/Bangalore


Role Overview

We are looking for an experienced Kafka EDF (Event-Driven Framework) Engineer to design, develop, and support high-throughput, real-time streaming platforms. The role focuses on building scalable event-driven architectures using Apache Kafka and integrating enterprise systems for business-critical use cases at HCLTech.


Key Responsibilities

  • Design and implement event-driven architectures (EDA) using Kafka
  • Develop and manage Kafka producers, consumers, topics, and schemas
  • Build real-time data streaming pipelines and integrations
  • Ensure high availability, fault tolerance, and scalability of Kafka platforms
  • Perform monitoring, tuning, and performance optimization
  • Handle production issues (L3 support) and conduct root cause analysis
  • Work closely with microservices, data engineering, and cloud teams
  • Implement security, governance, and data compliance standards
  • Support CI/CD, release, and deployment processes


Required Skills

  • Strong experience with Apache Kafka and Event-Driven Frameworks (EDF)
  • Expertise in Kafka architecture (brokers, partitions, replication, offsets)
  • Hands-on with Kafka Connect, Kafka Streams
  • Experience with Schema Registry (Avro/JSON/Protobuf)
  • Proficiency in Java / Scala / Python
  • Experience integrating Kafka with microservices
  • Understanding of distributed systems and messaging patterns



Good to Have

  • Experience with Confluent Platform
  • Exposure to cloud-based Kafka (GCP / AWS / Azure)
  • Knowledge of Docker, Kubernetes
  • Experience with monitoring tools (Prometheus, Grafana)
  • BFSI / Telecom / Retail domain experience


Want To
WORK FOR YOU?

GET THE QUOTE

Want To
WORK WITH US?

CAREER