The Kafka Engineer is responsible for designing, developing, and optimizing high performance streaming solutions using modern data and application platforms. This role involves building resilient microservices, creating and maintaining streaming pipelines, and ensuring reliable, secure, and scalable data delivery across distributed systems. The engineer will also contribute to technical documentation, debugging, and ongoing maintenance of services.
- Develop and optimize Kafka producers, consumers, and Kafka Streams for high volume, low latency data pipelines.
- Build and maintain Spring Boot microservices for streaming, integration, and API functionality.
- Design and implement RESTful APIs, including versioning, validation, pagination, and structured error handling.
- Ensure data quality, consistency, and reliability across distributed streaming systems.
- Implement observability, including metrics, structured logging, and lag monitoring.
- Manage deployments and operations on OpenShift (OCP), including Helm charts, ConfigMaps/Secrets, resource tuning, readiness/liveness probes, and rollout strategies.
- Troubleshoot services running in containerized environments (OpenShift/Kubernetes).
- Collaborate with architecture teams on schema evolution, integration patterns, and event?driven design.
- 5+ years of Java development with strong Spring Boot expertise.
- Proven experience building secure, scalable REST APIs (OAuth2/JWT, validation, standardized error responses).
- Hands on Kafka experience: Kafka Streams, Avro/JSON, Schema Registry.
- Experience deploying and supporting workloads on OpenShift/OCP.
- Solid understanding of distributed systems, event driven architectures, and streaming patterns.
- Familiarity with CI/CD pipelines and container orchestration tools.
- Bachelor’s Degree in Computer Science or related field.
- Experience with API gateways, OAuth2 security, and enterprise API standards.
- Exposure to master data management concepts and advanced streaming best practice
#INDGEN #ZR