Job title: Kafka Platform Engineer
Location: - Austin, TX ( Onsite )
Experience: 10+ Years
Visa: H1B only
Mandatory Skills:
- Experience in monitoring and troubleshooting large scale data platforms, Data Streaming pipelines, complex backend services.
- Able to effectively communicate incidents, coordinate incident responses, document/manage runbooks, set up processes to support other software engineering teams and data analysts.
- Experience in managing large datasets.
- Experience in creating Kubernetes configurations, managing services in Kubernetes, building docker images.
- Demonstrate good understanding & troubleshooting Kafka jobs.
- Kafka platform design, installation, operation, and the best practices for Brokers, Zookeepers, Kafka Connect/Connectors, Security Settings, JMX for Kafka monitoring and performance tuning.
- Setting up new kafka clusters and onboarding kafka API’s.
- Topic management (creating, deleting, enable, disable, monitor).
- Upgrade, Installation, Patching and Deployment for Kafka.
- Experience in developing/maintaining automation tools in programming/scripting languages like Python
- Demonstrates good understanding in Big data ecosystems such as HDFS, Kafka, SQL etc.
- Experience in using IT automation tools such as AlgoCD, and job orchestrator systems such as AirFlow, Jenkins
- Experience in a cloud based environment (AWS/GCP etc.) is a big plus.
- Experience in monitoring tools such as Splunk, Prometheus/Grafana is a big plus.
- Experience with Full-Stack web development (React, Django etc.) is a big plus.
- Responsibilities Support large scale data pipelines and backend services by monitoring and troubleshooting/recovering incidents .
- Deployment and support kafka clusters.
- Build and operate data management infra services.
- Automate build, deployment, and monitoring
- Participate at an onCall & release turn
- Taking an oncall turn including weekend coverage