Kafka Administrator
Kafka Administrator
10
Applications
Not Accepting Applications
About the Job
Skills
We are seeking an experienced Confluent Kafka Administrator/Developer to manage and optimize our Kafka ecosystem. The ideal candidate will be responsible for ensuring the reliability, performance, and scalability of our Kafka infrastructure. The ideal candidate will also work with development teams to implement effective data streaming solutions and enhance our data processing capabilities. This is role which requires strong experience with Confluent Kafka administration and also strong experience with developing low latency data streaming solutions.
Key Responsibilities:
- Install, configure, and maintain Confluent Kafka and its ecosystem components (e.g., schema registry, KSQL, Kafka Connect).
- Monitor and troubleshoot Kafka clusters to ensure maximum uptime and performance.
- Optimize Kafka configurations for throughput, latency, and resource utilization.
- Implement security best practices for Kafka installations, including authentication and authorization.
- Collaborate with development teams to design data streaming solutions that meet business requirements.
- Develop frameworks and patterns for data streaming solutions using solutions like Python, Spark, Flink.
- Conduct regular performance tuning and capacity planning for Kafka clusters.
- Manage data retention policies and strategies to ensure efficient use of storage resources.
- Document Kafka architecture, best practices, and standard operating procedures.
- Provide training and support to team members on Kafka and related technologies.
- Stay updated with the latest Kafka and Confluent technologies and trends.
Qualifications:
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- 5+ years of experience in Kafka administration, with a focus on the Confluent platform.
- Strong understanding of distributed systems, messaging systems, and data processing principles.
- Proficient in Kafka tooling, command-line utilities, and monitoring frameworks.
- Experience with relevant programming languages (e.g., Java, Python) and scripting.
- Experience with Spark, Spark Streaming, Flink is required.
- Knowledge of cloud platforms (AWS, Azure, GCP) and containerization technologies (Docker, Kubernetes) is a plus.
- Experience with Infrastructure as Code (Iac) solutions like Terraform are required. Experience with Terraform scripting for automating Iac is required.
- Experience with data integration and ETL processes to translate this experience to design and develop data streaming solutions
- Excellent problem-solving skills and attention to detail.
- Strong communication and team collaboration skills.
Preferred Qualifications:
- Kafka certification (Confluent Certified Administrator and Developer ) is a plus.
- Experience with other streaming technologies (e.g., Apache Pulsar, RabbitMQ) is advantageous.
- Understanding of data governance and compliance regulations.
About the company
Industry
Staffing and Recruiting
Company Size
51-200 Employees
Headquarter
Malaysia
Other open jobs from Twinpacs Sdn Bhd