Job Opening -- Kafka (Confluent) Admin Consultant -- (Remote/ Work from Home)

السعودية

Greetings of the day,

 

We are looking out for the below position for a project in Saudi Arabia.

 

Duration Of Contract : 12 Months (Extension till project finished)

Job Location: -- (Work from Home)

 

Open Position as below:

 

Kafka (Confluent) Admin Consultant

 

Key Responsibilities:

Key Responsibilities:

·        Kafka Administration: Manage the installation, configuration, and administration of Confluent Kafka clusters in both cloud and on-premise environments.

·        Monitoring and Alerts: Set up monitoring and alerting using tools such as Prometheus, Grafana, and Confluent Control Center to track Kafka cluster performance and health metrics.

·        Cluster Scaling: Plan and implement the horizontal and vertical scaling of Kafka clusters to handle increased data throughput and storage requirements.

·        Security Management: Implement and maintain security protocols for Kafka, including SSL/TLS encryption, Kerberos, and role-based access control (RBAC).

·        Backup and Recovery: Develop and manage Kafka backup, disaster recovery, and failover strategies to ensure data integrity and high availability.

·        Performance Tuning: Optimize Kafka brokers, ZooKeeper, producers, consumers, and connectors to improve performance, reduce latency, and manage data retention policies.

·        Kafka Connect and Stream Processing: Manage Kafka Connect for integrating data from various sources, and optimize Kafka Streams applications for real-time data processing.

·        Cluster Upgrades: Plan and execute cluster upgrades and patching of Kafka brokers, ZooKeeper, and Confluent components, ensuring minimal downtime.

·        Automation and Scripting: Develop automation scripts using Bash, Python, or Ansible to streamline Kafka operations such as cluster deployment, scaling, and monitoring.

·        Kafka Topics Management: Create, manage, and optimize Kafka topics, partitions, and replication settings, ensuring efficient use of cluster resources.

·        Troubleshooting: Diagnose and resolve issues related to Kafka performance, ZooKeeper, consumer groups, broker failures, and message processing delays.

·        Data Governance and Auditing: Implement data governance, audit logging, and compliance monitoring for Kafka topics and streams.

·        Collaboration: Work closely with development and DevOps teams to support Kafka-based applications, ensure smooth integration, and provide guidance on best practices.

·        Documentation: Maintain up-to-date documentation for Kafka environments, processes, and procedures, including incident response plans and operational guidelines.

Required Skills:

·        Kafka Expertise: In-depth experience with administering Apache Kafka and the Confluent Platform, including Kafka Streams, Kafka Connect, and Schema Registry.

·        Confluent Tools: Strong experience with Confluent-specific tools such as Confluent Control Center, Confluent Schema Registry, and Confluent REST Proxy.

·        ZooKeeper Administration: Solid understanding of ZooKeeper and its role in Kafka cluster management, including tuning and maintaining ZooKeeper ensembles.

·        Performance Optimization: Expertise in optimizing Kafka brokers, topics, partitions, and producers/consumers for high-throughput, low-latency messaging.

·        Scripting and Automation: Proficiency in scripting languages such as Bash, Python, and automation tools like Ansible for automating Kafka cluster tasks.

·        Security: Strong understanding of Kafka security configurations, including encryption (SSL/TLS), authentication (SASL/Kerberos), and authorization (ACLs, RBAC).

·        Cloud Deployments: Experience with deploying and managing Kafka clusters in cloud environments like AWS, Azure, or GCP, including leveraging Kubernetes for Kafka on containers.

·        Troubleshooting: Proven ability to troubleshoot Kafka performance, consumer-lag issues, replication lags, and ZooKeeper synchronization issues.

·        Data Integration: Experience with Kafka Connect, integrating various data sources (e.g., databases, message queues) and sinks with Kafka clusters.

·        Monitoring and Logging: Experience with Kafka monitoring tools such as Prometheus, Grafana, and ELK stack (Elasticsearch, Logstash, Kibana) for Kafka log aggregation and analysis.

 

If interested please send us your profile across hyd@ goaheadconsulting .co.uk with below details

 

1.                       Notice period.

2.                       Can you Join in 2 Weeks?

3.                       Are you OK for Contract role?

4.                       Current Salary?

5.                       Expected Salary?

6.                       Are you Okay to Work full time as per Saudi Standard Time?

 

 

If not interested your references will be much appreciated.

 

Best Regards,

Mohammed Masood  

Skills

 Kafka Expertise: In-depth experience with administering Apache Kafka and the Confluent Platform, including Kafka Streams, Kafka Connect, and Schema Registry.

·        Confluent Tools: Strong experience with Confluent-specific tools such as Confluent Control Center, Confluent Schema Registry, and Confluent REST Proxy.

·        ZooKeeper Administration: Solid understanding of ZooKeeper and its role in Kafka cluster management, including tuning and maintaining ZooKeeper ensembles.

·        Performance Optimization: Expertise in optimizing Kafka brokers, topics, partitions, and producers/consumers for high-throughput, low-latency messaging.

·        Scripting and Automation: Proficiency in scripting languages such as Bash, Python, and automation tools like Ansible for automating Kafka cluster tasks.

·        Security: Strong understanding of Kafka security configurations, including encryption (SSL/TLS), authentication (SASL/Kerberos), and authorization (ACLs, RBAC).

·        Cloud Deployments: Experience with deploying and managing Kafka clusters in cloud environments like AWS, Azure, or GCP, including leveraging Kubernetes for Kafka on containers.

·        Troubleshooting: Proven ability to troubleshoot Kafka performance, consumer-lag issues, replication lags, and ZooKeeper synchronization issues.

·        Data Integration: Experience with Kafka Connect, integrating various data sources (e.g., databases, message queues) and sinks with Kafka clusters.

·        Monitoring and Logging: Experience with Kafka monitoring tools such as Prometheus, Grafana, and ELK stack (Elasticsearch, Logstash, Kibana) for Kafka log aggregation and analysis.

تاريخ النشر: 22 ربيع الأول 1446 - اليوم
الناشر: Bayt
تاريخ النشر: 22 ربيع الأول 1446 - اليوم
الناشر: Bayt