Job Details

JPC-232450 - Kafka Admin
Experience:
7 - 10 years
Qualification:
Job Location:
Pune
Job Type:
Contract
Skills:
Kafka Admin
Vacancies:
0
Job Posted: May 15, 2024 | Total views: 1

Job Description:

  • Job Description

    OLD CSA U2VTV3, Please refer duplicates from old csa RTHY Job Responsibility Primary responsibility of Confluent Platform Administrator will be to build, test and maintain Kafka Cluster and its eco-system which is deployed to run Data Streaming use cases for various business units Experience Overall 5+ years of experience out of which 2+ years around Confluent Platform administration Mandatory Job Requirements Manage single and multi-node Kafka cluster deployed on VM, Docker and Kube
    etes Container platform. Experience with Confluent Platform running on-prem · Perform Kafka Cluster build, including Design, Infrastructure planning, High Availability and Disaster Recovery · Implementing wire encryption using SSL, authentication using SASL/LDAP & authorization using Kafka ACLs in Zookeeper, Broker/Client, Connect cluster/connectors, Schema Registry, REST API, Producers/Consumers, Ksql · Perform high-level, day-to-day administration and support functions ·Upgrades for the Kafka Cluster landscape comprising of Development, Test, Staging and Production/DR systems · Creation of key performance metrics, measuring the utilization, performance, and overall health of the cluster. · Capacity planning and implementation of new/upgraded hardware and software releases as well as for storage infrastructure. · Research and recommend innovative ways to maintain the environment and where possible, automate key administration tasks. ·Ability to work with various infrastructure, administration, and development teams across business units · Document and share design, build, upgrade and standard operating procedures. Conduct knowledge transfer sessions and workshops for other members in the team. Provide technical expertise and guidance to new and junior members in the team · Create topics, setup Apache Kafka MirrorMaker 2, Confluent Replicator to replicate the topics, create connect clusters, Schemas for the topics using Confluent Schema Registry · Configure various Opensource and licensed Kafka Source/Sink Connectors such as Kafka Connect for SAP HANA, Debezium Oracle and MySQL Connectors, Confluent JDBC source/sink, Confluent ADLS2 Sink connector and Confluent Oracle CDC source connector... · Develop and maintain Unix scripts to perform day to day Kafka Admin and Security related functions using Confluent REST Proxy server · Setting up monitoring tools such as Prometheus, Grafanato scrape metrics from various Kafka cluster components (Broker, Zookeeper, Connect, REST proxy, Mirror Maker, Schema Registry ...) and other endpoints such as webservers, databases, logs etc. and configure alerts for Kafka Cluster and supporting infrastructure to measure availability and performance SLAs · Experience with Confluent ksql to query and process Kafka streams · Knowledge of Kafka Producer and Consumer APIs, Kafka Stream Processing, Confluent Ksql · Availability to work in shifts, extended hours and to provide on-call support as required. There will be work over weekends at times depending on the project needs. · Must have excellent communications and interpersonal skills Preferred but Optional skills · Linux (SLES or RHEL) system administration (basic or advanced), creating shell scripts .. · Working experience on docker and Kube
    etes clusters (opensource, Rancher, RedHat OCP, VMWare Tanzu) involving administration of containers(Operator level skills), deployments, updates, integration with products running outside of the cluster · Working knowledge with container registry such as Harbor, Quay, Nexus etc. Exposure to Container/artifact scanners such as Trivy, Claire … · Security related config for above listed software or any other tools in SSL for wire encryption, integration with AD for authentication and RBAC for authorizations · Implemented and supported any enterprise product such as any well-known ERP products, Data warehouse, Middleware etc. · Database administration skills in Oracle, MSSQL, SAP HANA, DB2, Aerospike, Postgres .. · Exposure to SaaS based observability platform like New Relic · Deployment of container images and pods using CI/CD pipelines using Jenkins or comparable tools. · Experience in building Kafka deployment pipelines using Terraform, Ansible, Cloud formation templates, shells etc. Worked in Public cloud environment such asAzure or AWS or GCP, preferably in Azure Note - Resource need to ready for F2F Intv at IBM location based on account request and Day 1 reporting post OB.

     


About Company :
Purview is a leading Digital Cloud & Data Engineering company headquartered in Edinburgh, United Kingdom having a presence in 14 countries India (Hyderabad, Bangalore, Chennai and Pune), Poland, Germany, Finland, Netherlands, Ireland, USA, UAE, Oman, Singapore, Hong Kong, Malaysia and Australia.

We have a strong presence in UK, Europe and APEC, providing services to Captive Clients (HSBC, NatWest, Northern Trust, IDFC First Bank, Nordia Bank etc) in fully managed solutions and co-managed capacity models. Also, we support various top IT tier 1 organisations (Capgemini, Deloitte, Wipro, Virtusa, L&T, CoForge, TechM and more) to deliver solutions and workforce/resources.

Company Info:
IN:
3rd Floor, Sonthalia Mind Space
Near Westin Hotel, Gafoor Nagar
Hitechcity, Hyderabad
Phone: +91 40 48549120 / +91 8790177967

UK:
Gyleview House, 3 Redheughs Rigg,
South Gyle, Edinburgh, EH12 9DQ.
Phone: +44 7590230910
Email: careers@purviewservices.com