
Data Engineer - South Africa
- Johannesburg, Gauteng
- Contract
- Full-time
- Design, develop, and maintain data pipelines for ingesting and transforming data from various sources (including APIs, flat files, DBs).
- Integrate messaging systems like Kafka and RabbitMQ into the pipeline for event streaming and system decoupling.
- Work closely with Business Analysts and BPEs to translate client requirements into technical data flows.
- Implement and manage connectors for REST APIs, and file-based batch processes.
- Ensure data lineage, auditability, and performance tuning of NiFi flows.
- Deploy and monitor data flows in distributed environments using Zookeeper, Kafka, and ElasticSearch for logging and observability.
- Strong experience in designing and implementing data integration solutions for real-time streaming data.
- 3–5+ years of experience in data integration/engineering roles.
- Experience with Apache NiFi (flow design, templates, version control, custom processors).
- Proficient in Kafka, RabbitMQ, and event-driven architectures.
- Experience integrating with legacy systems ((BAPIs, IDocs, OData).
- Scripting and transformation in Python, Groovy, or Java.
- Familiarity with distributed systems (Zookeeper, ElasticSearch, etc.).
- Experience with CI/CD for NiFi or container-based deployments (Docker, Kubernetes).
- Knowledge of security practices (TLS, role-based access in NiFi).
- Hands-on with monitoring tools (Prometheus, Grafana).
- Familiarity with cloud platforms (AWS/GCP/Azure) for data storage or processing.
- Eligibility for Top-Level Security Clearance:
- Candidates must be eligible to obtain and maintain security clearance at the highest level, in accordance with applicable national security regulations.
- On-Site Work Requirement:
- This role requires full-time, on-site presence at the client’s premises located in Pretoria. Remote or hybrid work arrangements are not applicable.