ETL Data Senior Developer at The Circana, Apply now!

Full-time

Circana

Job Details

๐Ÿ“ Location: Cape Town, Western Cape, South Africa
๐Ÿข Company: Circana
๐Ÿ’ผ Job Type: Full-Time
๐Ÿ†” Job ID: ETLDE017672
๐ŸŒ Department: Software Engineering
๐Ÿ•’ Work Model: Hybrid / Flexible


Company Overview

Circana is a global organisation focused on data, analytics, and technology solutions. The company promotes a culture of continuous learning, innovation, and inclusivity, empowering employees to contribute meaningfully while maintaining flexibility and work-life balance. Recognised as a Great Place to Workยฎ, Circana values diverse perspectives and encourages collaboration across global teams.


Job Overview

Circana is seeking a Senior ETL Data Developer to join its Global Professional Services team. This role focuses on designing, building, and optimising large-scale data pipelines and cloud-based data infrastructure.

You will act as a bridge between client requirements and technical solutions, leveraging modern data engineering tools to deliver scalable, efficient, and high-performance data systems. The position requires both strong technical expertise and the ability to collaborate across international teams.


Key Responsibilities

ETL / ELT Development

  • Design and optimise scalable ETL/ELT pipelines using Python, PySpark, and Apache Airflow
  • Implement batch and real-time data processing solutions using Apache Spark
  • Ensure data quality, governance, and security throughout the data lifecycle

Cloud Data Engineering

  • Manage and optimise Azure-based data infrastructure
  • Implement CI/CD pipelines for reliable deployment of data workflows
  • Ensure cost-effective and efficient cloud operations

Big Data Processing & Analytics

  • Build and optimise large-scale data pipelines using Apache Spark and PySpark
  • Apply partitioning, caching, and performance tuning techniques
  • Work with structured and unstructured datasets for analytics and ML use cases

Workflow Orchestration

  • Develop and maintain Airflow DAGs for automated workflows
  • Monitor, troubleshoot, and optimise job execution

Leadership & Collaboration

  • Provide mentorship and technical guidance to data engineering teams
  • Collaborate with global stakeholders and cross-functional teams
  • Maintain best practices in coding, documentation, and version control

Qualifications

Education

  • Degree in Computer Science, Engineering, or related field

Experience

  • Minimum 5+ years in Data Engineering
  • Proven experience with Azure Data Services, PySpark, Apache Spark, and Airflow
  • Experience in client-facing or consulting environments

Technical Skills

  • Strong Python and SQL programming
  • Deep understanding of Spark architecture (RDDs, DataFrames, DAG execution)
  • Experience with CI/CD pipelines, Docker, and Kubernetes
  • Knowledge of data governance, security, and performance optimisation

CV Tips

  • Highlight hands-on experience with Azure, PySpark, and Airflow projects
  • Showcase measurable impact (e.g., performance improvements, pipeline efficiency)
  • Include experience with large-scale or real-time data systems
  • Mention any leadership, mentoring, or stakeholder engagement experience

Benefits of Joining Circana

  • Work on cutting-edge data and AI-driven projects
  • Flexible and hybrid work environment
  • Global collaboration opportunities
  • Strong culture of learning, innovation, and inclusion
  • Career growth within a high-performing international team

Company Policy

Circana is an equal opportunity employer committed to diversity, equity, and inclusion. The company fosters an environment where all employees can bring their authentic selves to work and contribute to shared success.

To apply for this job please visit jobs.circana.com.