Website PickTwo
**About PickTwo**
We help African organisations modernise their data estates so AI initiatives can operate on trustworthy, well-governed foundations.
**Role Snapshot**
– Design resilient data pipelines that feed analytics, ML, and real-time use cases
– Optimise data infrastructure for performance, governance, and cost discipline
– Serve as a technical partner to AI engineers and product teams
**What You Will Do**
– Build batch and streaming pipelines using tools such as dbt, Spark, Airflow, and Kafka
– Implement data quality frameworks, observability, and lineage tracking
– Automate infrastructure with Terraform or Pulumi across AWS/Azure/GCP
– Collaborate with stakeholders to define schemas, SLAs, and documentation
**What You Bring**
– 4+ years building production data platforms in Python, SQL, and cloud-native services
– Strong command of data warehousing (Snowflake, BigQuery, Redshift) and lakehouse patterns
– Experience with event-driven architectures and API integrations
– Security-first approach to data governance and compliance
**Why This Role Matters**
Your work ensures AI teams have reliable, well-curated data—positioning PickTwo clients for sustained machine learning success.
To apply for this job email your details to support@picktwo.africa.