Data Engineer-2
Job details
Let's Write Africa's Story Together! Old Mutual is a firm believer in the African opportunity and our diverse talent reflects this. Job Description ROLE OVERVIEW Develop data products & data warehouse solutions in on-premises and cloud environments using cloud-based services, platforms, and technologies. As an AWS Data Engineer, you will design and maintain data analytic road maps and data structures that support business and technology objectives. Dynamic and results-driven Data Engineer with extensive experience in designing, developing, and deploying high-performance, re-usable data engineering pipelines and data products using Python, Dbt, and Apache Airflow. Proven expertise in building scalable data pipelines and real-time processing systems that enhance operational efficiency and drive business insights. Strong background in microservices architecture, cloud technologies, and agile methodologies. KEY RESULT AREAS Operational Delivery
- Building the Data Lake using AWS technologies like S3, Quicksight
- Building data APIs and data delivery services to support critical operational and analytical applications
- Participate in the development of workflow, coding, testing, and deployment solutions
- Implement unit testing for all assigned deliverables to ensure deployment success
- Support and maintain solutions
- Design and develop data models using dimensional modelling and data vault techniques
- Work from high-level requirements through to detailed specifications, prototypes, software deployment, and administration
- Deliver incremental business value per delivery phase(sprint/cycle)
- Deliver iteratively throughout the cycle
- Conduct peer reviews within and across squads
- Profile and analyze data sets
- Participate in the engineering and other discipline’s community of practice
- Share AWS knowledge and practical experience with the community
- Challenge and contribute to the development of architectural principles and patterns
- Ensure solutions adhere to Olympus patterns, guidelines, and standards
- Operate within project environments and participate in continuous improvement efforts
- Follow and participate in defined ways of work including, but not limited to, sprint planning, backlog grooming, retrospectives, demos, and PI planning
- Experience of developing solutions in the cloud
- At least 2-3 years' experience with designing and developing Data Pipelines for Data Ingestion or Transformation using AWS technologies
- Experience in developing data warehouses and data marts
- Experience in Data Vault and Dimensional modelling techniques
- Experience working in a high availability DataOps environment
- Experience working with automated data warehousing solutions would be advantageous
- Orchestration with Apache Airflow
- CI/CD
- Github
- dbt Core
- Bachelor’s Degree in Computer Science or similar fields like Information Systems, Big Data, etc.
- AWS Data Engineer Certification would be advantageous
- Related Technical certifications
Apply safely
To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.