Job details
Role Overview: As an Airflow Engineer, you will be responsible for designing, implementing, and maintaining data workflows and pipelines, leveraging Apache Airflow to ensure efficient data processing and orchestration. Your role will involve managing ETL processes, optimizing data flow, ensuring data quality, and collaborating with cross-functional teams to support business-critical operations. Experience:
- 2-5 years of experience in managing workflows and data pipelines using Apache Airflow.
- Proficiency in Python for scripting and data manipulation.
- Strong experience with SQL for database design and querying.
- Familiarity with containerization tools such as Docker and Kubernetes.
- Experience with Infrastructure as Code (IaC) tools like Terraform and Ansible.
- Knowledge of monitoring tools like Prometheus and Grafana.
- Design, implement, and maintain workflows and data pipelines using Apache Airflow.
- Oversee and optimize data loading operations for extraction, transformation, and reporting.
- Build and manage ETL processes to support business intelligence and analytics.
- Monitor, maintain, and troubleshoot workflows to ensure reliability and performance.
- Develop and implement data validation processes to ensure data quality and accuracy.
- Optimize data flow between systems to improve efficiency and performance.
- Collaborate with data scientists and analysts to provide seamless access to data for analytical use.
- Establish and enforce data governance policies to ensure compliance with organizational and regulatory standards.
- Provide technical support to resolve data-related issues, including pipeline failures and performance bottlenecks.
- Write Python scripts to parse and process data from various sources.
- Design and maintain SQL-based database solutions to support data operations.
- Utilize containerization tools to manage workflow deployment and scalability.
- Leverage Infrastructure as Code to automate and manage data infrastructure.
- Implement monitoring and alerting mechanisms using tools like Prometheus and Grafana to ensure system reliability.
- Develop and maintain detailed documentation for workflows, processes, and system configurations.
- Ensure compliance with data security policies and relevant regulations.
- Strong analytical and problem-solving abilities.
- Effective communication and collaboration skills for working with cross-functional teams.
- Adaptability to meet critical deadlines and manage multiple priorities.
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Certification in relevant technologies is desirable.
- Experience or willingness to work in a support role for data pipelines and workflows.
Apply safely
To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.