Principal Azure Data Engineer
تفاصيل الوظيفة
We are looking for an experienced data engineer responsible for expanding and optimizing our big data pipelines. You will be working alongside other engineers and developers working on different layers of the infrastructure. Therefore, a commitment to collaborative problem solving, sophisticated design, and the creation of quality products is essential. Responsibilities:
- Collaborate with Big Data Solution Architects to design, prototype, implement, and optimize data ingestion pipelines so that data is shared effectively across various business systems.
- Build ETL/ELT ingestion pipelines and analytics solutions using cloud and on-premises technologies.
- Ensure the design, code and procedural aspects of solution are production ready, in terms of operational, security and compliance standards.
- Participate in day-to-day project and production delivery status meetings, and provide technical support for faster resolution of issues.
- Bachelor's degree from an accredited college/university or equivalent work experience.
- 5+ years of experience performing data engineering, warehousing, publishing and visualization throughout the full data lifecycle.
- At least two (2) years of experience as an Azure Data Engineer, or in a similar role in a technology or IT consulting environment.
- 3 years of experience in data profiling, cataloguing and mapping to enable the design and build of technical data flows.
- 3 years of experience defining ELT/ETL architecture and process design.
- 2 years of experience performing end-to-end implementation of data warehousing analytics solutions built on MS or Azure platforms.
- 2 years of experience working with Azure.
- Experience in creating data pipelines using ETL tools like Airflow, ADF.
- Experienced in distributing computing technologies like Apache Spark, Databricks.
- Experienced with stream-processing systems like Spark Streaming, Eventhub.
- Proficient in one of the object-oriented and functional languages: Python, Java, Scala, etc.
- Experience of writing effective and maintainable unit and integration tests for ingestion pipelines.
- Experience of using static analysis and code quality tools.
- Experience in using code version control repositories like Git.
- Experienced in security best practices including encryption of sensitive data, encryption at rest and encryption in flight.
- Experience of building CI/CD pipelines & DevOps orchestration tools like Jenkins.
- Familiarity with Data Lakehouse, Delta Lake, Apache Hudi.
Apply safely
To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.