Data Engineer- DWH
Job details
Data Engineer- DWH Job Overview: We are seeking an experienced Data Engineer to join our dynamic team. The ideal candidate will have a strong background in Data Warehousing, T-SQL, ETL processes, and SSIS, with additional expertise in Python and Azure Synapse. This role will involve designing, implementing, and maintaining data pipelines, as well as supporting our data warehousing and analytics initiatives. Key Responsibilities:
- Data Warehousing: Design, develop, and maintain data in warehouses to support business intelligence and analytics needs. Optimize existing data warehouse structures for performance, scalability, and efficiency.
- T-SQL Development: Write Complex T-SQL queries, stored procedures, and scripts for data extraction, transformation, and loading (ETL) processes. Optimize and tune T-SQL queries to improve database performance.
- ETL & SSIS: Design and develop ETL workflows using SSIS to extract, transform, and load data from various sources into the data warehouse. Manage and monitor SSIS packages, ensuring reliable and timely data processing.
- Azure Synapse Analytics: Implement and maintain data integration solutions using Azure Synapse Analytics. Optimize Synapse SQL pools and pipelines for high-performance data processing.
- Python Development: Develop data processing scripts and automation tools using Python. Integrate Python scripts with ETL processes and data pipelines to enhance functionality and efficiency.
- Data Integration & Management: Work with various data sources, including relational databases, flat files, and cloud-based systems. Ensure data quality, consistency, and security throughout the data lifecycle.
- Data Quality & Governance: Implement and enforce data quality standards, validation rules, and governance practices to ensure data accuracy and reliability. Perform data profiling, cleansing, and validation to maintain high data quality across all data pipelines. Collaborate with data governance teams to ensure compliance with data security and privacy regulations.
- Collaboration & Documentation: Collaborate with cross-functional teams, including data scientists, MIS, and business stakeholders, to understand data requirements. Create and maintain detailed technical documentation for all data engineering processes and workflows.
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- 5+ years of experience in Data Warehousing, T-SQL, ETL, and SSIS.
- 2+ years of experience with Python and Azure Synapse Analytics.
- Strong understanding of database design principles, data modelling, and SQL optimization.
- Proficiency in T-SQL for writing complex queries and scripts.
- Experience with ETL tools, particularly SSIS, for building and managing data pipelines.
- Hands-on experience with Azure Synapse Analytics, including Synapse SQL pools and data
- integration pipelines.
- Strong programming skills in Python, with experience in data processing and automation.
- Familiarity with cloud platforms, particularly Microsoft Azure.
- Strong problem-solving skills and the ability to troubleshoot data-related issues.
- Excellent communication skills, with the ability to convey technical information to non-technical
- stakeholders.
- Detail-oriented with a focus on data accuracy and quality.
- Experience with other Azure data services such as Azure Data Factory, Azure Data Lake, or Databricks.
- Knowledge of data governance, data security, and compliance best practices.
- Experience with big data technologies and distributed data processing.
Apply safely
To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.