Home India Data Warehouse Architect

Home India Data Warehouse Architect

Data Warehouse Architect

Full time at ZEISS India in India
Posted on December 26, 2024

Job details

About ZEISS: ZEISS is an internationally leading technology enterprise operating in the fields of optics and optoelectronics. For its customers, ZEISS develops, produces and distributes highly innovative solutions for industrial metrology and quality assurance, microscopy solutions for the life sciences and materials research, and medical technology solutions for diagnostics and treatment in ophthalmology and microsurgery. With around 43,000 employees, ZEISS is active globally in almost 50 countries with around 30 production sites, 60 sales and service companies and 27 research and development facilities. Founded in 1846 in Jena, the company is headquartered in Oberkochen, Germany. ZEISS in India is headquartered in Bengaluru and present in the fields of Industrial Quality Solutions, Research Microscopy Solutions, Medical Technology, Vision Care and Sports & Cine Optics. ZEISS India has 3 production facilities, R&D center, Global IT services and about 40 Sales & Service offices in almost all Tier I and Tier II cities in India. With 2200+ employees and continued investments over 25 years in India, ZEISS’ success story in India is continuing at a rapid pace. Purpose of the Job The global Performance management and Analytics team is looking for a motivated Data architect who will be responsible for design, develop, and maintain the data architecture and infrastructure necessary for effective data analysis and reporting. The Data Architect is responsible for ensuring the accuracy, integrity, and availability of data for the SCM Analytics team to derive meaningful insights and support decision-making processes. The key objectives of the Data Architect role includes:

  • Data Modeling: Designs and implements data models that align with the requirements of the Analytics team. This involves understanding the data needs of the team, identifying relevant data sources, and creating logical and physical data models that facilitate efficient data storage and retrieval.
  • Data Integration and ETL: Responsible for designing and implementing data integration processes, including Extract, Transform, Load (ETL) workflows. Ensure that data from various sources is collected, cleansed, transformed, and loaded into the appropriate data repositories for analysis.
  • Database Management: Oversees the management and administration of databases used by the Analytics team. This includes optimizing database performance, ensuring data security and access controls, and monitoring data quality and consistency.
  • Data Governance: Establish and enforces data governance policies and procedures within the Analytics team. Define data standards, data lineage, and data documentation practices to ensure data consistency, reliability, and compliance with regulatory requirements.
  • Collaboration and Communication: Collaborates closely with other members of the Analytics team, as well as stakeholders from different functional areas, such as supply chain, operations, and IT. Communicate effectively to understand data requirements, address technical challenges, and present data insights in a clear and understandable manner.
  • Connecting external APIs is also a crucial part of the purpose of the Data Architect role within the Analytics team. Connecting external APIs involves integrating and leveraging data from external sources, such as third-party vendors, suppliers, or industry-specific APIs, to enrich the data ecosystem of the SCM Analytics team. The Data Architect is responsible for identifying relevant external APIs that can provide valuable data for analysis and decision-making.
Education Degree in Computer Science engineering, Mathematics, or comparable degree. Master’s is a plus. Work Experience Minimum of 8 to 10 years professional experience Specific Knowledge/Skills
  • Ability to create data warehousing solutions by translating business requirements
  • Familiarity with ETL processes & tools.​
  • Familiarity in scripting languages such as Python and/or PySpark o
  • Advanced knowledge of SQL for complex data transformation and aggregation.
  • Understanding of data formats – JSON, CSV, XML
  • Proficiency working with data warehouses, NoSQL..
  • Knowledge of cloud-based platforms – Azure Cloud..​
  • Real time data processing – Apache, Kafka, Flink​
  • Proficiency in performance optimization techniques.
  • Ensuring the data infrastructure is designed keeping security best practices in mind
PREFERRED QUALIFICATIONS:
  • Experience in large corporate company with complex supply chain/operations processes and multiple inventory locations
  • Excellent communication skills
  • Proficiency in Power BI will be a plus
  • Lead and foster the culture of best practice implementation while creating data models and schemas

Apply safely

To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.

Share this job
Improve your chance to get this job. Do an online course on Warehouse starting now. Claim $10 promo towards online courses. See all courses
See All Data Jobs
Feedback Feedback