GCP Data Architect
Full time
at Tata Consultancy Services
in
India
Posted on December 17, 2024
Job details
Role: GCP Data Architect Experience range: 10 - 20 years Location: PAN India. Responsibilities :
- As a member of Data on Cloud CoE (Centre of Excellence) Provide Technical pre-sales enablement covering data architecture, data engineering, data modelling, data consumption, data platform migration and data governance focusing on GCP data platform.
- Expert level knowledge on GCP Data Platform – Data engineering, performance, consumption, security, governance and admin aspects
- To work with cross-functional teams in onsite/offshore setup and discuss and solve technical problems with various stakeholders including customer teams.
- Creating technical proposals and responding to large scale RFPs.
- Have discussion on existing solution, design/optimize solution and prepare execution planning for development, deployment and enabling end users to utilize the data platform.
- Role demands excellent oral and written communication skills to organize workshops, meetings with account teams, account leadership and senior stakeholders from client including CXO level.
- Adept in creating POV and conduct PoC.
- Liaise with Technology partners like GCP, databricks etc.
- 10-15 years of total experience and at least 3+ years of expertise in Cloud data warehouse technologies on GCP data platform covering – Cloud Storage, Data Flow, Data Proc, BigQuery, etc.
- At least one End-to-end GCP data platform implementation is a must covering all aspects including architecture, design, data engineering, data visualization and data governance (specifically data quality and lineage).
- Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts.
- Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement.
- Significant experience with data migrations and development, design, Operational Data Stores, Enterprise Data Warehouses and Data Marts.
- Experience with cloud ETL and ELT in one of the tools like Data Flow/ Data Proc or Matillion or any other ELT tool and exposure to Bigdata ecosystem (Hadoop).
- Expertise with at least one of the Traditional data warehouses solutions on Oracle, Teradata, and Oracle Exadata.
- Excellent communication skills to liaise with Business & IT stakeholders.
- Expertise in planning execution of a project and efforts estimation.
- Understanding of Data Vault, data mesh and data fabric architecture patterns.
- Exposure to working in Agile ways of working.
- Experience in coding languages like Python and PySpark would be an added advantage.
- Experience in DevOps, CI/CD, GitHub is a big plus.
Apply safely
To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.