Principal GCP Data Engineer
Job details
What will you be responsible for?● The ideal candidate will oversee end to end data pipeline stack, including design, development, testing, documenting and deploying the pipeline, by following the defined/proven data engineering practices in a cloud based environment (GCP). ● As principal you will provide subject matter expertise in Pipeline Architecture, Standards and Practices, for designing, developing, implementing and monitoring enterprise wide secured and scalable solutions. ● Contribute to Architecture and Design to create a robust and secure Data platform ● Defining integration solutions, creating technology roadmaps and creating an end-to-end plan for enterprise level projects/initiatives/migrations. ● Mentor team to stay on top of the latest evolving technological suites like Delta Lake, Iceberg, Data Fabric, DataOps, Data Mesh, Streaming Data Processing, Data Governance etc ● Hire the right talent, build, recognize & retain talent to create a professional culture with IKS defined values within the team. ● Support to establish SLAs, KPIs and recommend measurement and improvement of processes. ● Create a strong monitoring platform for all the live interfaces for proactive alerting models and administer the interface engine for assured availability. ● Provide and implement continuous improvements & recommendations on new processes, policies for software development and management. ● Deliver data-driven & complex solutions with a customer-first mindset with multicultural and global teams in different time zones. ● Creating and managing a sustainable, self organized, cross-functional and a collaborative team that follows the best practices of Agile methodology. What would your day look like?● Understanding the domain, IKS feature clusters, business, functional and non-functional feature requirements. ● Create, support and own artifacts : Architecture, Solution Design, Interface Deployment, Interface Mapping and Transformation logics, Business Context, Data Flow diagrams and others. ● Influencing the culture of delivering high values to the internal and external customers using Agile methodologies. ● Proactive measures to resolve the possible risks in any of the projects and programs. ● Support leadership team with proper business and technical updates/progress. Who are we looking for?● The ideal candidate should have 10+ years of experience in building and deploying data pipelines ● Masters/Bachelors Degree in Computer Science, Software Engineering, or related field. ● Minimum 5+ years of experience in working using data components like Apache Beam, PySpark, Serverless Offering from Cloud (like Lambda, Cloud Functions), Data Warehouse, ETL Orchestration Tools (like Apache Airflow, Control-M)● Experience in any of the Cloud Platforms (preferably Google Cloud)● Proficient in SQL and NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB, Redis). ● Familiar with CI/CD pipelines, Docker and data as a service concepts. ● Strong understanding of Data Modeling concepts for analytics and transactional systems ● Strong problem-solving and analytical abilities with excellent written & verbal communication skills. ● Experience in managing tactical operations to achieve successful delivery with a customer-first mindset. ● Led complex technical development and migration projects from concept to completion. ● Designed scalable systems capable of handling high volumes of data and traffic (both batch and streaming)● Experience working with multicultural and global teams in different time zones. ● Nice to have experience: Exposure to Healthcare industry with understanding of lifecycle and terminologies Exp: 12-18 yrs Location: Remote
Apply safely
To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.