Mastech InfoTrellis | GCP Data Engineer | chennai
Job details
Position: GCP Data EngineerLocation: Chennai, Tamil Nadu(3 day/week) / (Remote with travel 1 week/ 3 months)Duration: Full TimeService based Company: Mastech InfoTrellis https://g.co/kgs/91rBXWY and Mastech Digital https://www.mastechdigital.com/Notice Period: 15-45 days (Non negotiable)Job Description:We are seeking a talented and motivated Data Engineer with expertise in Google Cloud Platform (GCP) and PySpark to join our team. As a Data Engineer, you will be responsible for designing,implementing, and maintaining scalable data pipelines and solutions on GCP using PySpark for processing and analysis.Responsibilities: • Design, develop, and maintain data pipelines on GCP using PySpark for data processing, transformation, and analysis.• Work closely with cross functional teams to understand data requirements and translate them into technical solutions.• Optimize and tune data pipelines for performance, scalability, and reliability.• Implement best practices for data governance, security, and compliance.• Collaborate with Data Scientists and Analysts to support their data needs and ensure data availability and quality.• Troubleshoot and debug data pipeline issues in production environments.• Stay up to date with the latest trends and technologies in data engineering and cloud computing.• Contribute to the continuous improvement of our data engineering practices and processes.Required Skills: • Proven experience as a Data Engineer familiar with Databricks and GCP services.• Strong proficiency in Google Cloud Platform (GCP) services, particularly BigQuery, Dataflow, Pub/Sub, and Cloud Storage.• Strong experience designing, implementing, and optimizing data pipelines using Databricks in GCP.• Hands-on experience with PySpark for data processing and analysis.• Solid understanding of distributed computing principles and experience with large-scale data processing frameworks.• Experience with SQL and NoSQL databases.• Familiarity with data modelling, ETL, and data warehousing concepts.• Excellent problem-solving and communication skills.• Ability to work effectively in a fast-paced, dynamic environment.Preferred Skills & Qualifications: • Bachelor's or Master's degree in Computer Science & Engineering, or a related field.• Google Cloud Platform (GCP) certification.• Experience with other big data technologies such as Hadoop, Spark, or Apache Beam.• Knowledge of machine learning concepts and tools.• Experience with containerization technologies like Docker and Kubernetes. AGG
Apply safely
To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.