Data Engineer - Google Cloud Platform
Job details
Work Mode : Onsite 5 days working Must Have Skills : Bigquery, cloud storage, any 1 of dataflow - data proc and cloud composer, SQL or PythonJob Description :- Good hands-on experience in GCP services including BigQuery, Cloud Storage, Dataflow, Cloud Dataproc, Cloud Composer/Airflow, and IAM.- Must have proficient experience in GCP Databases : Bigtable, Spanner, CloudSQL and AlloyDB- Proficiency either in SQL, Python, Java, or Scala for data processing and scripting.- Experience in development and test automation processes through the CI/CD pipeline (Git, Jenkins, SonarQube, Artifactory, Docker containers)- Experience in orchestrating data processing tasks using tools like Cloud Composer or Apache Airflow.- Strong understanding of data modeling, data warehousing and big data processing concepts. - Solid understanding and experience of relational database concepts and technologies such as SQL, MySQL, PostgreSQL or Oracle.- Design and implement data migration strategies for various database types ( PostgreSQL, Oracle, Alloy DB etc.) - Deep understanding of at least 1 Database type with ability to write complex SQLs.- Experience with NoSQL databases such as MongoDB, Scylla, Cassandra, or DynamoDB is a plus- Optimize data pipelines for performance and cost-efficiency, adhering to GCP best practices.- Implement data quality checks, data validation, and monitoring mechanisms to ensure data accuracy and integrity.- Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and translate them into technical solutions.- Ability to work independently and manage multiple priorities effectively.- Preferably having expertise in end to end DW implementation (ref:hirist.tech)
Apply safely
To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.