Quantiphi - Practice Leader - Data Engineering/AWS
Job details
Role : Practice Lead - Data & AnalyticsExperience Level : 15+ Years or AboveWork location : Mumbai, Bangalore & TrivandrumWhat you'll do :We are seeking a highly skilled and experienced Technical Leader for our AWS Data Engineering practice. The ideal candidate will be responsible for architecting scalable data solutions and driving the implementation of data projects on AWS. This role requires a deep understanding of AWS services and data engineering best practices.In this role, you will be responsible for establishing and enhancing the company's Data Engineering Services Practice. You will work closely with senior stakeholders to understand business needs and deliver technical solutions. The role is well-suited for a technically proficient individual looking to thrive in a dynamic and fast-paced environment.Role & Responsibilities :Technical Leadership :- Act as a visionary leader capable of steering, motivating, and driving exceptional performance in data engagements.- Conduct proof-of-concept projects to explore strategic opportunities and future-oriented data processing and integration capabilities, aiming to recommend scalable, flexible, and sustainable solutions that offer a high return on investment.- Make informed architectural decisions with the customer's needs and priorities at the forefront.- Guide and mentor engineers, actively participating in code reviews to ensure high standards of code quality.- Collaborate closely with the sales/pre-sales/solution team to develop proposals and strategies that align with and meet the company's performance objectives.- Partner with the marketing team to create collateral and assist recruitment teams in identifying and attracting the right talent to expand the practice.- Design and implement data lakes, cloud data warehouses, master data management solutions, data models, and data quality assessments as part of the data engineering scope.- Lead the development and management of data infrastructure, including tools, dashboards, queries, reports, and scripts, ensuring automation of recurring tasks while maintaining data quality and integrity.Architecture and Design :- Design and architect scalable and robust data solutions using AWS services.- Ensure data architecture aligns with business requirements and best practices.- Evaluate and select appropriate AWS services for data storage, processing, and analytics.Project Implementation :- Oversee the implementation of data engineering projects from inception to completion.- Engage in strategic discussions with customers and offer thought leadership to guide their decisions.- Ensure data quality, integrity, and security throughout the data lifecycle.Technical Innovation :- Stay updated with the latest trends and advancements in data engineering and AWS technologies.- Drive continuous improvement initiatives to enhance data engineering practices and processes.- Experiment with new tools and technologies to improve data processing efficiency and effectiveness.Skills Required :- Bachelor's or Master's degree in Engineering or Technology (B.E. / M.E. / B.Tech / M.Tech)- 15+ years of technical hands-on experience in Data space.- At least 4 end-to-end implementations of large-scale data projects- Experience working on projects across multiple geographic regions- Extensive experience with a variety of projects, including on-premises to AWS migration, modernization, greenfield implementations and cloud-to-cloud migrations.- Proficiency with AWS data services such as AWS Glue, Redshift, S3, Athena, EMR, Lambda, and RDS.- Strong understanding of AWS architecture and best practices for data engineering.- Proficiency in managing AWS IAM roles, policies, and permissions.- Proficient in SQL and Python for data processing and transformation.- Strong understanding of data warehousing concepts, ETL/ELT processes, and data modeling.- Experience with data integration from various sources including batch and real-time data streams.- Familiarity with data serialization formats such as Avro, Parquet, and ORC.- Expertise in optimizing data pipelines and query performance.- Experience with monitoring and troubleshooting data pipelines.- Proficiency in performance tuning and optimization of distributed computing environments.- Experience with data governance frameworks and practices.- Understanding of data lifecycle management and data retention policies.- Ability to implement and manage data quality frameworks and processes.- Hands-on experience with big data processing frameworks like Apache Spark, Hadoop, and Kafka.- Knowledge of stream processing technologies and frameworks.- Experience with data visualization tools such as PowerBI or Tableau (ref:hirist.tech)
Apply safely
To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.