Home India Big Data Developer

Home India Big Data Developer

Big Data Developer

Full time at Impetus in India
Posted on January 21, 2025

Job details

Big Data Developer Experience- 3 to 12 years Locations- Indore, Noida, Hyderabad, Bangalore, Gurgaon Job Description We are looking for a Big Data Developer to design and oversee the implementation of large-scale data solutions, lead a talented team, and ensure the efficient processing and analysis of massive datasets. Roles & Responsibilities Key Responsibilities

  • Architect Data Solutions: Design and implement scalable, high-performance big data architectures using industry-leading tools and technologies.
  • Lead Development Efforts: Oversee the end-to-end development of big data workflows, pipelines, and applications.
  • Mentor Team Members: Provide technical guidance, conduct code reviews, and foster a culture of continuous improvement within the team.
  • Optimize Performance: Identify bottlenecks in data processing workflows and implement strategies for optimization.
  • Collaborate: Work closely with data scientists, engineers, analysts, and business stakeholders to understand requirements and deliver solutions.
  • Data Governance: Ensure compliance with data privacy regulations and best practices in data management.
  • Innovate and Stay Current: Research emerging big data technologies and incorporate them into the organization's tech stack as appropriate.
Required Skills and Qualifications
  • Big Data Expertise: Experience working with big data technologies like Apache Spark, Hadoop, Hive, and HBase.
  • Programming Skills: Advanced proficiency in Python, Scala, or Java, with experience in functional programming.
  • ETL and Data Pipelines: Extensive experience in building and optimizing ETL pipelines for large-scale data ingestion and transformation.
  • Data Management: Expertise in handling structured and unstructured data in formats like Parquet, Avro, ORC, JSON, and CSV.
  • Performance Tuning: Proven ability to optimize distributed computing jobs for resource and cost efficiency.
  • Database Knowledge: Proficiency in SQL and NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB).
  • DevOps Integration: Experience with CI/CD pipelines, containerization (Docker), and orchestration (Kubernetes).
  • Leadership: Demonstrated ability to lead teams, manage projects, and mentor junior developers.

Apply safely

To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.

Share this job
Improve your chance to get this job. Do an online course on HADOOP starting now. Claim $10 promo towards online courses. See all courses
See All Big Jobs
Feedback Feedback