Job details
The key responsibility for this person will be to help craft and develop the core data platform for Fusion data. This person will work on formalizing and rolling out data strategy and architecture to various fusion teams. They will develop low-level design documents. This person will work on building pipeline automation, CICD processes. This person should be pro-active in introducing new scalable pipeline improvements, own a small part of the roadmap. This person will also design and develop pipeline monitoring, patching and upgrade flows. They will work on building a comprehensive ER diagram explaining the relationships between various objects that are being crafted. We work in a distributed team structure, matrix organization and with current remote working, this individual should be comfortable in getting the best from their team. Requirements
- 3+ years of working experience in any industry-standard data pipeline products
- 5 + years of core java experience
- Strong experience in Microservices based architecture and technologies, setting up, provisioning, solving, monitoring and patching will be a great advantage to this role
- A good understanding of ZDT based pipeline architectures
- Good understanding of any Cloud based database, Cloud native principals would be advantageous
- Hands-on experience working with Docker, Kubernetes and helm charts will be advantageous
- Good working knowledge of Oracle Database, Oracle Autonomous Database, DBMS packages, SQL, PL/SQL, Python, IntelliJ and Java
Apply safely
To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.