Data Engineer - Senior Consultant - AWS
Job details
Deloitte is a leading global provider of audit and assurance, consulting, financial advisory, risk advisory, tax and related services. Our global network of member firms and related entities in more than 150 countries and territories (collectively, the “Deloitte organisation”) serves four out of five Fortune Global 500® companies. Learn how Deloitte’s approximately 457,000 people make an impact that matters at . Deloitte Consulting — Our Culture Innovation, transformation and leadership occur in many ways. At Deloitte, our ability to help solve clients’ most complex issues is distinct. We deliver strategy and implementation, from a business and technology view, to help lead in the markets where our clients compete. Are you a game changer? Do you believe in adding advantage at every level in everything you do? You may be one of us. Deloitte Consulting is growing, with a focus on developing our already powerful teams across our portfolio of offerings. We are looking for smart, accountable, innovative professionals with technical expertise and deep industry experience insights. The combination of our 6 areas of expertise, our well-developed industry structure and our integrated signature solutions is a unique offering never seen before on the African continent. Deloitte Consulting - AWS Be at the forefront of the revolution. AI-enabled technologies are shaking business foundations. Some find this daunting. We see opportunity—for clients, societies, and people. Deloitte’s AI & Data Specialists partner with clients to leverage AI and reach new levels of organisational excellence. We turn data into insights, into action—at an industrial scale. Join us as we enable clients to grasp the future and reach new heights. Learn from the best in the field to create solutions blending data science, data engineering, and process engineering with our industry-leading expertise.
Job Description
Working with and supporting Technical Lead in establish new patterns standards, processes and procedures for client’s solution and data community. Specialize in data integration and data warehousing concepts to extract data from disparate sources and transform it as per business requirement and load the required tables that can be consumed downstream. Helping design and build solutions, communicating to both technical and business teams at a client and covey solutions to business requirements. Delivery Leadership:- Define high level solution design options based on client requirements
- Creation of design standards and patterns reusable in a client’s solution
- Experience in rapid prototyping of potential solutions for design trade-discussions
- Mentoring and training of Junior members of the team
- Completing code reviews of team members
- Accurate breakdown and estimations of tasks for solution
- Ability to pick up and learn new technology quickly
- Able to define a structured approach to problem-solving
- Completion of data models and designs within client’s architecture and standards
- Understanding complex business environments and requirements and design a solution based on leading practices
- Ability to document design and solutions for understanding by client product owners
- Completion of deliverables for gaining architectural approval at client
- Understanding of DataOps approach to solution architecture.
- Solid experience in data and SQL is required
- SAP Hana
- Teradata
- SQL Server
- NoSQL (Hbase, Cassandra or Mongo DB)
- Cloud Based Databases(Hive, Cosmos DB, Dynamo DB)
- Experience Views, functions, stored procedures, Optimisation of queries, building indexes, OLAP / MDX
- AWS
- SSIS
- IBM DataStage
- SAP Data Services
- Informatica or similar
- SQL (TSQL /HQL etc)
- Java
- Python
- Spark / Kafka / RabbitMQ
- UNIX & Shell Commands (Python / shell / Perl) is a plus
- Data Vault (pref)
- Kimball (Pref)
- 3rd Normal Form / OLAP / MDX)
- Big Data
- Hadoop Platform (Cloudera / cloud equivalent)
- HiveQL /Spark / Ooozie / Impala / Pig)
- Optimising Big Data
- Streaming (NiFi / Kafka)
- Methodologies:
- Agile
- PMBOK
- DataOps / DevOps
- Pipeline creation, Automation and data delivery
- Once off, CDC, Streaming
Qualifications
Minimum: Bachelor’s Degree in Data Science, Engineering or related Degree Preferred: Post Grad Degree in Data Science Engineering or related Degree, Data related cloud certifications Experience: 3 - 5 years working experience with client facing experienceAdditional Information
Behavioural:- Excellent communication skills, both written and verbal
- Ability to develop & grow technical teams
- Objective oriented with strong client delivery focus
- Client focused by building strong trusting relationships with clients
- Focus on quality and risk
- Sound problem solving ability
- Ability to understand and comprehend complex environments and systems.
- Inquisitive by nature and keen to figure out how things work
Apply safely
To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.