Data Senior Consultants
Job details
Our Why Datacom works with organisations and communities across Australia and New Zealand to make a difference in people’s lives and help organisations use the power of tech to innovate and grow. Our team The Data and Analytics team is a progressive group of diverse and creative individuals with a deep love for all things data-related. They build platforms and solutions that are leading edge and at the forefront of the technology curve. A close-knit and fun team embedded within our rapidly growing Professional Services division, and part of the wider Datacom ecosystem. This role can be based anywhere, so you choose your lifestyle and we have the role for you. About the Role (your why) We’re looking for Senior Data Engineers to join our growing analytics team and work on a broad range of customer projects with one common denominator – data. You’ll be working closely with our software developers, data scientists, DW engineers and data storytellers and will be responsible for creating and maintaining data architectures that enable data-driven insights and innovation for our customers. In short, you’ll be building modern data platforms and get to work with cutting-edge data technologies in an Agile and fast-paced environment. What you’ll do
- Design data platforms, distributed systems, data lakes & data stores
- Create software solutions for data ingest & integration
- Develop and operationalise reliable data pipelines & ETL patterns
- Build analytics tools to provide actionable insights and solve business problems
- Infrastructure development
- Wrangle and integrate data from multiple sources
- Identify ways to improve data reliability, efficiency and quality
- Experience in Snowflake/DBT
- Strong programming skills, especially Python, Java, Scala, C++, C# etc
- Cloud platforms such as Azure, AWS & GCP
- Relational database management systems such as SQL Server, Redshift etc
- Distributed processing technologies such as Apache Spark
- Working knowledge of message queuing, stream processing, and highly scalable data stores
- Experience building CI/CD Pipelines such as GITHUB, Azure DevOps, etc
- Degree in Computer Science, Data Science, Statistics or related field
- Experience developing and deploying data pipelines into live environments
- A passion for lean, clean and maintainable code
- Strong analytic skills related to working with datasets, both structured and unstructured
- Curious and enthusiastic mindset
- The desire to grow and to share insights with others
- Machine Learning frameworks and theory
- Analytic platforms & tools such as Databricks, Alteryx, SAS, KNIME or Datarobot
- Data vault / Kimball modelling methodology
- DevOps / DataOps
Apply safely
To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.