Principal DevOps Engineer Data Engineering - Cortex XDR
Job details
Your Career The Cortex team builds and delivers the industrys most advanced SecOps platform consisting of XDR XSIAM XSOAR and XPANSE. As a member of the Cortex DevOps team your role involves operating and maintaining a largescale GCP environment We are looking for a motivated Principal Data / DevOps Engineer to join our Cortex XDR Devops group in our India center. We are responsible for reliability scalability and Operational excellence while keeping an eye on latency performance and capacity of GCP hosted data stores primarily BigQuery. As part of this role you will collaborate closely with our engineering teams to develop innovative solutions that provide clear and actionable insights into our systems performance and health.
- Contribute to the success of the Data platform team
- Develop expertise in new technologies
- Work with developers researchers data scientists and security experts
- Ensure that applications are scalable and reliable
- Develop tools and automation frameworks
- Automate robust deployment of robust services
- Orchestrate endtoend monitoring and alerting
- Participate in design reviews
- You will work closely with development teams to manage and optimize data storage and processing in BigQuery in GCP
- Optimizes query performance and reduces costs for faster more efficient data analysis
- Ensures scalability of data infrastructure to support product growth
- Supports feature optimization and enhances user experience with accurate data analysis
- Manages datarelated costs by optimizing storage and query strategies
- Participate in the oncall rotation supporting the applications and infrastructure
- You will think of new ways of improving existing processes and creating new ones to increase Developers Experience
- Own maintain and continuously improve all systems provided as a service such as monitoring and datastores
- Engage in service capacity planning and demand forecasting anticipating performance bottlenecks
- 9 years in data engineering / DevOps with extensive experience working on Dataware housing solutions
- Experienced in managing and optimizing GCP BigQuery with expertise in SQL query tuning partitioning and clustering for performance and cost efficiency
- Experience in schema design table management and query execution
- Proficiency with DB such as Cassandra ScyllaDB MemSQL MySQL
- Proficiency with code language (Python / Go preferred)
- Proficiency in the Google Cloud Platform
- High proficiency with virtualized and containerized environments (Kubernetes and Docker)
- Expertise in designing analyzing and troubleshooting largescale distributed systems
- Experience in ETL/ELT Development
- Systematic problemsolving approach
- Bachelor in Computer Science / Engineering or equivalent or equivalent military experience required
- Passionate about technology with a strong focus on ensuring high reliability and service levels
- Vaccine requirements and disclosure obligations vary by country.
- Unless applicable law requires otherwise you must be vaccinated for COVID or qualify for a reasonable accommodation if:
- The job requires accessing a company worksite
- The job requires inperson customer contact and the customer has implemented such requirements
- You choose to access a Palo Alto Networks worksite
- If you have questions about the vaccine requirements of this particular position based on your location or job requirements please inquire with the recruiter.
Apply safely
To stay safe in your job search, information on common scams and to get free expert advice, we recommend that you visit SAFERjobs, a non-profit, joint industry and law enforcement organization working to combat job scams.