Overview
We are looking for a proactive and detail-oriented DevOps Engineer with strong experience in cloud infrastructure and automation, particularly within Google Cloud Platform (GCP). The ideal candidate is passionate about building scalable, secure and reliable systems, and is comfortable working in a fast-paced, collaborative environment. A strong ownership mindset, the ability to streamline operations through automation and the judgment to balance speed with stability are essential.
Responsibilities
- Design, implement and manage cloud infrastructure on GCP using Infrastructure as Code (IaC) principles
- Develop and maintain Terraform modules for environment provisioning and standardization
- Use Ansible for configuration management and system automation
- Build, optimize and maintain CI/CD pipelines using Jenkins to support efficient software delivery
- Manage and secure GCP services, including IAM roles, networking configurations and access controls
- Administer BigQuery environments, ensuring performance, cost optimization and data governance
- Oversee Google Cloud Storage (GCS) buckets, including lifecycle policies, access control and security best practices
- Deploy and operate data processing workloads using GCP Dataproc (Spark jobs)
- Collaborate with engineering, data and product teams to support reliable and scalable platform operations
- Monitor system performance, troubleshoot issues and implement improvements to enhance reliability and efficiency
- Contribute to best practices, documentation and continuous improvement of DevOps processes
Requirements
- 2+ years of experience in DevOps, Cloud Engineering or related roles
- Proficiency in Terraform for infrastructure provisioning and Ansible for configuration management and automation
- Background in building and maintaining CI/CD pipelines using Jenkins
- Expertise in Google Cloud Platform (GCP), including IAM and networking
- Skills in administering BigQuery and managing large-scale data environments
- Familiarity with GCS bucket management, including lifecycle and security configurations
- Competency in GCP Dataproc and Spark job deployment/operations
- Understanding of cloud security, scalability and reliability principles
- Capability to solve problems and work independently under general direction
- Strong communication and collaboration skills
- English proficiency at B2 level or higher
Nice to have
- Familiarity with Looker or similar BI/reporting tools
- Exposure to data engineering workflows or analytics platforms
[GTS] Benefits (generic, except India)
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn