Overview

We are seeking a Senior Data DevOps Engineer to design, build and operate robust cloud-based data platforms, supporting data engineering and machine learning teams with scalable, production-ready infrastructure.


Technologies we are interested in...

Cloud Platforms

· AWS: EC2, S3, IAM, VPC; Glue, Redshift, EMR, Kinesis; SageMaker

· Azure: VM, Storage, RBAC, Networking; Synapse, Data Factory, Event Hubs; Azure ML

· GCP (basic): BigQuery, Pub/Sub, Dataflow

Infrastructure, Containers & Automation

· Infrastructure as Code: Terraform (AWS & Azure), Terragrunt, ARM / Bicep

· Containers & orchestration: Docker, Kubernetes (EKS / AKS), Helm

· Configuration management: Ansible, Chef, Puppet or similar

· CI/CD: Jenkins, TeamCity, Bamboo, GoCD or similar

Data, Streaming & ML Platforms

· Apache Spark, Apache Kafka

· Delta Lake / Lakehouse architectures

· Databricks

· Workflow orchestration: Airflow or Prefect

· MLOps & ML platforms: MLflow, Feature Stores (Feast), Model Serving (KFServing, TorchServe)

Big Data, Monitoring & Security

· Hadoop: HDFS, YARN; Cloudera or Hortonworks; Ambari, Cloudera Manager; Hive, HUE, Spark; Hadoop Security (intermediate)

· Monitoring & observability: Prometheus, Grafana, Datadog, Zabbix, Nagios

· Search & analytics: Elasticsearch (intermediate)

· Security best practices and authentication/authorization: LDAP, Kerberos, SAML

Programming

· Python (minimum 2 years hands-on experience)

· One scripting language: Bash, Perl, or Groovy

Responsibilities

  • Design, build, and operate highly available, scalable cloud-based data platforms
  • Support data engineering and machine learning teams with production-ready infrastructure
  • Automate build, deployment, and operational processes
  • Ensure reliability, performance, security, and observability of systems
  • Troubleshoot production issues and collaborate with development teams
  • Create and maintain technical documentation, procedures, and knowledge transfers

Requirements

  • Bachelor’s degree in Computer Science or equivalent practical experience
  • Minimum 4 years of professional experience in DevOps / Cloud / Data Platform roles
  • Strong Linux/Unix administration, networking, troubleshooting, performance tuning, and optimization skills
  • Experience designing and operating production-grade systems
  • Strong analytical, problem-solving, and communication skills
  • Ability to visualize and explain architectures
  • English B2 or higher

Hungary

  • Dynamic, entrepreneurial corporate environment
  • Diverse multicultural, multi-functional, and multilingual work environment
  • Opportunities for personal and career growth in a progressive industry
  • Global scope, international projects
  • Widespread training and development opportunities
  • Unlimited access to LinkedIn learning solutions
  • Competitive salary and various benefits
  • Advanced wellbeing and CSR programs, recreation area

[epamgdo] Hungary (About EPAM)

EPAM is a leading global provider of digital platform engineering and development services. We are committed to having a positive impact on our customers, our employees, and our communities. We embrace a dynamic and inclusive culture. Here you will collaborate with multi-national teams, contribute to a myriad of innovative projects that deliver the most creative and cutting-edge solutions, and have an opportunity to continuously learn and grow. No matter where you are located, you will join a dedicated, creative, and diverse community that will help you discover your fullest potential.

[epamgdo] Hungary (Campus Programs)

Do you know someone interested in starting a career in IT? Share our EPAM Campus programs with them, where they can enhance their knowledge in various fields online, free of charge.