Overview
We are seeking a Senior Data Software Engineer to join our team and contribute to building scalable data solutions. In this role, you will collaborate directly with the client, lead technical delivery and design robust data pipelines for batch and streaming workloads.
Responsibilities
- Develop and maintain the code base for ETL and ELT pipelines, large batch/micro batch processing and streaming systems
- Build out the infrastructure required for optimal extraction, transformation and loading of data from a wide variety of data sources using ADF, Spark, Kafka or similar technologies
- Identify, design and implement internal process improvements such as automating manual processes, optimizing data delivery and re-designing infrastructure for greater scalability
- Design and implement innovative data services solutions using Spring Boot, ReactJS, NoSQL or other UI and API related technologies
- Assure governance of processes in delivery management and production as per the selected delivery model
- Act as a single point of responsibility for top management and stakeholders over any delivery-related matters, including escalations, upsells and ramp-downs
- Provide technical leadership for the delivery, ensuring a sound and future-proof architecture is planned and the implementation meets the technical quality standards
- Write stories and associated acceptance criteria for agile/scrum workflow
- Coordinate between multiple disciplines and stakeholders
- Ensure projects are delivered in line with processes and methodologies, with a focus on agile approaches
- Establish a strategy of continuous delivery risk management that enables proactive decisions and actions throughout the delivery life cycle
- Measure and improve delivery productivity, serve as a consultant to Data Engineers and perform production support and deployment activities
Requirements
- 3+ years of experience in data software engineering
- Proficiency in SQL, Spark and Scala
- Expertise in Databricks
- Capability to communicate directly with clients in English at a high level
- Understanding of ETL/ELT pipelines, batch and streaming data processing
- Familiarity with delivery management, agile methodologies and technical leadership
- English proficiency at B2 level or higher
Nice to have
- Knowledge of Spark Streaming
- Familiarity with Kafka
- Background in Azure
[GTS] Benefits (generic, except India)
- International projects with top brands
- Work with global teams of highly skilled, diverse peers
- Healthcare benefits
- Employee financial programs
- Paid time off and sick leave
- Upskilling, reskilling and certification courses
- Unlimited access to the LinkedIn Learning library and 22,000+ courses
- Global career opportunities
- Volunteer and community involvement opportunities
- EPAM Employee Groups
- Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn