Overview

We are looking for a highly skilled and experienced Senior Python Developer to become an integral part of our innovative team.

In this role, you will design and sustain high-quality solutions at every stage of the Software Development Life Cycle (SDLC). This opportunity calls for a resourceful engineer who excels at addressing complex challenges and demonstrates an eagerness to learn and adapt to emerging technologies.

Responsibilities

  • Create and manage cloud resources in AWS
  • Implement data ingestion processes for a variety of data sources, including technologies like RDBMS, REST HTTP APIs, flat files, streams, and proprietary time-series data
  • Develop data ingestion and processing pipelines using Big Data technologies
  • Process and transform data with Spark and cloud-based services while embedding crucial business logic
  • Build automated data quality validations to maintain data processing integrity and accuracy
  • Develop infrastructure to collect, transform, integrate, and distribute customer data effectively
  • Optimize and streamline processes for enhanced data collection, analytics, and visualization
  • Ensure scalability, reliability, and flexibility of data pipelines and associated analyses
  • Identify, analyze, and interpret complex data patterns and trends
  • Create frameworks with data visualization tools to provide actionable insights for stakeholders
  • Actively participate in Agile Scrum ceremonies as part of development teams
  • Generate well-structured reports and develop effective queries to share findings
  • Mentor junior colleagues and uphold best practices across the team

Requirements

  • 3+ years of data engineering experience within domains such as consumer finance, loans, collections, or similar fields
  • Background in math, statistics, computer science, data science, or related disciplines
  • Advanced knowledge of Python or Snowflake
  • Proficiency in using tools like HDFS, YARN, Hive, Spark, Kafka, Oozie/Airflow, AWS, Docker/Kubernetes, and Snowflake
  • Skills in programming languages (e.g., SAS, SQL, R, Python), database technologies (e.g., PostgreSQL, Redshift, Snowflake, Greenplum), and data visualization platforms (e.g., Tableau, Looker, MicroStrategy)
  • Showcase of flexibility to adopt and employ new technologies and tools effectively
  • Strong organizational and multitasking attributes supported by consistent adherence to deadlines
  • Expertise in business intelligence, analytics techniques, and modern methodologies
  • Proficiency in written and verbal communication in English (minimum B2 level), ensuring clarity when addressing non-technical stakeholders

Nice to have

  • AWS certification showcasing relevant expertise
  • Competency in Spark Streaming
  • Understanding of ELK Stack
  • Familiarity with Cassandra or MongoDB technologies
  • Proficiency in deploying CI/CD tools such as Jenkins, GitLab, or Jira/Confluence
  • Background in Go Language use

[GTS] Benefits (generic, except India)

  • International projects with top brands
  • Work with global teams of highly skilled, diverse peers
  • Healthcare benefits
  • Employee financial programs
  • Paid time off and sick leave
  • Upskilling, reskilling and certification courses
  • Unlimited access to the LinkedIn Learning library and 22,000+ courses
  • Global career opportunities
  • Volunteer and community involvement opportunities
  • EPAM Employee Groups
  • Award-winning culture recognized by Glassdoor, Newsweek and LinkedIn