Overview
We are searching for a Lead Data Software Engineer to join our team and oversee the development of our cutting-edge supply-chain data analytics platform.
This platform delivers a comprehensive view of suppliers, products, material categories, shipments, and compliance activities while employing artificial intelligence to generate insights and recommendations tailored to our clients’ needs.
If this role excites you – apply!
Responsibilities
- Design and develop data architectures and ETL pipelines utilizing Databricks and external orchestrators such as Airflow
- Collaborate with machine learning teams to incorporate AI-driven insights into the platform
- Develop system-level improvements and components for data engineering workflows
- Optimize the performance and scalability of data-intensive applications
- Adhere to software engineering standards, including containerization, unit testing, linting, and code style reviews
- Maintain and improve data workflow, Delta Lake, and Delta Live Tables within Databricks
- Provide technical mentorship and leadership to the data engineering team
- Engage with clients and stakeholders to understand their requirements and offer customized solutions
- Tackle challenges independently and ensure successful project delivery
Requirements
- Experience leading data engineering efforts on projects utilizing Databricks
- Solid understanding of data architectures and skills in data modeling
- Background in designing and building ETL pipelines in Databricks with external orchestrators such as Airflow
- Hands-on expertise in Databricks (Delta Lake, workflows, Delta Live Tables, deployment, and versioning)
- Proficiency in Python and cloud-native technologies
- Competency in Spark/PySpark
- Engineering experience with either AWS or Azure
- Knowledge of big data and techniques for optimizing data-intensive applications
- Strong communication skills with a proactive attitude and client-facing experience
- Ability to navigate ambiguity and complete projects independently
- Comfort working in a dynamic and transparent startup environment
- Fluent English communication skills at a C1 level
Nice to have
- Proficiency in configuring and managing CI/CD pipelines in Azure DevOps
- Familiarity with Data Observability and Data Quality Monitoring practices
- Capability to integrate data quality checks into data pipelines
Greece
For your comfortable work:
- Remote and hybrid work opportunities
- Option to work from our centrally located office in Athens
- Corporate laptop provided
- Private health insurance
- Meal vouchers / restaurant tickets
- My Benefit card (264€/year)
- Monthly public transportation card
For your growth:
- Global and diverse client portfolio, large-scale projects, and trendy technologies
- Diverse multicultural, multi-functional, and multilingual work environment
- Opportunity to contribute to internal and open-source products
- Outstanding career development opportunities with a transparent career path and a roadmap to accelerate your journey
- Possibility to create a Personal Development Plan from the first day in the company
- Numerous opportunities for self-development: hard & soft skills internal training courses, mentoring programs, and unlimited access to LinkedIn Learning courses, external e-Libraries
- Certification opportunities
- Knowledge-sharing with colleagues from EPAM's global tech and non-technical communities
- Language courses
[epamgdo] Greece (Hybrid)
The remote option applies only to the Candidates who will be working from any location in Greece.
[epamgdo] Greece (About EPAM)
EPAM strives to provide its global team of over 61,700 professionals in more than 55 countries with opportunities for professional growth from day one of collaboration. Our colleagues are the source of EPAM's success, so we value cooperation, strive to always understand our clients' business and aim for the highest quality standards. No matter where you are, you will join a dedicated, diverse community that will help you realize your potential to the fullest.