We are seeking an international media-broadcasting company with experience in data processing, integration, and quality on a large scale. The company operates in a cloud environment, using modern data engineering technologies to build and maintain analytical platforms that support key business processes.
Requirements
- At least 5 years of experience in IT, including at least 3.5 years of experience working with data in the cloud (AWS)
- Commercial experience working with GCP and BigQuery and Dataproc
- Experience working with Databricks
- Commercial experience with PySpark
- Experience working with Terraform
- Advanced SQL skills and use of SQL in technological solutions
- Knowledge of methodologies and use of Git and CI/CD
- Experience creating and optimizing data processing solutions (ETL, ELT, etc.)
- Programming skills in Python and use of Pandas and/or NumPy libraries
- Experience with agile practices and knowledge of tools used in software development process, including Azure DevOps
- Knowledge about secure data storage and processing in the cloud
- Experience with Generative AI in software development process and architectural decisions
Benefits
- Lot of freedom of action and a real impact on the direction of projects
- Projects for clients in Europe and the USA
- Cloud certifications and training
- AI tools for everyone in the team from day one
- Community of experts
- Possibility of working 100% remotely