Senior Data Engineer, Madrid


Empresa
 EPAM
Provincia
 Madrid
Ciudad
Madrid
Tipo de Contrato
 Tiempo Completo
Descripción
Senior Data Engineer
Are you a forward-thinking professional with a strong background in Data Engineering and an interest in financial services? Join as a Senior Data Engineer and help build and evolve enterprise data ecosystems for customers.

Design scalable data pipelines, ensure high-quality data flows and integrate complex APIs while collaborating with cross-functional teams to deliver robust and reliable data products.

This is a hybrid role based in Madrids city center, ideal for those eager to thrive in a dynamic environment and make a significant impact in private banking technology.

Join EPAM and contribute to shaping the future of financial services in Spain!

Responsibilities
- Design, build and optimize scalable Python-based ETL/ELT pipelines using frameworks such as pandas or Polars
- Orchestrate data pipelines using Dagster, Airflow or similar tools
- Develop efficient data ingestion pipelines for batch, incremental and streaming scenarios
- Integrate with internal and external APIs ensuring robust authentication, error handling and data quality
- Implement best-in-class data models including dimensional modelling and domain-driven structures
- Manage application lifecycle and reduce architecture debt
- Deploy, operate and monitor data pipelines using Docker, Kubernetes and GitLab
- Develop and maintain CI/CD pipelines for data workflows and infrastructure components
- Collaborate with data scientists, software engineers and platform teams to enhance data services and deployment processes
- Support troubleshooting, incident response and participate in architecture discussions

Requirements
- Bachelors or Masters degree in Computer Science, Engineering, Data Science or related field
- Minimum 5 years of experience in data engineering or backend engineering with a strong data focus
- Expert-level Python skills with frameworks such as pandas or Polars
- Experience with data orchestration frameworks (Dagster, Airflow or similar)
- Strong understanding of data modelling, ETL patterns and performance considerations
- Proficiency with Docker, Kubernetes and GitLab CI/CD
- Hands-on experience with SQL and relational databases (MS SQL preferred)
- Practical experience with Azure data and compute services or comparable AWS/GCP experience
- Passion for building robust, maintainable and well-tested data systems
- Strong focus on scalability, reliability, data quality and monitoring

Nice to have
- Experience in regulated environments such as finance, insurance or healthcare
- Certifications in cloud, Kubernetes or DevOps (CKA, AWS/GCP/Azure)
- Familiarity with distributed systems, microservices and API-driven architectures
- Commitment to automation, reproducibility and DevOps practices
- Creativity and new ideas to enhance the platform
- Ability to work under pressure and tough deadlines
- Strong communication skills and product-thinking mindset
- Comfortable working in agile environments
- Fluency in Spanish

Python, pandas, Polars, Dagster, Airflow, Docker, ETL
Regresar
Al enviar este formulario certifico que acepto los Terminos de Uso

 

Empleos más buscados

Ubicaciones Frecuentes