Data Engineer (Airflow), Barcelona
Empresa
WIZELINE
Provincia
Barcelona
Ciudad
Barcelona
Tipo de Contrato
Tiempo Completo
Descripción
Data Engineer (Airflow)
We are:
Wizeline, a global AI-native technology solutions provider, develops cutting-edge, AI-powered digital products and platforms. We partner with clients to leverage data and AI, accelerating market entry and driving business transformation. As a global community of innovators, we foster a culture of growth, collaboration, and impact.With the right people and the right ideas, theres no limit to what we can achieve
Are you a fit?
Sounds awesome, right? Now, lets make sure youre a good fit for the role:
Key Responsibilities
- Data Migration and Pipeline Development
- Data Modeling and Transformation
- Troubleshooting and Optimization
- Collaboration and Documentation
Must-have Skills:
- Bachelors or Masters degree in Computer Science, Engineering, or a related quantitative field.
- 3+ years of experience in data engineering, with a focus on building and maintaining scalable data pipelines.
- Solid experience with data migration projects and working with large datasets.
- Strong hands-on experience with Snowflake, including data loading, querying, and performance optimization.
- Proficiency in dbt (data build tool) for data transformation and modeling.
- Proven experience with Apache Airflow for scheduling and orchestrating data workflows.
- Expert-level SQL skills, including complex joins, window functions, and performance tuning.
- Proficiency in Python for data manipulation, scripting, and automation for edge cases
- Familiarity with PySpark, AWS Athena, and Google BigQuery (source systems).
- Understanding of data warehousing concepts, dimensional modeling, and ELT principles.
- Knowledge of building CI/CD pipelines for code deployment
- Experience with version control systems (e.g., Github).
- Excellent problem-solving, analytical, and communication skills.
- Ability to work independently and as part of a collaborative team in an agile environment.
- Must speak and write in English fluently Effective communicator
Nice-to-have:
- AI Tooling Proficiency: Leverage one or more AI tools to optimize and augment day-to-day work, including drafting, analysis, research, or process automation. Provide recommendations on effective AI use and identify opportunities to streamline workflows.
What we offer:
- A High-Impact Environment
- Commitment to Professional Development
- Flexible and Collaborative Culture
- Global Opportunities
- Vibrant Community
- Total Rewards
Specific benefits are determined by the employment type and location.
Airflow, SQL, Python
We are:
Wizeline, a global AI-native technology solutions provider, develops cutting-edge, AI-powered digital products and platforms. We partner with clients to leverage data and AI, accelerating market entry and driving business transformation. As a global community of innovators, we foster a culture of growth, collaboration, and impact.With the right people and the right ideas, theres no limit to what we can achieve
Are you a fit?
Sounds awesome, right? Now, lets make sure youre a good fit for the role:
Key Responsibilities
- Data Migration and Pipeline Development
- Data Modeling and Transformation
- Troubleshooting and Optimization
- Collaboration and Documentation
Must-have Skills:
- Bachelors or Masters degree in Computer Science, Engineering, or a related quantitative field.
- 3+ years of experience in data engineering, with a focus on building and maintaining scalable data pipelines.
- Solid experience with data migration projects and working with large datasets.
- Strong hands-on experience with Snowflake, including data loading, querying, and performance optimization.
- Proficiency in dbt (data build tool) for data transformation and modeling.
- Proven experience with Apache Airflow for scheduling and orchestrating data workflows.
- Expert-level SQL skills, including complex joins, window functions, and performance tuning.
- Proficiency in Python for data manipulation, scripting, and automation for edge cases
- Familiarity with PySpark, AWS Athena, and Google BigQuery (source systems).
- Understanding of data warehousing concepts, dimensional modeling, and ELT principles.
- Knowledge of building CI/CD pipelines for code deployment
- Experience with version control systems (e.g., Github).
- Excellent problem-solving, analytical, and communication skills.
- Ability to work independently and as part of a collaborative team in an agile environment.
- Must speak and write in English fluently Effective communicator
Nice-to-have:
- AI Tooling Proficiency: Leverage one or more AI tools to optimize and augment day-to-day work, including drafting, analysis, research, or process automation. Provide recommendations on effective AI use and identify opportunities to streamline workflows.
What we offer:
- A High-Impact Environment
- Commitment to Professional Development
- Flexible and Collaborative Culture
- Global Opportunities
- Vibrant Community
- Total Rewards
Specific benefits are determined by the employment type and location.
Airflow, SQL, Python