Data Engineer – Build Scalable Data Pipelines
The Chefz · Amman
Descripcion del puesto
About the role
We are looking for a Data Engineer to design, build, and maintain scalable data pipelines that support both batch and real-time processing. The role works closely with product, engineering, and analytics teams to ensure data reliability and enable data-driven decision making.
Key responsibilities
- Design, develop, and operate ETL/ELT pipelines on AWS.
- Build and optimise data lakes, warehouses and data models (Redshift, PostgreSQL/Aurora, MySQL/MariaDB).
- Ensure data quality, reliability and integrity across all systems.
- Collaborate with cross-functional teams to define data requirements and deliver solutions.
- Implement real-time streaming solutions and batch processing workflows using tools such as Kafka and Airflow.
- Monitor and improve database performance and query efficiency.
- Apply data governance, security and best-practice standards.
- Mentor junior engineers and promote engineering excellence.
Required profile
- 5+ years of experience in data engineering or a related field.
- Strong problem-solving skills in a fast-moving product environment.
- Ability to design reusable, tested pipeline code.
Required skills
- Python
- SQL
- AWS services: Redshift, S3, Athena, ECS, EventBridge
- Workflow orchestration: Airflow (or equivalent)
- Relational databases: PostgreSQL/Aurora, MySQL/MariaDB
- Data modelling, dimensional design and schema evolution
- Kafka (streaming) – a plus
Questions fréquentes
Por que reporta esta oferta?
Postula en 30 segundos
Ingresa tu email para postular. Se creara una cuenta automaticamente.
Al continuar, aceptas nuestras condiciones de uso.
Ya tienes cuenta? Iniciar sesion
Publicado hace 2 horas
Expira en 1 mes
4 vistas · 0 candidaturas
Aumenta tus posibilidades
Sube tu CV: te propondremos las ofertas que coinciden con tu perfil.
Analizando tu CV...
The Chefz
Amman