Descrizione Lavoro
At Hostelworld (Permanent), in Porto, Portugal
Expires at: 2026-03-26
Remote policy: Full remote
WHO YOU'LL WORK WITH
You will play a key role in a diverse, highly-talented team managing cloud-native data engineering systems to consistently ensure the timely, accurate and secure production and delivery of data from the Hostelworld platform.
You will work closely with other technology groups and business owners to refine quality standards and processes for Data processing design, pipeline design, development and deployment. The underlying purpose is to ensure we have an efficient, robust, secure and performant data service to support our business growth.
WHAT YOU'LL DO
We’re seeking a Senior Data Engineer to lead the integration and modernization of a recently acquired company’s data ecosystem. You’ll collaborate closely with our new colleagues to learn, document, and understand their existing data product — currently based on Python and Airtable — and work to evolve it into our modern Google Cloud Platform (GCP) stack.
This is a hands‑on, strategic role that combines architecture, delivery excellence, and collaboration. You’ll bring strong software engineering discipline to our data workflows, ensuring all pipelines are built with testability, version control, and CI/CD best practices in mind.
Bridge and Modernize Systems
Learn and document the acquired company’s current data ecosystem, including Airtable structures, Python scripts, and API integrations.
Translate that understanding into clear technical documentation and communicate findings back to our internal data and engineering teams.
Design and lead a modernization roadmap to integrate their systems into our GCP‑based medallion architecture.
Champion pragmatic migration strategies that balance business continuity with long‑term scalability.
Design and Deliver Robust Data Pipelines
Build, orchestrate, and maintain scalable ELT pipelines using Astro (Airflow), DBT, and BigQuery.
Ingest data from APIs and third‑party systems into our unified data model.
Embed data validation, testing, and observability into every stage of the pipeline.
Apply Engineering Best Practices
Drive excellence in the software development lifecycle (SDLC) — from design and peer review to automated testing, deployment, and monitoring.
Implement and improve CI/CD pipelines for data workflows (e.g., Astro Cloud, GitHub Actions, Terraform).
Promote a “data‑as‑code” mindset, ensuring reproducibility, versioning, and auditability across environments.
Collaborate and Lead in an Agile Environment
Act as a senior contributor in the teams Scrum ceremonies (planning, review, retrospectives), driving transparency and continuous improvement.
Communicate findings, technical designs, and migration recommendations clearly and proactively to internal stakeholders.
Partner with global data and software teams to align on design standards, delivery milestones, and business priorities.
WHAT WE’RE LOOKING FOR
5+ years of experience in data engineering, designing, building, and maintaining robust, cloud‑native data pipelines.
3+ years of experience as a Python developer, developing data processing scripts, automation, and API integrations.
2+ years of experience in Google Cloud Platform (GCP), including deploying and managing scalable ETL/ELT pipelines.
Proven track record with orchestration and transformation tools, including Astro (Airflow) and DBT.
Strong understanding of data quality, testing, and observability frameworks (e.g., Great Expectations, dbt tests).
Experience implementing CI/CD pipelines for data workflows using tools such as GitHub Actions, Terraform, and Astro Cloud.
Familiarity with Agile and Scrum practices, with experience collaborating in cross‑functional teams and leading technical discussions.
Demonstrated ability to learn, document, and modernize legacy data systems while communicating technical findings clearly to internal stakeholders.
Nice to have
Experience working with Airtable and integrating its API into larger data systems
Hands‑on experience with BigQuery for data storage, analytics, and transformation.
WHAT WE OFFER
Competitive salary & benefits
Enhanced annual leave plus 3 Wellbeing Days per year
Paid family leave (maternity, paternity, surrogacy & adoption)
Agile working (plus a Working from Abroad Policy!)
Support for your ongoing growth & development
Inclusive people policies (sickness, menopause, compassionate and fertility leave)
A chance to give back to your local community with 5 paid volunteering days
OUR BEHAVIOURS
Grow others - We fundamentally believe that investing in growing others benefits everyone, whether it’s helping them develop hard or soft skills. We want learning and growing to be part of our DNA to help makes us a better team, together.
Master it - We are obsessed with our area of expertise and enjoy developing our skills. We rarely take things at face value; we investigate, interrogate, and always look for ‘the why,’ and wherever possible, we use data to find the best solution.
Collaborate - We are in it together, for the tough stuff and the celebrations too. To achieve the best results, we need expertise from all areas of the organisation, and we wholeheartedly welcome diverse thinking.
Adapt - We work fluidly, adapting to new information and the evolving environment while staying committed to our goals. Innovation and experimentation fuel our projects and we’re never afraid to pivot.
Deliver - Our focus is always on the end result; we value outcomes over activity. We collaborate to deliver work at speed without dropping any of our other behaviours.
We believe in talented and diverse teams that reflect the diversity of our customers and the communities in which we operate. Everyone brings different perspectives and experiences. We lay out the above requirements to guide applicants to the experiences that we believe will allow you to be successful in the role. If you don’t meet them all, please consider applying if you think you can still perform the role as described.
#J-18808-Ljbffr