Data Engineer - AWS, Databricks & Pyspark

Posted 25 July 2025
Salary Negotiable
LocationCity of London
Job type Contract
Discipline Data & Analytics
ReferenceBBBH39523

Job description

Data Engineer - AWS, Databricks & Pyspark

Contract Role - Data Engineer
Location: Hybrid (1 day per month onsite in Harrow, London)
Rate: £350 per day (Outside IR35)
Duration: 6 months

A client of mine is looking for a Data Engineer to help maintain and enhance their existing cloud-based data platform. The core migration to a Databricks Delta Lakehouse on AWS has already been completed, so the focus will be on improving pipeline performance, supporting analytics, and contributing to ongoing platform development.

Key Responsibilities:
- Maintain and optimise existing ETL pipelines to support reporting and analytics

- Assist with improvements to performance, scalability, and cost-efficiency across the platform

- Work within the existing Databricks environment to develop new data solutions as required

- Collaborate with analysts, data scientists, and business stakeholders to deliver clean, usable datasets

- Contribute to good data governance, CI/CD workflows, and engineering standards

- Continue developing your skills in PySpark, Databricks, and AWS-based tools

Tech Stack Includes:
- Databricks (Delta Lake, PySpark)
- AWS
- CI/CD tooling (Git, DevOps pipeline
- Cloud-based data warehousing and analytics tools

If your a mid to snr level Data Engineer feel free to apply or send your C.V

Data Engineer - AWS, Databricks & Pyspark