Job description
Freelance Lead Databricks Engineer (Azure) - Zurich (Remote/Hybrid) - €700-800/day
I am currently hiring for a Freelance Lead Databricks Engineer to join a key project within the finance IT space for one of our insurance clients in Zurich, with the flexibility of remote/hybrid work across Europe. This is a freelance/contract role only, so I am specifically looking to speak with experienced contractors.
About the Role:
As part of a large-scale data transformation initiative, you will be responsible for designing and optimising end-to-end data pipelines to support the client's move towards a data-driven operation. You will focus on Azure-based solutions, including Databricks, Delta Lake, ADF, and PySpark, as well as tackling financial data integration and transformation tasks.
This project is critical to the client's operations, and we're looking for someone who can hit the ground running and deliver high-quality solutions from day one. You'll be working with the client's finance team to improve data accessibility, drive decision-making, and shape the future of their data architecture.
Key Responsibilities:
Design, build, and optimise end-to-end data pipelines in Azure (ADF, Databricks, Delta Lake).
Lead financial data integration and transformation projects.
Analyse existing process logic and introduce improvements.
Ensure best practices around governance, security, and performance.
Collaborate with analytics teams to improve data models that feed business intelligence tools.
Support the finance function with scalable, clean, and secure data architecture.
Optimize data storage and management across Delta Lake.
What We're Looking For:
5+ years of experience in data engineering roles (contract/freelance experience is preferred).
Strong hands-on experience with Azure Databricks, Delta Lake, Python, and PySpark.
Proficient in designing, optimising, and tuning pipelines using Azure Data Factory and Databricks Notebooks.
Excellent SQL skills and experience working with relational databases (e.g., MSSQL, PostgreSQL, Oracle DB).
Solid understanding of modern data architectures, including Data Lake, DWH, and Data Mesh concepts.
Comfortable working in agile environments and able to adapt to changes quickly.
Fluent in English (both written and spoken).
Nice to Have:
Azure/Cloud certifications (e.g., Azure Data Engineer Associate).
Background in the insurance or financial services sectors.
Experience with BI tools like Power BI.
What We Offer:
Daily rate: €700-800/day, depending on experience.
Remote/Hybrid work - based in Zurich, with flexibility to work remotely within Europe.
Long-term project with the chance to make a significant impact on the client's data transformation efforts.
If you're a contractor who has worked in complex Azure environments, is highly skilled in Databricks and Azure Data Factory, and is eager to contribute to a high-stakes data project, I'd love to hear from you.
Apply today or reach out to me directly for more details.
