Data Engineer
Are you a talented Data Engineer with Databricks expertise ?
Would you like to work for a company that is making a difference and tackling some of the worlds toughest problems ?
In this role, you’ll design, build, and optimise scalable data pipelines on the Databricks platform, enabling high‑impact analytics and machine learning that improve the performance of renewable assets globally.
If you’re passionate about cloud data engineering, big data technologies, and working in a collaborative environment—this is the perfect opportunity.
Its hybrid working, 3 days in the office and 2 from home. This is a 24 month Fixed Term Contract.
Please only apply if you have the right to work in the UK without Visa sponsorship.
Key Responsibilities:
- Design, develop, and maintain robust data pipelines using Databricks DLT.
- Collaborate with software engineers, data scientists, and platform engineers to define and deliver high‑quality data solutions.
- Build scalable ETL/ELT workflows for structured and unstructured data sources.
- Optimise Databricks workflows for performance and cost efficiency.
- Ensure strong data quality, governance, and documentation practices.
- Create reusable components and frameworks to accelerate engineering delivery.
- Support CI/CD automation and modern deployment practices.
- Stay up to date with the latest Databricks features and champion best practices.
Knowledge:
- Strong grasp of data modelling, warehousing, and distributed computing.
- Experience with Delta Lake, Unity Catalog, and cloud governance frameworks (e.g., GDPR, HIPAA).
Skills:
- Proficiency in Python and SQL.
- Experience with Git, CI/CD tooling, and cloud‑native development workflows.
- Excellent analytical, communication, and problem‑solving skills.
Experience & Qualifications
- Proven experience delivering end‑to‑end data engineering solutions on Databricks, including DLT.
- Hands‑on experience with cloud data platforms (preferably Azure; AWS or GCP also valuable).
- Exposure to machine‑learning workflows and integration with ML models.
- Experience working within distributed, cross‑functional teams.
- Databricks certifications (Associate or Professional).