OVO is a highly data-driven organisation, with an ever growing demand for excellent quality, reliable and accurate data to power decision making in various parts of the business, for example:
- Enabling our product teams to work out the best way to serve our members in a highly regulated and competitive market;
- Allowing our Data Science teams to build tools and models to help members reduce their carbon impact by analysing their 10 second smart meter data;
- Helping our Leadership team prioritise how we integrate our Group companies and rationalise our organisational and technical architectures.
To support these use cases, and many more, we need to be able to empower the organisation with a cutting edge Data Platform. You will work as part of a team focused on building this platform, providing the rest of the business with the tools, services and data they need, at the speed they need it.
- Designing, building and maintaining a cutting edge Data Platform based on streaming technologies.
- Developing tooling that helps other parts of the business use the Data Platform, including tools that help users find and access the data they need, understand the lineage of data, as well as developing self-serve tooling that allows users to publish data to the platform.
- Communicating the Data Platform vision to technical and non-technical stakeholders and driving the successful adoption of the platform within the organisation.
- Influencing the team’s technology selection and architectural direction.
- Acting as a subject matter expert within the team, providing technical advice, guiding solutions, and helping to establish good engineering practices.
- Experience working on streaming ETL solutions utilising streaming data processing tools (e.g. Kafka Streams, Amazon Kinesis or similar).
- Proven technical leadership including guiding: product architecture; solution design; technology selection; design pattern usage; implementation activities and coaching and mentoring of technical team members.
- This is very much a hands-on role and so an excellent working knowledge of Scala is essential.
- Experience developing cloud-based solutions on GCP, AWS or Azure using Infrastructure as Code tools such as Terraform.
- A proven track record of designing, building, monitoring and managing large-scale data products, pipelines, tooling and platforms.
- A solid knowledge of data engineering principles such as Data Modelling, Data Warehouses, Data Lakes, Data Marts and ETL.
- You will be comfortable working in an agile development environment with regular release cycles, and have experience of pair programming, BDD/TDD, CI/CD and deployment strategies.