Head of Data Pipeline Enablement
- Edinburgh, UK
- Verisk Business: Wood Mackenzie
About Wood Mackenzie and Verisk
Verisk is a global analytics provider serving customers in insurance, energy, and finance. Verisk has been serving clients for over 50 years by using proprietary data assets to deliver analytics and predictive modelling that help clients make better decisions. We offer solutions in rating, underwriting, claims, weather and catastrophe modelling, global risk management, energy and natural resources, and retail finance.
Wood Mackenzie, the flagship company in Verisk’s Energy & Specialized Markets division, provides expert research and analysis in the energy domain to clients worldwide. Our oil, gas, power and renewables, chemicals, and metals and mining sector teams are located around the world to support our global client base, working with strategy and policy makers, business developers and market analysts, corporate finance, risk teams, and investors. Our analysts deliver research and consulting projects based on the assessment and valuation of thousands of individual assets, companies, and economic indicators, such as market supply, demand, and price trends.
The Head of Data Pipeline Enablement is responsible for leading our data modelling, data pipeline oversight, and data pipeline R&D teams. These teams guide and improve how our data engineers process data through our pipeline.
The data modelling team is responsible for creating and managing the data models that represent Wood Mackenzie’s view of the industry. This intellectual property is a fundamental component of our data management system, and enables us to provide high quality data and analytics to our clients. The pipeline oversight team work with our research and data engineering teams to determine how to identify and onboard data into our systems. This includes finding sources and processes to ingest, defining what “high quality” means for each data asset and metrics for tracking and monitoring, and working with the data engineering team to implement those processes. The R&D team works to assist data engineers the ingestion and structuring of unstructured data. For example, data sources may include unstructured presentations, press releases, news reports, even images, that need to be processed and structured for ingestion by our data pipeline.
The Head of Data Tools & Development/R&D is a key role in developing and advancing our data management capability. At Wood Mackenzie, data is the foundation on which our high-value analytics and research is based. Our competitive advantage is the quality of our data – it’s completeness, coverage, and accuracy. Our high data quality is critical to our success and this newly developed role is part of a transformation we are undergoing to create a new data organization that is better equipped to produce the high-quality data we need to provide more differentiated, more compelling services for our clients in the energy and natural resources markets.
As a member of the Central Data Organisation, this is a key role that will define how we manage and process our data. We are looking to become more scalable, highly automated, and capable of producing higher quality data that further differentiates us from our competitors. We are looking for a leader who understands the value of data and how to process and ingest it, and is able to help define processes as well as adjust/modify them as the team and our needs evolves. Experience with high levels of automation and process control to take raw, often unstructured and dirty data and turn it into structured, quality-controlled, analytics-ready data is important. This role requires understanding of technology and processes for managing data and the ability to interface with senior and executive leaders within the data organization and across the company.
As a senior leader in our Central Data organisation this is a pivotal role to build out and transform our data operations within Wood Mackenzie.
Strategy & Vision
- Manage and hire as needed teams of data modellers and a pipeline oversight team, working with research, to define the data sources, processes of ingestion, and target data models for our data assets.
- Collaborate with senior stakeholders across the business to understand the evolving operational data needs of the organisation and our clients.
- Collaborate with members of the Research Team (data analysts and research analysts) to deliver new data needs.
- Work closely with the Heads of Data in Research and the CDO to ensure strong communication and alignment with Research Leadership Team (RLT) priorities.
- Once a new data source has been identified estimate work required to onboard and validate new data sets.
- Hire and manage teams that may be located in Edinburgh, London, and Houston, possibly additional locations, as determined to be optimal by our executive leadership team.
- Lead a team of data scientists to develop new algorithms for structuring unstructured data. Make sure those teams develop in a way that aligns and integrates well with the data platform we are using to process our data. Focus is on modular, efficient, and easy-to-operate algorithms that can be used by a team of data engineers that is distributed globally.
- Work with CDO to agree key workforce planning & resource priorities and track progress against these – specifically with respect to any special projects, timing, prioritisation and budget.
- Working with teams that develop proof-of-concept capabilities and finding ways to integrate them into a production data pipeline.
- Leading teams to produce reports and dashboards that show the status of data in our pipeline
- Show key metrics about types of and volume of data processed
- Highlight areas of concern that require follow-up
- Show trends in our processing capability over time
- Help in finding gaps in coverage, completeness and quality
- Segmented to identify data strengths and weaknesses
- Foster an agile, high-performance, collaborative culture which is creative, open, supportive and dynamic with high levels of continuous improvement mind-set.
Operational Excellence & Quality
- Ensure the team ways of working are aligned and compliant with the organisation’s defined methodologies and approaches.
- Work with stakeholders across Wood Mackenzie to ensure effective timely delivery on key deliverables and business cycle processes.
- Work with research teams to understand how our clients view data quality and implement validation processes to enforce that quality
- Ensure strong team emphasis on automation and scalability
- Define and monitor metrics that measure scalability of Operations team Proof of Concepts.
- Track these metrics over time to improve performance
- Define technical requirements that increase the level of automation in the Data Operations division.
- Strong emphasis and understanding of data quality
- Ensure that the right data is available to research analysts and end-users to produce high quality, timely data-driven analysis, insight and content.
- Define metrics to measure data coverage, data completeness, and data accuracy
- Lead the team to develop dashboards and reports for measuring these metrics
- Ensure that Lean principles and philosophy are at the forefront of WM thinking and ways of working.
- Monitor, produce and distribute as required all metrics used by the team to drive performance
- Learn and understand our data pipeline and the tools used to manage data through that pipeline
Knowledge & Experience Required
- Ideally you will already be a senior leader in a subscription data analytics business, but we will consider someone ready to move into a senior data leadership role.
- You will have a strong understanding of data management fundamentals, including concepts such as data dictionaries, data models, validation, and reporting.
- Strong background in processing data, especially unstructured data, for use in structured environments.
- Able to lead teams in building dynamic dashboards, reports, and presentations on data quality and data status
- Experience in Oil, Gas and Power industry would be an advantage but not essential
- Able to manage complex relationships in a matrix environment and strong stakeholder management skills.
- Experience managing data through its lifecycle (especially sourcing, validating, and deploying)
- Experience with Python or other general-purpose programming language
- Experience with BI tools such as PowerBI, Tableau, Spotfire
- Experience with ETL and data manipulation tools helpful, such as Pentaho or Alteryx
- Experience with different types of databases, including structured and unstructured
- Experience querying and manipulating data stored in databases to build reports or dashboards
- Experience managing automated systems
- Experience in troubleshooting data issues (related to any part of the data lifecycle of sourcing, validating, data model alignment, validation, publication)
- Results orientated, proactive, takes initiative.
- Strong organizational and project management skills
- Able to work across regions and cultures
- Willingness to travel to locations where operational teams may be located (Edinburgh, London, Houston, Krakow)
All your information will be kept confidential according to EEO guidelines.