The high cost of developing offshore wind is often sited as one of the main drawbacks for further adoption of the technology. There are several strong arguments which challenge this notion including the fact that it is an emerging sector still in an R&D heavy phase, the long-term truth that after project capitalization the long-term variable fuel costs are free, the price suppression effect this creates for legacy technology (an effect now being seen in Europe where renewable production is reaching critical mass. Taking this into account though, there are areas in which better overall project planning and management should do result in consistent quality and price improvements.
In the UK a major cost reduction effort launched by the Offshore Wind Cost Reduction Pathways Study is well underway. Just this month DNV KEMA released its subsea cable guidelines, a years long joint industry project which does an excellent job of beginning to establish best practices for installing power cables offshore – an area in which the industry has seen a great deal of turmoil to date.
One area which is somewhat less glamorous but which has been shown in other industries to be able to transform and improve cost models over time is with data management. In January an £850,000 effort called the Offshore Renewable Energy (ORE) Catapult, was launched by The Crown Estate and several offshore wind farm owners and operators. The plan is to build a database to store anonymised data that will improve safety, reliability and availability, helping companies identify operational improvements and cost reduction opportunities internally and for the wider sector.
This is a huge step in the right direction. Offshore wind project developers spend tens of millions on gathering the necessary data to map out turbine siting and the corresponding balance of plant including the cables, substations and the necessary shore side facilities. This ranges from weather information to sea conditions throughout the year and over time, to water column information, seabed conditions, existing seabed users such as fishing interests, oil & gas or telecommunications – the list goes on.
As can be easily imagined, managing this wide and deep range of data comes with significant complexity. Like all systems in the natural world interdependency is the norm – add engineered structures, hundreds of them, into the mix and the data analysis requirements grow exponentially. A good example of data management tools being developed in this area is what Uni Research is doing in the area of site planning.
Another example of proactive data management can be found at TÜV SÜD PMSS with its multi-contracting project management systems and its effort to de-risk the construction process. That’s what it all comes down to – project risk, operating risk, financial risk – and if the risk cannot be parsed and quantified throughout all aspects of the projects, then it is assumed to be high – and if the risk is high, then the costs associated with building, insuring and operating are going to be high as well. These installations are designed to be operational for decades so the data gathered and organized today will provide the foundation for our being able to cost effectively solve the problems which will arise tomorrow.
To that end, data management and analysis will play a major role in de-risking, and ultimately reducing the overall costs of installing and maintaining windfarms offshore. Just think, by managing data effectively companies such as Google and Amazon know everything about us. If the data of our personal habits has such value surely gathering and managing data for offshore windfarms for decades to come will have some value as well.