Just over a decade ago, Sandia National Laboratories founded the PV Performance Modeling Collaborative (PVPMC). According to Sandia, PVPMC increases transparency and accuracy in PV system performance modeling, helping to bring stakeholders together to improve modeling practices. Joshua Stein, senior scientist at Sandia, thinks it’s time for a similar effort to shed light on data analytics for monitoring PV plant performance.
“There is a lack of standardization and harmonization in the way people talk about fault analysis and monitoring,” he explained pv magazine. Currently, some companies offer services that include data analysis and fault identification and classification based on PV power plant monitoring data. However, these companies each use their own set of error definitions, meaning it is very difficult to compare and contrast the value of these services.
“Imagine if every doctor or hospital you visit uses its own diagnoses and clinical practices.” said Steijn. “It would be very difficult and confusing to get a second opinion. Most mature industries standardize their definitions of errors and mitigation procedures. Solar PV is not quite there yet.”
First, the introduction of machine learning techniques to automate fault detection in PV systems has further clouded the picture. Stein made it clear that he is not against machine learning. “Automated, ML-based error detection is useful because it can lead to highly scalable analytics,” Stein said.
Rethink KPIs
Technical Key Performance Indicators (KPIs) are essential for evaluating PV power plants, from the development phase to contractual agreements between asset owners and O&M suppliers.
“Although the IEC and ASTM standards define a number of KPIs, such as performance ratio (PR) and capacity testing, their calculation methods often vary or rely on user interpretation, leading to uncertain results,” says Marios Theristis, colleague from Stein’s Sandia Labs.
“For example, there is significant flexibility in the processing of data, which can introduce biases that impact contracts and financial decisions. Without clear KPI definitions and harmonized calculations, contracts can be unfairly influenced – not due to actual under- or over-performance, but due to biased calculations,” he added.
Bad data is data that is not useful because it was obtained inadequately. It contains errors or is not clean. This in turn creates problems for PV stakeholders who rely on it to tell them what the true performance of their system is.
Standard industry practice is for asset owners to outsource an O&M vendor to handle operations and maintenance. This means that there is a contractual agreement that stipulates that the power plant is properly maintained and that certain KPIs are met.
“If you have poor data quality, there will be some bias in the KPI estimate. This bias can be positive, but also negative. In a hypothetical scenario where the PR bias is negative, the O&M provider would unfairly have to pay penalties caused by data quality, rather than underperformance,” Theristis said.
This also works in reverse, where an O&M supplier could benefit, due to bias caused by data quality and not by factory overperformance. Theristis continues: “In some cases, the operational PR is compared to the pre-construction PR generated by PV design software, leading to an apples-to-oranges comparison: is the PR difference due to an overly optimistic pre-construction simulation or actual underperformance?”
Transparency of the program
Theristis has started a new program to address these issues, the PV O&M Analytics Collaborative (PVMAC). He introduced the project at the EU PVSEC conference in September 2024 in Vienna.
“Our goal is to create a collaborative network to improve transparency in PV analytics software and services, to collaborate with monitoring/analytics companies, O&M providers, asset owners and insurers, and to help these stakeholders work together to agree on what standards are needed and to ensure overall market transparency,” he said.
The team also uses Sandia Labs supercomputers to run simulations across the United States to reduce uncertainties in the KPI estimate.
PVMAC is still very new, but in the coming months the researchers will meet with industry to conduct interviews and tests, such as blind analytical comparisons. This involves providing stakeholders with real or synthetic performance data and asking them to identify and classify errors and calculate KPIs.
“Blind comparisons are a great way to evaluate the state of the industry and the consistency between different providers.
“We expect these comparisons will reveal discrepancies in how different companies define errors and calculate KPIs,” Stein said.
“We plan to organize special sessions on O&M analysis, failures and KPI harmonization during our upcoming PVPMC workshops,” said Theristis.
Standardize performance
The two scientists hope that by shining a light on any inconsistencies, it will be easier to reach consensus on the steps needed for standardization. For example, an O&M supplier that offers a performance guarantee could be incentivized not to properly clean irradiation sensors because a dirty pyranometer makes PV performance appear better than it actually is. Stein and Theristis’ goal is to quickly determine why a PV installation is underperforming and propose strategies to address the problems.
The stakes are high; even a small percentage of underperformance for a large PV installation results in significant financial losses. Stein thinks that measuring the performance of a PV system is not that different from measuring a person’s health.
“You collect data about your patient, for example vital signs, medical history. If these indicate a more serious problem, order more tests. In medicine at least, the standards are pretty much international, so no matter where in the world you go, you’ll be judged in a similar way.
“I hope we can have a similar system with PV. For example, the oil and gas industry is standardized. If you go to an oil rig or a gas turbine anywhere in the world, the standards are similar,” he said.
Tangled web
Poor management of power plants can also cause problems. Theristis cautioned that while there are many O&M products and services available, asset owners may not have “quantitative knowledge” of the potential benefits of investing in data quality, O&M and analytical capabilities. “These solutions have not achieved transparency and lack independent validation,” he said.
Using supervised machine learning methods for classification is preferable to unsupervised methods, according to Stein, because the former allows you to tell the algorithm what the categories are.
“But many people opt for ‘unsupervised,’ which basically means there’s a problem without necessarily specifying the problem,” Stein said. “It can get caught up in clustering algorithms that say all these problems are somehow similar, and then if you use the same practice on multiple data sets you can end up with clustered data, but it’s clustered for different reasons.
“That’s one of the dangers of machine learning; it works for one dataset, but it’s hard to extend it to another dataset you haven’t trained it on.”
If machine learning is to be used effectively to monitor error detection, the industry must first agree on how these errors are defined. Not all companies that sell and use AI or machine learning solutions to identify errors use these methods consistently and transparently, Stein said.
“What we have seen is that the problem is that not all companies do the same thing. Some companies are doing really good work and have very valuable products, but there’s no one actually coming up with the standards and validation so that those companies can be valued appropriately, right?”
The scientist warned that the ‘AI hype’ could be to blame. “Many companies advertise that their software uses AI technology, but few are willing to describe exactly how it is used. And so each company has its own kind of ‘diagnostic guide’ for what can go wrong with the PV system,” he said.
“Artificial intelligence is sometimes just a buzzword.”
This content is copyrighted and may not be reused. If you would like to collaborate with us and reuse some of our content, please contact: editors@pv-magazine.com.