MeteoGroup scores high on verification: why aren’t we shouting about it?
MeteoGroup strives to deliver accurate forecasts. This mission requires a deep understanding of the problems and challenges customers face. Knowing the precise limits and margins for their specific activities means knowing what forecasts parameters to deliver and how to measure quality.
Before you buy a lamp, you probably want to know how it looks, if it’s energy efficient, and whether it gives enough light. It’s a quick quality check to see if the features meet your needs.
Likewise, when you purchase a weather forecast, you want the same quality assurance. Will the promised features help your business? It’s an important decision because an accurate forecast is not only reliable; it helps you make confident business decisions.
Evelyn Müller, Verification Manager at MeteoGroup, explains why weather is so important to businesses: “Farmers might have a crop model running based on weather input. Winter road managers must make decisions about salting the roads. Offshore workers don’t want unnecessary time lost because of high waves.”
But how do you know that your weather forecast will deliver the insight you need. The key to quality checking lies verifications, which play an important role in delivering accuracy. As Evelyn reveals, “I am very mathematical, and I like all the discussions we have here about numbers.”
Verifying in a thousand ways
Yes, it is all about math. Some weather parameters, such as the temperature forecast, are easy to verify. You might quantify the number of days when the temperature forecast is right at a certain location within plus or minus 2 degrees - a very acceptable range for a general forecast. We then take a certain period and can say, for example, that for 93% of the nights/days our forecast for this location is correct. Of course, you can do this for lots of locations and with different margins. It becomes more complex when you have to check on elements that lack decent observations: you can only verify when you have reliable, professional observations. All big clients are obliged to have good on-site measurements, not least from a legal perspective, and MeteoGroup will use those. “And, we profit from the really big pot full of professional observation data that we have as a weather company,” says Evelyn.
But how does one verify probabilities? Like ‘a 70 percent risk of a shower’ in a certain region? “Probabilistic forecast verification methods are already developed by the meteorological community that runs under the umbrella of the World Meteorological Organization,” says Evelyn. “So, we don’t have to invent. We use what is scientifically recommended. But to keep it understandable we try to translate it into user language. Dennis Schulze, Chief Meteorological Officer at MeteoGroup, adds, “In a probabilistic forecast you cannot take one case and say the forecast is right or wrong. You will have to take many, many forecasts and then compare.”
Black box – secret box?
“I think verification is meant to be the opposite of a black box, to create transparency of accuracy and quality towards ourselves and our customers. You need to explain in a lot of detail what kind of forecast you issue, what kind of observation you take and then the methodology to compare the two. A black box? Maybe because verification is mainly in a footnote,” says Dennis.
It is almost impossible to compare your own verifications with competitors. Everyone keeps their analyses to themselves - understandably because nothing is as easily misinterpreted as numbers. Dennis shows us an example: “Suppose 92 percent of the minimum temperatures forecasted were correct and our competitor says: ‘we had 95 percent correct’. It is not that plain and simple. Maybe they took a boundary of 2.5 degrees, where we took 2.0 degrees. Well, in that case, you cannot compare the results. We are very open with customers when we are in direct contact with them. However, if we would just put numbers on our website, they could be misinterpreted and misused.” Evelyn adds, “It is very easy to tweak or manipulate the outcomes. There are so many important details, explanations on how it is done precisely. That is one of the reasons that we don’t publish the results. Because others can take out what they like. The devil is in the details.”
Verification process starts at the front end
MeteoGroup’s goal is to deliver excellent forecasts. This objective requires a good understanding of the problems and challenges our (potential) customers face. It is important to have extensive talks with the client to find out what the exact limits and margins are for their specific activities. From that moment onwards, it is possible to make a serious proposal of how to deliver what kind of forecasts and agree upon the verification process. This all is dependent on the crucial elements for the customer. “Our willingness to sign up to a specific Service Level Agreement is also a confirmation that we really trust the forecasts that we produce,” says Dennis. “We commit to a certain quality level. Like what we do for the BBC, as verifiers, we were at the table during the start of the negotiations.”
An enthusiastic Evelyn continues: “Our clients do comparisons between weather providers from time to time and share the results with us. We often win these comparative studies. This gives us a lot of confidence; we are one of the top players, and we win a lot of contracts because of these kinds of results. Usually, the client asks us not to shout it out, meaning we cannot publicly put the spotlights at ourselves.”
It turns out that verification is very often something just between MeteoGroup and the customer. But Dennis mentions another crucial point: “Evelyn and other team members have developed some very nice in-company tools to visualize verification of the database and the adjustments made by meteorologists. I dare to say that we are one of the few companies doing this: giving daily feedback to the whole company, and especially the meteorologists, of how their forecast performance was the last day. That they can think about what went well or bad. And then it’s all about improving, learning from that.”
This results in a continuous stream of discussion. Between forecasters, researchers, developers, verifiers, and all others concerned. As Dennis says, “The visualization might make people a bit uncomfortable, but we see ourselves as a big team. We all want improvements; we all want to be the best weather company in the world.”
Interested in learning more?
We work together with an international group of people that dedicate themselves to forecasting verification. So, we can easily measure if we do something scientifically sound.