News

Meet the Crunch Bunch

View as PDF

September 5, 2012- 

By Stephen Pritchard

Despite being intangible, and often invisible, business data is increasingly referred to as ‘the new oil’.

One of the first companies to appreciate its value was Tesco, which has been tracking the shopping habits of more than 15 million British families for almost two decades thanks to its innovative loyalty card scheme. Launched in 1995, the Clubcard helped Tesco catapult past Sainsbury’s to become the UK’s biggest supermarket chain and the third-largest retailer in the world.

Today there is hardly a businesses sector on the planet that does not mine data in the hope of unlocking new revenues. Facebook’s valuation, when it listed, was largely down to the mass of data it collects from its users.

But simply gathering and storing endless information does little to help a business: in fact, it will just add to its IT costs. A US study by business software company Oracle found that managers felt they were being deluged by it. The average data volume they are handling has increased by 86% in just two years.

As well as facing rising costs from managing more data, businesses are also failing to make proper use of it. As Peter Sondergaard, head of research at analysts Gartner, puts it: “Information is the oil of the 21st century; analytics is the combustion engine.”

In other words, the real value of business information comes not from collecting or storing it, but from the tools that enable businesses to examine their data and look for trends. They can then use this to come up with new ideas or develop new products.

But only a few companies are doing that today. Gartner, for example, says that 85% of business information is so- called unstructured data, including text, video and audio. Only one company in 10, however, has a formal role in the business for someone to manage that data and put it to work.

This comes with a cost. Again, according to Oracle’s survey, private-sector companies believe they are losing 13% of potential revenues by failing to make the most of the information explosion. And this is in part because most things to do with data are still cloaked in geek-speak.

This, says Lora Cecere, CEO of industry researchers Supply Chain Insights, is a hurdle when it comes to turning data into profit. As a respondent to one of her surveys pointed out recently: “Most business people do not understand this, it is still IT-speak for now.” But it is not a trend that companies can afford to ignore.

“What is driving change is the availability of a lot of data,” says Andy Fano, who leads the data scientist team within the analytics practice at Accenture, the consulting firm. “There are technologies available that make processing those data possible. But what it is about is understanding the signals that are present in the data. And that is not just traditional data, coming from transactions, but information from sensors, from social media, or unstructured information such as text, images, audio and video.”

Tesco doesn’t just use its Clubcard to see who is buying a particular brand of toothpaste. It has used the data it collects to develop an entirely new raft of businesses, such as financial services.

There are many other examples of data being used to hone businesses. Virgin Atlantic has used data gathered from years of flying passengers to the US to change the timings of some of its less busy flights, making them more convenient for travellers with onward connections. This has transformed some of its less popular routes into some of its best revenue earners.

Other technology is even more advanced. Accenture, for instance, has used cameras in supermarkets to help retailers better understand where to place goods on the shelves. Yet the quest for information is hardly over. “In retail, if you look at transactions over the last 10 years, it feels like you have a lot of data,” says Accenture’s Fano. But when it comes to predicting what, say, an individual shopper might do, there is no such thing as too much data.

“If a customer visits a department store six times in 10 years, and visits the menswear department, all the information you might have could be that they bought an item at this price. If you’re lucky, you would know it was a shirt.” That information on its own says little about that customer, or his or her spending patterns or desires. “That is where the volume of data is deceptive,” warns Fano.

All told, experts who can crunch the numbers, unlock the secrets of companies’ vast databases and explain the data’s meaning to senior managers are highly prized.

And helping managers make sense of data is a lucrative business. By no means all the companies mining and analysing data are new: some have done so, out of the limelight, for years. If data is indeed the new oil, here are some of the explorers.

Andres Reiner

CEO

PROS

New York-listed software company PROS specialises in price optimisation. It started in the airline business more than 30 years ago, but has since moved into areas such as hotels, business-to-business services and retailing: anywhere where goods or services are perishable or have a fixed shelf life. Andres Reiner, its CEO, says businesses often have plenty of data, but struggle to translate that into improved profitability.

“Companies have invested a lot in data and creating ERP (enterprise resource planning) and CRM (customer-relationship management) solutions to leverage that data,” he says. “Technology allows them to use transactional data with other external sources to drive their strategy. But a lot of companies have lost their ability to understand how raw-material price changes or currency fluctuations impact their business. They need better tools to guide them on where and when to make changes.”

Companies should be looking for patterns in the numbers: changes in the market that might affect pricing. But this means more than just looking at their own sales or reservations systems – they must look at competitor data too. And they need to do more to bring marketing and sales, and purchasing, together. “We’re trying to bring them the data they need to make the right decisions,” he says.

Sometimes, looking at the data produces startling results. One firm found that the difference between the highest and lowest prices paid for its products was 70%. And sales teams do not always tie pricing to profitability. “How do you ensure a customer who is driving more business for you pays a better price than one who drives less business?” asks Reiner. “When any sales rep can change a price, you may be setting a price that means you are losing money on that business – but you will never know. Businesses must leverage the technology they have to understand how the market is changing and how their costs are changing, so they can make real-time decisions on where you need to price to win business profitably.”

Industrial customers, which have traditionally updated price lists once a year, if that, are being hit hard by more volatile raw-material pricing.

They could, Reiner suggests, learn from FMCG retailers or airlines and price more dynamically. “The consumer markets are much more advanced when it comes to using this data, because they’ve needed to be. Business-to-business companies may not always have seen the volatility we’ve seen over the last few years.” Regular, small price increases might be better for both a business and its customers, as it is easier to absorb the costs. “Understanding how your profitability changes over time is very important,” says Reiner. “And businesses need to be more surgical: you are not changing all of your price lists, but changing them where it matters.”

Jim Goodnight

CEO

SAS Institute

Dr Jim Goodnight co-founded data analytics firm SAS Institute in 1976.

Still privately held, the company ranks governments, global banks and big-name retailers among its customers.

For three decades, analytics remained a largely specialist branch of IT, used in areas such as banking or intelligence analysis. But that has changed. “Execs realise that analytics can be a key factor in improving their bottom line,” says Goodnight.

“A bank needs to make decisions on who to lend money to. By building predictive models, we can compute the probability that someone will repay their loan. With that information, the bank can decide whether it is worth taking the risk or not. We score credit card usage, so when you use a credit card we are at the other end working out whether the card is being used fraudulently.

“We’ve been in the high-end analytical space for 35 years. But more and more companies are using analytics, and we are helping to create new solutions for companies that maybe don’t have the analytical talent themselves – we will host the data and do the work.” HSBC is one such customer: SAS Institute develops algorithms for the bank to use in its analytics.

The current trend for businesses to use ‘big data’ is not, though, a guarantee of improved profits. “Often the problem is not so much the analytics but getting it all together and getting the data cleaned up,” he says. “There are always bugs in data: someone has put down an address incorrectly or misspelled a name. But with high-performance analytics, once the data is cleaned up, we can compute models in a matter of minutes and address huge problems we would not have been able to dream of solving just a few years ago.”

Demands such as credit card processing have prompted SAS Institute to focus on performance: the smartest algorithms are of no use if a shop customer has to wait several minutes for approval. Those lessons, Goodnight says, are now being applied to other businesses.

“We have been successful at uncovering really fast ways to do things,” he adds, explaining that this in turn allows the software to run on more modest equipment. As a result, a wider range of companies can use the service.

“Retail is one of the last spaces to turn to analytics in a big way,” he says. “We are working with [US department-store chain] Macy’s to forecast the shelf life of each item in their stores – if it looks like it will be on the shelf for more than a season, we will suggest markdowns. We do that for 270 million items each Monday morning.”

Gordon Rugg

Founder

Search Visualizer

Academic Dr Gordon Rugg – a psychologist and computer-science lecturer at the UK’s Keele University, set up Search Visualizer to help researchers and businesses make sense of large amounts of information. This might be results from a search engine, where the problem is sifting through thousands of hits, or looking for specific terms within a document. In layman’s terms, it is helping to look for needles in a haystack of data.

“Most people tackle big data with one of two approaches. One is to make software replicate how humans operate – the ‘semantic web’ approach. The other is to do what humans are bad at: statistical cluster analysis,” says Rugg.

“The problem with the semantic approach is that the hardware isn’t yet up to it. Nor is the software. What we are doing is representing data in a way that plays to the strength of the human mind’s ability to process it: making sense of huge amounts of natural language text but also quantitative information.”

The technology works by turning data into a pattern of coloured dots. The company says it can search through an entire Shakespeare play, and put the results for just a single word on one page. But there is a more business-focused application.

“The companies we are talking to are drowning in data. They want to make sense of it. One customer is looking at patent searches, and searching for patents in languages they don’t speak. The current technology is too slow. But we can identify which records are likely to be relevant and draw up a shortlist of documents to be translated.”

Another application is monitoring social media for comments. “Say you want to know what’s being said about a company. You don’t need a lot of training to do that, and you can scroll through hundreds of records in a few minutes to get a general feeling for whether sentiment is positive or negative. Then you can drill down into what people are saying – the words that they are using – whilst you are at your desk. You just reframe the question with a greater degree of precision.

“When we show it to people their eyes widen. Within minutes they can be up and running and using it at a ‘power user’ level. There is a lot you can do with this that you can’t do with conventional search technology: you can see a large amount of information at a glance. We see this being used by medium to large, nimble companies, that are small enough to react quickly and flexible enough to adopt new ideas.”

Neil Thomson

CTO

Microgen

Dr Neil Thomson carried out academic research on memory and reasoning at Cambridge before moving into information technology.

He founded his own company, developing business-rules software, which he sold to Microgen. There, he built the firm’s Aptitude technology, which deals with large volumes of business data.

Businesses cannot derive insights from data if they cannot process it in a joined-up manner, he says. “In classical statistics, you are not just looking randomly for patterns. It is always going to be difficult to look through large amounts of data without an a-priori hypothesis.”

The challenge is to design a system that is efficient, and fast enough to cope with huge volumes of data, but which is simple enough to be configured by business users, not IT people or specialist analysts.

“I can write something that goes fast in a low-level language, but that doesn’t correspond to any business language,” says Thomson. “We try to create something people can use that isn’t opaque code.”

The Aptitude software aims to help companies make more of their data by being as intuitive as possible but still powerful. The idea is that by making data analytics more accessible, companies can run more searches and queries, and make better use of the information they hold. After all, data, Thomson concedes, is worth little on its own.

“More data doesn’t necessarily lead to better decisions,” he says. “Ideas come from creativity; data lets you check those ideas. If you look at most standard business processes, ‘big data’ is simply more data.” And ever-larger databases aren’t the answer. “The issue is not having the data, it’s having the questions. Business users need to be able to ask the right questions.”

Previous

Taming the Volatility Beast in Chemicals Markets

Next

PROS Introduces Quote2Win for Salesforce