Skip to contentSkip to navigation

“Little Data” Matters, Too

Big data is an invaluable source of insight, but companies with limited access to analytics can still find information to improve their business.

(originally published by Booz & Company)

Back when I was working at the advertising agency JWT, one of our clients—a U.S. Marine Corps colonel—said something that has stuck with me ever since. “Look,” he said, “if I’m on a battlefield trying to defend a hill and I get a piece of intelligence, even if I’m not 100 percent sure that it’s accurate, I will make decisions based on that intelligence.” His point was that it’s better to have some information than none—and that you’d be a fool to disregard it just because it fell short of being definitive.

There is, of course, a great deal of discussion about the opportunities big data gives companies for greater customer insight and operational efficiency. But many companies, if not most of them, work in relatively sparse data environments: in emerging markets, B-to-B industries, highly specialized or concentrated markets, and the like. These companies have to be content with what I would call “little data.” For them, the colonel’s words will resonate more than the latest data-mining algorithm or social-listening platform.

In my last blog post, I made the point that the implications of big data go beyond new data sources, analytical techniques, and technology. Rather, a paradigm shift—away from management based on gut feelings and toward data-driven decision-making—is already underway , and accelerating. For companies operating without complete or clean market data, this means a concerted effort to make better use of the data that is available to them (imperfect as it may be) or to use low cost ways to create new data.

A paradigm shift—away from management based on gut feelings and toward data-driven decisions—is under way.

Here are several good examples of companies that have done this successfully:

• A maker of industrial coating products had limited data on pricing by customer and region, so it couldn’t build robust price elasticity models using classical regression analysis. By using other analytical techniques, however, the company was able to identify concrete areas to improve pricing and service policies. It moved to a value-based pricing approach to ensure its most profitable customers were receiving the highest service levels. Implementation in one business unit in one region alone yielded a 4 percent increase in sales.

• A large beverage manufacturer wanted to improve its sales to bars, restaurants, and entertainment venues. The available syndicated data was based on a standard segmentation scheme that did not provide enough insight into how to serve different segments. The company used observational research to define more actionable segments, but needed a way to quantify the segmentation. It developed an algorithm based on observable characteristics, then used a classic little data technique by asking its sales professionals to classify all the bars and restaurants in their territories based on the algorithm. Tailored product assortments, pricing, and marketing programs were developed for each major segment. Pilot projects in two large cities have shown significant lifts in total sales and share penetration, and the initiative is being rolled out nationally.

• A regional health insurance company trying to differentiate itself around outstanding customer experience realized that its call center was a potential source of data about customer pain points and potential solutions. The company took full transcripts of the calls, not just the summaries entered by service representatives, and applied available text mining algorithms. It was able to improve the format and language of its written communications, and streamline the call center process. In addition, the company uncovered an opportunity to introduce storefront locations in certain neighborhoods to facilitate customer interaction and improve retention rates.

• The Chinese large-appliance giant Haier uses information gathered by service technicians to drive innovation. Some technicians, for example, had found that rural customers were using their washing machines to wash vegetables, leading to clogs. Haier used this information to develop a new type of washer, which the company says is “mainly for washing clothes, sweet potatoes, and peanuts.

None of these examples involved expensive hardware, software, or technology infrastructure. Data acquisition costs were low and, in some cases, cost nothing at all.

What is required to take advantage of such little data is a bit of creativity and a willingness to learn by doing. Pick a product, a region, and a problem that needs attention and run a pilot project. In this way, you can demonstrate to yourself and the rest of your organization that the return on effort and cost is justified. My observation has been that once companies start investing in analytics, they almost never stop, because the things they learn drive improvements in the business that more than pay for the analytics. It becomes self-funding. Executives get insight into what they can do to improve their competitive position, or—to put it in terms that a Marine Corps colonel might appreciate—identify what might be coming up the hill to surprise them. It’s hard to put a price tag on that.

David Meer

David Meer is a thought leader on consumer insights and marketing analytics, with a special focus on the retail and consumer sectors at Strategy&, PwC’s strategy consulting group. Based in New York, he is a principal with PwC US.

 
Get s+b's award-winning newsletter delivered to your inbox. Sign up No, thanks
Illustration of flying birds delivering information
Get the newsletter

Sign up now to get our top insights on business strategy and management trends, delivered straight to your inbox twice a week.