When Big Data Isn’t an Option
Companies that only have access to “little data” can still use that information to improve their business.
An advertising agency met with a client—who happened to be a U.S. Marine Corps colonel—and the conversation turned to the topic of reliable data. “Look,” said the colonel, “if I’m on a battlefield trying to defend a hill and I get a piece of intelligence, even if I’m not 100 percent sure that it’s accurate, I will make decisions based on that intelligence.” He strongly believed that it’s better to have some information than none—and that you’d be a fool to disregard it just because it falls short of being definitive. One could say that the colonel was a proponent of “little data.”
There is, of course, a great deal of discussion about the potential of “big data,” the high-volume, high-velocity, high-variety information assets that require new forms of data processing to enable companies to make better decisions and operate more efficiently. Giant data sets are being created by aggregates of individuals’ behavior (on social media sites such as Twitter and Instagram, for example), by transaction logs, and by automated information-sensing devices. Companies are increasingly mining these data sources to understand more about their customers’ behavior and preferences, and even to anticipate stock market movements. Early successes by a few companies have caused others to start investing in the infrastructure, software, and talent required to mine big data.
There is, however, one important caveat. Many companies—probably most—work in relatively sparse data environments, without access to the abundant information needed for advanced analytics and data mining. For instance, point-of-sale register data is not standard in emerging markets. In most B2B industries, companies have access to their own sales and shipment data but have little visibility into overall market volumes or what their competitors are selling. Highly specialized or concentrated markets, such as parts suppliers to automakers, have only a handful of potential customers. These companies have to be content with what might be called little data—readily available information that companies can use to generate insights, even if it is sparse or of uneven quality. For these companies, the U.S. Marine colonel’s words will resonate more than the latest data-mining algorithm or social listening platform.
Several commentators have made the point that the implications of big data go beyond new data sources, analytical techniques, and technology. Rather, a paradigm shift—away from management based on gut feelings and toward data-driven decision making—is already under way, and accelerating. The shift is so profound that companies lacking complete or clean market data can no longer use this deficit as an excuse to rely on the status quo. They must make a concerted effort to use the data that is available to them (imperfect as it may be) or to explore innovative, low-cost ways to create new data.
Companies lacking complete or clean data can’t use that as an excuse to rely on the status quo.
In one example, a large beverage manufacturer wanted to improve its sales to bars, restaurants, and entertainment venues. For years, this company had been buying syndicated data from an established source, which covered more than 100,000 establishments. Unfortunately, the data was collected and structured to serve a broad set of clients and featured a standard segmentation scheme that did not provide enough insight for the beverage company into how to serve different segments. So the company decided to adopt a series of little data techniques to come up with a solution customized to its needs.
It started with observational research, visiting bars and restaurants and qualitatively cataloging the clientele and their consumption patterns. Synthesizing this information resulted in more actionable segment definitions. The next step was to quantify the segmentation—determining how many establishments were in each segment. The beverage manufacturer developed an algorithm based on observable characteristics, then asked its sales professionals to classify all the bars and restaurants in their territories based on the algorithm. (This is a classic little data technique: filling in the data gaps internally.) Finally, for each major segment, the company designed tailored product assortments, pricing, and marketing programs. Pilot projects in two large cities have shown significant lifts in total sales and share penetration, and the company is now rolling out the initiative nationwide.
Other companies have used little data successfully as well. In one case, a maker of industrial coating products had limited data on pricing broken down by customer and region. As a result, it couldn’t build robust price elasticity models using classical regression analysis. By using other analytical techniques, however, the company was able to identify specific areas in which it could improve pricing and service policies. It moved to a value-based pricing approach to ensure its most profitable customers were receiving the highest service levels. Implementation in one business unit in one region alone yielded a 4 percent increase in sales.
In another instance, a regional health insurance company trying to differentiate itself through outstanding customer experience realized that its call center was a potential source of data about customer pain points and potential solutions. The company took full transcripts of the calls—not just the summaries entered by service representatives—and applied available text-mining algorithms. From this data, the company was able to improve the format and language of its written communications, and streamline the call-center process. In addition, it uncovered an opportunity to introduce storefront locations in certain neighborhoods in order to improve its customer interactions and increase customer retention rates.
Even large companies are able to make use of little data techniques. The Chinese large-appliance giant Haier uses information gathered by service technicians to drive innovation. In the late 1990s, some technicians, for example, found that rural customers were using their washing machines to wash vegetables, leading to clogs. Haier used this information to develop a new type of washer, which the company says is “mainly for washing clothes, sweet potatoes, and peanuts.”
With the right mind-set, virtually all sources of information can be exploited to improve products, the customer experience, or a company’s profits. Little data techniques, therefore, can include just about any method that gives a company more insight into its customers without breaking the bank. As the examples above illustrate, mining little data doesn’t mean investing in expensive data acquisition, hardware, software, or technology infrastructure. Rather, companies need three things:
• The commitment to become more fact-based in their decision making. This commitment is often spurred by a sense that competition is heating up or the company is falling behind changing customer habits and preferences. But fact-based decision making can be an important source of competitive advantage for market-leading companies.
• The willingness to learn by doing. Since little data applications are not commercially available via third parties, companies have to use trial and error. However, once a few priorities have surfaced, a series of pilot projects will give the company useful experience and, with a little luck, some early successes that can inspire the rest of the organization.
• A bit of creativity. To generate richer data, companies need to get creative, in part by tapping into the customer interactions that take place naturally. For instance, retailers can intercept shoppers in store locations for quick iPad-assisted surveys. Any website with a registration form can add questions that reveal preferences beyond the basic data usually collected. Call-center conversations are another opportunity to gather data on a particular topic, and the text can be mined for greater insight into the customer. Some companies create advanced user panels of savvy customers to get input during the R&D process for new products. Others rely on their sales representatives to report trends in customer preferences and competitors’ activities. The bottom line: Companies have to put in the extra effort required to capture and interpret data that is already being generated.
Companies often start the journey by picking a product, a region, and a problem that needs attention and running one or more pilot projects. This allows executives to demonstrate to themselves and the rest of the organization that the return on effort and cost is justified. Once companies start investing in analytics, they almost never stop, because the things they learn drive improvements in the business that more than pay for the analysis. The activity becomes self-funding. In some cases, companies that start with little data end up recognizing the value of the resulting insights and expanding their investment to incorporate larger data sets and more advanced analytics. For others, little data is all that’s needed. In either case, the benefits are clear: Executives get insight into what they can do to improve their competitive position, or—to put it in terms that a Marine Corps colonel might appreciate—identify what might be charging up the hill to surprise them. It’s hard to put a price tag on that.
Reprint No. 00250
Author profile:
- David Meer is a partner with Strategy&’s consumer and retail practice, and is based in New York.