Fotolia

Evaluate Weigh the pros and cons of technologies, products and projects you are considering.

Data trust gap confronts analytics -- time to open the 'black box'

C-suite honchos sign off on analytics tool purchases and then wonder what they've wrought. But data trust can build if practitioners make the analytics process more transparent, consultant Bill Nowacki says.

Sometimes it seems big data analytics is stuck at the starting gate. There seems to be a data trust gap that blocks progress. Business leaders see new data analytics as transformational, but they aren't sure how trustworthy it is. A recent report from consulting and professional services company KPMG shed light on the dilemma.

According to a survey of 2,165 data and analytics decision makers conducted for KPMG by Forrester Research in July 2016, data and analytics tools are used widely to analyze existing customers (50%) and to find new ones (48%). However, only about 34% of business leaders are ''very confident'' about the insights the tools revealed about business operations, according to the survey.

There are indications that investment in data and analytics tools is increasing. So confidence in analytics' results should be better, according to Bill Nowacki, a data analytics veteran who is KPMG's managing director for decision science in the U.S. In an interview with SearchDataManagement, Nowacki said more transparency is needed. That means showing executives what's behind the recommendations that analytics tools are making.

The research seems to highlight a data trust gap. What's been the trend in confidence in data quality?

Bill Nowacki: There is a core belief that has emerged that analytics are essential -- but, at the same time, we still haven't gotten the full onboarding from top brass, and the industry needs to take certain steps to shore that up. Part of that is being more transparent. Let's step back and look at the evolution.

The whole acceptance of analytics, if you were to look at it graphically, is shaped like a U. If you go back a few years, it was really the Wild West, and people were building analytical models that were brought online quickly without much scrutiny. They were incorporated into routine decision making. And that worked for a while.

But then we saw that certain decisions were not made optimally. The veracity of the models was challenged. Now, we're at the bottom of the U and we're seeing an awakening -- one that says it is really now time to take a more serious view on this. That's in terms of which data we're using, where they're coming from, whether they're curated correctly and so on.

Bill Nowacki, KPMGBill Nowacki, KPMG

The industry saw predictive models being used, especially in terms of marketing and pricing. People were experimenting with optimization or asset management. We saw tons of this. But as they [became] mission-critical to companies, it became time to make sure they were built to last. We're starting to see things pop up on the other side of the U, but with a lot more structure, governance, discipline and, I think, a call for transparency.

The transparency doesn't get easier with the emerging use of machine learning for predictive analytics. It isn't easy to see into those kinds of 'black boxes' to help boost data trust.

Sure, but there are ways in which that is being chipped away. Think about the credit industry. If you look back at FICO, there was a period in the 1980s and 1990s where they were using neural networks and getting good, highly predictive credit scores. But what was lost in that was explainability.

Regulators came in and told them, 'If you're turning people down for credit, you have to be able to explain why.' It was the Wild West; the models were born and then people said we need more transparency. We need to understand why the engines are saying what they're saying.

If it's a black box, it doesn't engender trust. So, you make a choice when designing, saying, 'I'm going to use methods that reduce opacity.' That begins to foster some modicum of trust.

What we're seeing today is much more deliberate design -- time spent on the exam questions figuring out what does 'good' look like. After that comes a lot more deliberative development. For example, people look at their retail real estate portfolio -- you look to find the signs, or signatures, for all your stores so that you have really good [customer] cohorts and you take a really genuine sample of each, experiment with price within those, see if you get the outcome you anticipated and, once that is done, deploy a little bit more and then a little bit more. People are starting in a small way, but in a way that can be easily generalized to the rest.

That recalls a basic tenet for projects, which is that it's good to be able to show success.

Yes, senior executives are already well on their way to accepting analytics in everything they do, but nothing engenders trust more than success. And going to market in a very deliberate way, where you have set up experiments and you're validating the predictions in the market before you go 'big bang,' also engenders trust. 

Next Steps

Find out how data trust is faring in an age of unpredictability

Review an expert's take on decision-making models

Look at the new field of IoT data analytics

This was last published in November 2016

Dig Deeper on Data stewardship

PRO+

Content

Find more PRO+ content and other member only offers, here.

Join the conversation

1 comment

Send me notifications when other members comment.

By submitting you agree to receive email from TechTarget and its partners. If you reside outside of the United States, you consent to having your personal data transferred to and processed in the United States. Privacy

Please create a username to comment.

What data trust issues are you encountering in analytics and reporting?
Cancel

-ADS BY GOOGLE

SearchBusinessAnalytics

SearchAWS

SearchContentManagement

SearchOracle

SearchSAP

SearchSQLServer

Close