Feeds:
Posts
Comments

Archive for August 24th, 2012

Some people hate buzzwords, like Big Data.   I’m ok with it.  Because unlike many buzzwords it actually kind of describes exactly what it should.   It’s a world increasingly dependent on algorithmic decision making, data trails, profiles, digital finger prints, anomaly tracking… not everything we do is tracked, but enough is that it definitely exceeds our ability to process it and do genuinely useful things with it.

Now, is it because of the tools/technology that makes Big Data so challenging to businesses?   I suppose somewhat.  I think it it’s more behavioral than anything.  Humans are very good at intuitive pattern recognition.   We’re taking in Big Data every second – through our senses, working around our neural systems and so on.    We do it this without being “aware”.   With explicit Data Collection and explicit Analysis like we do in business we betray our intuitions or rather our intuition betrays us.

How so?

We often go spelunking through big data intuiting things that aren’t real.  We’re collecting so much data that it’s pretty easy to find patterns, whether they matter or not.  We’re so convinced there’s something to find there, we often Invent A Pattern.

With the ability to collect so much data our intuition tells us if we collect more data we’ll find more patterns.  Just Keep Collecting.

And then!  we have another problem.   we’re somewhat limited by our explicit training.

We’re so accustomed to certain interfaces with explicitly collected data – Spreadsheets, Relational Database GUIs, Stats programs, that we find it hard to imagine working with data in any other way.   We’re not very good at transcoding data into more useful forms and our tools weren’t really built to make that easier.   We’re now running into this “A Picture is Worth a Thousand Words” or some version of Computational Irreducibility.   Our training has taught us to go looking for shortcuts or formulas to compress Big Data into Little Formula (you know take a dataset of 18 variables and reduce it to a 2-axis chart with an up and to the right linear regression line).

Fact is, that’s just not how it works.   Sometimes Big Data needs a Big Picture cause it’s a really complicated network of interactions.  Or it needs a full simulation and so on.

Another way to put this… businesses are so accustomed to the idea of Explainability.   Businesses thrive on Business Plans, Forecasts, etc.   so they force a overly simplistic reductionist analysis of the business and drive everything against that type of plan.   Driving against that type of plan ends up shaping internal tools and products to be equally reductionist.

To get the most out of Big Data we literally have to retrain ourselves against our deepest built in approaches to data collection and analysis.   First, don’t get caught up in specific toolsets.   Re-imagine what it means to analyze data.   How can we transcode data into a different picture that illuminates real, useful patterns without reducing it to patterns we can explain?

Sometimes, the best way to do this is to give away the data to hoards and hoards of humans and see what crafty things they do with it.  Then step back and see how it all fits together.

I believe this is what Facebook has done.  Rather than analyze the graph endlessly for their own product dev efforts, they gave the graph out to others and saw what they created with it.   That has been a far more efficient, parallel processing of that data.

It’s almost like flipping the idea of data analysis and business planning on its head.   You figure out what the data “means” by seeing how people put it to use in whatever ways they like.

Read Full Post »