It’s very evident to me that businesses, organizations and individuals who don’t handle data well (i’ll define that shortly) don’t end up making any difference (traffic, profit, buzz…).
yeah, that’s probably not intellectual news to anyone. really, though, how many people really handle data well?
Here are some common samples bad analysis, bad data, bad labeling, bad process:
- VCs seriously consider 3 year pro-formas on businesses that have yet to produce or sell a single unit
- Ad Agencies blatantly ignore sources of traffic when reporting to their clients
- The whole media world pays attention to comscore, nielsen (and some even alexa!)
- Product managers never track down baselines and expectations
- Ad sales teams routinely ignore inventory levels
- Marketers talk about “brand value”
- dotcoms install 5 or 6 tracking mechanisms and never sync them
- analysts/bi people start analysis with false assumptions or no assumptions
- home buyers don’t calculate property taxes or relative market value of their home
- employees generally don’t consider all implications of FSA and 401k contributions when consider real take home pay
- employers evaluate employees on qualities and skills not results
- traditional resumes feature dates and objectives not results and plans
- dow = market to general public
- subprime is word of the year
- “backing into” a model is a well honed practice in most executive offices
- Music labels pay attention to “money lost to piracy”
There are an infinite number of anecdotes on fishy data analysis.
For those that want actual facts – here’s how I know data analysis is a problem in industry and society:
- according to the 2006 PISA report, “only 57% [of students] said that science was very relevant to them personally”
- Again, according to PISA, only 29% of world wide students (in the US its less than 19%!) students can work effectively with situations and
issues that may involve explicit phenomena requiring them to
make inferences about the role of science or technology. They
can select and integrate explanations from different disciplines
of science or technology and link those explanations directly to
aspects of life situations. Students at this level can reflect on their
actions and they can communicate decisions using scientific
knowledge and evidence. - Undergraduate degrees issued by major institutions show very low percentage of students in statistics, mathematics, behavior, and other data/experimental disciplines. Don’t even try to pass off business and management has analytical. (http://facts.ucdavis.edu/largest_undergraduate_majors_by_degrees_conferred.lasso, http://intranet.northcarolina.edu/docs/assessment/Abstract/2006-07/Deg%20Con/F._1107.pdf, http://www.google.com/search?q=report+degrees+conferred)
- Dial back the news reels 1 year (http://www.msnbc.msn.com/id/17369494/) Yeah, nice work ECONOMISTS OF THE WORLD. 1 in 5 chance of recession… hahahaha. (read them all: http://www.google.com/search?q=predicting+recession)
- Economic Indicator reports that total confound, conflict and contradict… (here’s a nice summary of how it all goes down: http://www.ftpress.com/articles/article.aspx?p=775678&rl=1)
- Billions are gambled (via ads and content deals) on a terrible TV Ratings system http://en.wikipedia.org/wiki/Nielsen_Ratings#Criticism_of_ratings_systems (don’t even read about how magazine subs are counted, that will really freak you out)
- Think gas is really that expensive (more expensive than ever?)? The media and politicians tell us it is… but… http://www.measuringworth.com/uscompare/ read the bottom there for some calculations…
Ok, ok. I’ve done a good job of pointing out horrible data analysis and lots of fun factoids but I haven’t demonstrated why poor analysis diminishes opportunities.
First, let me explain my qualifications for “good analysis”:
- data should be collected and analyzed in an appropriate timeframe (don’t take 10 years to graduate!)
- Make a clear statement of analytic objective and methods is a must
- The accuracy and depth of data and analysis should be relative to the importance of the subject matter
- prediction of human behavior is impossible, avoid absolutists statements
- explain relationships between variables, avoid overbearing causation arguments
- check and recheck (1 set of eyes is not enough)
- qualitative research should always accompany quantitative, vice versa
- ask more questions
With some of those key statements established, i can now draw out why people and orgs miss out or flat out make huge mistakes so often.
[I will do so in a forthcoming post!]
Either people think it’s too hard…
Or, they don’t want to have concrete numbers because they aren’t confident they can hit projections made by other people (see the pro-formas sold to VC’s…who is held accountable for hitting the numbers? the start-up CEOs? NO…the suckers who took the job attached to numbers someone else came up with.
Or, there are competing groups in a company that muddy the data. Who gets credit for the numbers on the sale of a laptop at Dell.com? The product team? The marketing team? The web team? The SEM team? The promotions team? Plenty of eyes checking data, but all stacking the deck in their favor to come closest to meet projections by their bosses (which also weren’t synced up to begin with).