Feeds:
Posts
Comments

Posts Tagged ‘journalism’

In the last couple of months we’ve had several high profile events (reporter escape, #iranelection, swine flu)  on the planet that demonstrate the direct influence the media has on events.  As much as journalists and media personnel attempt to be impartial reporters they never are and never can be.  It’s simply impossible to report on an event without impacting it especially in this ever more everything digitally connected to everything else.  This is not necessarily bad or misguided.  What is problematic though is operating media properties without careful navigation of the fine line between influence and observation and consuming media without judging it’s impact.

I’ve recently read the Dave Cullen book, COLUMBINE.  On top of  its literary positives this book does an excellent job of picking apart the media coverage’s direct influence on the events as they unfolded and our analysis (and current thinking!) on the events, the people, the causes.  People died as a result of the fundamental misunderstanding about media’s impact on events.  People’s lives continue to be out of sync with what really happened and why it all happened because of the media’s impact on the events and investigations.

I suspect we’ll look back on the Iran election in a similar light.  Perhaps, in this case, media will be a more positive influence.

The recent NYTimes+Wikipedia strategy is another example of potential grave misunderstanding.  In this case the potential influence of media was recognized before hand but…… now that it is public how we can manipulate media and the Internet population there’s another problem looming.   Are we opening a can of worms by allowing the media to be used strategically in political and military efforts?

I recently had a mini-debate on facebook about whether it was a such a good idea to encourage folks to confuse and hide identies behind false settings and proxy servers on Twitter during the Iran Election.  Though the intentions behind these activities seem worthwhile – helping citizens fight for political freedom – this is a slippery precendent to be setting.  Where do we draw the line on using the shifty nature of online media as a strategy?  How can we legally hold criminals accountable for these same actions?  How can we identify suspicious behaviors when we’ve encourage this use of media by everyone?  Is it OK for journalists to use this tactic when pursuing a story?

Trying to understand the world is difficult enough.  The Internet and new approach media is great for its openness, DIY approach and general “we’ll figure it out as we go” utility.  However, unchecked by the very people creating and consuming it as the situation is now we’re only creating more confusion and muting the considerable utility of this platform.  What I am directly saying is that all of us in media (reporting and tool building) need to spend a little more time reflecting and strategizing and a little less time trying to be the first on the scene, the one with the most pageviews, the one with the exclusive.  This approach won’t come about without some direct actions on our part and lives depend on figuring this out.

Read Full Post »

It’s a safe assumption that every newspaper and publisher has access to the basic data created and shared by the world.  Newspapers and publishers can use Freebase or Wikipedia or some other aggregator/syndication service to fill their websites and their papers full of content.  The race for coverage and exclusives is over – every outlet can cover everything simply by mashing it all up.

This availability of all data by all outlets has ruined the differentiation between local papers, category specific publications and most hobbyist content providers.  It used to matter who the editor in chief knew, which city a publisher was in, how fast the vans could get to the scene or whether the product makers liked a publisher’s reviewers.  The ability to get and cover content competitors couldn’t drove audience.  Now, search engine optimizations, clever traffic arbitrage schemes and integration with portals are the drivers of audience for most traditional publishers.  This is a short lived game though – for a variety of reasons.  The main reason is that it’s too hard to compete – everyone is competing for the top spot in Google for the most important searches.  The portals can only feature a small set of links every day. The arbitrage schemes are played out by a great many players, and money talks more than anything else.

The winning strategy involves Synthesis.   “Just the facts, mam” doesn’t cut it.  Interpretation, analysis, synthesis of the causal webs, related ideas, the context, the history.  All of that done via interactive visuals like timelines, charts, trends, projections, simulations.  Every which way to expose fresh takes.

Synthesis still takes unique perspective and skilled people.  Synthesis is what creates shared understanding – i.e. knowledge.  Publishers need to get out of the data business and into the knowledge business.   Knowledge is defensible, data is not.  Knowledge is worth paying for, data is not.  Advertising tangled with knowledge performs, advertising with data distracts.

Are there examples of good synthesis?

It’s rare.

I don’t think cable TV does a great job of synthesis.  Most talking heads are just creating noisy data.  Good synthesis provides insight, demands more questions, finds connections.  Yahoo News doesn’t. Google News doesn’t.

Edge.org does alright.  The Economist usually.  WSJ still does a good job with synthesizing.  CNN, occasionally.  Go to these websites, they aren’t just the data regurgitated. And they are rewarded with traffic.

This isn’t limited to news. Science information, entertainment, business intelligence, weather, and more all need synthesis.   Consider the spread of Compete.com and Quantcast.com versus Alexa.  Alexa didn’t really synthesize the numbers into useful trends, audience slices or any more useful view.

You get the point.  Enough of the “information age”, let’s bring about the synthesis age.  Therein lies the value publishers can bring to consumers.

Read Full Post »

Update 2/17/09: Here’s a fun piece on CNN about MDs using Twitter from the OR. Again, this is NOT particular useful data being generated.  It is, however, an excellent BROADCAST tool.  Surgeons pushing out updates is useful to families and friends. In the grand scheme of useful information unto itself, this content will have no reuse outside of that surgical operation context.  Perhaps an aggregation and synthesis (not real time) would be useful in trending operations, but there are other, more efficient, ways of computing and comparing data from operations.

Ok, so perhaps, this is why VCs, media pundits and internet geeks gush over Twitter: The idea that it represents some collective thought stream/collective brain.

The most common statement about why this colletive stream of drivel has value comes in this excerpt from the TechCrunch post:

Twitter may just be a collection of inane thoughts, but in aggregate that is a valuable thing. In aggregate, what you get is a direct view into consumer sentiment, political sentiment, any kind of sentiment. For companies trying to figure out what people are thinking about their brands, searching Twitter is a good place to start. To get a sense of what I’m talking about, try searching for “iPhone,” “Zune,” or “Volvo wagon”.

Viewing the proposed examples SEEMS to validate the claim.  However, online discussion and online “tweets” are NOT the same as the behavior you’re actually trying to gain insight into.  Whether people are into a brand is not accurately assesed by viewing what they SAY about it — it’s what they DO about it.  Do people BUY the brand? Do they SHOW the brand/products to others?  Do they consume the brand?

These above examples are not predictive in anyway.  They are reflective.  Twitter can’t do much better than Google, blogs, and news outlets at ferreting out important events, people, products, and places before they are important.  Twitter, in some respects gets in its own way because the amount of “tweet” activity is not always a great indicator of importance.  In fact, some of the most mundane events, people and places get a ton of twitter activity versus really important stuff.

Twitter is also highly biased.  It is predominately used by the technical/digtial elite.  Yes, it’s growing quickly, but it still doesn’t reflect more than perhaps 1-2% of the US population.    Heck, even Google traffic is highly biased, as only 50% of the US population uses search every day. You say, so what, it will get there!  No, it won’t.  Consider the following examples.

Twitter can’t tell you ANYTHING about the real stuff of life like Baby Food, Peanut (recall), or your local hospital. (I leave it as an exercise for the reader to try these searches on Google and compare the results).  With more usage, this only gets more impossible to find the real information.  New tools to parse and organize tweets must be created.  This implies you’ll need computational time to parse it all, thus destroying the “real time part” the techcrunch authors and this quoted blogger adore.  Beyond just filtering and categorizing, an engine needs some method to find the “accurate” and “authoritative” data stream.  Twitter provides no mechanism of this and doing so would destroy it’s general user value (you don’t want to have to compete with more authoritative twitterers, do you?)  Twitter search would need to become more “Googly” to matter at all in some bigger world or commerce sense.

TechCrunch correctly identifies this problem:

An undifferentiated thought stream of the masses at some point becomes unwieldy. In order to truly mine that data, Twitter needs to figure out how to extract the common sentiments from the noise (something which Summize was originally designed to do, by the way, but it was putting the cart before the horse—you need to be able to do simple searches before you start looking for patterns).

So where does Twitter really sit and does it have value?  It is a replacement for the newsgroup and chatroom and some IM functions.  It has value, obviously, because people use it.  Users replace their other forms of conversation with Twittering.  Broadcasters and publishers are also replacing other forms of broadcasting/pushing messages with Twitter.  This, too, has value in that Twitter better fits the toolsets more and more of us sit in front of all day long.  It’s somewhat of a “natural” evolution of things to find a new mechanism of broadcasting when a medium (terminals attached to the network) reaches critical mass.  The hudson river landing example is a better example of the shift in broadcasting method than it is of some crack in the value of Google and others for a value to have “real time search.”  If that logic were sound, CNN would been hailed as a “Google Slayer” as they are more real time than Twitter is (yes, they use twitter and ireport and citizen journalism…).    In fact, CNN is the human powered analytic filter required to make sense of real time streams of data.  News journalists capture all that incoming data and find the useful and accurate information and summarize and rebroadcast.

If I were an operator of IM networks or a business that relied on chatrooms and forums, I’d be worried.  Google, news outlets and other portals should not be worried.  They don’t need more contextless content to sift through, they do just fine without yet another 99% source of throw-away thoughts.

I, myself, am not a Twitter-hater.  It is a great media success.  It probably can make money.  However, it doesn’t represent some shift in social networking, high tech, communications, much less how we interact.  Anyone who claims that must be delusional or hoping to make a buck or two, which is fine too.

TechCrunch concludes with the real question here:

But what is the best way to rank real-time search results—by number of followers, retweets, some other variable? It is not exactly clear. But if Twitter doesn’t solve this problem, someone else will and they will make a lot of money if they do it right.

Is there a possibility to generate a collective thoughtstream? big Internet brain?  Sure, in some loose sense, that’s already happened.  Twitter (and other tools) is just a piece of the puzzle.  The human brain doesn’t have just one piece you can claim as the main part – the CPU that can make sense of everything.  Why should we think something less complicated (the Internet has far fewer nodes, interconnections and far higher energy demands than just one human brain!) have a central core (service) providing some dominant executive function?   There are several reasons this physically can’t happen.  The main thing, I mentioned it earlier, is that making sense of random streams of data requires computational time.  The more inputs a system takes in, the more computation it requires to make sense (or to filter it in the first place).  New information or new types of information must first be identified as potentially useful before they can even be included for summarization.  And so on.  The more useful you need to make entropic data (random), the more energy you need expend. Raw data streams trend toward entropy, yes in an informatic and thermodynamic sense.

In other words, no one company is going to figure out how to rank real time search results – it can’t be done.  Perhaps more damning is, it doesn’t need to be done.  There’s no actual value in searching real time.  The idea of searching is that there is some order (filter) to be applied.  When something happens, John Borthwick, correctly claims “relevancy is driven mostly by time”.  So twitter already has the main ordinal, time, as it’s organizing principle.  Perhaps TC and John Borthwick desire a “authority” metric on tweet search… however, you can’t physically do this without destroying the value of real time.  No algorithm accounting for authority will be completely accurate -there’s a trade off with real time and authority.  (PageRank has the similar problem with authority and raw relevancy, as no name authors and pages often have EXACTLY what you want but you can’t find them.  This is a more damaging problem in “real time” scenerios where you want the RIGHT data at the RIGHT TIME).

If Twitter could plant an authoritative twitterer at every important event and place, real time twitter search might become real.

Oh wait, that’s called Journalism – we already have 1000s of sources of that.

Read Full Post »