Feeds:
Posts
Comments

Posts Tagged ‘theory’

I have an hypothesis that the key ability of humankind that evolved over the millennia is learning.   Broadly speaking this means the awareness, recognition, and synthesis of patterns.   This doesn’t refer just to academic learning or book knowledge, but instead to the more generalized concept of pattern recognition.   We are learners.   

 

Everything we do is about foraging for new patterns or confirming previously learned patterns.   Of course, we forage for patterns to survive – to eat, sleep, find water, mate and avoid predation.   While in the modern world it can be hard to see how everything we do is about these basic survival aspects so far removed is our daily experience from pulling food from the ground and running from sabre tooth tigers, no doubt our homes, transportation, logistics networks and so forth have been built up to provide essentials for increasing populations.   And the one common behavioral thread from how our pre history ancestors likely lived and how we live today is Learning.   This is the biological strategy developed from our interaction with the world over time.

 

Our physiology compared to the rest of the animal kingdom favors us using our large brains and capable senses to forage for and use patterns vs. terrifying strength or built in camouflage, etc.  A baby can do very little physically for a very long time while it is learning.   Our period of growth to adolescence and self sufficiency is very long.   There is a lot to learn to become surviving human.

 

Certainly all living things learn to some degree.   The simplest creatures all of have some sorta of biological memory that helps them find food and avoid destruction – though that memory is often quite different from ours and may not even be anything we’d recognize.    The difference is the sophistication and complexity of that learning made possible by our complex nervous system.   As individuals we learn a great deal.   As a species we learn a great deal.  Over time we are able to store and retrieve an increasing amount of learning that we pass on to our descendents through culture, written records, and now the internet and digital technology.   There is simply no other animal we’ve found that does this to the scale we do it.

 

Plenty of literature suggests it’s language or consciousness or art that makes us “human” or “different that the rest of animals.”   Rather than saying learning makes us a superior life form or different than other animals I’m merely suggesting that this our evolutionary strategy that developed and that all of those other things people mention come from this ability and need to learn.   We are constantly in search of more efficient ways to discover and transmit patterns that help us survive.   Music, language, art, writing, sport, etc all of this are varied, efficient and robust ways to teach other patterns.   Yes, they often have more pragmatic and immediately practical effects like making us attractive to mates, etc.   but they also are transmissions of patterns we’ve found interesting or useful or they help unearth other patterns.

 

Now, this being per speculation as so much of evolutionary biological thinking is, it’s quite possible that everything that allows us to learn was simply evolving in response to other things than learning.   Perhaps that’s true, but the emergent effect is that we happen to be extremely powerful learnings and we have yet to devise anything that can learn more effectively.   It’s unsurprising to me that our key enterprise is developing non human machinery to help us learn and that might learn better than us.  This is literally what we must do, it’s all we do do.

 

Why is any of this important?   It is a perspective that might put various aspects of organizing our lives, societies, countries, world, technology in a new, more resilient light.   If learning is the key ability we have to survive should we not organize around this and NOT do things that reduce learning?   Should we not amplify our ability and scope of learning?   

Maybe that’s too directional of a way to think about it.  Perhaps it matters not if there’s something we OUGHT to do and rather we Do What We Do and that’s the whole lot of it.   EIther way, as an individual looking to find better ways to survive and thrive I find it useful to think through and understand what might be underlying it all.   You know, seems like I should learn.

Read Full Post »

Chris Anderson is at it again… stirring the pot with big claims that are hard to falsify but seem to generate a huge amount of discussion with smart people. Check out some of the discussion. Or maybe read the article first.

Here’s an excerpt:

But faced with massive data, this approach to science — hypothesize, model, test — is becoming obsolete. Consider physics: Newtonian models were crude approximations of the truth (wrong at the atomic level, but still useful). A hundred years ago, statistically based quantum mechanics offered a better picture — but quantum mechanics is yet another model, and as such it, too, is flawed, no doubt a caricature of a more complex underlying reality. The reason physics has drifted into theoretical speculation about n-dimensional grand unified models over the past few decades (the “beautiful story” phase of a discipline starved of data) is that we don’t know how to run the experiments that would falsify the hypotheses — the energies are too high, the accelerators too expensive, and so on.

Now biology is heading in the same direction. The models we were taught in school about “dominant” and “recessive” genes steering a strictly Mendelian process have turned out to be an even greater simplification of reality than Newton’s laws. The discovery of gene-protein interactions and other aspects of epigenetics has challenged the view of DNA as destiny and even introduced evidence that environment can influence inheritable traits, something once considered a genetic impossibility. In short, the more we learn about biology, the further we find ourselves from a model that can explain it.

There is now a better way. Petabytes allow us to say: “Correlation is enough.” We can stop looking for models. We can analyze the data without hypotheses about what it might show. We can throw the numbers into the biggest computing clusters the world has ever seen and let statistical algorithms find patterns where science cannot.

And where does Anderson suppose all these statistical algorithms come from?  Think about it.  The statistical algorithms have come from “old science”.  We came up with statistics as a way model things – to compress our data.  If we simply use these algos without ever obtaining understanding and testing models how can you validate that your statistical models/algos are good at finding correlations?  You can’t!  This point alone is enough to dismiss Anderson’s “non theory” (or is it a theory?).  Read on if you want more commentary.

Certainly there are some tidbits of useful insight, however, his call for the end of science as we know it hardly withstands much thought.

a) Google doesn’t know as much as everyone claims

b) Correlation is not enough for understanding.  If all we are going to do after the end of theory is act on correlations of intervening variables (i.e. variables/metaphoros that aren’t at the root of a phenomenon but are associated), we will get further and futher from understanding “the thing”.  That’s ok in business and some technical situations where you want to cut corners (understanding isn’t important) but would be horribly catastrophic in medical procedures, genetic work, rocketry, etc. etc.

c) Models are useful.  In fact, Anderson employs Google as a model to communicate his ideas.  Models aren’t the thing, and most serious thinkers never claim they are.  Models help to organize thinking and direct research, but they do not substitute for the phenomenon.  Yes, in new investigations our models are somewhat off, but it an uncountable set of situations our models are highly accurate, useful and consistently employed.  I leave it as an exercise for the reader to think about the many models of the world we all use every day to great effect.

d) No doubt the computational ability we have at our finger tips will help to uncover things we never saw before.  That’s always been the case with new technology.  The better the technology the further we can see, the smaller we can disect, the more we can crunch…  how is the advance of the computer any different?  It’s not!  Think about it.

e) Exhaustive search efforts (massive data mining) like the ones he sites from Venter and others has been going on for decades.  There’s no big shift in the future.  The more we can computer the bigger datasets we’ll work on and we’ll still see things just out of our computational reach.  This is a proven fact.  The universe has been computing and generating data for a very long time and we are not going to catch it seeing as how we’re PART OF IT.

f) I suspect Anderson woke up one morning to his own realizations about the usefulness of datamining.  Meanwhile the rest of us have been taken advantage of new technologies and ever increasing data storage for a very long time (in fact, pretty much since the inception of science…)

Anderson is a good writer and a bad scientist.  Oh well, life and science and journalism carry on….

Read Full Post »