Posts Tagged ‘receptor webs’

CHREST Computational Model

CHREST Computational Model

Chunking theory is a theory for how we learn/remember things.  In combinatorial “chunks” essentially, stored in long term memory.  Recent work in receptor webs and other areas of cognitive work and biology are following similar concepts.

If this sounds like I’m not explaining it well… I’m not.  I’m absorbing it too.  Read the source material for more info.

What I’m mostly excited by is how well this chunking seems to mesh with other “network” computational models.  And it should as I think it’s the same researchers who’ve branched.

There’s actually a code base called CHREST modeling the theory.

Here’s great information on CHREST that leads to a variety of other great resources.

CHREST homepage

Some of the original work by Chase and Simon.

Read Full Post »