Saturday, 19 May 2012

Mind: Some problems with the Classical Computer Metaphor - Part Two

A Tenuous link to Extended Mind
In part one I briefly explained what the computer metaphor entails and discussed the first of three intertwined problems that I feel the amodal paradigm faces;

1) Conceptual and philosophical issues with amodal abstract symbol manipulation

In this post I focus on the second problem;

2) Strengths of competing paradigms, such as embodied cognition


Embodied Cognition


In addition to the conceptual and philosophical issues surrounding the amodal approach, there is an increasing body of evidence supporting an alternative paradigm - embodied cognition. Embodiment is a rather general umbrella term, influenced by phenomenology, ecological psychology, AI, philosophy and heavily overlaps with grounded cognition, extended mind theories and embedded cognition - though there are subtle differences. 


Essentially all these theories share the underlying assumption that conceptual knowledge must be grounded in the sensory-motor modalities in which they originate, rather than being amodal representations of abstract symbols. This means that the perceptual, motor and emotional modalities are recruited and actually essential during conceptual processing - something that would not be predicted a priori by amodal theories.


Some evidence...


Perceptual simulation 
It is proposed that the conceptual system co-opts the perceptual system in order to access or simulate concepts. As such, it follows that perceptual brain areas for each modality (visual, auditory, olfactory, gustatory and tactile amongst others) should be recruited during conceptual processing. 

This has indeed been consistently demonstrated throughout the embodied literature. For example, using fMRI Simmons et al 2007 showed that part of the left fusiform gyrus was activated both during a perceptual task (perceiving colour using the 'colour perception functional localizer task') and also during a conceptual task (verifying if the colour of an object was true - "a banana can be yellow"). Arguably, we would not expect the perceptual cortex for colour to be activated during a conceptual task if the concept is amodally represented. The activation of perceptual areas during conceptual verification has also been demonstrated across the other modalities (See Gonzalez et al 2006 and Goldberg et al 2006)

Switching Systems 
One problem (of many) for fMRI studies is that the temporal resolution is rather poor. That is, the overlap in neural areas for perceptual/conceptual tasks could simply be epiphenomenal. Evidence from behavioural experimentation, however, suggests this is not the case.


If the conceptual system does indeed co-opt the perceptual system for representation, then we would expect perceptual phenomena to emerge in conceptual processing. One such perceptual phenomena reported by Spence et al 2001 is that switching between modalities during a perceptual task incurs a processing or 'switching' cost. As such, if conceptual representation is modality specific rather than amodal, we should observe a modality switching cost during a conceptual task as well.


Pecher et al 2003 showed that this in fact was the case. Using a Property Verification Task, they found that response times were significantly quicker when verifying a sentence that had been preceded by verification of a sentence in the same modality compared with a sentence from a different modality. For example...


Same modality = quicker response time - [blender is loud] (auditory) --> Y or N? --> [leaves rustle] (auditory) --> Y or N?


Different modality = switching cost (slower RT) - [blender is loud] (auditory) --> Y or N? --> [lemon is yellow] (visual) --> Y or N?  


Further, Connell & Lynott 2010 were able to demonstrate the switching cost phenomenon occurs in concept creation as well as retrieval. In a unique and intricate experiment they showed that when generating a novel concept after creating a familiar concept in a different modality, response times were slower compared to when generating concepts in the same modality. 


Tactile Disadvantage
A further perceptual phenomenon that emerges during conceptual processing is the tactile disadvantage. Spence et al 2001 demonstrated that when subjects are asked to respond to the arrival of a stimulus, they are generally slower to detect tactile stimuli than visual stimuli, even when they are told which modality to expect. (nb this was not explained by physiological differences in processing stimuli between different perceptual modalities, such as latencies for transduction and transmission, and in fact, the visual modality is the slowest out of audition, vision and touch to get the signal to the brain.)


Connell & Lynott 2010 hypothesised that if the conceptual system is modal and recruits the perceptual system during cognition, then the tactile disadvantage should occur in conceptual processing as well as during perception. In the first two experiments they displayed unimodal words from five modalities (auditory, visual, olfactory, tactile and gustatory) at increasing durations starting at the subliminal threshold of 17ms. Participants were significantly slower at responding to tactile concepts compared with the other modalities. 


Because of the nature of the design using different words per modality, however, it could be argued that the results arose amodally due to unknown differences across these words.


By repeating the experiment using bimodal visuo-tactile words (e.g. fluffy), Connell found that the tactile disadvantage occurred even for the same word, depending on whether participants were identifying a tactile or visual modality. This suggests the results cannot be explained amodally, since for bimodal words the same amodal symbol should be recruited. As such, whether responding to the word fluffy as a tactile modality or as a visual modality, we would expect the same amodal symbol to generate the same (or similar) response times. 


Object orientation & the Action-sentence Compatibility Effect (ACE)
If understanding a concept such as 'the man hammered the nail into the floor' or 'you give Andy the pizza' involves the activation of perceptual and motor representations, rather than amodally distributed symbols, then we would expect to see the priming of these specific modal representations for subsequent use. In other words our conceptual processing should facilitate further use of our perceptual and motor modalities. 


Stanfield and Zwaan 2001,2002 argue that processing sentences automatically and unconsciously activates visual imagery using the visual processing system, thus facilitating a perceptual task. They created pairs of sentences in which the implied orientation or shape of the object differed - 'the man hammered the nail into the floor' versus 'the man hammered the nail into the wall'. Following a sentence, subjects were shown an image of the manipulated object which was either compatible or not compatible with the implied orientation/shape.


In the first task, participants were slower to state if the image of the object was mentioned in the prior sentence, if there was a mismatch between implied orientation and the image. Importantly, in a second task, participants were slower to simply name the object in the image if there was a mismatch of implied orientation/shape. This means participants were not prompted to recall visual imagery, which could otherwise be explained amodally. 

The Action-Sentence Compatibility Effects occur when motor representations are activated during a conceptual task, thus resulting in facilitation of the actual performance of a compatible motor act . Glenberg and Kaschak 2002 showed that when asked to verify the meaningfulness of a sentence e.g. 'you handed Andy the pizza', participants were significantly quicker when their response incorporated the action of the sentence - moving arm away from oneself (as would occur for the action of handing someone a pizza) to press the yes button. This was also found to be the case for abstract transfer i.e. the giving and receiving of non-physical entities for example 'you told Liz the story'

In summary, we would not expect the perceptual or motor processing systems to be primed if conceptual tasks are amodal. 

Situated Cognition
In a nice introductory paper, 'six views of embodied cognition', Margaret Wilson begins by considering the premise, inspired by ecological psychology and dynamical systems, that cognition is situated. This means that cognition takes place within the context of the biological organism and its real-world, time-pressured, environment. This inherently involves a continuous feedback loop of information from environment to the organism which then acts on the environment which feeds back to the organism and so on. 


An important implication of situated cognition is to question the need for representations at all, when the world can (at times) act as its 'own best model'. One such (probably over-) cited example is the outfielder problem. The standard neurocognitive explanation for how the outfielder catches a ball would incorporate the subconscious calculation of trajectory based on a complex representation which is used to predict the outcome, however, this is not what happens


A more parsimonious explanation lies in the prospective models, whereby the outfielder does not calculate or predict the flight of the ball in advance, but rather modifies his/her behaviour in response to certain available perceptual variables. One such example is McBeath's Linear Optical Trajectory model which proposes that the outfielder must orientate themselves such that the ball appears to move in a straight line in their visual field (though see the Optical Acceleration Cancellation model or Marken's Control of Optical Velocity model for competing models). The important point is that these prospective models do not require complex amodal calculations, but rather produce a successful outcome which emerges from the situated action of an organism coupled to its environment. 


An immediately apparent problem with situated cognition, as recognised by Wilson, is how it could possibly account for seemingly 'offline' cognitive tasks such as imagining catching a fly ball. It is our ability to imagine, plan and create which arguably distinguishes us from other lifeforms, yet such tasks appear 'decoupled' from the environment. 


One potential solution lies in the grounding of perceptual representations, which are partially reactivated during a conceptual task, as discussed. If we were to take a more radical approach denying representations full stop, it may be argued that we are never truly offline or decoupled from our environment and in fact many tasks which appear purely 'in our heads' are often offloaded onto our environment, for example using fingers to count.


These findings taken together, provide a fraction of the data available suggesting that abstract amodal representations, based on the computational paradigm, are not utilised or necessary for cognition.


[A couple of caveats...]
 
1) Much of the evidence discussed is from a weaker form of embodied cognition proposed by Barsalou and his Perceptual Symbols System. For the more radical form of embodiment, which questions the very need for representations at all, see Andrew and Sabrina's excellent blog at notes from two scientific psychologists.


2) It is worth noting that the embodied research is not without its criticism from classical cognitivists who essentially argue much or all of these findings can be explained within an amodal paradigm as mentioned in part one (see Mahon & Caramazza 2008 and see Connell, Lynott & Dreyer 2012 for their counter argument)


I was hoping to cover neural networks and robotics in part two but it seemed a bit much so part three's-a-comin!

2 comments:

  1. Waiting for part III!

    ReplyDelete
  2. Excellent! I'll have it up in a few weeks, need to revise neural networks first :-)

    ReplyDelete