How we “Grasp” Things is Related to “How we grasp things” (draft)

The functional interrelations of motor control and language are not fully understood yet, especially their neurophysiological basis

The human hand is an important tool to interact with our surroundings They can also be used for social, communicative and even linguistic functions (i.e., waving, gesturing or sign language, and these functions may have co-evolved ( That is, the human hand can be an end effector of the language and the motor control system .

It is no coincidence that the term for “comprehension” derives from “prehension” and that when “grasp” a concept we can do so in more ways than one…as this study shows.

Voluntary grasping movements are determined by object features, e and by action plans and goals, i.e., cognitive processes and anticipated future states, Human voluntary actions are oftentimes guided by verbal processes as in instructions or requests. That is, some form of interactive processing between the motor control system and the language system seem to be necessary but the precise nature of the underlying neurophysiological interaction is not fully understood.

While there is an on-going discussion about the different sub functions of the neural basis of grasping and their relations there emerges consensus regarding a parieto-frontal network as the neural underpinning of grasping with evidence from different populations and methodologies,

Event-related brain potentials (ERPs) are well-suited for and have recently been applied successfully to the examination of overt movements . Thus, ERPs with their high temporal resolution can be used to investigate fast neurocognitive processes of language comprehension and grasp planning/execution

http://medicalxpress.com/news/2016-12-comprehension-grasping-actions.html

“When the study participants had to grasp an object while reading, their brain processed parts of the meaning of the words earlier than in previous studies in which words were evaluated without something being gripped.”

This study investigated whether specific motor representations for grip types interact neurophysiologically with conceptual information, that is, when reading nouns. Participants performed lexical decisions and, for words, executed a grasp-and-lift task on objects of different sizes involving precision or power grips while the electroencephalogram was recorded

In a control block, participants pointed at the objects instead of grasping them. The behavioural data show that action-related, conceptual information affects response times (RTs) in the grasping block and also in the pointing bloc

It seemed possible that a neurophysiological interaction of language and motor control processes occurs already between 100 and 200 ms after word onset which would imply the availability of language and action relevant information.

As demonstrated in previous studies, it takes the brain a third of a second to process a word. “In our study, however, we were able to show that comprehension can already begin much earlier, after just a tenth of a second – if a grasping action is required,” explains Koester.

This study not only provides evidence that the brain has a common control center for language and movement, but “it also shows that our brain’s processing steps shift very quickly and adjust to current tasks – in this case, the task of grasping something while reading.”

Interestingly, the RT effect of this conceptual information was independent of the task, i.e., it was found for grasping and for pointing. This result shows that word reading cannot only pre-activate further perceptual processes as often investigated in priming paradigms (i.e., a picture primes the perception of another picture) but it can also influence subsequent motor processes (i.e., a picture influences motor processes;

The main result, however, revealed an event-related potential (ERP) interaction of grip type and conceptual information which was not present for pointing. Power grips were executed faster than precision grips, presumably reflecting the lower precision demands. Grip type and conceptual information are functionally related when planning a grasping action but such an interaction could not be detected for pointing

It seems that conceptual information and motor commands are only processed together neurophysiologically, if a complex (manual) response is required and reading is directly related to the motor requirements (object manipulation)

http://journals.plos.org/plosone/article?id=10.1371/journal.pone.0165882

“Similar as in our experiment, patients could practice words they cannot access by indicating not only verbally but also with grip movements to show they recognize a word. In short, motor training,” explains Koester.As such, one’s knowledge of words would be strengthened through the ‘back door’ of motor control.”

Latest theories in cognitive science research hypothesize that our memory also records physical sensations as part of the words stored,”

Various frameworks have been proposed on the relationship between language and action, On the one hand, there are strictly symbolic accounts which assume an amodel, central processing system for language that is functionally independent from other cognitive domains.

On the other hand, the embodiment framework assumes that sensory object properties and action features pertaining to the same object share some representational aspects with abstract, symbolic representations for objects and actions, specifically, nouns and verbs, respectively,

Accordingly, functional interactions among the domains of motor control and symbolic word representations can be expected for similarities in the structures of sentence representations and action sequences).

“Similar to an entry in a reference book, the brain records a word like ‘whisk’, associating it with concepts such as ‘inanimate’ and ‘kitchen device.’ In addition to this, the brain connects the word to one’s own experience – how a whisk feels, for instance, and that a spinning motion is related to it.” In their new study conducted with 28 participants, Koester and his colleagues lend support for the thesis of the embodiment of knowledge.

The occurrence of this interaction (and the congruency effect) strongly suggests an integrated processing of symbolically coded information (nouns) and concrete motor commands (grip types). To the extent that the interaction reflects integrative processing of symbolic information and concrete, motor commands, i.e., a functional relation in processing, this finding supports embodiment views which argue for an integrative processing and against a strict separation of symbolic and sensory/motor information processing,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s