Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Sunday October 26 2014, @11:19PM   Printer-friendly
from the talk-to-the-hands dept.

Science Daily has an article about intonations in sign language.

Like the intonation of individual spoken languages, sign languages also have their own unique "sound," and, as with spoken languages, the intonation of one community's language is different from that of another community, according to a new study at the University of Haifa. "Our discovery that sign languages also have unique intonation patterns once again demonstrates that sign languages share many central properties with spoken languages. It turns out that intonation is an essential component of any human language, including languages without sound," explained Prof. Wendy Sandler, who led the study.

On a personal note, I find it interesting to compare this to emoticons and acronyms like LOL or ROFL in an email setting as well.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by dmbasso on Monday October 27 2014, @10:50PM

    by dmbasso (3237) on Monday October 27 2014, @10:50PM (#110686)

    [...] it's quite clear that in many cases, damage to Broca or Werknice impairs both perception and production. Imaging studies also consistently show activity in both areas.

    Of course, and the reason is analogous to the relation of mirror neurons and action production & recognition.

    [...] is that Broca does syntactical and Wernicke lexical work [...]

    That doesn't make much sense, and it is easily falsifiable by the clinical cases where you have language recognition or (specially) production with correct form and complementary function damaged.
    Also, just from an architectural point of view, Broca's proximity to the motor cortices and Wernicke's proximity to sensor cortices are consistent with the idea of them performing production and recognition, respectively.

    [...] fMRI and MEG experiments often show activation outside these areas as well: in the right hemisphere, [...]

    That's the non-verbal processing, prosody.

    and also e.g. in the basal ganglia.

    Without whom there wouldn't be the actual motor activation (among other things).

    So it's much more messy than the handbooks say

    That's a given, isn't it? ;)
    But there there are some well-established data, such as the separation of verbal (left hemisphere) and non-verbal (right h.) communication. Perhaps you were not aware of it because your focus on NLP limited the kind of research papers you read.

    About the future, I'll probably follow the same path you took... right now I have a research contract to build some psychophysical experiments on a CAVE setup, but after it ends I guess I'll have to find something more stable / long term...

    --
    `echo $[0x853204FA81]|tr 0-9 ionbsdeaml`@gmail.com
  • (Score: 2) by TGV on Tuesday October 28 2014, @06:11AM

    by TGV (2838) on Tuesday October 28 2014, @06:11AM (#110769)

    > Of course, and the reason is analogous to the relation of mirror neurons and action production & recognition.

    Actually, large groups of researchers are of the opinion that mirror neurons (as such) do no exist. There can be brain functions that relate perception and production, but the normal learning paradigm is sufficient for this. There is no need for specialized neurons, and it's hard to imagine how they could fit in processes as complex as language, unless you assume that the human brain was designed as such. The idea of mirror neurons comes from, IMO, a "limited" understanding of the role of a neuron, the kind of understanding that would lead us to believe that everyone is born with a Jennifer Aniston neuron.

    > and it is easily falsifiable

    It is not easily falsifiable. There are enough patients with a lesion in those regions that have lost more than one function. Furthermore, we cannot think that everybody's brain is identical. Scans show differences on many high level tasks, so it's likely that localized damage cannot inform us fully.

    > That's the non-verbal processing, prosody.

    In reading tasks? Why would prosody be non-verbal? It needs to integrate with the rest of the linguistic processing, otherwise it's useless, and you cannot communicate by prosody (alone).

    > But there there are some well-established data, such as the separation of verbal (left hemisphere) and non-verbal (right h.) communication.

    There is no such thing, from what I've read. Where do you get that from? And what is "non-verbal" communication? There is no reason to believe that the right hemisphere is not involved in linguistic tasks.