Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Wednesday November 04 2015, @05:34PM   Printer-friendly
from the festival dept.

Jessica Jones over at The Local continues reporting on an embarrassing gaffe in promoting a vegetable celebration in a town in northwestern Spain. The town, Pontes, publicized its annual rapini festival on the town hall's website.

From the article:

A town hall in northwestern Spain was left red-faced after a Google Translate error led to it advertising its local leaf vegetable celebration as a much more X-rated affair.

One of the highlights of the year in the town of As Pontes in Galicia, northwestern Spain, is its annual rapini festival, when townsfolk celebrate the town's speciality, the leafy green vegetable similar to spinach.

[...] But when residents clicked onto the Castillian Spanish version of the town's website - provided by Google Translate - to check the dates for next year's fest they were shocked at the new turn the festival had apparently taken.

"The clitoris is one of the typical products of Galician cuisine," read the description of the festival on the Castillian Spanish version of the town hall's website, whose original version is written in Galician.

"Google translate recognized our Galician word grelo as Portuguese and translated into the Spanish clítoris," town hall spokeswoman Monserrat García, explained to The Local.

Google Translate changed Feira do grelo (Rapini Festival) into Feria Clítoris (Clitoris Festival) leading to some embarrassment when staff at the town hall discovered their error on Thursday.

While this is embarrassing for the folks in Pontes, it raises some interesting questions as to how useful automated translation software (such as Google Translate) can be.

Have any Soylentils run into issues like this while using automated translation software? Did anyone see the mistranslation and make travel plans based on the it?


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 1) by isj on Wednesday November 04 2015, @09:32PM

    by isj (5249) on Wednesday November 04 2015, @09:32PM (#258529) Homepage

    Anyone assuming that automatic translation is relatively error-free are naïve.

    Context, culture, argot and idioms all make automatic translation difficult, and because sometimes there are no 1:1 mapping of words and their semantics details and precision is usually lost in not only automatic translation but also manual ones. It is easy to find examples where automatic translation will fail. eg:
    "Yes, it is blue". Translating that into Italian requires you to know what "it" is. If "it" is the sky then "blue" must be translated into "azzurro".
    "Could you handle that?". Translating that accurately to another language requires you to know if the originator is British or non-British.

    Translation requires understanding and understanding requires knowledge of the world, which computers don't have.

  • (Score: 0) by Anonymous Coward on Wednesday November 04 2015, @10:03PM

    by Anonymous Coward on Wednesday November 04 2015, @10:03PM (#258545)

    [...] requires you to know what "it" is.

    It's raining. But what is the 'it' that rains?

    • (Score: 2) by maxwell demon on Wednesday November 04 2015, @11:50PM

      by maxwell demon (1608) on Wednesday November 04 2015, @11:50PM (#258579) Journal

      The cloud, of course. Rain is only available as cloud service. ☺

      --
      The Tao of math: The numbers you can count are not the real numbers.
    • (Score: 2) by Yog-Yogguth on Wednesday November 11 2015, @08:17AM

      by Yog-Yogguth (1862) Subscriber Badge on Wednesday November 11 2015, @08:17AM (#261654) Journal

      Cats and dogs of course! :D

      --
      Bite harder Ouroboros, bite! tails.boum.org/ linux USB CD secure desktop IRC *crypt tor (not endorsements (XKeyScore))
  • (Score: 2) by jelizondo on Wednesday November 04 2015, @11:22PM

    by jelizondo (653) Subscriber Badge on Wednesday November 04 2015, @11:22PM (#258568) Journal

    And it gets worst in some languages... I took clases in Yucatec Maya and the number varies with the type of thing being described, for example:

    • junp'eel is one of something not-alive, junp'eel míis (one broom)
    • juntúul is one of something alive (but not plants) juntùul miis (one cat, the diacritical mark makes a difference in the pronuntiation)
    • junkúul is one of plants, junkùul cho'ob (one tree)
    • juntsiit is one of something thin, like juntsiit ju'un (one piece of paper)

    And so on, depending on the characteristics of that being described; I count about 13 different things (living things, non-living, plants, flat things, thin things, slices, sips, etc.)

    Bing does a somewhat decent job of translating Yucatec Maya but you have to watch it because a diacritical mark can turn one word into something else, like míis and miis; spoken you would not confuse the words, but written is quite more complicated.

  • (Score: 2) by KritonK on Monday November 09 2015, @01:13PM

    by KritonK (465) on Monday November 09 2015, @01:13PM (#260746)

    It gets even worse, if one of the source or destination languages is not English. From some of the ludicrous results I've seen, its is pretty obvious that Google translates from the source language to English, then translates the intermediate English to the destination language. I don't remember any examples off-hand, but I've seen words, that have only one meaning, translated completely incorrectly. This is always the case when the English equivalent is ambiguous, so the incorrect translation could only have occurred if Google translated first into English, then flipped a coin and used the wrong meaning of the English word for the final translation.

    I guess the accuracy of the translations also depends on the language being translated. If the language is a close relative of English, then the translation might be passable. For other languages, e.g., Greek, Google is little more than a glorified dictionary. It is very good at suggesting translations for individual words, but, when it comes to complete sentences, the resulting translation often makes very little sense. It's a good thing, too, because many spammers have started sending spam in languages other than English, using automatic translation. Google's poor results make it very easy to identify it.

    One could argue that one could get a rough translation from Google, then edit the result, but I have found that this is more work than translating from scratch.

    On a funny note, I recently made a blog post, which I knew some people would try to read using automatic translation, so I inserted a reference to hovercraft full of eels!

    • (Score: 1) by isj on Monday November 09 2015, @08:37PM

      by isj (5249) on Monday November 09 2015, @08:37PM (#260911) Homepage

      Google translate even falls down flat when translating from common-germanic languages, eg. German where due to case and emphasis the word order can be different. Example:
      "Die Katze hat der Hund gefressen"
      where google translate reverses the meaning eventhough all the grammatical clues are there.

      I do find google translate useful when one of my acquaintances in Brazil writes in Portuguese. The rough translation and my knowledge of Italian makes it possible to decipher most messages but not catch nuances.

      Reminds of a wonderful response one of my Italian colleagues wrote to an upset customer: "We are hardly working on the problem". (although no automatic translation was involved in that gaffe)