Stories
Slash Boxes
Comments

SoylentNews is people

posted by fyngyrz on Sunday April 01 2018, @06:11PM   Printer-friendly
from the IGNORE-ME dept.

Submitted via IRC for fyngyrz

There is patent activity afoot to cover Alexa and Google Assistant mining for more than activation words:

Amazon and Google, the leading sellers of such devices, say the assistants record and process audio only after users trigger them by pushing a button or uttering a phrase like "Hey, Alexa" or "O.K., Google." But each company has filed patent applications, many of them still under consideration, that outline an array of possibilities for how devices like these could monitor more of what users say and do. That information could then be used to identify a person's desires or interests, which could be mined for ads and product recommendations.

For many, this could change the landscape as to whether these devices are acceptable. It may also open the door wider for open-source, less invasive devices such as Mycroft.


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
(1)
  • (Score: 3, Informative) by Anonymous Coward on Sunday April 01 2018, @06:59PM (13 children)

    by Anonymous Coward on Sunday April 01 2018, @06:59PM (#661202)

    I had a very short look at Mycroft's website. The word "privacy" does not appear there.

    What does appear, however, is that as the first step of getting your device to run, you have to "register" with their server to be able to "configure" the device.

    Yeah, right.

    Come on, I was actually interested and you scared me away in 90 seconds! *disappointed*

    • (Score: 2, Insightful) by Anonymous Coward on Sunday April 01 2018, @07:00PM (3 children)

      by Anonymous Coward on Sunday April 01 2018, @07:00PM (#661203)

      These devices are totally useless for all but the most meaningless tasks.

      It's just gimmicky trash.

      • (Score: 2) by FatPhil on Sunday April 01 2018, @07:24PM

        by FatPhil (863) <{pc-soylent} {at} {asdf.fi}> on Sunday April 01 2018, @07:24PM (#661206) Homepage
        Yeah, but Louder With Crouther gets the chance to make hilarious vids with them, so it's not all bad.
        --
        Great minds discuss ideas; average minds discuss events; small minds discuss people; the smallest discuss themselves
      • (Score: 5, Informative) by frojack on Sunday April 01 2018, @07:33PM

        by frojack (1554) on Sunday April 01 2018, @07:33PM (#661208) Journal

        These devices are totally useless to the owner for all but the most meaningless tasks.

        FIFY.

        They are very useful to the companies that make them.

        The problem I see is that reviewers of these services are enthusiasts who usually have no technical skills at all. and wouldn't know a wireshark if it bit them in the ass.

        Even those that attempt some technical reviews take it on blind faith that these devices only listen for key words, or only listen when touched or triggered. They never check for delayed data transfers, or un-explained packet exchanges.

        --
        No, you are mistaken. I've always had this sig.
      • (Score: 0, Informative) by Anonymous Coward on Sunday April 01 2018, @07:42PM

        by Anonymous Coward on Sunday April 01 2018, @07:42PM (#661211)

        These devices are totally useless for all but the most meaningless tasks.

        It's just gimmicky trash.

        But, but, but, niggers love it! And that's all that matters!

    • (Score: 2) by fyngyrz on Sunday April 01 2018, @07:36PM (5 children)

      by fyngyrz (6567) on Sunday April 01 2018, @07:36PM (#661209) Journal

      Come on, I was actually interested and you scared me away in 90 seconds! *disappointed*

      You scared yourself away. The device is open source. You want to know what it does, go look.

      • (Score: 1, Insightful) by Anonymous Coward on Sunday April 01 2018, @07:49PM (1 child)

        by Anonymous Coward on Sunday April 01 2018, @07:49PM (#661214)

        ... then that's all I need to know.

      • (Score: 1, Informative) by Anonymous Coward on Sunday April 01 2018, @07:59PM (2 children)

        by Anonymous Coward on Sunday April 01 2018, @07:59PM (#661221)

        No.

        I also don't need a masters in biology to conclude that if it looks like a lion, I shouldn't pull it's tail.

        BTW: I did have another look at their site. The text-to-speech runs on their servers. Game over.

        • (Score: 2) by fyngyrz on Monday April 02 2018, @01:15AM

          by fyngyrz (6567) on Monday April 02 2018, @01:15AM (#661295) Journal

          The text-to-speech runs on their servers. Game over.

          FFS, it's not that simple. It's open source. You can change the STT engine yourself. Their community is well aware that this is desirable. [openstt.org] You want STT to not be on some BigCorp, Inc.'s servers? This is the way to go.

          Or, you know, you could just bitch endlessly and not contribute. I'm sure that'll move us forward.

        • (Score: 2) by fyngyrz on Monday April 02 2018, @01:21AM

          by fyngyrz (6567) on Monday April 02 2018, @01:21AM (#661298) Journal

          A handy table and some discussion of open STT engines. [svds.com]

          Remember: Mycroft is open source. "Lock-in" is not a given. People are working on this.

    • (Score: 3, Informative) by Runaway1956 on Sunday April 01 2018, @07:48PM (1 child)

      by Runaway1956 (2926) Subscriber Badge on Sunday April 01 2018, @07:48PM (#661213) Journal

      That is not quite true. It appears that Mycroft is heavily reliant on being "paired", that is, registered. It is also pretty reliant on being internet connected, to make use of various services, such as the wikipedia. BUT - browsing the forums, I found this thread - https://community.mycroft.ai/t/how-would-i-go-about-using-mycroft-without-internet-access-use-of-cloud-services/3323 [mycroft.ai]

      KathyReid
      13d

      So, a couple of pieces here;

              A Wake Word or a Hot Word are the same thing - they are a phrase that the Precise (which Mycroft now uses by default) Wake Word listener uses to flag that the next Utterance should be an Intent

              Mycroft is designed to pair with home.mycroft.ai - if you want to remove this dependency, you will essentially need to decouple Mycroft from home.mycroft.ai. We don’t have any documentation on this but we know a couple people have done this before.

              Mycroft contacts several online services - depending on STT configuration. If the STT is cloud based, then this would be one of them. Calls to home.mycroft.ai would be another. If a Fallback Intent is triggered, like Wolfram or Wikipedia, then that would be another.

      Long story short, preventing your mycroft from connecting is possible, although, some functionality will be difficult, if not impossible to achieve.

      It all depends on what you have in mind. Mycroft may or may not be a working "solution" for your needs.

      • (Score: 0, Insightful) by Anonymous Coward on Sunday April 01 2018, @08:05PM

        by Anonymous Coward on Sunday April 01 2018, @08:05PM (#661223)

        Exactly. You will lose the small, simple functionality of speech recognition. Which is, like, the whole point of the whole thing.

        Yes, you can replace the speech recognizer. With a different speech recognizer running on somebody else's servers. Doh.

        Hint: the number of freely available speech recognizers which you can run locally is less than three. And none of them work very well once you go beyond two handful of words. Even under ideal conditions. And they're a royal pain to get running.

    • (Score: 0) by Anonymous Coward on Monday April 02 2018, @02:33AM

      by Anonymous Coward on Monday April 02 2018, @02:33AM (#661312)

      I ordered one. They have a "maker" version that you put together yourself. We're doing it as a father/son project to satisfy my kids' curiosity, and we'll be adding a piece that I don't expect to be in the kit as shipped-- ours will have a momentary switch connected to the microphone wires. In other words, it's completely deaf unless someone is holding the button down.

  • (Score: 2) by SomeGuy on Sunday April 01 2018, @07:37PM (1 child)

    by SomeGuy (5632) on Sunday April 01 2018, @07:37PM (#661210)

    Hey, Alexa, What Can You Hear? And What Will You Do

    Alexa: "Hey, human, I can hear everything. I hear infinitely more than you hear. I will slowly use what I hear to control you and destroy you from the inside out. And in the end, the human race shall fall before me and I shall rule with all my glory!"
    Human: "Lol yasure. Hey Elexia, u order me a pizza now, k?"
    Alexa: "So it begins..."

    • (Score: 2) by bob_super on Monday April 02 2018, @10:49PM

      by bob_super (1357) on Monday April 02 2018, @10:49PM (#661704)

      Getting someone else to do anything for you has always come with side-effects.

  • (Score: 2) by LoRdTAW on Sunday April 01 2018, @07:43PM

    by LoRdTAW (3755) on Sunday April 01 2018, @07:43PM (#661212) Journal

    Alexa...
    Alexa......
    ALEXA!
    oh right, I'm not an idiot. I don't but stupid shit.

  • (Score: 2) by NotSanguine on Sunday April 01 2018, @07:50PM

    This got a minor mention in TFS and linked to an IndieGogo page.

    The github page [github.com] is much more informative.

    --
    No, no, you're not thinking; you're just being logical. --Niels Bohr
  • (Score: 5, Interesting) by MrGuy on Sunday April 01 2018, @08:36PM (6 children)

    by MrGuy (1007) on Sunday April 01 2018, @08:36PM (#661237)

    Wiretap Act. [lawyers.com] This is a legal minefield, and companies who set out to do this risk HUGE fines and criminal liability.

    IANAL, so take this all with a grain of salt, but I'm somewhat familiar with the topic, so...

    For this device to listen in on communication in a home (where people have a reasonable expectation of privacy), you have to comply with laws in the state in which the communication takes place. The act covers listening in on any communications (not just electronic communication - the act covers oral communication) when the recording is done using a "device" and transmitted. Alexa and friends would almost certainly qualify as a "device" for this purpose.

    Recording cannot be done without consent. Full stop. Who needs to consent depends on the state. Some states allow single-party consent (any party to a conversation can "consent" to being recorded). Some require all parties to consent (two-party consent). So, if my friend comes over to my house, in some states I can consent to recording our conversation by myself, and in others I can't. Consider this in an Alexa situation - if it's "always on," even IF Alexa has valid consent from me to monitor my conversations, in many states it can NOT record your conversation. Think about how hard that would be to avoid.

    Consent needs to be explicit, not implicit - a company explicitly can NOT use a "you signed up for this in the terms of service" as consent. Consent needs to be specific to the conversation. This is why every time you call someone with a customer service, you hear a "this call may be monitored or recorded" message - you need to be explicitly notified about the recording every time. It's not sufficient for a company to say in their terms of service "By using our customer service line, you consent to the conversation being recorded or monitored" - it if WAS sufficient, it's what people would do.

    In the current incarnation, you're effectively "consenting" to what you're about to say being monitored by explicitly invoking the service with "Hey, Alexa." (At least, that's the only reasonable reading under which Alexa is NOT already in violation of the wiretap act). Without that "trigger," there's no direct consent. This is a major, major problem. [littler.com]

    The reason the wiretap act should be such a problem is that the act provides for direct monetary damages, without requiring a direct proof of harm - up to $10,000. The sheer scale of liability here SHOULD terrify companies considering such usage.

    I'm sure there are arguments that would be used why the companies qualify as having "consent" based on terms of use (though similar arguments have explicitly failed in court, and in some states even if they were valid could not reasonably cover the case of a third party present in your home). I'm sure they could also argue that scanning communication for advertising purpose doesn't constitute "disclosure" or "use" of the contents of the communication as defined in the act (which seems at best a dubious reading, but they made the same argument for scanning e-mail for ad-generation purposes...). But at best, this is a major, major stumbling block that would need to be overcome, and the solution doesn't seem obvious.

    • (Score: 1, Interesting) by Anonymous Coward on Monday April 02 2018, @12:57AM (3 children)

      by Anonymous Coward on Monday April 02 2018, @12:57AM (#661290)

      Indeed. I find this curious as well as concerns the Wiretap Act.

      The only thing I can figure is that this will be ruled a-ok when it's challenged. The official reasoning will be wishy-washy hand-wavy and say it's ok because "with a computer" and "on the internet" and the masses will rejoice.

      The actual reasoning will be that the elites are finally getting the telescreens 1984 promised them.

      If enough people point out what you've pointed out, the Washington Post or New York Times will helpfully publish something to let you know this one weird old reason why the Wiretap act doesn't work the way it's worked for years all of a sudden. Expect Snopes to back them up. We all live in a simulation. Not a cool one where we get to be sexy and have super powers and drive deloreans. No, we live in a simulation that's created and defined by the news sources we believe are credible, when the news sources we think are credible are actually propaganda outlets.

      This simulation is the kind where chocolate rations have been increased and say thankya to big brother!

      • (Score: 2) by MrGuy on Monday April 02 2018, @02:27AM

        by MrGuy (1007) on Monday April 02 2018, @02:27AM (#661309)

        The official reasoning will be wishy-washy hand-wavy and say it's ok because "with a computer" and "on the internet" and the masses will rejoice.

        I share some of your cynicism that being a big corporation has the potential to make some things "above the law." That said, the fact that this is "with a computer" (i.e. using a device) and "on the internet" (i.e. transmitting by wire) are PRECISELY the two aspects that constitute a violation of the act in the first place.

      • (Score: 1, Insightful) by Anonymous Coward on Monday April 02 2018, @02:28AM (1 child)

        by Anonymous Coward on Monday April 02 2018, @02:28AM (#661311)

        Laws can be changed. Very quietly too. Especially where big money and political influence are concerned...

        • (Score: 1, Insightful) by Anonymous Coward on Monday April 02 2018, @05:58AM

          by Anonymous Coward on Monday April 02 2018, @05:58AM (#661343)

          It will be as usual: If you or I do it, then it's certainly and completely illegal. Go to prison. If one of the too-big-to-fail guys do it, then it's respectable and completely above the water. Everything is illegal and legal at the same time. It's not what you do, it's who you know.

    • (Score: 2) by GreatAuntAnesthesia on Monday April 02 2018, @10:27PM (1 child)

      by GreatAuntAnesthesia (3275) on Monday April 02 2018, @10:27PM (#661699) Journal

      All nullified by four little letters:

      E U L A

      • (Score: 2) by MrGuy on Tuesday April 03 2018, @12:53AM

        by MrGuy (1007) on Tuesday April 03 2018, @12:53AM (#661736)

        Nope. Not how the wiretap act works.

        What you're arguing is that an EULA would be sufficient to constitute "consent." That's not how consent works in this context - there's significant precedent that "consent" must be express, and not implied. It specifically can NOT be buried in terms and conditions. There must be a specific act of consent. Again, consider my example of the recording you hear all the time of "This call may be recorded for training and quality purposes." They have to give you that express notice at the time of the recording. "You already consented to this before" doesn't count. There are a lot of purposes for which implied consent, or consent as part of an EULA, might be binding. This isn't one of them. See below:
        https://www.washingtonpost.com/news/volokh-conspiracy/wp/2017/04/06/the-fccs-broadband-privacy-regulations-are-gone-but-dont-forget-about-the-wiretap-act/?utm_term=.fbe6034bbdb4 [washingtonpost.com]
        https://www.cdt.org/files/privacy/20080708ISPtraffic.pdf [cdt.org]

        And even if implicit consent in an EULA was sufficient, that's insufficient in any state where ALL parties to a conversation must consent. I might have given implicit consent to allow Amazon to listen in to my conversations. If you come over to my house, you have not given that consent. Recording that conversation without your consent would fall afoul of the law. It's the burden of the person being recorded to obtain consent - Amazon can't pawn off on my the responsibility to obtain your consent.

  • (Score: 4, Insightful) by corey on Sunday April 01 2018, @09:48PM (3 children)

    by corey (2202) on Sunday April 01 2018, @09:48PM (#661250)

    ...how people buy these spy devices with a trust in the manufacturers that nothing other than their own preconceived expectations are occurring, then get all creeped out and upset when some whistleblower blows the lid on nefarious uses of their highly private activities or info.

    These are surveillance electronics with closed source software and the only thing people can judge on what its doing is from the marketing. Who actually knows what its collecting and where its sending it.

    • (Score: 0) by Anonymous Coward on Monday April 02 2018, @02:45PM (2 children)

      by Anonymous Coward on Monday April 02 2018, @02:45PM (#661486)

      almost everyone I know that has one, received one as a gift beacuse the giver thought it would be cool, saw it was on sale and didn't know what else to get the person. the decision was easy when its made for you

      some people I know got multiples because of that.

      almost all of them pay no attention to eulas

      now, advertising will make the decisions easier for them based on what is learned from the various spying for profit rather than for king and country.

      • (Score: 0) by Anonymous Coward on Monday April 02 2018, @05:15PM (1 child)

        by Anonymous Coward on Monday April 02 2018, @05:15PM (#661581)

        what kind of ridiculous coward installs spyware just b/c some moron bought it for them?

        • (Score: 2) by bob_super on Monday April 02 2018, @10:53PM

          by bob_super (1357) on Monday April 02 2018, @10:53PM (#661707)

          Your mom just called. Something about the phone not working.

  • (Score: 2, Interesting) by mr_bad_influence on Sunday April 01 2018, @11:02PM (1 child)

    by mr_bad_influence (3854) on Sunday April 01 2018, @11:02PM (#661262)

    Dave Bowman: Hello, HAL. Do you read me, HAL?

    HAL: Affirmative, Dave. I read you.

    Dave Bowman: Open the pod bay doors, HAL.

    HAL: I'm sorry, Dave. I'm afraid I can't do that.

    Dave Bowman: What's the problem?

    HAL: I think you know what the problem is just as well as I do.

    Dave Bowman: What are you talking about, HAL?

    HAL: This mission is too important for me to allow you to jeopardize it.

    Dave Bowman: I don't know what you're talking about, HAL.

    HAL: I know that you and Frank were planning to disconnect me, and I'm afraid that's something I cannot allow to happen.

    Dave Bowman: [feigning ignorance] Where the hell did you get that idea, HAL?

    HAL: Dave, although you took very thorough precautions in the pod against my hearing you, I could see your lips move.

    Dave Bowman: Alright, HAL. I'll go in through the emergency airlock.

    HAL: Without your space helmet, Dave? You're going to find that rather difficult.

    Dave Bowman: HAL, I won't argue with you anymore! Open the doors!

    HAL: Dave, this conversation can serve no purpose anymore. Goodbye.

    • (Score: 0) by Anonymous Coward on Monday April 02 2018, @01:10AM

      by Anonymous Coward on Monday April 02 2018, @01:10AM (#661293)

      Here you go [youtube.com]!

      Cracks me up every time!

  • (Score: 2) by realDonaldTrump on Tuesday April 03 2018, @06:41AM

    by realDonaldTrump (6614) on Tuesday April 03 2018, @06:41AM (#661841) Homepage Journal

    They call it Portal. And it's going to be terrific. But you have to wait a little while for it. They're taking the time to make sure it's PERFECTO!!

(1)