Stories
Slash Boxes
Comments

SoylentNews is people

posted by CoolHand on Tuesday July 03 2018, @07:19PM   Printer-friendly
from the just-a-lil-bit-of-spyin' dept.

Submitted via IRC for SoyGuest52256

According to the patent, spotted by Metro, the system would use 'a non-human hearable digital sound' to activate your phone's microphone.

This noise, which could be a sound so high-pitched that humans cannot hear it, would contain a 'machine recognisable' set of Morse code-style beeps

Once your phone hears the trigger, it would begin to record 'ambient noise' in your home, such as the sound of your air conditioning unit, plumbing noises from your pipes and even your movements from one room to another.

Your phone would even listen in on 'distant human speech' and 'creaks from thermal contraction', according to the patent.

TV advertisers would use this data to determine whether you had muted your TV or moved to a different room when their promotional clip played.

Source: http://www.dailymail.co.uk/sciencetech/article-5882587/Facebook-wants-hide-secret-inaudible-messages-TV-ads-force-phone-record-audio.html


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3, Insightful) by Anonymous Coward on Tuesday July 03 2018, @07:52PM (19 children)

    by Anonymous Coward on Tuesday July 03 2018, @07:52PM (#702136)

    You were sending crash dumps without asking the user? That's terrible.

    Starting Score:    0  points
    Moderation   +3  
       Insightful=2, Informative=1, Total=3
    Extra 'Insightful' Modifier   0  

    Total Score:   3  
  • (Score: 2) by VLM on Tuesday July 03 2018, @08:39PM (18 children)

    by VLM (445) on Tuesday July 03 2018, @08:39PM (#702156)

    That's terrible.

    Eh, whatevs.

    I can empathize with the user that they have no idea what the company is doing with the data. I mean, from the inside, I know all that was happening with the data was bugs were found and fixed faster and nobody does anything else with the microscopic amount of data reported, it all gets erased and none of it I'm aware of ever had PII data included, so I really don't care. I'm sure someone with paranoia problems would absolutely assume that crash dump allows space aliens to control their minds and lets the Russians steal their precious bodily fluids. Individual paranoia can be damaging to the population at large, not just to the suffer. To some extent opt-ing out is exactly like anti-vaxxer people.

    Its like demanding car accident reports be kept secret for privacy reasons no matter how anonymized the data is or how much societal damage results if crashes are kept secret. Would you be more, or less safe WRT a Tesla self driving car, for example, if you knew that every reported self-driving accident needed permission from the manufacturer to be reported? Or if every time there was a possibly fatal accident, if the victim was a total antisocial jerk, they could "keep it private" and not let the manufacturer know about it, such that some other poor bastard gets into an identical (possibly fatal?) accident? Or WRT traffic engineering design of intersections, if enough accidents were kept private, a lot of people would die because nobody knows which intersections need fixing.

    Also there's a lot of individual history involved... multi-user unix admins laugh at the idea that they need permission from every user to fix a bug involving a core dump or an error message logged to /var/log/syslog or a kernel panic. Or a public utility background, like seriously, a 5ESS switch crashes or a DEXCS shuts down, you fix it, not obtain permission from every telephone line in the city... On the other hand people brought up on apps and solely apps seem horrified at the idea that someone somewhere is "broadcasting an ip address" like the scammy banner ads from twenty years ago used to fearmonger. Ah the webserver kernel panic for no apparent reason sig 11 or whatever? Well I can't fix that or even look at /var/log/syslog until I have documented proof that everyone listed in the access_log gives permission to troubleshoot the issue, sorry.

    GDPR has its heart in the right place, but its definitely written without any technical input. IT would be exactly like having a bunch of inexperienced political science students invent their own replacement for the existing ASME steam boiler codes and then force every engineer to implement their pipe dream. The idea of a near legally binding design code for steam boilers is a great idea, its just the whole planet would be better off with a steam boiler code written by engineers rather than well meaning art history majors.

    • (Score: 5, Insightful) by Anonymous Coward on Tuesday July 03 2018, @09:01PM (11 children)

      by Anonymous Coward on Tuesday July 03 2018, @09:01PM (#702169)

      To some extent opt-ing out is exactly like anti-vaxxer people.

      I know you're pretty nuts, but I would have thought you'd have more respect for privacy concerns. Maybe you're company isn't doing shady shit but others are and there is no way for joe random to know. Additionally, even if you're not selling the data your company might become a target and the data will be stolen. Not having it in the first place because someone opted out seems like the safest bet.

      Maybe you'll change your tune if we can get legislation that lets people easily sue companies that violate privacy knowingly or not. Like the "do not call" list.

      I guess I'll be happy you're not calling GDPR a subversive communist attempt to destroy capitalism and democracy.

      • (Score: 2) by VLM on Tuesday July 03 2018, @09:19PM (10 children)

        by VLM (445) on Tuesday July 03 2018, @09:19PM (#702182)

        there is no way for joe random to know

        Exactly, a better solution that the GDPR would implement that, somehow. Sort of a SOX declaration that nothing shady is going on and the auditors agree, rather than the GDPR we got stuck with.

        Trust me dude as part of the cleanup I got to see the crash logs and there is Nothing in there, absolutely nothing, worth selling or stealing, so... I mean, on that specific cleanup project, if you had root on the app and could take anything you wanted, the app itself had nothing worth stealing to begin with... I know that, and they know that, and that only leaves like 324999900 other people in the USA who don't know nothing interesting was going on and assume instead what was going on was the facebook thing from the article of recording audio from their microphone for whatever big brother insanity they plan.

        I guess I'll be happy you're not calling GDPR a subversive communist attempt to destroy capitalism and democracy.

        Its too crazy. In the future its going to be looked back upon like those crazy laws on clickbait sites like "In Waynesboro, Virginia It's Illegal For A Woman To Drive Up Main Street" type of stuff. Like kids are going to read about the bad old days and GDPR will probably get a LOL paragraph and the kids will ask questions in class like "so... this was a result of legalizing weed, right?" and stuff like that. Its such a bad law its almost orthogonal to reality, not merely oppositional to capitalism or democracy.

        The GDPR is like, you better treat all the normal residential garbage in a trash can as if its entirely made of written copies of the nuclear missile launch codes and CA root certs and lists of SS numbers, unless you're a giant corporation with a lot of money and lawyers or a crook, in which case you can go right on screwing everyone over without any interference from the government at all. And the above the law people are ironically the only source of trouble to begin with. In that way GDPR is kinda like gun control; the people most likely to obey the gun control laws (like, say, the cleanup BS project I got catapulted into) are exactly the self selected sub-population least likely to be involved in gun related crimes and vice versa.

        • (Score: 2) by bitstream on Tuesday July 03 2018, @09:38PM (9 children)

          by bitstream (6144) on Tuesday July 03 2018, @09:38PM (#702196) Journal

          Auditors will become lazy and bribed. Human nature thing.

          And crash dumps, logs etc may contain sensitive data inadvertently. But a combination of some automatic scrubbing and checking a user setting of "send debug logs to author" should provide a suitable solution.

          Regardless. Laws are usually written by law people and possibly politicians. Which would outlaw gravity if it were deemed problematic and more "just" .... ;-)
          Facts nor practicality need not to apply.

          • (Score: 2) by VLM on Tuesday July 03 2018, @10:23PM (6 children)

            by VLM (445) on Tuesday July 03 2018, @10:23PM (#702211)

            And crash dumps, logs etc may contain sensitive data inadvertently.

            No dude, its not inadvertent. Not at all. This late in the holiday I'm still sober enough to log into firebase console on another monitor while typing this WRT that GDPR cleanup project and there's three NPEs and one SQL constraint violation in the hopper. At least with Crashlytics you have to explicitly and intentionally enable custom logging, set custom keyvalue pairs, or intentionally and explicitly set user identifiers. If you don't, you don't get "sensitive data" logged in crash dumps. It doesn't like "magically appear".

            I'm looking at it right now, and there is no user data logged BY DEFAULT other than the call stack and the exact line of code where the crash happened.

            Yes crazy as it sounds, even a "android.database.sqlite.SQLiteConstraintException" only logs "Unique constraint failed" the name of the SQLite column defined as unique in the schema, and the exact java class name and line number in that class that blew it up (63 for whatever it matters) and thats it. No actual data, no row data, nothing no variable contents nothing nothing nothing. Certainly no PII HIPPA nuclear missle launch codes, recorded mic audio, sexting pixs, SN unames and passwords, nothin but those three things, the error being a sqllite issue, the specific issue being a schema constraint violation, the column in the schema that was violated, and the exact line number of source code causing it. Oh and it happened at 3:40:00pm yesterday plus or minus five minutes. All the crashes seem rounded to five minute resolution. Sorry if I just violated some poor bastards privacy but I simply have no cares if they're that clinically paranoid ..... You can feel pity for someone insane, and be nice to them, but you can't take their delusions seriously. This is simply not a problem for non-insane people.

            Now a crooked company could intentionally with malice aforethought explicitly add custom logging to send in your login password or sweet nothings you texted your spouse, but ... not here. Like I implied elsewhere, its precisely the crooked companies that are not going to even attempt to follow GDPR because they're crooks, and non-crook companies like the project I got catapulted into are precisely the kind of people who you DON'T need to worry about but are going thru the wringer.

            The NPEs list the timestamp, obviously, "attempt to read from field blah on a null object reference" and the exact class name and source code line number (125 in this case). I am too lazy to look up the reported version vs the bug database to tell, but "usually" this is poor handling of views where a view goes away while someone still has a reference to it (like rotating the screen while something is updating or whatnot). You're supposed to use LiveData to "fix" all that foolishness and do all your biz logic in the viewmodel but WTF. Who knows maybe hardware failure madness I donno. No I do not get any more data. Hard to describe a null as violating someones privacy (thats a joke, kinda). Obviously its more a source code bug than data/privacy issue. But in summary, no, NPEs don't leak private data accidentally or any of that BS.

            My guess is a new "feature" of crashlytics will be to remove all that possibly privacy violating stuff. Bare basic crashlytics gives you class name, variable name, maybe, generic error message, and line of the source code. Not your social security number or facebook password.

            There are some details I left out that are irrelevant. I know the android version of the crashing device. I know the app version they were running (recent, at least). When the NPE happend it was in Portrait mode... The device that crashed had 821.64 MB free so it wasn't a OOM event. Thats about it. I have no uuid no device id no mac no ip address no phone number no wifi ESSID, basically, nothing.

            Anyone with a specific question about Crashlytics could set up a free account, write a five line "app" with an activity that calculates 1/0 and see for themselves, or ask me if I'm still sober enough to respond...

            Firebase Performance is interesting. Unless you enable extra traces all you get to see is the worldwide average app_start time for this all for the last 30 days is 708 ms, which is supposedly not bad. If a new version altered the average startup time for good or bad, that would be interesting. This app does not hit the internet, but performance can be used to analyze http response latency and codes, although obviously I have no data because this app doesn't do it. Ratio of 200 vs 404 codes, response times below 2 s, 2-6 secs, and above 6 secs by geographic area.

            Analytics is pretty empty for this app. They're logging every time any fragment is hit, so I can see X% of users hit fragment (screen) A, vs Y% looked at screen B over various time intervals up to the previous month. This particular company does NOT add additional logging and only added logging to show a fragment (screen) was loaded. There's some fuzzy looking math about total audience numbers which is interesting. YES you can explicitly and intentionally with malice add "user properties" to log your passwords, dogs name, and sexting pixs in each upload, but this company not being crooks... they don't.

            In summary, no dude, "sensitive data" is NOT uploaded unless the programmers explicitly and intentionally with malice are dirtbags. By default nothing sensitive is uploaded or could possibly appear even by accident. Thats why I'm pretty cavalier about it; I KNOW its not a privacy violation. Although I sympathize that the general public has no way of knowing.

            • (Score: 2) by VLM on Tuesday July 03 2018, @10:47PM (5 children)

              by VLM (445) on Tuesday July 03 2018, @10:47PM (#702218)

              Looking more closely at the version numbers, I am not working the afternoon before the 4th, but clearly someone IS working ... One of the NPEs in the hopper crashed this afternoon on a phone described as "Android SDK built for x86" so ...

              As a cultural phenomena the crash hopper at a multi-person company ends up about like you'd expect with orphaned junk in it that doesn't age out until its auto-deleted in 30 days, so naturally some dev didn't delete the other NPEs and no one else will because its kind of sabotage if someone actually needed that data (who?) and there's no theoretical way to reverse the data and de-anonymize who crashed their emulator, so its impossible to figure out who done it. With more motivation and less alcohol I could log into the VPN then the rdesktop to the cluster to get into the gitlab to see who was making commits to that java class around the time of the NPE which doesn't exactly prove who caused the problem but does implicate someone who fixed it, but ... I'm lazy. Which makes the point that its hard to violate someone's privacy if even the guys working there with source code access still can't effectively de-anonymize a crash report.

              Firebase and crashlytics and all that is like a network test tool; the fact that a crook could pingflood someone with it doesn't prove everyone with that binary is in fact a criminal requiring extensive regulation, in fact most people are not criminals. Again, the fairly obvious gun control analogy, that the only people following the draconian rules are by definition exactly the people you don't need to worry about and shouldn't be subjected to the draconian rules.

              • (Score: 3, Informative) by bitstream on Wednesday July 04 2018, @12:19AM (1 child)

                by bitstream (6144) on Wednesday July 04 2018, @12:19AM (#702264) Journal

                I think our perspectives may differ. I'm thinking primarily on "core dumps". They may contain memory regions of various data.
                Your logging stuff may be way less prone to inadvertent data leakage.

                • (Score: 2) by VLM on Thursday July 05 2018, @03:21PM

                  by VLM (445) on Thursday July 05 2018, @03:21PM (#703012)

                  Ehh... Android studio is free, Google firebase with crashlytics and analytics is free, you can log in and try this stuff yourself if you don't believe me for nothing but hours.

                  Computing world being very big, there could be some competitor product you're talking about that I'm unaware of that uploads your complete photo gallery and stored website passwords with every NPE; but I can assure you, not with what I have experience with.

                  I'm not kidding, you get a crash in crashlytics, it doesn't have memory dumps or variable contents, which sometimes makes debugging weird, yet merely knowing what crashed when is often enough data to fix stuff. I suppose a real jerk could embed single bits of data at a time in the backtrace by having string to binary functions recursively call each other and then do a 1/0 to drop a crash where the backtrace is a binary representation of "secret data", but its impossible to prevent active intentional malice.

                  Maybe Apple dev tools contain curious stuff in their crash reports; again, I've only worked on Android using standard google services with one non-scummy company.

              • (Score: 0) by Anonymous Coward on Wednesday July 04 2018, @03:45AM (2 children)

                by Anonymous Coward on Wednesday July 04 2018, @03:45AM (#702360)

                What happens when you are not in that company and an asshole psychopath replaces you?

                • (Score: 2, Funny) by Anonymous Coward on Wednesday July 04 2018, @06:52AM

                  by Anonymous Coward on Wednesday July 04 2018, @06:52AM (#702404)

                  What happens when you are not in that company and an asshole psychopath replaces you?

                  You mean, how will they tell the difference?

                • (Score: 1, Funny) by Anonymous Coward on Wednesday July 04 2018, @03:14PM

                  by Anonymous Coward on Wednesday July 04 2018, @03:14PM (#702561)

                  What happens when you are not in that company and an asshole psychopath replaces you?

                  Are they looking for one? I'm available.

          • (Score: 0) by Anonymous Coward on Wednesday July 04 2018, @02:41AM (1 child)

            by Anonymous Coward on Wednesday July 04 2018, @02:41AM (#702321)

            "Auditors will become lazy and bribed. Human nature thing."
            Oh how I wish. However the reality is somewhat depressing.
            Auditors will make themselves necessary. It's their reason for existence.
            I do SOX. Been there, done that. Auditors will not go away, they will weasel their way into everything they possibly can.
            I loathe most auditors because they have no understanding of what they're auditing. Or the language of the people they're auditing.
            Deloitte is pure evil, they should hire native speakers to conduct their audits.

            • (Score: 3, Interesting) by bitstream on Wednesday July 04 2018, @07:11AM

              by bitstream (6144) on Wednesday July 04 2018, @07:11AM (#702413) Journal

              SOX = Sarbanes–Oxley Act of 2002 ?

    • (Score: 0) by Anonymous Coward on Tuesday July 03 2018, @09:11PM (1 child)

      by Anonymous Coward on Tuesday July 03 2018, @09:11PM (#702175)

      I would think if your app merely compiled a report and then prompted the user before sending it, as is common on some linux ui software I use, would keep yo uin the clear for GDPR, since the user is sending it, and satisfy users who'd rather not send it - either because they're paranoid, or savy enough to know it was their own damn fault and not want to waste your time.

      • (Score: 2) by VLM on Tuesday July 03 2018, @09:38PM

        by VLM (445) on Tuesday July 03 2018, @09:38PM (#702197)

        savy enough to know it was their own damn fault and not want to waste your time.

        Trying not to get into the weeds, but one guys lack of input sanitation might be someone else's sql injection attack or sneaky buffer overflow attack, thinking back on the saga of little bobby tables. Fixing input sanitation mistakes should be a developer decision not a user decision.

        Also crashlytics isn't really that kind of "system level" thing its a linked in user level library. You can crash an app on startup before you get situated enough to have that little upload dialogue if you badly enough screw up a database migration, perhaps. A system level crash reporter built into android operating at a level above the app would be a great idea, but its not here today. Maybe that'll be the end result and the app dev will have no involvement in crash reporting at all other than a possible recipient of reports.

        Another long term outcome is likely removal of features from Crashlytics such that, however helpful it can be, you can no longer include arbitrary data from the app. Oh well. Paranoid people think all crash reports have the feature including the variable containing the user's social security number and moms maiden name and all their sexting pics and are furious mad that might be recorded somewhere temporarily to fix some bug. More realistically the cleanup project I was in had no access to anything interesting like that, didn't log anything additional that I can recall, and most crash dumps were dumb stuff like calculating the statistical average of a table that "can never be empty" but none the less is empty, by adding up a subtotal (ok, result zero...) then dividing it by the length of the empty table (that being zero) oh snap divide by zero. The longer I program the more I expect the "impossible" to not merely be possible, but common, so I wouldn't do something that dumb, but things happen and thats really what crashlytics is for, not sneakily collecting every Sekret Squirrel's top-secret-nuts.

    • (Score: 5, Informative) by edIII on Tuesday July 03 2018, @11:17PM (3 children)

      by edIII (791) on Tuesday July 03 2018, @11:17PM (#702225)

      Go fuck yourself with a cactus sideways. If your response to exfiltrating data from a customer without their knowledge, or consent, is "whatevs", then you have no business doing those things.

      You are a shitty fucking sysadmin who does not deserve the trust of the users, especially when you're so fucking flippant about it. Seriously, go fuck yourself.

      --
      Technically, lunchtime is at any moment. It's just a wave function.
      • (Score: 2) by VLM on Tuesday July 03 2018, @11:52PM (2 children)

        by VLM (445) on Tuesday July 03 2018, @11:52PM (#702249)

        exfiltrating data from a customer without their knowledge

        I guess that's the fundamental problem, isn't it?

        I know what that data is in great detail, its a timestamp rounded to nearest (or next?) five minutes, the line of code where the app crashed, maybe the one liner error message (like the famous NPE). Stack trace. No variable contents, no sqllite DB contents, no UI data, none of that. There is no anonymized identification data for me to even attempt to de-anonymize; maybe in some API I don't understand or use. Someone who's a crook could add that to the source code, but this being a legit company, no one has. Needless to say I don't work at a shithole like Facebook.

        Despite how little datais contained in a crash report, its surprisingly effective at finding and fixing bugs. Of course onsie twosie here and there its probably a end user hardware failure issue not actual code. You get 100 crashes on the same line immediately after rollout, thats a software bug.

        The, uh, extremely enthusiastic response I get assumes I'm getting ip addrs, account names, passwords, phone numbers, SS numbers, pix, audio (like the linked facebook article). Nah. No personal data at all, just a crash report.

        Theres just a slight mismatch between reality and today's "two minutes hate".

        I sleep very well accepting a check from a company "exfiltrating data" to the level they exfiltrate. Now, there is a spectrum of acceptability such that to me, how anyone with any moral or ethical sense works for Facebook is a mystery to me, but whatevs.

        Let me troll you a bit by releasing the total sum of some user's "personal private" (LOL) information in a NPE crash report. Around 1pm yesterday some anonymous poor bastard got a NPE when source code line 111 tried to set a onClickListener on a null object reference where that null object reference wanted to be a widget.Button but was a null instead. Based on the class name I know it was a "C" CRUD fragment, so it's virtually certainly a button labeled "save" or whatever in the users locale (I can't de-anonymize the user so I don't even know their locale or language; hopefully doesn't matter). That's all the data I got, officially. Unofficially I think one of the devs was Fing around with that part of the app and thats probably his personal testing device, but its impossible to de-anonymize, could be you, or my next door neighbor, for all I know. My guess based on experience doing android development is my fellow software dev changed the name of the button or otherwise messed up a findViewById where you make a Button object in your fragment and try to link it to the UI, but if you screw that up just right Android will be chill until you try to set an OnClickListener on the object to respond to the button being clicked at which point it promptly crashes. A simple typo couldda done it. Or maybe some other reason. I didn't do it, but I could probably debug it given the "exfiltrated data", so its business useful. Possibly I'm completely wrong and the other dev was trying to create a new button labeled something like "cancel" and THAT had failed. Hmm. Not much data, personal, or otherwise, in these crash reports. Oh I feel so exhibitionistic dropping all that private data, LOL. Hope you can handle it, LOL. Oh no, the lost trust, LOL. Its hard to take an accusation like that seriously in a story like this about Facebook, LOL.

        Part of getting stuck on the GDPR project was the point of the project was all the meaningless permission opt in BS to "get permission" from losers stuck living in the EU to exfiltrate such valuable and personal data as listed in uncensored totality above, so even if you don't like how it WAS being done, you'd probably be quite happy now. I was kinda air dropped into the middle of the project and will likely exfiltrate myself from the project shortly so my impression of past vs future might be fuzzy, plus or minus alcohol consumption on this holiday of course. I guess given the timestamp yesterday that incredibly detailed personal data, the complete sum of which is provided above, was freely and legally given, so thats cool.

        Its all good, though, have a happy 4th

        • (Score: 3, Insightful) by anubi on Wednesday July 04 2018, @01:13AM (1 child)

          by anubi (2828) on Wednesday July 04 2018, @01:13AM (#702283) Journal

          I believe what's behind all this fear is a complete loss of trust.

          Companies have foisted this lack of trust by making it difficult, if not impossible, to verify just what is being snooped on.

          Most companies can be trusted. A few can't. And few ( if any ) of us know which is which.

          What we do know is many companies pride themselves on "thinking outside of the box" when it comes to things like acquiring anything they can out of someone's machine if they will let them in. They figure it didn't cost them anything to get the data, and its a monetizable commodity. Carpe Diem!

          I feel toward many web pages much like a merchant may think if somebody enters his business, with dozens of kids in tow, each wearing a little "javascript" shirt. The kids are getting into everything. Going through his books, counting the cash in his cash register.. going into doors marked PRIVATE, everywhere, and he can't lay a hand on 'em... they are kids... protected by law. The most he can do is block them from entering his store in the first place. But that often means turning away the adult that came with them.. an adult that might do business with him. He has to consider is it worth it to him to have all those kids in his store getting into everything. You want to find someone you can trust, and having some people dress kids up in little javascript shirts, and have them rifling through your machine does not do much for trust.

          Little "business phrases" like "we will only share your information as permitted by law" sure sends my trust level on a downward spiral... laws can be bought. Nor do we know what information is being shared. Life is too much like a poker game, and if your competitor/opponent/customer/vendor/employee/employer knows certain things, they may seize opportunities when they know they have me over a barrel.

          --
          "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
          • (Score: 3, Insightful) by VLM on Thursday July 05 2018, @03:14PM

            by VLM (445) on Thursday July 05 2018, @03:14PM (#703006)

            I believe what's behind all this fear is a complete loss of trust.

            Agree completely, the app biz right now is a dark alley at 2am. When I'm walking down the alley, well, duh, I know I'm not a threat at all, so WTF, but everyone assumes everyone in that dark alley is a mugger.

            The infrastructure does not help. Thank you Google for making us all look like assholes. Bare unaltered Firebase/Fabric.io crashltics is as I describe, utterly no personal data and nothing any privacy advocate could be offended by, but those Google assholes added "features" such that crooks will add creepy as hell personal data to the upload "to help with debugging" which sometimes might be the honest truth but at least sometimes is scammy marketing.

            Likewise the analytics feature; bare analytics would make a privacy advocate pretty happy as I've seen it used from the inside, but asshole google is like "let me help you out" and next thing you know crooked devs are doing full on identity theft.