Stories
Slash Boxes
Comments

SoylentNews is people

posted by LaminatorX on Friday August 08 2014, @07:41PM   Printer-friendly
from the How-To-Take-People's-Secrets dept.

Google has announced that HTTPS will be used as a lightweight ranking signal for search results; sites that use it will appear slightly higher (although they make clear that quality content is a higher signal).

Related Stories

Less than Half of Google Searches Now Result in a Click 39 comments

A few months ago, back in August, the Web passed a milestone in that less than half of Google searches result in even a single click onwards. In other words, the majority of searchers never left Google after seeing the results. That could be a warning that Google is transitioning from a search engine to more of a walled-garden. Or it could mean that the results aren't good any more and people move on to other engines after only a quick glance. If the former, where searches are no longer resulting in click through, then what should be the proper response from the Web at large?

From: Less than Half of Google Searches Now Result in a Click:

On desktop, things haven’t changed all that much in the last three years. Organic is down a few percent, paid and zero-click are up a bit, but June of 2019 isn’t far off January of 2016.

On mobile, where more than half of all searches take place, it’s a different story. Organic has fallen by almost 20%, while paid has nearly tripled and zero-click searches are up significantly. Even way back in January 2016, more than half of mobile searches ended without a click. Today’s, it’s almost 2/3rds.

Three trends are made clear by these numbers:

  1. The percent of searches available as organic traffic from Google is steadily declining, especially on mobile.
  2. Paid clicks tend to increase whenever Google makes changes to how those results are displayed, then slowly decline as searchers get more familiar with spotting and avoiding them.
  3. Google’s ongoing attempts to answer more searches without a click to any results OR a click to Google’s own properties are both proving successful. As a result, zero-click searches, and clicks that bring searchers to a Google-owned site keep rising.
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by Chillgamesh on Friday August 08 2014, @07:53PM

    by Chillgamesh (4619) on Friday August 08 2014, @07:53PM (#79072)

    This is a genius move my google.

    Anyone with an important website will have a strong incentive to use https to eek a little better representation from google.

    The real winners are the the end users who gain increased protection from snooping by any and all of the powers that be.

    Google and the NSA have proven that even traditionally "non-sensitive" data can have huge value when enough of it is compiled about a single person or group of people.

    It looks like google is really leading a push back to limit surveillance for its own sake. Sure NSA can probably still make raw queries to big corporation's databases. But the amount of free flowing information that they can trivially collect will be definitely be decreased by this move.

    • (Score: 2) by strattitarius on Friday August 08 2014, @09:05PM

      by strattitarius (3191) on Friday August 08 2014, @09:05PM (#79096) Journal
      Interesting take on it. My first thought was "great, now small guys that don't put SSL on their site will be harder to find and things like bigresource will move up".

      But I see your point and think it outweighs mine. Besides, it's not that hard to implement.
      --
      Slashdot Beta Sucks. Soylent Alpha Rules. News at 11.
      • (Score: 2) by elf on Saturday August 09 2014, @08:01AM

        by elf (64) on Saturday August 09 2014, @08:01AM (#79256)

        As a small guy what is the best / cheapest way to do this?

        • (Score: 2) by stderr on Saturday August 09 2014, @05:50PM

          by stderr (11) on Saturday August 09 2014, @05:50PM (#79389) Journal

          As a small guy what is the best / cheapest way to do this?

          StartSSL [startssl.com] and CACert [cacert.org] are two popular options.

          --
          alias sudo="echo make it yourself #" # ... and get off my lawn!
    • (Score: 3, Insightful) by kaszz on Friday August 08 2014, @09:26PM

      by kaszz (4211) on Friday August 08 2014, @09:26PM (#79107) Journal

      And the search optimization slime will fall over themself to implement https ;)

    • (Score: 3, Insightful) by Bot on Saturday August 09 2014, @09:44AM

      by Bot (3902) on Saturday August 09 2014, @09:44AM (#79267) Journal
      Excuse me, but does it matter to google that the site is served over https, when it features some google analytics javascript? Or some google hosted fonts and js libraries? No, I think it doesn't. OTOH the https connections make it quite hard for the ISP to collect data. So Google makes its own data collection more valuable by interfering with third parties ones. The fact that users may be advantaged, at the moment, might be entirely coincidental. In the long term, all https traffic may mean that certificate authorities will be the gatekeepers of the web. So, it's not all rosy even if it's not bad news.
      --
      Account abandoned.
  • (Score: 3, Interesting) by cafebabe on Friday August 08 2014, @08:11PM

    by cafebabe (894) on Friday August 08 2014, @08:11PM (#79080) Journal

    Does this move create extra pressure on IPv4 space?

    --
    1702845791×2
    • (Score: 0) by Anonymous Coward on Friday August 08 2014, @08:43PM

      by Anonymous Coward on Friday August 08 2014, @08:43PM (#79086)

      No

      But it does create pressure on my squid cache that I use for lowering my bw to the outside world and the upcoming caps my ISP wants to put in place.

      • (Score: 1) by datapharmer on Friday August 08 2014, @09:13PM

        by datapharmer (2702) on Friday August 08 2014, @09:13PM (#79101)

        That's what DPI is for. Self-sign a certificate and exclude anything you don't want inspected (like your bank). It cuts down on the viruses trying to bypass proxy a/v checks too!

        • (Score: 0) by Anonymous Coward on Saturday August 09 2014, @01:15AM

          by Anonymous Coward on Saturday August 09 2014, @01:15AM (#79190)

          Didnt really want to MITM attack myself. Been meaning to do exactly what you said for awhile though. Just a PITA.

    • (Score: 2) by cykros on Friday August 08 2014, @11:56PM

      by cykros (989) on Friday August 08 2014, @11:56PM (#79165)

      Not at all. The same machine can be running an http server on port 80 as is running an https server on port 443.

      If load's what you're worried about getting to be too much (as https does have put a bit more load on a server, though not as crazy as some have made it out to be), it's worth remembering that you can have 10 servers all sitting behind the same public IP if you're so inclined, with methods in place to handle load balancing.

      • (Score: 1) by Mr. Slippery on Saturday August 09 2014, @05:29AM

        by Mr. Slippery (2812) on Saturday August 09 2014, @05:29AM (#79236) Homepage

        The same machine can be running an http server on port 80 as is running an https server on port 443.

        I don't think that's what the poster was referring to. Before the SNI extension to SSL was widespread, you had to have a separate IP address for each HTTPS site.

  • (Score: 3, Interesting) by kaszz on Friday August 08 2014, @09:44PM

    by kaszz (4211) on Friday August 08 2014, @09:44PM (#79120) Journal

    The current implementation is dependent on a list that the developers of the web browser decide upon and which contains trusted Certificate Authorities (CA). The catch is that ANY compromised CA can make the browser indicate a secure connection. And you are sure that NSA hasn't compromised any of these 50 or so CAs where the majority is within US jurisdiction? (or anyone else..)

    The current system has two major weak points. The browser team solely decides which CAs to trust. And the CA decides who you should trust. A system with a cross linked network of trust and revocation updates is less likely to have these single points of failure and may reduce the barrier to entry.

    This is just the mathematical dependencies. Secure implementation and patch updates also matters.

    • (Score: 3, Interesting) by Ethanol-fueled on Friday August 08 2014, @10:33PM

      by Ethanol-fueled (2792) on Friday August 08 2014, @10:33PM (#79137) Homepage

      Yeah, what you said. Google and Yahoo and others had their nice logos in the PRISM presentation. And now they magically pretend to care about security in some idiotically superficial manner instead of using those ga-jillions of assets of leverage to sue the fucking NSA and government on behalf of the common American. But of course they're not doing that, because they themselves have mutated into arms of the fucking intelligence services.

      Yawn. No thanks, assholes. Ask the CIA to front you more of that heroin money they're making, or go to China and convince its 12 year olds that you're still gonna be relevant in 5 years. And ask Marissa Mayer to show her armpits [vogue.com] again, without the extensive photoshopping this time around. They probably look like brown strips of dirt that don't even match her hair.

      • (Score: 3, Interesting) by SlimmPickens on Friday August 08 2014, @11:01PM

        by SlimmPickens (1056) on Friday August 08 2014, @11:01PM (#79153)

        I shouldn't really respond since you'ev invoked armpits, but whatever

        Google and Yahoo and others had their nice logos in the PRISM presentation. And now they magically pretend to care about security

        You can weigh this up however you like, however I won't be blaming them for complying with a court order. I don't believe that sue the fucking NSA and government is realistic. In fact those very NSA slides show that Yahoo alone made that futile fight. That right there is eye opening.

        The thing is, that Google are fucking paranoid about loosing their users. They don't thnik they've got everyone by the balls. This is far more than something PR dreamed up.

        • (Score: 3, Interesting) by cykros on Saturday August 09 2014, @12:10AM

          by cykros (989) on Saturday August 09 2014, @12:10AM (#79170)

          That's because Google has by now been around long enough to see what happens to web companies that sit back content that their have their users by the balls.

          Anyone remember Lycos? Altavista? Excite? Hell, Digg?

          While Google might not be the paragon of privacy defenders, it's in their and everyone's best interest (other than government espionage machines) to keep the data they mine to themselves. I don't have any delusions that they rushed to volunteer their services to the NSA, and don't think that most people take that stance either. Doesn't stop their users from still being right to worry about how secure their data will be if they continue to use Google, however, and so Google is absolutely right to do everything they can to bolster up their credibility when it comes to privacy and security.

          And while I have issues with the core of their business model type to begin with (data mining advertising firm), I think it's worth giving credit where credit is due. Time and again, they've been an industry leader in implementing security methods such as universal HTTPS across their services, and 2 factor authentication. While what they can't change is definitely still worth worrying about, I'd say that the security and privacy decisions within their dominion usually are handled more responsibly than I've come to expect from big web companies.

          All that said, my own reasons for keeping Google at at least arm's length haven't changed, and unless something drastic occurs with their business model, I doubt it will. It is nice to at least see them holding the rest of the web industry to a higher standard than it seems it would otherwise settle out to.

        • (Score: 0) by Anonymous Coward on Saturday August 09 2014, @02:45AM

          by Anonymous Coward on Saturday August 09 2014, @02:45AM (#79211)

          Also, if you search using Google or Yahoo, they have information on you (what you searched, what your IP/tracking signature/canvas/etc is), which gives them something to sell. Anyone who sniffs the connection gets that information too, meaning Google/Yahoo's data decreases in value. SSL protects the value of search engines' data collected on users, because no one else can easily watch that data.

    • (Score: 4, Insightful) by cykros on Saturday August 09 2014, @12:26AM

      by cykros (989) on Saturday August 09 2014, @12:26AM (#79172)

      Relying on HTTPS (especially with auth handled by CA's) to stop the NSA is like relying on a slingshot to stop a hoard of mounted Mongolian archers. Perhaps with carefully handled keys with MANUAL authentication you'd get somewhere, but even then, it's hard to say how long your SSL encryption would keep your data safe for. I don't think this is what they're going after (but they probably don't mind that being a common enough misconception),

      Relying on HTTPS to stop casual sniffing attacks, however, will get you somewhere. Making the red team have to give up on using ettercap in favor of SSLSniff (or Dsniff) greatly decreases the amount of damage ultimately done (especially as either either of those SSL MITM tools there are some pretty obvious tells that your connection is compromised to those that bother looking).

      As usual, the perfect is the enemy of the good. If it's the NSA you're trying to avoid, I think outside of stepping away from the computer, there's still no silver bullet in terms of technological fixes. Encrypting data offline using sufficiently random one time pads would do it, but besides being cumbersome...well, watch out for that $5 wrench. And anyone thinking their "NSA-Proof" vpn is going to keep their location hidden from the NSA is woefully misinformed. At best, being PAINFULLY paranoid about how you use tor or i2p with end to end manually authenticated encryption (if your browser supports javascript, it's not paranoid enough) might be an acceptable gamble. Cook up some plausible deniability software a la Truecrypt Hidden Containers that actually passes muster and you're cooking with fire...until their computing power becomes sufficient to brute force it (which naturally, you won't know until it's too late). With all those considerations, how again is Google supposed to NSA-proof the average user's Internet experience?

      • (Score: 3, Insightful) by kaszz on Saturday August 09 2014, @12:53AM

        by kaszz (4211) on Saturday August 09 2014, @12:53AM (#79182) Journal

        HTTPS is more like better than nothing authentication wise. The encryption math is likely to hold up. It's more of a system that will throw sand in the listening cogs. In the end the secrets worth to know perhaps ain't where they look.

    • (Score: 3, Informative) by No.Limit on Saturday August 09 2014, @09:10AM

      by No.Limit (1965) on Saturday August 09 2014, @09:10AM (#79261)

      There are revocation lists [wikipedia.org] and also certificate status protocols [wikipedia.org].

      However, they rely on the browser using them and being able to connect to them. (The current default setting in firefox is to check certificates with the certificate status protocol, but if it the connecction fails treat it as a success, you can see this: edit -> preferences -> advanced -> certificates -> validation).

      You also don't solely rely on the browser developers CA list, in firefox you can add/remove them with edit -> preferences -> advanced -> certificates -> view certificates.

      But of course unless you're an expert you'll probably never touch any of these settings. The current https implementation (it's a public key infrastructure implementation or short PKI) suffers from many issues and both the revocation lists as well as the online certificate status protocol are really just patch work.

      And we've seen horrible CA compromises [wikipedia.org] (there are many more) in the past.

      PKI is part of active research and there are several projects trying to overcome some of the issues with the current implementation:
      - Google's Certificate Transparency [google.com]
      - Accountable Key Infrastructure (AKI) [cmu.edu]
      and others

    • (Score: 2) by maxwell demon on Saturday August 09 2014, @01:35PM

      by maxwell demon (1608) on Saturday August 09 2014, @01:35PM (#79308) Journal

      There's also a third weak point: Say your bank uses CA X for its certificate. And assume that a malicious attacker manages to get a certificate from CA Y, which also in in your browser's trusted list. Then the browser will happily accept the certificate from CA Y without even warning about that suspicious change of the CA. And that even for cases where every human would immediately get suspicious if he saw it, e.g. if the site of an American bank suddenly presents a certificate from a Russian CA.

      There should be a way to declare that for a certain web site, only certificates from a certain CA are to be accepted, with the default being the CA from first visit.

      --
      The Tao of math: The numbers you can count are not the real numbers.
      • (Score: 2) by kaszz on Saturday August 09 2014, @05:07PM

        by kaszz (4211) on Saturday August 09 2014, @05:07PM (#79373) Journal

        I include that in my second point. The compromise of any CA makes most https-security flawed. Whether it's the CA you use or any other that can produce the same authentication doesn't matter in practice. Although a Russian CA for an American bank seems kind of fishy. ;)

        • (Score: 2) by maxwell demon on Saturday August 09 2014, @06:05PM

          by maxwell demon (1608) on Saturday August 09 2014, @06:05PM (#79394) Journal

          Well, the point is that naively one might assume that to impersonate a site, you'd have to get a fake certificate from the very CA the bank uses. Which if one (as some people probably would) trusts the bank to choose its CA well might be considered a minor issue. But the actual fact is that you can use a certificate from any of the CAs your browser trusts to impersonate the bank's site. Which is a vastly bigger attack surface.

          --
          The Tao of math: The numbers you can count are not the real numbers.
  • (Score: 0) by Anonymous Coward on Sunday August 10 2014, @11:47PM

    by Anonymous Coward on Sunday August 10 2014, @11:47PM (#79834)

    Encryption takes longer to make readable/processable
    and increases the heat output of the PC chips causing
    more wear and tear in the long run.

    But it is a good move for privacy.... :)