Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Wednesday October 12 2016, @01:42PM   Printer-friendly
from the just-run-your-OWN-facebook-at-home dept.

The original purpose of the web and internet, if you recall, was to build a common neutral network which everyone can participate in equally for the betterment of humanity. Fortunately, there is an emerging movement to bring the web back to this vision and it even involves some of the key figures from the birth of the web. It's called the Decentralised Web or Web 3.0, and it describes an emerging trend to build services on the internet which do not depend on any single "central" organisation to function.

So what happened to the initial dream of the web? Much of the altruism faded during the first dot-com bubble, as people realised that an easy way to create value on top of this neutral fabric was to build centralised services which gather, trap and monetise information.

[...] There are three fundamental areas that the Decentralised Web necessarily champions: privacy, data portability and security.

Privacy: Decentralisation forces an increased focus on data privacy. Data is distributed across the network and end-to-end encryption technologies are critical for ensuring that only authorized users can read and write. Access to the data itself is entirely controlled algorithmically by the network as opposed to more centralized networks where typically the owner of that network has full access to data, facilitating customer profiling and ad targeting.
Data Portability: In a decentralized environment, users own their data and choose with whom they share this data. Moreover they retain control of it when they leave a given service provider (assuming the service even has the concept of service providers). This is important. If I want to move from General Motors to BMW today, why should I not be able to take my driving records with me? The same applies to chat platform history or health records.
Security: Finally, we live in a world of increased security threats. In a centralized environment, the bigger the silo, the bigger the honeypot is to attract bad actors. Decentralized environments are safer by their general nature against being hacked, infiltrated, acquired, bankrupted or otherwise compromised as they have been built to exist under public scrutiny from the outset.

In the Web 3.0 I want a markup tag that delivers a nasty shock to cyber-spies...


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Insightful) by Francis on Wednesday October 12 2016, @02:46PM

    by Francis (5544) on Wednesday October 12 2016, @02:46PM (#413458)

    Depends whom you ask. It was either about social networking or about massive amounts of ads. And because the social networking sites were mostly all free, I'm not sure that the distinction is particularly important. Either way, web2.0 was about handing your personal information to all sorts of strangers.

    Starting Score:    1  point
    Moderation   +1  
       Insightful=1, Total=1
    Extra 'Insightful' Modifier   0  

    Total Score:   2  
  • (Score: 0) by Anonymous Coward on Wednesday October 12 2016, @02:52PM

    by Anonymous Coward on Wednesday October 12 2016, @02:52PM (#413461)

    I thought Web 2.0 was about sites built with tons of JavaScript, XMLHttpRequest (the one thing that survived from ActiveX) and Ajax.

    • (Score: 4, Insightful) by lizardloop on Wednesday October 12 2016, @03:14PM

      by lizardloop (4716) on Wednesday October 12 2016, @03:14PM (#413479) Journal

      That was my understanding of it. Web 1.0 implied that pages were static and could not accept new content without the involvement of the original author.
      Web 2.0 seemed to imply the idea of "anyone" being able to upload content to websites.
      I guess Web 3.0 implies that the hosting is then distributed.

      I've theorised for a while now that our relatively open web will be rapidly narrowed to resembling cable TV. That ISPs will offer the internet as a series of white listed URLs and anything outside of that will be prohibitively expensive. Facebook have already attempted to do this with their "free basics" service.

      This narrowing of content to pre approved URLs would seem to suit the powers that be. I'm just amazed it hasn't happened yet. I'm guessing only because of their sheer incompetence. But they'll get around to it eventually.

      If you want a truly decentralised web you also need to get rid of ISPs. They are choke points that are easily targeted by corporate and political interests. But then you're looking at meshnets and there are a whole bunch of hurdles to getting everyone using those.

      • (Score: 3, Insightful) by bob_super on Wednesday October 12 2016, @05:26PM

        by bob_super (1357) on Wednesday October 12 2016, @05:26PM (#413548)

        Web 1.0 : read this page.
        Web 2.0 : interact with this page.
        Web 3.0 : surrender yourself to this page.

      • (Score: 3, Interesting) by Phoenix666 on Wednesday October 12 2016, @05:38PM

        by Phoenix666 (552) on Wednesday October 12 2016, @05:38PM (#413557) Journal

        But then you're looking at meshnets and there are a whole bunch of hurdles to getting everyone using those.

        Maybe, maybe not. Build it into an OS by default and that takes care of the number of nodes in a mesh network and getting people using them. That is, for the people using them the mesh will be invisible. All they'll perceive is that they can now surf anytime, anywhere without having to pay for a data plan, as long as point A can connect to point B somehow.

        For the other hurdles with mesh networks, perhaps we can fashion something akin to the fold@home or SETI@home for router firmware to mitigate throughput and latency issues.

        --
        Washington DC delenda est.
        • (Score: 2) by julian on Thursday October 13 2016, @05:14AM

          by julian (6003) Subscriber Badge on Thursday October 13 2016, @05:14AM (#413773)

          The things people are used to, and by people I mean the masses of normies, are media-rich and scripting heavy pages that look engaging (the way a casio slot machine is engaging to some people) and provoke positive emotional response. Those sorts of web pages like Facebook, Twitter, YouTube, require huge amounts of reliable, fast, bandwidth. Mesh networks can't provide that without a huge number of users, and huge numbers of users won't leave the traditional ISPs unless greener pastures already exist.

          For people who appreciate simpler, information-heavy as opposed to media or mere "content" heavy pages, then a mesh network could work at a lower critical mass but there would be many isolated clusters of nodes. You might have every large metro area an island meshnet. Neighborhoods could go dark and come back as people move around. This gets better if the meshnetwork is an adjunct to the internet to bridge the gaps, but then someone is still paying or misappropriating a traditional ISP connection.

          A meshnet would be the way to go if you were to redesign the Internet today as a "drop in" replacement. It's out of reach if that goal has to be obtained through contiguous and functional evolutionary steps.

          • (Score: 0) by Anonymous Coward on Thursday October 13 2016, @04:21PM

            by Anonymous Coward on Thursday October 13 2016, @04:21PM (#413949)

            I like the way you guys think regarding the issues related to mesh networks. Part of the issue is also the complexities of the tools available to start offering mesh networks and bridges from which mesh network can grow in a neighborhood; it's complicated, even for somebody not completely incompetent. For instance I have a Buffalo DDWRT router with a good cable connexion, and I've always thought that I would agree to provide a public access and / or bridge if:

            a) I can use a separate unencrypted SSID for users or other nodes to connect to my router, a distinct one from my own home SSID
            b) I can limit the speed and monthly bandwidth of that SSID, to avoid retaliation from my ISP if I use too much badnwidth
            c) I can use my proxy service to tunnel everything to somewhere else, again to avoid retaliation from my ISP (or other harassing bureaucrat types) if somebody uses my public access's IP address in ways those who wield power frown upon.

            I really would love to see a tutorial along these lines; I'm moderately competent with my equipment, but not enough to figure all of that out in any reasonable time. I'm sure under these conditions, many like me could agree to start offering public access, which could be extended with other nodes more easily.

            Is there anybody here who can confirm if this is technically feasible under ddwrt? Any hints?

            -Very tin foil hat

  • (Score: 2) by julian on Wednesday October 12 2016, @10:01PM

    by julian (6003) Subscriber Badge on Wednesday October 12 2016, @10:01PM (#413665)

    It was either about social networking or about massive amounts of ads. And because the social networking sites were mostly all free, I'm not sure that the distinction is particularly important.

    This is why I believe blocking ads isn't just a good idea for sanity and security; it's a moral obligation to starve the beast.

    This business model, ad-supported content, is simply too perverse to be tolerated. It creates an adversarial relationship between website operators and website visitors. It incentives owners to actively attack their users' privacy and security. It fundamentally cannot be done ethically, even in principle.

    • (Score: 1) by Francis on Thursday October 13 2016, @02:04PM

      by Francis (5544) on Thursday October 13 2016, @02:04PM (#413899)

      I generally block everything I can find, but I've found on the few sites that I value enough to support by disabling ads tend to run like molasses because of the myriad poorly designed scripts that take ridiculous amounts of resources to run.

      A simple text or image ad is enough. The only javascript involved should be writing the link for the image or writing the ad. That stupid intellitext thing that expects to run over the whole page and highlight words with links is a particularly egregious example as it doesn't just slow the browser to a crawl, it also interferes with the use of the page.