Slash Boxes

SoylentNews is people

posted by martyb on Wednesday October 12 2016, @01:42PM   Printer-friendly
from the just-run-your-OWN-facebook-at-home dept.

The original purpose of the web and internet, if you recall, was to build a common neutral network which everyone can participate in equally for the betterment of humanity. Fortunately, there is an emerging movement to bring the web back to this vision and it even involves some of the key figures from the birth of the web. It's called the Decentralised Web or Web 3.0, and it describes an emerging trend to build services on the internet which do not depend on any single "central" organisation to function.

So what happened to the initial dream of the web? Much of the altruism faded during the first dot-com bubble, as people realised that an easy way to create value on top of this neutral fabric was to build centralised services which gather, trap and monetise information.

[...] There are three fundamental areas that the Decentralised Web necessarily champions: privacy, data portability and security.

Privacy: Decentralisation forces an increased focus on data privacy. Data is distributed across the network and end-to-end encryption technologies are critical for ensuring that only authorized users can read and write. Access to the data itself is entirely controlled algorithmically by the network as opposed to more centralized networks where typically the owner of that network has full access to data, facilitating customer profiling and ad targeting.
Data Portability: In a decentralized environment, users own their data and choose with whom they share this data. Moreover they retain control of it when they leave a given service provider (assuming the service even has the concept of service providers). This is important. If I want to move from General Motors to BMW today, why should I not be able to take my driving records with me? The same applies to chat platform history or health records.
Security: Finally, we live in a world of increased security threats. In a centralized environment, the bigger the silo, the bigger the honeypot is to attract bad actors. Decentralized environments are safer by their general nature against being hacked, infiltrated, acquired, bankrupted or otherwise compromised as they have been built to exist under public scrutiny from the outset.

In the Web 3.0 I want a markup tag that delivers a nasty shock to cyber-spies...

Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2, Interesting) by Anonymous Coward on Wednesday October 12 2016, @05:29PM

    by Anonymous Coward on Wednesday October 12 2016, @05:29PM (#413551)

    What about an open standard for "messages"? People could send and republish other's messages as they see fit, kind of like Twitter, but with a service of their choosing and longer messages. One would still typically "rent" services for such broadcasting, but no one company would own or house the messages. Such a service would support "Standard Message Protocol X" (SMPX), as a reference name for now.

    A checksum and originating source (URL) would be included with each message so that re-broadcasters could verify the authenticity* (if desired). The verification service would be a standard feature with SMPX. One is not forced to re-broadcast only verified messages, but you'll have more cred if you do. Checking other site's cred by back-tracing their re-published content would also be a standard feature of SMPX.

    Also, one doesn't have to re-publish the entire message; they can optionally republish only the title and perhaps synopsis with a hyperlink to the original or alternative source.

    Draft message data structure:

    - Title (required)
    - Synopsis
    - Body text
    - Keywords (and/or categories)
    - Image URL (such as for thumbnails)
    - Author actual name
    - Author handle (pseudo-name)
    - Author contact (email, Twitter account, etc.)
    - Date/time created
    - Revision date/time (required, same as created if new)
    - Checksum (about 20 characters)
    - Source URL
    (A max size for some of these fields would be part of the standard.)

    * Checksums are not perfect for integrity checking, but can serve as a quick and dirty check supplemented with spot-checking. For example, if your SMPX wanted to quickly verify a given site, it could check most or all check-sums and also do a random spot-check of the full text (structure) of selected messages. One could optionally verify the full (public) data of any SMPX source, but that could overwhelm both the sending and receiving SMPX services. The combo described above is a decent compromise versus full scanning. A typical scan result info set will resemble:

    SMPX site examined: fredsmith-smpx-site.smpx
    Total checksums listed: 1301 (as shared between services)
    Total checksums inspected: 1301
    Total checksums passed: 1301
    Total messages full-data spot-checked: 50
    Total messages full-data spot-check passed: 50
    Total integrity problems found: 0

    Starting Score:    0  points
    Moderation   +2  
       Interesting=2, Total=2
    Extra 'Interesting' Modifier   0  

    Total Score:   2  
  • (Score: 2) by Phoenix666 on Wednesday October 12 2016, @11:22PM

    by Phoenix666 (552) on Wednesday October 12 2016, @11:22PM (#413691) Journal

    I think that's the right way to go, and in the spirit of what the article is proposing.

    Washington DC delenda est.