Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 16 submissions in the queue.
posted by LaminatorX on Tuesday January 20 2015, @11:48PM   Printer-friendly
from the cloud-above-the-clouds dept.

Ars Technica On Sunday reported that Elon Musk (of SpaceX and Tesla fame) and Sir Richard Branson (Virgin Galactic, etc.) are each preparing to launch LEO (low earth orbit) constellations of satellites to provide world-wide internet coverage:

It was an interesting week for ideas about the future of the Internet. On Wednesday, satellite industry notable Greg Wyler announced that his company OneWeb, which wants to build a micro-satellite network to bring Internet to all corners of the globe, secured investments from Richard Branson's Virgin Group and Qualcomm. Then in a separate announcement on Friday, Elon Musk said that he would also be devoting his new Seattle office to creating "advanced micro-satellites" to deliver Internet.

[...] OneWeb, formerly WorldVu Satellites Ltd, aims to target rural markets, emerging markets, and in-flight Internet services on airlines, the Wall Street Journal reported. Both Branson and Qualcomm Executive Chairman Paul Jacobs will sit on the company's board, but Wyler did not say how much Virgin and Qualcomm invested in his company.

Wyler said that his company's goal is to create a network of 648 small satellites that would weigh in at around 285 pounds each. The satellites would be put in orbit 750 miles above the Earth and ideally cost about $350,000 each to build using an assembly line approach. Wyler also said that Virgin, which has its own space segment, would be launching the satellites into orbit. “As an airline and mobile operator, Virgin might also be a candidate to resell OneWeb’s service,” the Journal noted. Wyler has said that he projects it to take $1.5 billion to $2 billion to launch the service, and he plans to launch in 2018.

[...] On the other hand there's Musk, who's a seasoned space-business launcher that's starting fresh in the world of satellite Internet services. The Telsa and SpaceX founder announced his plans to launch 700 satellites weighing less than 250 pounds each in November.

His satellites would also orbit the Earth at 750 miles above. Musk spoke to Bloomberg on Friday evening explaining that 750 miles above the Earth is much closer than the tens of thousands of miles above the Earth at which traditional telecommunications satellites operate.

Then it got even more interesting.

Ars is now reporting Google might pour money into SpaceX — that it really wants satellite internet:

The Information reported on Monday that, according to “several people familiar with the talks,” Google is considering investing in SpaceX to support its plan to deliver hundreds or thousands of micro satellites into a low (750 mile) orbit around the globe to serve Internet to rural and developing areas of the world. The Information's sources indicated that Google was in the “final stages” of investing in SpaceX and valued the company at “north of $10 billion.” SpaceX is apparently courting other investors as well.

[...] The Information added another interesting tidbit that was not widely reported in previous discussions of SpaceX's plans for global Internet service: “Mr. Musk appears to be trying to get around his lack of spectrum rights by relying, in part, on optical lasers.”

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by tibman on Thursday January 22 2015, @01:00AM

    by tibman (134) Subscriber Badge on Thursday January 22 2015, @01:00AM (#136850)

    Every time you click a comment header (to read the comment, since you cannot expand) you discard the entire page and make a request to SN. If you are logged in then you probably won't be getting cached content. SN will build you a page dynamically from templates and data, send it to you, you parse the entire page, and finally render it (around 40kB in html i think?). Pretty wasteful if all you wanted was another 300 bytes of content or so. Or, you could allow javascript to send a request for only the comment you wanted to read. It would return only the data required to display that comment.

    It really all depends on what someone considers junk or wonderful. While i do like a lot of the benefits that javascript brings, i also dislike a lot of the ways it is often twisted. I wouldn't throw out the baby with the bathwater though. There is dynamic stuff happening no matter what. It's either happening on their server back-end or your browser (likely both!). AJAX is a wonderful way to grab the raw data needed to update only the portions of a page that a user wanted to change. In my opinion, obviously : )

    --
    SN won't survive on lurkers alone. Write comments.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by Arik on Thursday January 22 2015, @08:18AM

    by Arik (4543) on Thursday January 22 2015, @08:18AM (#136893) Journal
    Approximately 40kb including graphics and css. That's cached, however. The actual page comes to 24kb (testing on this comment.) And no updates are needed. You posted the comment, I come along later and load it. I compose a reply and post mine. There are two page loads involved, ~25kb or so each i.e. negligible even by the standards of a challenged old network, and so there would appear to be absolutely nothing to gain with ajax here. You turn it into, what, ~400k on the initial load? I'm just guessing but it seems likely to be on the right order at least. And the updates are still going to be 10s of kbs to accomplish the same task without a page reload. What's the point?

    Even if I did see an advantage to using ajax here though (and I'll certainly concede for the sake of argument there must be some applications where it's handy) what you are ignoring is the price. And it does come with a price - chronically insecure computer systems. That is a very high price to pay to avoid a 25kb reload.

    --
    If laughter is the best medicine, who are the best doctors?
    • (Score: 2) by tibman on Thursday January 22 2015, @03:51PM

      by tibman (134) Subscriber Badge on Thursday January 22 2015, @03:51PM (#136973)

      I think the price is the center of the discussion. Do you trade features for safety? When is that trade worth it and when is it a terrible idea? For you, it is never (or very rarely) acceptable to sacrifice security for responsiveness. For some people they run their browsers in self-destructing containers and care very little about what active content runs inside the browser. To each their own : )

      --
      SN won't survive on lurkers alone. Write comments.
      • (Score: 2) by Arik on Thursday January 22 2015, @04:53PM

        by Arik (4543) on Thursday January 22 2015, @04:53PM (#136988) Journal
        "acceptable to sacrifice security for responsiveness."

        I don't think that's an accurate formulation. The handful of ajax monstrosities that I am forced to work with certainly are not 'responsive.' The one that comes to mind immediately just got an ajax makeover and as a result it is FAR from responsive - it has a maddeningly awful UI and even if it were "responsive" it would still suck. The new version costs me at least 5 minutes every day versus the old page (which was pretty awful to begin with, actually.)
        --
        If laughter is the best medicine, who are the best doctors?
        • (Score: 2) by tibman on Thursday January 22 2015, @07:05PM

          by tibman (134) Subscriber Badge on Thursday January 22 2015, @07:05PM (#137023)

          You can make terrible implementations of any technology. Including straight html pages. I've used plenty of straight html pages that had overlapping divs that hid scrollbars or buttons. UI has nothing to do with AJAX. AJAX is just a method to fetch data (usually raw JSON). UI is html and javascript that manipulates it. Responsiveness could be something like form validation before a user submits. Say you had a registration form, you can give feedback on password strength, that all required fields are completed, and so on. Without that responsiveness you'll have to post to the server and regenerate the entire page (hopefully with the user entered data dynamically inserted into the fields) with a server-side validation message. That's costly. All of that responsiveness had zero to do with AJAX and everything to do with javascript. There were terrible and slow websites long before AJAX existed (and javascript even). Bad UI and poor implementations cannot be pinned on any technology. That blame goes to developers/designers.

          --
          SN won't survive on lurkers alone. Write comments.
          • (Score: 2) by Arik on Thursday January 22 2015, @10:15PM

            by Arik (4543) on Thursday January 22 2015, @10:15PM (#137067) Journal
            "You can make terrible implementations of any technology. Including straight html pages. I've used plenty of straight html pages that had overlapping divs that hid scrollbars or buttons."

            That's not really proper html either.

            "Say you had a registration form, you can give feedback on password strength, that all required fields are completed, and so on. Without that responsiveness you'll have to post to the server and regenerate the entire page (hopefully with the user entered data dynamically inserted into the fields) with a server-side validation message. That's costly."

            Again, not nearly as costly as loading your 'webapp'.

            "Bad UI and poor implementations cannot be pinned on any technology. That blame goes to developers/designers."

            To a degree that's true, but AJAX seems to be a particularly effective way to get a bad UI, without a single example to show that it's capable of being used to produce a good one.
            --
            If laughter is the best medicine, who are the best doctors?
            • (Score: 2) by tibman on Thursday January 22 2015, @11:05PM

              by tibman (134) Subscriber Badge on Thursday January 22 2015, @11:05PM (#137074)

              AJAX is not UI related at all. It is as related to UI as SQL is related to UI. It is just a way of making a request for data, that is all. I also don't think form validation qualifies as a webapp. My test for webapp qualification is: "does the page still function if javascript is disabled." In the case of form validation, the page still functions only is far less responsive because it requires round trips to the server.

              I'm not trying to convince you to use javascript. Maybe trying to convince you that javascript isn't evil though : ) It sounds like you really hate it. I think it's only going to get worse. More and more sites are completely unusable with javascript disabled. I browse with noscript on in FF and often have to switch to Chrome to use some sites. Especially if trying to buy something. I've had too many credit card transactions freeze because some javascript was required and whitelisting the payment provider caused a page refresh which then starts a chain of ut ohs.

              --
              SN won't survive on lurkers alone. Write comments.
              • (Score: 1) by Arik on Friday January 23 2015, @12:22AM

                by Arik (4543) on Friday January 23 2015, @12:22AM (#137083) Journal
                You're trying to shift the definition of AJAX to not include ecmascript but I believe that is what the 'J' stands for, no?

                "I also don't think form validation qualifies as a webapp."

                I used to agree with you, a few years back. Ecmascript is harmless in and of itself, a little extra flash here save a pageload there it can be used right!

                Over time I have seen I was wrong, wrong, wrong.

                OK, it *can* be used right but it almost never has been historically and as time goes on that just gets worse. And even if it was always used right (instead of almost never used right) it *still* wouldnt be worth the security nightmare it unleashes.

                It is *in principle* impossible to secure any system that executes 'webapps' handed out by random web pages. And in practice that is the single security hole through which virtually every mass exploit of the past decade and more has been delivered through.

                "I browse with noscript on in FF and often have to switch to Chrome to use some sites. Especially if trying to buy something."

                I switch to a different browser for those websites in certain cases, but NEVER to buy something. I only go to those websites if it is officially required for me to get paid (intranet, bleh.) I've switched suppliers more than once rather than give them control of my computer in order for ME to give THEM money. That crap will not fly.
                --
                If laughter is the best medicine, who are the best doctors?