Stories
Slash Boxes
Comments

SoylentNews is people

posted by azrael on Monday October 27 2014, @04:20PM   Printer-friendly
from the fibre-good-for-movements dept.

Researchers at Eindhoven University of Technology (TU/e) in the Netherlands and the University of Central Florida (CREOL) in the USA, report in the journal Nature Photonics the successful transmission of a record high 255 Terabits/s over a new type of fibre allowing 21 times more bandwidth than currently available in communication networks (Abstract). This new type of fibre could be an answer to mitigating the impending optical transmission capacity crunch caused by the increasing bandwidth demand.

[Car Analogy]: The new fibre has seven different cores through which the light can travel, instead of one in current state-of-the-art fibres. This compares to going from a one-way road to a seven-lane highway. Also, they introduce two additional orthogonal dimensions for data transportation – as if three cars can drive on top of each other in the same lane. Combining those two methods, they achieve a gross transmission throughput of 255 Terabits/s over the fibre link. This is more than 20 times the current standard of 4-8 Terabits/s.

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 3) by VLM on Monday October 27 2014, @04:36PM

    by VLM (445) on Monday October 27 2014, @04:36PM (#110577)

    the increasing bandwidth demand.

    For....

    for doubling the number of people with internet access every quarter? Naah that flatlined in the 90s.

    for increased end user demands (to do what? Streaming video is OLD stuff now.)

    I donno, why is demand going up?

    • (Score: 0) by Anonymous Coward on Monday October 27 2014, @04:47PM

      by Anonymous Coward on Monday October 27 2014, @04:47PM (#110582)

      Downloading all those Pirate Bay movies before the MPAA notices and blocks them? ;-)

    • (Score: 2) by WizardFusion on Monday October 27 2014, @04:48PM

      by WizardFusion (498) on Monday October 27 2014, @04:48PM (#110583) Journal

      Netflix, et al.
      But then at those speeds, all the US data caps will be used up in a day. We Europeans don't have any data caps - well I don't :)

      • (Score: 2) by VLM on Monday October 27 2014, @05:13PM

        by VLM (445) on Monday October 27 2014, @05:13PM (#110593)

        But isn't netflix old news? I'm looked at like some kind of extreme luddite by my son's friends because I don't have it and its almost as old as they are (well, not quite that bad, but close).

        I just have my amazon prime for the free shipping which comes with instant video so I watch that on my streaming boxes and there's always youtube etc.

        You could play some games with the math... so if the average household in America is like 2.6 people and a HD stream can't be more than 10 megs or so, and there's about 115 million households. So if everyone in America watched streamed HD TV simultaneously on their own personal HD device then to one sig fig thats about 1000 terabits/sec or only about 4 times as much at this device can pull.

        So if the NSA wanted to monitor every stream of every human in the USA simultaneously then they'd need about 8 of these things, 4 running in and 4 running out and quite an array of packet sniffers. As if there is a "one pipe" the entire internet runs thru.

        • (Score: 0) by Anonymous Coward on Monday October 27 2014, @05:37PM

          by Anonymous Coward on Monday October 27 2014, @05:37PM (#110601)

          But isn't netflix old news?

          In the U.S., maybe. In Germany they just started operation.

    • (Score: 2, Interesting) by VitalMoss on Monday October 27 2014, @05:09PM

      by VitalMoss (3789) on Monday October 27 2014, @05:09PM (#110590)

      Well you have to keep in mind a couple things:
      - Competitive Gaming requires low latency, be it Counter Strike or League.
      - Video Quality is either going up or is very high quality, the faster it gets buffered the better, honestly.
      - Piracy, large file size downloads, etc...
      - Cloud Storage, Cloud Computing, Cloud Clouding...

    • (Score: 2) by Marand on Monday October 27 2014, @05:42PM

      by Marand (1081) on Monday October 27 2014, @05:42PM (#110602) Journal

      I donno, why is demand going up?

      Higher quality video, for one. Larger and more images in webpages for another. The recent interest in pushing for higher resolution displays affects both. Also makes app sizes larger because of larger UI elements because pixmaps are still king and probably will be for a while. Games are another source of bandwidth use, with modern ones clocking in at 20+ gigs. One game from this year was over 40 gigs. If you uninstall a game when you get bored and reinstall it later, that turns into repeat downloads at those massive sizes, too.

      On a more speculative note, you've also got things like Adobe testing the ability to run photoshop on a Chromebook with their servers doing the bulk of the work. That sort of use needs a fast connection, because file sizes can get quite large. RAW photos can be massive, as can complex Photoshop documents. Adobe isn't the only one working in this area, either.

      And, of course, you also have existing "cloud" solutions being pushed more aggressively than ever. Limited local storage on devices and a "suggestion" to just put all your music on Google or Apple's servers and stream them all day long instead, etc.

      None of this may appeal to you, but "doesn't matter to me" doesn't invalidate a general desire for more bandwidth. There's always a use for a faster connection, even if you don't see how it could be useful right now. Same way people argued that 40GB hard disks were impossible to fill when they became available. It didn't take long for those drives to fill up, and now Windows alone can take half of that without anything useful installed.

      • (Score: 2) by kaszz on Monday October 27 2014, @05:58PM

        by kaszz (4211) on Monday October 27 2014, @05:58PM (#110609) Journal

        In other words there's always a way to waste any new technology..
        Those kinds of users should be penalized in some way..

        • (Score: 2) by Marand on Monday October 27 2014, @06:29PM

          by Marand (1081) on Monday October 27 2014, @06:29PM (#110625) Journal

          In other words there's always a way to waste any new technology..
          Those kinds of users should be penalized in some way..

          What a ridiculous and ignorant thing to suggest.

          Penalised for what? Preferring high-res displays and UIs tailored to those displays? For watching high-res videos instead of 240p realplayer videos from the 1990s? You might be fine with displays never improving -- maybe you're still rocking a CRT at 640x480 or something -- but some of us are happy that we're finally seeing some momentum in moving past 1920x1080 as "good enough" after a far-too-long period of stagnation in display resolution.

          Or is it their fault that modern games use high-res textures and complex models and high-quality audio that uses more space? I guess you could blame the user for uninstalling a game when bored and reinstalling it later, but disk space isn't infinite, especially if you're using SSDs, and gamers using SSDs isn't really unimaginable, considering how much of an improvement SSDs have load times.

          I suppose it is, arguably, the user's fault for choosing to use an app remotely in the speculative photoshop/chromebook example, but is it really something the user should be penalised for? Being able to use certain software like Photoshop without having to deal with Windows, wine, or Apple would be pretty awesome.

          Even the cloud storage use case isn't necessarily a waste. It's not my thing, but it works for some people. I might not like it but it's not my place to declare that doing it is "wrong" and that the user should be punished for liking it. Nor is it yours.

          So, again, that was a ridiculous, ignorant, and short-sighted comment. It would have been more honest of you to complain that "they aren't doing what I do so they're wrong and stupid" instead of trying to maintain a pretense of superiority.

          • (Score: 3, Interesting) by kaszz on Monday October 27 2014, @10:11PM

            by kaszz (4211) on Monday October 27 2014, @10:11PM (#110678) Journal

            Increasing video and audio resolution results in more value of the connection. Using inefficient codecs because transmission capacity can be hogged is a destructions of the commons. Same for leaving out any cache function. Or using transmission capacity as a replacement for storage when cheap storage is available.

            Loading transmission capacity for software that easily can be used locally also increases demand which drives cost for everybody. And spurs incentive for corporations to enable a business model where access to your software can be cut at will.

            Cloud storage increases infrastructure demands in many cases that could be done locally.

            When the average demand for infrastructure capacity is driven up, it forces suppliers to invest and get the cost back from every user. And thus increases cost for everyone and the minimum fee for access.

        • (Score: 2) by frojack on Monday October 27 2014, @06:29PM

          by frojack (1554) on Monday October 27 2014, @06:29PM (#110626) Journal

          Because we have better uses for this technology?

          Those kinds of users PAY for development of such technology.

          --
          No, you are mistaken. I've always had this sig.
    • (Score: 2) by kaszz on Monday October 27 2014, @05:50PM

      by kaszz (4211) on Monday October 27 2014, @05:50PM (#110606) Journal

      Wasted bandwidth? the classic is a console session where every byte typed results in 40 bytes sent in a packet. IPv6 will make this even worse. Streaming sessions (like Netflix) that makes use of parallel unicasts instead of multicast. The same effect happens for the same movie (or file) downloaded by different users. Backup over network or low latency.

      So in some cases more transmission capacity makes sense. In many other cases it's just a symptom of wasted capacity by badly designed usage pattern.

      • (Score: 2) by frojack on Monday October 27 2014, @06:41PM

        by frojack (1554) on Monday October 27 2014, @06:41PM (#110631) Journal

        How does one "waste" capacity.

        Wasting capacity is leaving it underutilized on the off chance that in the future something might come along and use that fiber which just happens to meet with the approval of kaszz. Every minute you are not pushing everything you can through the fiber it is being wasted.

        Want to get rid of that 40 bytes of over head? String point to point cables to and from every origin to every destination. Single use, zero contention, and each cable wasted 99.9999% of the time when nothing happens be be traveling on it.

        Packaging and transport of data does not "waste" fiber. It fulfills its very raison d'etre.

        --
        No, you are mistaken. I've always had this sig.
        • (Score: 2) by kaszz on Monday October 27 2014, @09:58PM

          by kaszz (4211) on Monday October 27 2014, @09:58PM (#110674) Journal

          Waste is when applications force or invites to usage that can be done on a local basis without huge penalty. This may lead to more expensive investment in infrastructure that has to be amortized by all users in the end. It usually also increases complexity and thus decrease reliability.

          • (Score: 2) by frojack on Monday October 27 2014, @10:33PM

            by frojack (1554) on Monday October 27 2014, @10:33PM (#110682) Journal

            Don't believe you will find any justification for that definition.

            Even in the present discussion, every time something like netflics eliminates hopping in the car to go get a movie, or getting a movie in the mail on use-once plastic, delivered by a vehicle with a internal combustion engine, which has to be returned, only to be discarded, that is systemic waste.

            Spending a small amount of electricity to transmit that electronically is the perfect antithesis of waste.

            --
            No, you are mistaken. I've always had this sig.
            • (Score: 2) by kaszz on Tuesday October 28 2014, @04:28AM

              by kaszz (4211) on Tuesday October 28 2014, @04:28AM (#110751) Journal

              Make streaming services coordinate transmissions so it isn't a 1:1 resource usage. Divide transmissions into video positions that many receivers share and cache at receiver. The waste comes when using the packet network as broadcast setup.

  • (Score: 2) by frojack on Monday October 27 2014, @05:27PM

    by frojack (1554) on Monday October 27 2014, @05:27PM (#110598) Journal

    Wait, I thought the required analogy for data transmission was the number of Olympic swimming pools that you could fit in the Library of Congress or something...?

    --
    No, you are mistaken. I've always had this sig.
    • (Score: 2) by black6host on Monday October 27 2014, @08:57PM

      by black6host (3827) on Monday October 27 2014, @08:57PM (#110661) Journal

      No I believe, in this case, it's the amount of data that can be carried on hard disk drives in a station wagon from points A to B. Typically East Coast to West Coast of USA. It used to be that method gave the highest "throughput" of data. I'm pretty sure that's not the case any longer. Even with bigger and bigger hard drives. Be interesting to know when we crossed that threshold but I'm not up for the research right now. I'm too busy streaming movies to all the devices in my house at the same time :)

      • (Score: 2) by urza9814 on Friday October 31 2014, @02:02PM

        by urza9814 (3954) on Friday October 31 2014, @02:02PM (#111906) Journal

        I'm pretty sure that's not the case any longer. Even with bigger and bigger hard drives. Be interesting to know when we crossed that threshold but I'm not up for the research right now.

        There's an XKCD "What if" on this topic:
        https://what-if.xkcd.com/31/ [xkcd.com]

        The transport capacity of the internet as a whole would surpass that of FedEx as a whole around 2040, assuming drives stop getting bigger (so...probably not). But back to an individual link: The link in the summary transfers just short of one exabyte per hour. According to that XKCD, microSD cards hold 160TB per kg, giving a bit under 6000 kg per exabyte. According to Wikipedia, an Antonov 225 cargo jet can transport 250,000kg, or 41 exabytes, so any distance you can fly with a jet in 41 hours, sneakernet wins. For a truck, a "Turnpike double" trailer can carry up to 67,000kg, giving ~11 exabytes. So if you can drive there in 11 hours, the truck would be faster. A station wagon probably won't beat this link over any significant distance though.

        Of course, this is comparing an experimental network link to widely available consumer storage, so it's not really a fair comparison. MicroSD cards will probably increase by orders of magnitude before this networking tech is actually being deployed...

    • (Score: 1) by WillR on Monday October 27 2014, @09:08PM

      by WillR (2012) on Monday October 27 2014, @09:08PM (#110665)
      We can do swimming pools. "Fast enough to transfer the contents of an Olympic swimming pool full of 128GB SD cards every 10 seconds".
      (Unless I've misplaced a decimal somewhere, which is pretty likely.)
    • (Score: 2) by Bot on Monday October 27 2014, @11:02PM

      by Bot (3902) on Monday October 27 2014, @11:02PM (#110689) Journal

      Imagine a beowulf cluster of soda cans. How much should you drink to emit 255 teraburps? It would kill you. This explains the awesome bandwidth reached in this new experiment.

      (sorry this is the only "can analogy" that occurred to me, doing the best I can)

      --
      Account abandoned.
  • (Score: 2) by hoochiecoochieman on Monday October 27 2014, @06:26PM

    by hoochiecoochieman (4158) on Monday October 27 2014, @06:26PM (#110622)

    If you send the data on one end of the fiber and then glue both ends together before the light reaches the other end you get one fucking big storage array.

    • (Score: 4, Informative) by kaganar on Monday October 27 2014, @08:38PM

      by kaganar (605) on Monday October 27 2014, @08:38PM (#110656)
      Supposing this did work, there are some very practical issues to work out: Suppose I want to store a terabyte (you know, 1024^4 bytes, not 1000^4 bytes) of data:
      • The fiber loop would be almost as long as a quarter way around the earth.
      • It would weight more than a blue whale.
      • If compacted, the fiber would be about the size of a somewhat small family swimming pool.

      ... think I'll stick with what I've got. ;-)

      Maths and terrible found-it-on-the-internet assumptions:

      If I transferred over fiber at 255 terabits per second, I'd end up needing a burst of 8/255th of a second (8 bits in a byte and all). The speed of light is 299,792,458 m/s. The distance travelled would be 299,792,458 m/s * 8/255 s = 9,405,253.6 meters, or about 9,405 km. For reference, the earth is about 40,075 km around. About every 50 meters of cable weights approximately kilogram. That's about 188,105 kilograms. That's about 207 short tons. A blue whale weights about 190 short tons. Perhaps less impressively, let's consider volume. Supposing the diameter is 1mm (a little small, I admit), 3.141592 * (0.001mm)^2 * 9,405,253.6 = 29.55 m^3 -- about the size of small-ish home swimming pool.

  • (Score: 1) by anubi on Tuesday October 28 2014, @10:40AM

    by anubi (2828) on Tuesday October 28 2014, @10:40AM (#110802) Journal

    The bandwidth of an ejaculation is 1687.5 TeraBytes/Sec.... [blogspot.com]

    Never underestimate the bandwidth of a guy with a trunk full of usb sticks....

    --
    "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
    • (Score: 0) by Anonymous Coward on Tuesday October 28 2014, @05:17PM

      by Anonymous Coward on Tuesday October 28 2014, @05:17PM (#110905)

      However there's extremely high redundancy in that data transmission. Indeed, after removing that redundancy, the remaining data is less than a gigabyte.