Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Sunday December 03 2017, @08:06AM   Printer-friendly
from the stand-on-your-head dept.

So that's why:

The USB paradox is one of the most familiar experiences of the digital age. Every time you try to plug in a USB cord, it seems like you always get it wrong on the first try. It doesn't matter how much attention you pay to the plug or the cord or the icons on the cord. It's always wrong.

And there's a good reason for that! In an interview published Thursday by DesignNews, Intel's Ajay Bhatt spoke at length about why the ubiquitous technology has been so infuriating for so long. Bhatt was a member of the team that developed USB technology. Even at the start of development, they knew that making the connector flippable would be a better user experience in the long run. But doing so would require twice the wiring and more circuitry, which would increase costs.

"If you have a lot of cost up front for an unproven technology it might not take off. So that was our fear. You have to be really cost conscious when you start out," Bhatt said.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Informative) by Rich on Monday December 04 2017, @12:46AM

    by Rich (945) on Monday December 04 2017, @12:46AM (#604865) Journal

    What I don't understand about USB is why did it start out so slow.

    The article distorts history. USB only gets compared to Firewire. Which was only specified in 1995 and did not hit a wide audience until 2000 when the "PowerBook G3 (Firewire)", aka "Pismo" was introduced. By 1995 USB 1 was already widespread in hardware, as can be seen on the legendary Asus XP55T2P4 boards. No one bothered to connect the headers to the outside, though, and it was only W95OSR2 that delivered the first support, iirc. I had one of these Asus boards and only bought and connected a slot panel with two USB plugs when I saw them on fire sale.

    USB initially had to compete with ADB, an almost-good solution designed by none other than Woz himself. Thats probably where the "Low Speed" specs come from. Coincidentally about the maximum speed that you'll get over a short run single ended cable connection with TTL level drivers. Later on, someone at Intel probably put up a requirement that they had to do stereo CD audio, and for good measure hit 1 MB/s net. Coincidentally about the maximum speed that one will reliably get over a PCB trace at TTL levels. I would assume that the differential signaling, the point-to-point layout, and isochronous endpoints slipped into the feature set by that time.

    I also assume that USB was always meant to be dirt cheap. That is, they a.) had to interface with endpoints running on very primitive microcontrollers inside mice and their ilk and b.) absolutely wanted to get away without requiring a PHY chip like Ethernet needed (mind you, this was when AUI/AAUI was on the computer, requiring an external PHY, because no one knew whether Thickwire, Thinwire or Twisted Pair was to be used). If you look at the USB connectors compared to other connectors from back then, you'll find it a bit weird, and that probably is because you could build the A plugs with PCB technology, right angle sheet folding and injection moulding, whereas other connectors needed milled pins, plastic carriers and stamp formed shells.

    So, the hardware makes a lot of sense for those times. The protocol and software however has a few strange and complicated idiosyncrasies, which I can only explain by Intel trying to get some patentable weirdness into the standard. It also seems that Intel tried to play foul after getting adopters on board, which led to the development of OHCI vs Intel's own UHCI controller by some adopters.

    The high speed variants only came later, as an afterthought, to counter (and probably with the intent to destroy) FireWire, and later with the intent to leverage the installed base instead of doing a clean slate high speed interconnect. Those are all pretty badly kludged on (eg. you need EHCI and either UHCI or OHCI for high speed). In 1995 a lot of people was probably assuming that serious transfer rates to local I/O devices were a firm domain of SCSI and its descendents).

    And then, not going for the technical limit is of course natural, if you own a monopoly, because you can cash in twice or three times instead with smaller steps. (The SD card limits, and I suspect photo sensors, too, exhibit that pretty well).

    Starting Score:    1  point
    Moderation   +2  
       Informative=2, Total=2
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   4