Stories
Slash Boxes
Comments

SoylentNews is people

posted by Fnord666 on Friday March 20 2020, @10:33AM   Printer-friendly
from the any-port-in-a-storm dept.

Arthur T Knackerbracket has found the following story:

The Xbox Series X will be missing the optical S/PDIF audio output that was present on the Xbox One and Xbox 360 hardware lines.

[...] The removal will mainly impact players who use a small subset of high-end gaming headsets and audio systems that rely on the optical audio connection instead of audio sent over HDMI or Microsoft's wireless standard. Some users will be able to use S/PDIF passthrough output from their TV-set as a replacement, though. And Windows Central reports that wireless headset makers like Astro are already working on solutions to make existing Xbox One-compatible S/PDIF products work on the Series X.

Microsoft has also confirmed that the Series X will be missing the IR extension port that was present on the back of the Xbox One and the IR blaster that was present on the Xbox One S. Those features were only really useful in extremely limited circumstances, such as for Xbox users who wanted to use the system's TV remote control functions without plugging in a Kinect sensor.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by Immerman on Friday March 20 2020, @03:45PM (6 children)

    by Immerman (3985) on Friday March 20 2020, @03:45PM (#973530)

    >A digital signal carried by a copper wire works just as well as one carried by fiber.

    Agreed. I never really understood the point of optical S/PDIF when copper wires are so much more robust and easy to work with. More "high tech" I guess. Fiber optics have an advantage that they're pretty much completely immune to radio noise, which is often a big problem around powerful AV equipment. But a decently shielded or even twisted-pair copper cable can reduce that noise to acceptable levels for digital transmission easily enough, especially around relatively low-power home AV equipment.

    >Neither are bothered by noise on the transmission line.
    Well, not strictly true - it just takes a lot more noise to disrupt a digital signal.

    Analog degrades "gracefully", as could be seen in watching an old TV broadcast from too far away - you could still make out the voices and images through the "snow", if you really wanted to. Digital doesn't degrade at all... until it does, and then it immediately gets so bad you're unlikely to get any useful data at all - as seen trying to watch a modern TV broadcast from too far away. (Yeah, yeah, I know technically that's dealing with signal strength while holding the noise level constant, but the end result is much the same)

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by aiwarrior on Friday March 20 2020, @05:22PM (1 child)

    by aiwarrior (1812) on Friday March 20 2020, @05:22PM (#973563) Journal

    Your statement of digital signal degrading ungracefully is a bit unfair. If you say an analog signal inherently degrades gracefully I will give you that. On the other hand no signal is nowadays passed without any form of modulation and these in turn degrade gracefully, or where the datastream, is not encoded with some form of FEC or BEC system.

    • (Score: 2) by Immerman on Friday March 20 2020, @06:17PM

      by Immerman (3985) on Friday March 20 2020, @06:17PM (#973581)

      I'm not sure I understand your argument. Obviously modulation is required - that's the only way you can send a signal wirelessly - analog signals are all modulated as well AM radio = Amplitude Modulated, FM = Frequency Modulated. Even modem = MODulator/DEModulator. Nothing special about modulation. Even smoke signals rely on modulation for signal transmission.

      Digital signals do often include error correction as well, and that does help raise the bar some, but 50% data loss (or even 20%) is still likely to mean nothing useful gets through, especially if any form of compression is involved, while 50% loss on an analog signal the snow will be getting pretty annoying, but probably not interfere with comprehension since the human brain is an absolutely astounding signal processor. In practice, that mostly means that an analog signal will be useful much further beyond its "design range" than a digital one.

      Really though, my point was just that digital signals aren't completely immune to interference, they just raise the floor - minor interference may as well not exist. But as the signal-to-noise ratio falls, analog signals degrade like going down a hill, while digital signals remain pristine until they suddenly fall off a cliff.

  • (Score: 3, Informative) by toddestan on Friday March 20 2020, @09:52PM (2 children)

    by toddestan (4982) on Friday March 20 2020, @09:52PM (#973646)

    One other advantage of S/PDIF is that eliminates ground loops. I had an annoying 60 Hz hum hooking up my PC to my receiver. Analog was bad, a digital copper link was better, but S/PDIF completely eliminated the hum as there is no electrical connection whatsoever.

    With that said, I would assume this isn't such a big deal with purposely built audio equipment designed with proper isolation. However, I've found PCs in general aren't so good about it.

    • (Score: 2) by Immerman on Friday March 20 2020, @10:44PM

      by Immerman (3985) on Friday March 20 2020, @10:44PM (#973663)

      Okay, now that actually makes sense, thanks. Ditching copper doesn't reduce noise in the signal itself, but it eliminates noise introduced into the electronics as a side effect of the electrical interconnect provided by the copper.

      Of course you could theoretically electrically isolate the signal carrier from the rest of the system, but doing so effectively might be even more expensive than toslink

    • (Score: 0) by Anonymous Coward on Friday March 20 2020, @10:48PM

      by Anonymous Coward on Friday March 20 2020, @10:48PM (#973664)

      If you picked up a 60 Hz hum in your sound while using a copper digital connection, I can tell you that it is impossible for that 60 Hz interference to be due to the digital transmission. Clearly the analog part of that circuitry was getting the hum. As you say, it could be a ground loop. That would mean the cables weren't designed correctly or the amplifiers receiving the digital signal (digital signals are all analog of course until they are converted to digital ;-)) were also of crappy design. Done properly, there would be no detectable noise. Optical transmission is more "idiot proof" that way...

  • (Score: 1) by VacuumTube on Saturday March 21 2020, @07:32PM

    by VacuumTube (7693) on Saturday March 21 2020, @07:32PM (#973902) Journal

    "Well, not strictly true - it just takes a lot more noise to disrupt a digital signal."

    I was speaking in the context of a digital audio signal used in consumer equipment to connect a peripheral to an amplifier. At audio frequencies fiber optic equipment makes no sense as even antiquated technology is capable of carrying the signal without error for the few feet required. If the digital signal can be deciphered it transmits the information perfectly, within the limits of the technology used to encode it in the first place. So it either works or it doesn't. There's no noise added on account of the transmission medium, whatever it may be.