Arthur T Knackerbracket has found the following story:
The Xbox Series X will be missing the optical S/PDIF audio output that was present on the Xbox One and Xbox 360 hardware lines.
[...] The removal will mainly impact players who use a small subset of high-end gaming headsets and audio systems that rely on the optical audio connection instead of audio sent over HDMI or Microsoft's wireless standard. Some users will be able to use S/PDIF passthrough output from their TV-set as a replacement, though. And Windows Central reports that wireless headset makers like Astro are already working on solutions to make existing Xbox One-compatible S/PDIF products work on the Series X.
Microsoft has also confirmed that the Series X will be missing the IR extension port that was present on the back of the Xbox One and the IR blaster that was present on the Xbox One S. Those features were only really useful in extremely limited circumstances, such as for Xbox users who wanted to use the system's TV remote control functions without plugging in a Kinect sensor.
(Score: 4, Interesting) by acid andy on Friday March 20 2020, @10:34AM (14 children)
If you really want hi-fidelity gaming, why aren't you using a desktop PC?
Master of the science of the art of the science of art.
(Score: 0) by Anonymous Coward on Friday March 20 2020, @10:51AM (12 children)
If you need it, you have your hdmi from your console to your reciever and the reciever breaks it out to spdif (if it's not HDCP encoded and refuses to output via S/PDIF.)
The primary reason for S/PDIF back in the day was that the alternative was analog copper wires, which did cause signal degradation based on a variety of noise sources. With modern digital audio build to run over copper at the same or lower than a S/PDIF cable costs, it's pretty silly to bother with it, especially given the components required to support it, and the maximum frequency and resolution it supports.
(Score: 0) by Anonymous Coward on Friday March 20 2020, @01:39PM (9 children)
You didn't need optical even back then to have noise free audio carried over your cable.
Copper wires carrying an analog differential signal with possibly a braided shield over both wires would've worked just fine.
For a long wire run, slightly higher voltage signals would improve noise.
Digital done properly offers of course the highest noise immunity of all.
(Score: 1) by VacuumTube on Friday March 20 2020, @02:47PM (8 children)
A digital signal carried by a copper wire works just as well as one carried by fiber. Neither are bothered by noise on the transmission line.
(Score: 2) by Immerman on Friday March 20 2020, @03:45PM (6 children)
>A digital signal carried by a copper wire works just as well as one carried by fiber.
Agreed. I never really understood the point of optical S/PDIF when copper wires are so much more robust and easy to work with. More "high tech" I guess. Fiber optics have an advantage that they're pretty much completely immune to radio noise, which is often a big problem around powerful AV equipment. But a decently shielded or even twisted-pair copper cable can reduce that noise to acceptable levels for digital transmission easily enough, especially around relatively low-power home AV equipment.
>Neither are bothered by noise on the transmission line.
Well, not strictly true - it just takes a lot more noise to disrupt a digital signal.
Analog degrades "gracefully", as could be seen in watching an old TV broadcast from too far away - you could still make out the voices and images through the "snow", if you really wanted to. Digital doesn't degrade at all... until it does, and then it immediately gets so bad you're unlikely to get any useful data at all - as seen trying to watch a modern TV broadcast from too far away. (Yeah, yeah, I know technically that's dealing with signal strength while holding the noise level constant, but the end result is much the same)
(Score: 2) by aiwarrior on Friday March 20 2020, @05:22PM (1 child)
Your statement of digital signal degrading ungracefully is a bit unfair. If you say an analog signal inherently degrades gracefully I will give you that. On the other hand no signal is nowadays passed without any form of modulation and these in turn degrade gracefully, or where the datastream, is not encoded with some form of FEC or BEC system.
(Score: 2) by Immerman on Friday March 20 2020, @06:17PM
I'm not sure I understand your argument. Obviously modulation is required - that's the only way you can send a signal wirelessly - analog signals are all modulated as well AM radio = Amplitude Modulated, FM = Frequency Modulated. Even modem = MODulator/DEModulator. Nothing special about modulation. Even smoke signals rely on modulation for signal transmission.
Digital signals do often include error correction as well, and that does help raise the bar some, but 50% data loss (or even 20%) is still likely to mean nothing useful gets through, especially if any form of compression is involved, while 50% loss on an analog signal the snow will be getting pretty annoying, but probably not interfere with comprehension since the human brain is an absolutely astounding signal processor. In practice, that mostly means that an analog signal will be useful much further beyond its "design range" than a digital one.
Really though, my point was just that digital signals aren't completely immune to interference, they just raise the floor - minor interference may as well not exist. But as the signal-to-noise ratio falls, analog signals degrade like going down a hill, while digital signals remain pristine until they suddenly fall off a cliff.
(Score: 3, Informative) by toddestan on Friday March 20 2020, @09:52PM (2 children)
One other advantage of S/PDIF is that eliminates ground loops. I had an annoying 60 Hz hum hooking up my PC to my receiver. Analog was bad, a digital copper link was better, but S/PDIF completely eliminated the hum as there is no electrical connection whatsoever.
With that said, I would assume this isn't such a big deal with purposely built audio equipment designed with proper isolation. However, I've found PCs in general aren't so good about it.
(Score: 2) by Immerman on Friday March 20 2020, @10:44PM
Okay, now that actually makes sense, thanks. Ditching copper doesn't reduce noise in the signal itself, but it eliminates noise introduced into the electronics as a side effect of the electrical interconnect provided by the copper.
Of course you could theoretically electrically isolate the signal carrier from the rest of the system, but doing so effectively might be even more expensive than toslink
(Score: 0) by Anonymous Coward on Friday March 20 2020, @10:48PM
If you picked up a 60 Hz hum in your sound while using a copper digital connection, I can tell you that it is impossible for that 60 Hz interference to be due to the digital transmission. Clearly the analog part of that circuitry was getting the hum. As you say, it could be a ground loop. That would mean the cables weren't designed correctly or the amplifiers receiving the digital signal (digital signals are all analog of course until they are converted to digital ;-)) were also of crappy design. Done properly, there would be no detectable noise. Optical transmission is more "idiot proof" that way...
(Score: 1) by VacuumTube on Saturday March 21 2020, @07:32PM
"Well, not strictly true - it just takes a lot more noise to disrupt a digital signal."
I was speaking in the context of a digital audio signal used in consumer equipment to connect a peripheral to an amplifier. At audio frequencies fiber optic equipment makes no sense as even antiquated technology is capable of carrying the signal without error for the few feet required. If the digital signal can be deciphered it transmits the information perfectly, within the limits of the technology used to encode it in the first place. So it either works or it doesn't. There's no noise added on account of the transmission medium, whatever it may be.
(Score: 0) by Anonymous Coward on Friday March 20 2020, @05:07PM
I never said digital signals couldn't run over copper wire. Hell, that's exactly what they do over common Ethernet cable.
(Score: 0) by Anonymous Coward on Friday March 20 2020, @03:41PM
Normally (although it is technically possible in the protocol) the audio packets are not encrypted when HDCP is in use. So you can get audio splitters with whatever connectivity you want and they will work... although some of the cheapass splitter devices are likely fine at ~150MHz FHD but will fail horribly at higher clock rates. If you need it, good stuff that works reliably at high rates is available but can be somewhat expensive (e.g., look for extron gear on ebay).
(Score: 2) by Immerman on Friday March 20 2020, @06:24PM
Heck, S/PDIF actually explicitly includes digital transmission over a copper wire with RCA jacks as an option in the spec. Why the expensive, finicky optical toslink variant is the one that became popular is beyond me. Only thing I can think of is that "optical" sounds high tech. Though I could also see consumers using audio-grade RCA cables possibly seeing degraded signals - though I don't think that's so much of a problem with only a single signal wire - not like USB or HDMI where timing issues from different wire lengths can prevent old cables from operating at new speeds.
(Score: 0) by Anonymous Coward on Friday March 20 2020, @08:15PM
If you want a semi-serious answer to that question, because some of us got fed up with the ever changing hardware specs required to play games on them, it seemed like a hardware upgrade cycle was required every time another PC game developer decided to push the envelope with their latest release (looks at box containing all the graphics cards I've upgraded from over the past five years...and several sound cards), consoles, on the whole, just bloody worked.
This is only semi-serious, as I gave up up gaming in general about two years ago. I will admit to firing up the Xbox 360 to play the old Xbox game Black when I get really frustrated with a coding issue and need to 'work it through' by not thinking about it (an old, bad habit..which seems to work) but that's about it these days..
(Score: 4, Funny) by kazzie on Friday March 20 2020, @01:07PM
Here we are, with S/PDIF being tossed into graveyard of obsolescence, long before I ever had the need to use it.
(Score: 2) by crafoo on Friday March 20 2020, @01:37PM (1 child)
SPDIF uses light beams and that's really cool, so I'll miss it. HDMI is a 100% shitshow that frequently overlooks the interests of the user to enforce DRM, in various broken & incompatible ways. Signs of things to come.
(Score: 0) by Anonymous Coward on Friday March 20 2020, @07:01PM
I used SPDIF only once but found out in the settings that HDMI would have better audio bandwidth. It had something to do with which device was better at decoding the light.
(Score: 3, Interesting) by takyon on Friday March 20 2020, @02:08PM (2 children)
Xbox Series X Will Feature Audio Ray Tracing, Director of Program Management Reveals [wccftech.com]
Sony’s Next-gen PS5 3D Audio is Taking Steps a Big as Graphics and is a Dream Come True, Composer and Dev Say [wccftech.com]
https://en.wikipedia.org/wiki/PlayStation_5#Hardware [wikipedia.org]
Microsoft: Xbox Series X Performance Is 25+ TFLOPs when Ray Tracing; I/O Rate Equal to 13 Zen 2 Cores [wccftech.com]
Both the XSX and PS5 will have some kind of dedicated audio chip. That is potentially more interesting than the raytraced graphics.
It could take at least a 12-core CPU to effectively match the consoles' capabilities, when dedicated chips or background cores (ARM in PS5?) are taken into account.
Total RAM (shared with GPU) is at 16 GB, which is at the lower end of the 16-32 GB that was expected.
Both consoles will ship with relatively fast SSDs instead of HDDs. That could have an impact on games beyond just reducing loading times, since before this console generation, game developers had to cater to a large number of console and PC HDD users.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by Freeman on Friday March 20 2020, @05:50PM
Consoles with standard SSDs would have seen a significant User experience improvement. The nice and fast SSDs they are shipping with, will have an even greater positive impact on the user experience. Let alone, just game load times. OS booting, listing/accessing videos, audio, etc. The next gen consoles seem like they're on the right track for significantly improving the user experience with their systems.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 2) by Freeman on Friday March 20 2020, @05:55PM
The 16GB shared RAM is kind of sad to see as a typical gaming desktop would have a dedicated GPU with 8GB of RAM with 16GB+ of System RAM, for a total GPU+System RAM of 24GB+. Still, It's a significant upgrade from last gen.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"