Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Friday March 04 2016, @02:47PM   Printer-friendly
from the how-much-is-not-enough dept.

Submitted via IRC for Bytram

It's been almost a year now since Oculus announced that the consumer version of the Rift virtual reality headset would only support Windows PCs at launch—a turnaround from development kits that worked fine on Mac and Linux boxes. Now, according to Oculus co-founder Palmer Luckey, it "is up to Apple" to change that state of affairs. Specifically, "if they ever release a good computer, we will do it," he told Shacknews recently.

Basically, Luckey continued, even the highest-end Mac you can buy would not provide an enjoyable experience on the final Rift hardware, which is significantly more powerful than early development kits. "It just boils down to the fact that Apple doesn't prioritize high-end GPUs," he said. "You can buy a $6,000 Mac Pro with the top-of-the-line AMD FirePro D700, and it still doesn't match our recommended specs."

"So if they prioritize higher-end GPUs like they used to for a while back in the day, we'd love to support Mac. But right now, there's just not a single machine out there that supports it," he added. "Even if we can support on the software side, there's just no audience that could run the vast majority of software on it."

Source: http://arstechnica.com/gaming/2016/03/oculus-founder-rift-will-come-to-mac-if-apple-ever-release-a-good-computer/.
See also: Shacknews blog.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Informative) by Immerman on Friday March 04 2016, @03:20PM

    by Immerman (3985) on Friday March 04 2016, @03:20PM (#313694)

    As I recall the FirePro line are extremely high-end GPUs, they just don't offer the raw pixel-pushing performance of a sloppy gaming-oriented videocard, instead prioritizing the extreme accuracy needed for professional 3D modeling and CAD applications. Or has that changed?

    Starting Score:    1  point
    Moderation   +4  
       Interesting=1, Informative=2, Funny=1, Total=4
    Extra 'Informative' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 2) by Geotti on Friday March 04 2016, @04:23PM

    by Geotti (1146) on Friday March 04 2016, @04:23PM (#313752) Journal

    No, but that's exactly the point. That's a workstation and not a gaming PC.
    I'll continue playing around with the DK1 and maybe upgrade the internal display at some point, but I'm sure as hell not going back to Winblows. Well, we'll see, anyway, how fast open drivers will appear.

  • (Score: 2, Informative) by mobydisk on Friday March 04 2016, @07:42PM

    by mobydisk (5472) on Friday March 04 2016, @07:42PM (#313875)

    I keep hearing that same thing, but it doesn't add-up. Not sure about today, but ~5 years ago, the FireGL was actually the same hardware as the gaming GPUs, but with slightly different drivers and about twice the cost.

    What "accuracy" are they referring to? The video card is just rendering, right? It's not doing any of the engineering work.

    I right now can see the mechanical engineers a few cubes over from me using SolidWorks on their expensive FireGL cards. I see non-anti-aliased lines and simple solid filled polygons. Now, if you told me the driver needs to be optimized for the raw number of lines and polys, I might believe you. But if so, you'd think they could at least get something other than badly aliased lines. It looks like the software-only 3D I got on my 386 years ago. Actually worse - the lines aren't even solid - they have holes and dashes in them when they go to high angles. It's really quite poor. What gives?

    • (Score: 2) by Immerman on Saturday March 05 2016, @03:40AM

      by Immerman (3985) on Saturday March 05 2016, @03:40AM (#314058)

      As I recall one of the big issues is depth buffer accuracy - that weirdness/sparkling you get when two faces intersect, or when two parallel faces are at *almost* the same distance and you end up seeing the far one instead of the near one. That's unacceptable for professional graphics, but a common result of major performance-boosting compromises.

  • (Score: 2) by gman003 on Saturday March 05 2016, @01:46AM

    by gman003 (4155) on Saturday March 05 2016, @01:46AM (#314017)

    They're the same hardware, but the drivers are different. That's not the issue in this case, though.

    The Mac Pro is the only one with options that even come close to the Rift requirements. You can get it with two D300s, two D500s, or two D700s. Those models all seem to be exclusive to the Mac Pro, but they're based on desktop chips.

    The D300 looks like an underclocked Radeon R9 270 (256-bit memory bus, 1280 shader cores). The D500 looks like a heavily cut-down version of the Radeon R9 280 - 1536 shader cores instead of the 1792 in the 280 or the 2048 in the 280X, all three on a 384-bit memory bus. And the D700 looks like a match for the 280X, with maybe some minor clockspeed differences.

    Official system requirements are for a 290 or higher (512-bit memory bus, 2560 shader cores). I'm actually not sure how the dual-card setup goes in this case - VR is supposed to scale well to dual-card, better than most things, so I'd think the D500s ought to scrape by. But maybe it's bottlenecked on ROPs or something - memory access might be a big part of it, and dual-card doesn't help much with that. And I'm not sure what the software stack looks like on the Mac Pro.