Submitted via IRC for Bytram
It's been almost a year now since Oculus announced that the consumer version of the Rift virtual reality headset would only support Windows PCs at launch—a turnaround from development kits that worked fine on Mac and Linux boxes. Now, according to Oculus co-founder Palmer Luckey, it "is up to Apple" to change that state of affairs. Specifically, "if they ever release a good computer, we will do it," he told Shacknews recently.
Basically, Luckey continued, even the highest-end Mac you can buy would not provide an enjoyable experience on the final Rift hardware, which is significantly more powerful than early development kits. "It just boils down to the fact that Apple doesn't prioritize high-end GPUs," he said. "You can buy a $6,000 Mac Pro with the top-of-the-line AMD FirePro D700, and it still doesn't match our recommended specs."
"So if they prioritize higher-end GPUs like they used to for a while back in the day, we'd love to support Mac. But right now, there's just not a single machine out there that supports it," he added. "Even if we can support on the software side, there's just no audience that could run the vast majority of software on it."
Source: http://arstechnica.com/gaming/2016/03/oculus-founder-rift-will-come-to-mac-if-apple-ever-release-a-good-computer/.
See also: Shacknews blog.
(Score: 0) by Anonymous Coward on Friday March 04 2016, @03:05PM
I find it hard to believe that the Rift requires such high-end hardware to run. What is stopping them from giving lower-end hardware a 'degraded' experience instead of locking them out completely?
(Score: 2, Insightful) by Anonymous Coward on Friday March 04 2016, @03:11PM
As much as I dislike this whole idea of VR, I can support their decision not to degrade the experience. Doing so would compromise their product and make it less likely people would actually want it.
Or in other words, they go against this particular behavior: "When talking to customers, we tell them we only sell the best, nothing but the best of the best for the customer. When we purchase our materials, we look for the cheapest possible thing... so 'we can pass that saving on to the customer'..."
(Score: 0) by Anonymous Coward on Friday March 04 2016, @03:19PM
It's more likely that people will go "you mean I have to buy a new, several thousand dollar computer, just to play Rift games? Screw VR!"
I mean, that's what I would say, anyway. But then again, hardcore gamers seem perfectly fine with spending a few grand on a new 'rig' every now and then.
(Score: 0) by Anonymous Coward on Friday March 04 2016, @04:07PM
I just bought a fast i7 w/16g memory and put a good but cheap gtx750ti card in it. It plays every game at full detail settings, but it doesn't pass the VR test. Oh well.
(Score: 0) by Anonymous Coward on Saturday March 05 2016, @12:54AM
Buy a better video card.
Your bottle neck is the cheap ass card. But you knew that.
(Score: 0) by Anonymous Coward on Friday March 04 2016, @03:21PM
If there's an experience I would _not_ degrade, it would be a virtual reality experience: Even with the best realism it can make people nauseus.
(Score: 0) by Anonymous Coward on Friday March 04 2016, @03:27PM
Even with the best realism it can make people nauseus.
Isn't that a susceptibility thing, though? Some people get nauseous with even the best VR, and some don't have that problem at all even with crappy VR? I agree that degraded experience would make it more likely, but some games (such as the release game Lucky's Tale) takes place in a cartoony-style world, not realistic at all. Wouldn't it be acceptable to have less polys and keep the high FPS?
(Score: 1, Informative) by Anonymous Coward on Friday March 04 2016, @03:33PM
You need to be streaming upwards of 90fps to your eyeballs if you don't want to be vomiting after an hour.
(Score: 4, Insightful) by ledow on Friday March 04 2016, @04:05PM
Unless you want it to look like something from the 90's VRML days, you need high-res, dual-screen (both showing technically different scenes, so twice the rendering), at very high frame rates.
And given that you're not just rendering a 2D desktop, that actually requires quite some hardware behind it. Apple stuff is basically business-class hardware in those terms. It's like saying that Intel HD graphics should be fine for gaming. Sure, if you're playing Minesweeper...
(Score: 2) by Geotti on Friday March 04 2016, @04:17PM
Iris certainly not, but an R9 M370X should be enough.
(Score: 5, Informative) by ledow on Friday March 04 2016, @04:24PM
You're kidding right?
http://gpuboss.com/gpus/Radeon-R9-M370X-Mac-vs-GeForce-GTX-970M [gpuboss.com]
Apparently my several-year-old laptop GPU outperforms it.
(Score: 2) by Geotti on Friday March 04 2016, @05:02PM
No, I'm not kidding. My point is that it should be enough to power a rift, if my 320M 256MB could power the DK1.
Hardware sucks, but not *that* bad. Let's see, if something changes for the better with the upcoming Skylake revision, though I'm already planning on getting a hackbook pro for a while, if just for the additional SATA space.
(Score: 0) by Anonymous Coward on Saturday March 05 2016, @12:57AM
Your point is wrong though.
Try this on for size:
I should be able to get 35mpg in my 95 Explorer. All the new vehicles get that much and they do the same thing.
What would you think? I know you would think that dude is an idiot. What do you think everyone else is thinking about you right now...
(Score: 2) by Geotti on Monday March 07 2016, @11:39AM
I should be able to get 35mpg in my 95 Explorer
I'm using the metric system, jackass.
You're telling me that a 2 GB gfx card is incapable of rendering 2 hi-res image streams at 60-90 fps? Think again:
The recommended spec, "for the full Rift experience [emphasis added]" is a 970GTX or an R9 260 (see https://www.oculus.com/en-us/blog/powering-the-rift/). [oculus.com] The M390X is just ~30% slower than the 260 [gpuboss.com]. This ought to be enough, especially if you turn the details down. Not everyone needs to play with all setting set to ultra. Your comparison lacks a wheel, especially, because with some tinkering and careful driving it would probably be possible for your "idiot" to get that mileage, but that's a different story.
(Score: 2, Insightful) by Anonymous Coward on Friday March 04 2016, @04:25PM
> What is stopping them from giving lower-end hardware a 'degraded' experience instead of locking them out completely?
The same reason OS X has a better reputation than windows. Once you allow that, rift will be "that awful thing that gives you headaches and a horrible experience".
(Almost) Nobody cares that people brought this upon themselves by using unsupported, low-end hardware. $2000 Apple and good experience, vs $200 HP and bad experience for many people reduces to "Apple good, else bad".
HP, Dell and Lenovo are actually a very good example of this: Their cheap stuff is awful, their expensive stuff is good, look at their reputation.
(Score: 0) by Anonymous Coward on Friday March 04 2016, @05:29PM
What is stopping them from giving lower-end hardware a 'degraded' experience instead of locking them out completely?
Price! Who's going to pay $1,500 for the "full experience" version (which includes PC & headset) when they can test the waters for half that?
(Score: 2) by takyon on Friday March 04 2016, @11:16PM
I think they want a certain gaming experience to run at the somewhat high resolution and definitively high (and more importantly, stable) framerate. The metric they like to use is pixels per second (width * height * framerate). Insert your resolution and frame rate target to get a comparison. For example, 233,280,000 pixels per second for consumer launched Oculus Rift (2160x1200x90) vs. 221,184,000 pixels per second for 1440p gaming at 60 FPS. This PPS requirement could go way up in 2018-19 when the next Oculus Rift comes out. Bump it up to widescreen 4K at a higher frame rate? 4096x2160x120 = 1,061,683,200 pixels per second, 4.5 times more.
I feel that some VR will work just fine on Oculus Rift or other VR headsets with lower powered GPUs. For one, VR video. It's much easier to display pre-rendered video, even if it is in 360°, than it is to render a game in real time, with shadows, ray tracing, blah blah blah. Also, I would assume that simpler demos with lower levels of detail and less heavy GPU work, like kaleidoscopes or other cool stuff without billions of polygons, would run just fine, even at the 90 Hz frame rate.
To get back to your question, they shouldn't lock out anybody from attempting to run something with Oculus Rift and weaker hardware. I have no idea what DRM or other restrictions will be involved, but the device will be hacked very soon after release. I expect it to be seen working with Linux, BSD, 5 year old GPUs, crappy framerate, whatever.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 1, Insightful) by Anonymous Coward on Friday March 04 2016, @03:09PM
Mac could be so much more. I just wish the Woz had more involvement. Jobs and Cook just don't get it and have mangled Apple into a lifestyle brand with no substance.
(Score: 4, Touché) by isostatic on Friday March 04 2016, @03:21PM
Mac could be so much more. I just wish the Woz had more involvement. Jobs and Cook just don't get it and have mangled Apple into a lifestyle brand with no substance.
Yup
http://bgr.com/2015/02/11/apple-vs-google-microsoft-market-cap/ [bgr.com]
Apple is now worth more than Microsoft and Google combined
They just don't get it.
(Score: 5, Insightful) by Nerdfest on Friday March 04 2016, @03:27PM
It amazes me that people think Apple is such a great company because they take such an obscene amount of profit from their customers.
(Score: 0) by Anonymous Coward on Friday March 04 2016, @03:38PM
Welcome to the alternate 1985 from BTTF.
(Score: 4, Insightful) by isostatic on Friday March 04 2016, @03:40PM
No, what it means is Jobs knew what people wanted, and delivered it. To say he "didn't have a clue" is a rather strange statement. Sure, he had no clue about treating cancer, but he did about building good products that people liked to use.
(Score: 0) by Anonymous Coward on Friday March 04 2016, @04:47PM
He knew how to manipulate people into wanting something.
(Score: 5, Insightful) by Tramii on Friday March 04 2016, @06:14PM
He knew how to manipulate people into wanting something.
I hear this a lot and I just don't understand.
If I talk someone into buying something, and they go ahead and buy it and don't regret the purchase later, we don't call that "manipulating". We call that making a good recommendation. It's only if the buyer later regrets their decision that they might claim they were manipulated.
People are not being "manipulated" into buying Apple products. People in general *like* Apple products. They go back and buy them again and again. They are not being manipulated. Just because *you* don't like them, doesn't make it manipulation. Apple makes high quality products that generally "just work" and are generally easy to use. They are good at taking complicated technology and simplifying them down to where a grandma can use them. Their products also last longer than most others and retain their value longer.
It's NOT manipulation. It's giving people what they want. If you want something different, that's fine. If you don't like Apple's "walled garden", well you don't have do. A lot more people do. If you think you can get hardware for cheaper, then go for it. Most people don't have the time or energy to research and then build custom stuff. They want to be handed a near perfectly working solution that works the moment you hit the power button.
(Score: 4, Insightful) by frojack on Friday March 04 2016, @06:55PM
While I'm no particular fan of Apple, I still think you've hit the nail on the head.
There exists a sizable contingent here on SN that object to any form of advertising, and accuse any use of advertising as some form of manipulation.
Apparently you can't do any research and find out what people want, then use that to create a product and advertise that you do indeed supply what people were looking for. You have to somehow set up a retail operation, either in bricks and mortar or on the web, BUT never once mention what you are selling or why.
To do anything else is somehow manipulation.
Probably this is in response to the advertising abuse on the net, triggering the simple minded knee-jerk reaction to hate on all advertising.
No, you are mistaken. I've always had this sig.
(Score: 2) by BasilBrush on Friday March 04 2016, @11:07PM
Another bizzare thing is a lot of the people that say Apple is all marketing buy Samsung phones. Yet Samsung's marketing budget is many time that of Apple.
Hurrah! Quoting works now!
(Score: 0) by Anonymous Coward on Saturday March 05 2016, @01:01AM
Hmm Samsung sells more different things then apple, so therefore might have to advertise for more different products.
If you really want an apple to apple comparison what are the marketing budgets for the iphone vs the galaxy line of phones.
My WAG is that Apple spends more.
(Score: 2) by Nerdfest on Sunday March 06 2016, @02:11AM
Samsung has to pay for their advertising. Apple doesn't.
(Score: 2) by tangomargarine on Friday March 04 2016, @09:00PM
The best manipulation is the one where you don't realize you're being manipulated.
"Is that really true?" "I just spent the last hour telling you to think for yourself! Didn't you hear anything I said?"
(Score: 2) by Tramii on Saturday March 05 2016, @04:20AM
If there is a company that can manipulate people into buying products in a way that no one realizes they are being manipulated, then I'm just gonna be happy that they limit their evil powers to just making money. Better that than something more serious, like say, using it to take over the world.
(Score: 4, Insightful) by archfeld on Friday March 04 2016, @03:39PM
You sound like one of those women in the wrinkle cream commercials. I just paid $900.00 for a .25 oz jar of made up fsck'n crud, of course I could admit I got taken but I won't instead I'll say how amazing it is. Why do you think snake oil salesmen get away with their con-game so often and for so long ?
"Apple is now worth more than Microsoft and Google combined"
"They just don't get it."
So we know Apple hit their goal, to make mass quantities of money, what was your goal? Giving Apple large quantities of money, or to get hardware that will allow you to have an UNDEGRADED VR experience ? You make the choice. What's more important, performance or image ?
For the NSA : Explosives, guns, assassination, conspiracy, primers, detonators, initiators, main charge, nuclear charge
(Score: 2) by Geotti on Friday March 04 2016, @04:27PM
what was your goal?
To get a superior OS. Sadly, the hardware sucks balls these days on the CPU and GPU side of things, that's one of the reasons, why you can build a hackintosh or get a hackbook (pro).
(Score: 3, Insightful) by isostatic on Friday March 04 2016, @04:40PM
So we know Apple hit their goal, to make mass quantities of money, what was your goal? Giving Apple large quantities of money, or to get hardware that will allow you to have an UNDEGRADED VR experience ? You make the choice. What's more important, performance or image ?
Functionality. What gives me the maximum gain for the minimum effort. On my laptop, that's ubuntu on a thinkpad. On my phone it's an iphone 5.
(Score: 3, Interesting) by isostatic on Friday March 04 2016, @04:42PM
So we know Apple hit their goal, to make mass quantities of money, what was your goal?
To point out that Jobs and Cook did "get it", as evidenced by your admission that Apple made tons of money.
The relative quality and costs of apple kit is neither here nor there.
I have two laptops on my desk at the moment, one cost about £900, the other about £2k. One is a PC running linux, one is a mac. The expensive one is the quality one which I use all the time, the other one I use for specific purposes (it's got a longer battery life for example).
There's a mac mini too, which as it's used for business purposes was exempt from tax and therefore actually cost about £250 5 years ago. I'm sure I could have built a £150 linux PC instead, unlikely to be as nice hardware though.
(Score: 3, Insightful) by ledow on Friday March 04 2016, @03:56PM
Apple got that money - and most of it is just money sitting around not doing very much - from their customers.
By the difference between what their hardware / software actually COSTS to make, and what they charge for it.
I'm sure DeBeers make an awful lot of money. Because it costs next-to-nothing to get some slave labour to dig around in some dangerous mines that you happen to own most of them worldwide. And you can charge a fortune because people think it's somehow "rare". It does not mean that you're "good". It just means you know how to sell a polished turd.
A skill in itself, no doubt, but some people hold different criteria for success than the amount of cash you have in the bank doing nothing.
(Score: 2) by isostatic on Friday March 04 2016, @04:17PM
But it means that DeBeers gets it. They've met their goal.
(Score: 0, Offtopic) by Francis on Friday March 04 2016, @04:19PM
I refuse to buy diamonds because of that. Well, between DeBeers and the conflict diamonds, I refuse to buy. Personally, I'd rather have a manufactured diamond anyways. They come in an array of colors and are usually cheaper than the ones mined anyways.
(Score: 2) by Thexalon on Friday March 04 2016, @05:14PM
Also, the primary reason normal everyday diamonds are something that people think are all that valuable is due to marketing campaigns by, you guessed it, DeBeers. Save yourself some money and buy a different gem if all you want is a pretty rock.
Oh, and if it's for an engagement or something, the right partner will be overjoyed with a simple brass band ring, and the wrong partner will be complaining if you get them the most expensive ring you can find at your jeweler (jewelers of course make a lot of money from people who fail to understand this).
The only thing that stops a bad guy with a compiler is a good guy with a compiler.
(Score: 1) by Francis on Friday March 04 2016, @09:01PM
More likely I'll be buying just a gold ring like this one: https://www.etsy.com/listing/202786130/double-knot-ring-silver-and-rose-or?ref=market [etsy.com]
Not because it's cheap, but because it's something you don't see much and a great conversation starter. I've never understood the attitude that people are willing to spend a ridiculous amount of money on engagement rings and on the wedding/honeymoon. One of the most common reasons for divorce is financial problems, so why would anybody dig themselves that deep over such superficial choices?
(Score: 1) by Spamalope on Friday March 04 2016, @04:59PM
Don't forget murdering anyone producing diamonds who doesn't play ball...
(Score: 2) by tibman on Friday March 04 2016, @06:04PM
If you want to build/buy a gaming box then you can't even approach apple. Same for google, they make nothing usable by the gaming market. Microsoft doesn't make gaming hardware (other than peripherals maybe?) but their OS supports the best gaming gear. Linux is still behind Microsoft but only because most graphics drivers are closed-source and compiled for windows.
SN won't survive on lurkers alone. Write comments.
(Score: 3, Insightful) by isostatic on Friday March 04 2016, @06:10PM
Funny that, my phone has lots of games on.
If you want to build a very specific type of gaming box for a very specific type of user then you go for a very specific hardware and software combination.
However 90% of people couldn't give a stuff about that.
Apple know this, and have decided they don't want to serve that market. That doesn't mean they "don't get it".
(Score: 3, Informative) by tibman on Friday March 04 2016, @06:57PM
Just because you can play games by scratching X's and O's into the sand/dirt with a stick that doesn't mean you have a gaming (sand?) box. That's fine if apple doesn't want to serve it, i have no argument with you there. Just acknowledging they currently have zero offerings for gamers and so apple is a non-starter for those interested in gaming machines. That's what this whole article is about. Though perhaps Luckey should have said Rift will come to Mac if Apple “ever releases a good [gaming] computer”.
SN won't survive on lurkers alone. Write comments.
(Score: 5, Informative) by Immerman on Friday March 04 2016, @03:20PM
As I recall the FirePro line are extremely high-end GPUs, they just don't offer the raw pixel-pushing performance of a sloppy gaming-oriented videocard, instead prioritizing the extreme accuracy needed for professional 3D modeling and CAD applications. Or has that changed?
(Score: 2) by Geotti on Friday March 04 2016, @04:23PM
No, but that's exactly the point. That's a workstation and not a gaming PC.
I'll continue playing around with the DK1 and maybe upgrade the internal display at some point, but I'm sure as hell not going back to Winblows. Well, we'll see, anyway, how fast open drivers will appear.
(Score: 2, Informative) by mobydisk on Friday March 04 2016, @07:42PM
I keep hearing that same thing, but it doesn't add-up. Not sure about today, but ~5 years ago, the FireGL was actually the same hardware as the gaming GPUs, but with slightly different drivers and about twice the cost.
What "accuracy" are they referring to? The video card is just rendering, right? It's not doing any of the engineering work.
I right now can see the mechanical engineers a few cubes over from me using SolidWorks on their expensive FireGL cards. I see non-anti-aliased lines and simple solid filled polygons. Now, if you told me the driver needs to be optimized for the raw number of lines and polys, I might believe you. But if so, you'd think they could at least get something other than badly aliased lines. It looks like the software-only 3D I got on my 386 years ago. Actually worse - the lines aren't even solid - they have holes and dashes in them when they go to high angles. It's really quite poor. What gives?
(Score: 2) by Immerman on Saturday March 05 2016, @03:40AM
As I recall one of the big issues is depth buffer accuracy - that weirdness/sparkling you get when two faces intersect, or when two parallel faces are at *almost* the same distance and you end up seeing the far one instead of the near one. That's unacceptable for professional graphics, but a common result of major performance-boosting compromises.
(Score: 2) by gman003 on Saturday March 05 2016, @01:46AM
They're the same hardware, but the drivers are different. That's not the issue in this case, though.
The Mac Pro is the only one with options that even come close to the Rift requirements. You can get it with two D300s, two D500s, or two D700s. Those models all seem to be exclusive to the Mac Pro, but they're based on desktop chips.
The D300 looks like an underclocked Radeon R9 270 (256-bit memory bus, 1280 shader cores). The D500 looks like a heavily cut-down version of the Radeon R9 280 - 1536 shader cores instead of the 1792 in the 280 or the 2048 in the 280X, all three on a 384-bit memory bus. And the D700 looks like a match for the 280X, with maybe some minor clockspeed differences.
Official system requirements are for a 290 or higher (512-bit memory bus, 2560 shader cores). I'm actually not sure how the dual-card setup goes in this case - VR is supposed to scale well to dual-card, better than most things, so I'd think the D500s ought to scrape by. But maybe it's bottlenecked on ROPs or something - memory access might be a big part of it, and dual-card doesn't help much with that. And I'm not sure what the software stack looks like on the Mac Pro.
(Score: 2, Interesting) by Anonymous Coward on Friday March 04 2016, @03:43PM
ha, this is funny.
this is like driving TWO monitors and considering that the GPU manufacturers have been fleecing the pockets
with tiny new performance improvements (probably calculated and optimized for profit by a GPU powered ..uhmmm... super computer)
per new generation of GPU cards, this new development is going to hurt them $$$.
now ANY occulus ready GPU card will run ANY GAME easily on any single simple plain old ONE monitor hahahaha!
(Score: 0) by Anonymous Coward on Friday March 04 2016, @05:16PM
Which is why I always stay a generation or two behind whenever it's time to upgrade my video card.
(Score: 2) by takyon on Friday March 04 2016, @06:15PM
We're about to get a big performance increase anyway, since both NVIDIA and AMD will skip 20nm, moving from 28nm to 14nm with their Pascal and Polaris GPUs respectively.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 3, Interesting) by Gravis on Friday March 04 2016, @03:54PM
there is no excuse for dropping Linux support unless, "Linux isn't worth our time" is considered valid. that said, i'm glad i avoided investing in Oculus Rift from the very start.
(Score: 0) by Anonymous Coward on Friday March 04 2016, @04:50PM
Economically speaking that is a very valid reason. But on Linux you're free to make your own.
(Score: 3, Interesting) by opinionated_science on Friday March 04 2016, @06:35PM
I agree totally. I suspect the lack of Linux support was probably an "encouraged" feature, because it stopped diluting the market for Micro$oft.
Is it a surprise they are messing with their Xbox line?
Can't use Oculus anywhere other than WinVidia Win10 PC? Mac too slow?
That's a shame.../s
(Score: 3, Insightful) by RamiK on Friday March 04 2016, @06:51PM
It's not up to them. VR goggles require very low latency or you'll physically feel nauseous. In order to have that, you'd need to use the kernel-land. However, since all the GPU manufacturers violate each-others patents, they can't release optimized, low latency, open-source kernel drivers. So, they write kernel shims and closed source user-land drivers with obfuscated binaries. This means high-latency; Which means no VR for Linux in the foreseeable future.
In a few years time, when GPU hardware becomes 20-30% faster (not in throughput, but in latency) than what's required for minimal VR needs, user-land drivers might be enough.
compiling...
(Score: 1, Insightful) by Anonymous Coward on Saturday March 05 2016, @01:36AM
VR goggles require very low latency or you'll physically feel nauseous.
This is bullshit. The nauseous effect is caused by the disparity between your visual input and that of your inner ear's orientation to the ground. The "Muh Latency" lingo is just sales propaganda to convince you they fixed motion sickness, but it is 100% bullshit unless the machine comes with a graviton emitter.
(Score: 2) by RamiK on Sunday March 06 2016, @12:45PM
No. Read:
https://en.wikipedia.org/wiki/Virtual_reality_sickness [wikipedia.org]
http://oculusrift-blog.com/john-carmacks-message-of-latency/682/ [oculusrift-blog.com]
The research leading up to helmet-mounted displays started out attempting to replace the canopies in helicopters with cameras. It goes back to the 70s and 80s and made similar observations. At the time, they didn't have the compute or quality displays to pull it off so they ended up with HMDs. Now, the money goes into drones.
compiling...
(Score: 2) by Gravis on Saturday March 05 2016, @08:05PM
It's not up to them. VR goggles require very low latency or you'll physically feel nauseous. In order to have that, you'd need to use the kernel-land. However, since all the GPU manufacturers violate each-others patents, they can't release optimized, low latency, open-source kernel drivers. So, they write kernel shims and closed source user-land drivers with obfuscated binaries. This means high-latency;
wow, that's another load of shit! fun fact, you get better performance from graphics cards on Linux than you do on any version of Windows.
(Score: 2) by jasassin on Sunday March 06 2016, @02:44AM
No. No. Fuck no.
http://www.phoronix.com/scan.php?page=article&item=intel-skylake-windows&num=1 [phoronix.com]
How about this... http://www.phoronix.com/scan.php?page=article&item=mordor-win10-linux&num=1 [phoronix.com]
It's common knowledge Windows graphics driver are better than the Linux drivers (proprietary or open source). Hell, the Linux drivers don't even support OpenGL 4! Not sure why you said something so patently wrong. I love Linux also, but come on man!
jasassin@gmail.com GPG Key ID: 0x663EB663D1E7F223
(Score: 1) by bitstream on Saturday March 05 2016, @05:44AM
Don't be glad you didn't invest in a particular project. Instead only use your resources on projects that are open source or where you have the rights to the "IP" such that when and not if the management goes bad. You always can cooperate with others to get to *your* goals.
(Score: 2, Insightful) by CHK6 on Friday March 04 2016, @03:57PM
I think Luckey has a serious problem on his hands. If a $6K Mac Pro with a $3K AMD FirePro D700 video card cannot power his device to a usable state, then I think Luckey's engineering team is doing it wrong. It's a poor excuse to blame it on Apple. Maybe, just maybe his engineers aren't up to the challenge.
(Score: 4, Insightful) by theluggage on Friday March 04 2016, @05:04PM
If a $6K Mac Pro with a $3K AMD FirePro D700 video card cannot power his device to a usable state
Whether or not you think its "a good computer", the $6K Mac Pro was never intended to be a gaming machine - its a video editing/pro graphics workstation. The dual AMD FirePro D700 video cards aren't 'dual' in the sense of the SLI or Crossfire setup you'll find on gaming PCs, they're optimised for workstation use, not games and they're there as much for GPU-based computation (with OpenCL) as for actual graphics. The money goes on things like Xeon CPUs, ECC RAM, loadsa multi-threading and high-tech cooling for reliability on long render jobs rather than raw grunt.
Plug a FirePro or NVidia Quadro workstation card into your PC and, whatever the gaming performance, a 'consumer' gaming-oriented GPU will smoke it in terms of bangs-per-buck.
"Macs aren't the best computers for serious gaming" is hardly a revelation. Its never been a priority. Apple has a gaming platform: its called the iPhone.
(Score: 0) by Anonymous Coward on Friday March 04 2016, @06:14PM
"Apple has a gaming platform: its called the iPhone."
Well played! You owe me a bottle of Windex to get the coffee I just spat on my monitor cleaned up.
(Score: 3, Interesting) by damnbunni on Friday March 04 2016, @05:09PM
The D700 simply isn't a good card for gaming. That's not what it's meant for; its strengths lie elsewhere.
It doesn't help that OSX handles 'oh, I have two GPUs!' by dedicating one for compute and one for display. It's possible to override that, but it has to be done on a per-game basis, in the game engine.
(Score: 0) by Anonymous Coward on Friday March 04 2016, @05:41PM
Do you know the difference between the following apparently similar looking words?
*priceless
*worthless
There's a bridge I'd like to sell ya...
(Score: 2) by GreatAuntAnesthesia on Friday March 04 2016, @04:49PM
I remember when VR was the next big thing. Nothing came of it. However I do believe it's worth another shot, now that the hardware is so much better.
The real question is, will the convergence of a new interest in VR and the Hollywood trend of remaking old films lead to a remake of the Lawnmower Man?
And would that be a good thing or a bad thing? Or flip it on its head, how about an OR game that immerses the player in the psychedelic VR environments of the original film?
(Score: 2) by julian on Friday March 04 2016, @05:12PM
It's not really a hardware problem. There's about 10% of the population who are biologically incapable of enjoying 3D movies/VR and for the same reason. It causes eye-strain and nausea to focus your eyes in the way required when you have two separate screens showing slightly different images that close to them. I'm in that group, unfortunately. Until we have real VR, Matrix-style brain plugs, I'm not going to be able to use this stuff.
I'm not terribly disappointed, tbh. It seems gimmicky and stupid.
(Score: 2) by ilPapa on Friday March 04 2016, @05:40PM
I can enjoy it, but I need to hold a vomit bucket between my knees. VR also gives me crushing migraine headaches, but for that, there's always heroin.
Of course, my experience is with the DK1, and I understand they've improved quite a bit. So, we'll see.
You are still welcome on my lawn.
(Score: 2) by Post-Nihilist on Saturday March 05 2016, @01:19AM
psilocybin is better against migraine than heroin.
psilocybin is also better for VR than heroin, you do not need the headset!!!
Be like us, be different, be a nihilist!!!
(Score: 3, Informative) by takyon on Friday March 04 2016, @06:25PM
I don't think that's necessarily true given an increase in frame rates and resolution.
Consumer Oculus Rift is 1080×1200 per eye @ 90 Hz.
Developer Kit 2 was 960×1080 per eye @ 75 Hz.
Developer Kit 1 was 640×800 per eye @ 60 Hz.
Most smartphones inserted into cardboard are probably limited at 60 Hz.
AMD and others have talked about an initial target of 120 Hz frame rate, with AMD specifying 240 Hz in the far future. 90 Hz is described as the minimum rate necessary to avoid eye strain. They also mentioned "16K per eye" resolution [soylentnews.org]. They want to keep selling GPUs, so it's no surprise they want the requirements to skyrocket.
Let the early adopters describe whether they are killing their eyes at these higher resolutions and framerates. Wait 5 years and the hardware will be much better and there will be much more content available, and you can just borrow a friend's headset to test it for yourself.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by Username on Friday March 04 2016, @10:30PM
I do not think you gentlemen understand. Oculus has moved far beyond Apple’s capabilities. Soon no hardware will be able to contain Oculus.
(Score: 1) by bitstream on Saturday March 05 2016, @06:01AM
Apple decides what hardware they offer so there might actually be a reason to drop support. However machines that run Microsoft Windows can also run most free open source Unixes. And graphic makers ought to be interested in getting into this market so they can fix drivers for their own hardware. Thus hardware is not a reason to drop support for the more popular free Unix variants. One can suspect the reason is perverse incentive regarding the software.. After all no need to fix drivers if the source is available. The free developer kits seems just like "the first one is free" and once tester feedback has been collected those users could unmercifully be dropped.
If you don't have the source. Someone else has you!
If you however take a look at the situation. One could reverse engineer existing hardware like the Oculus Rift. Another possibility is to make an open source virtual reality headset hardware design. And a accompanying source code. The hard part of the headset is in essence two synchronized screens. For which there is standardized hardware to handle.