Stories
Slash Boxes
Comments

SoylentNews is people

posted by martyb on Monday March 30 2015, @03:57AM   Printer-friendly
from the chips-ahoy! dept.

The rumor spreading around the net is that Samsung is in talks to buy AMD so that they can be in a better position to compete with Intel and Qualcomm. The question of course will be what happens to AMD's X86 line, but no matter the outcome it seems the short term is helping AMD's stock price so could this be a good thing, or the beginning of the end for team red?

Also covered at: Tom's Hardware, The Register, and XDA Developers.

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 4, Informative) by takyon on Monday March 30 2015, @04:07AM

    by takyon (881) <reversethis-{gro ... s} {ta} {noykat}> on Monday March 30 2015, @04:07AM (#164058) Journal
    --
    [SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
    • (Score: 3, Interesting) by aristarchus on Monday March 30 2015, @05:05AM

      by aristarchus (2645) on Monday March 30 2015, @05:05AM (#164072) Journal

      I want somebody to buy AMD, and kick them into high gear. Playing second fiddle for half a century is not a mark of greatness. Time to blow them out of the water, cut ties and produce new processor technology? Maybe? NO? OK. Zambonis, on ICE.

      • (Score: 0) by Anonymous Coward on Monday March 30 2015, @05:13AM

        by Anonymous Coward on Monday March 30 2015, @05:13AM (#164074)

        You can produce new processor technology, but nobody will use it, unless it's x86 compatible. Proven by AMD.

        • (Score: 1, Insightful) by Anonymous Coward on Monday March 30 2015, @06:07AM

          by Anonymous Coward on Monday March 30 2015, @06:07AM (#164083)
          Disproven by ARM.

          The new technology has to be actually superior for what it's supposed to do otherwise few will want it. As proven by Intel with their Itanic and similar i860 (in comparison the i960 was fairly successful)... ;)

          And proven by AMD with their AMD64 (they gained a lot of share when they introduced it - but then they sat on their laurels).
          • (Score: 1, Interesting) by Anonymous Coward on Monday March 30 2015, @06:27AM

            by Anonymous Coward on Monday March 30 2015, @06:27AM (#164086)

            ARM is for phones, toys, and fondleslabs. You don't see Apple tossing x86 to the Trash and switching OSX to ARM exclusively, and Apple has a long history of switching to the new hotness. ARM, not even AArch64, is nowhere near superior enough to replace x86.

            • (Score: 3, Interesting) by Gravis on Monday March 30 2015, @07:16AM

              by Gravis (4596) on Monday March 30 2015, @07:16AM (#164096)

              ARM, not even AArch64, is nowhere near superior enough to replace x86.

              funny you say that because AMD is soon (2016) to start eating the server market with their ARM K12 chip. https://en.wikipedia.org/wiki/AMD_K12 [wikipedia.org]

              i will happily replace my x86 system with an ARMv8-A chip.

              • (Score: -1, Troll) by Anonymous Coward on Monday March 30 2015, @08:01AM

                by Anonymous Coward on Monday March 30 2015, @08:01AM (#164101)

                By "server" to you mean low-end web server that runs a blog, and which spends most of its time idling?
                By "my x86 system" do you mean low-end laptop that you use to type comments to message boards, and which spends most of its time idling?

                Please provide benchmarks clearly showing the AMD K12 outperforming a similarly configured Intel Xeon. It'll need to be a whole lot better than this embarrassment for ARM:

                http://www.cnx-software.com/2014/10/26/applied-micro-x-gene-64-bit-arm-vs-intel-xeon-64-bit-x86-performance-and-power-usage/ [cnx-software.com]

                • (Score: 2, Informative) by Anonymous Coward on Monday March 30 2015, @09:17AM

                  by Anonymous Coward on Monday March 30 2015, @09:17AM (#164133)

                  From the web page you linked:

                  Applied X-Gene 1 (40nm process) was used instead of X-Gene 2 built on 28-nm process which was not available at the time.

                  You don't think that would make a difference?

                  Anyway, server processes tend to scale well (many independent requests), and therefore for servers the absolute performance of a processor is less important that the performance per watt: If the processor is half as fast but has a higher performance per watt, you can just put two of them in your server room and still be better off.

                  Again from the page you linked:

                  However, when it comes to performance-per-watt, APM X-Gene 1 is clearly ahead of Intel Xeon E5-2650 and there’s no comparison against Xeon Phi systems.

                  • (Score: -1, Redundant) by Anonymous Coward on Monday March 30 2015, @09:51AM

                    by Anonymous Coward on Monday March 30 2015, @09:51AM (#164142)

                    If the processor is half as fast but has a higher performance per watt, you can just put two of them in your server room and still be better off.

                    Uh, no. If you're doing any kind of lengthy serial computation, it will simply run at half the speed. If you're doing any kind of computation that can't be run in parallel between two servers, you just wasted a server.

                    server processes tend to scale well (many independent requests),

                    Sure, requests for your blog might scale well. Can you comprehend that not every server is a blog server? Of course you can't.

            • (Score: 3, Insightful) by wantkitteh on Monday March 30 2015, @02:35PM

              by wantkitteh (3362) on Monday March 30 2015, @02:35PM (#164254) Homepage Journal

              You also don't see phone/tablet manufacturers rushing to x86 parts in significant numbers. There may be the odd Atom-powered tablet or phone here or there, I even own one, but face it - x86 may have won on the desktop, but it lost in your pocket.

      • (Score: 2) by SlimmPickens on Monday March 30 2015, @05:19AM

        by SlimmPickens (1056) on Monday March 30 2015, @05:19AM (#164078)

        Time to blow them out of the water, cut ties and produce new processor technology?

        Atanium? AMD128? ArMD?

        • (Score: 0) by Anonymous Coward on Monday March 30 2015, @05:28AM

          by Anonymous Coward on Monday March 30 2015, @05:28AM (#164079)

          POWERAMD

      • (Score: 3, Insightful) by Hairyfeet on Monday March 30 2015, @08:30AM

        by Hairyfeet (75) <{bassbeast1968} {at} {gmail.com}> on Monday March 30 2015, @08:30AM (#164113) Journal

        You'd be playing "second fiddle" if the other guy was paying off the ref [youtube.com] and as the link shows doesn't even give a single fuck if anybody knows they are rigging the game. if you try placing AMD chips against Intel in actual PROGRAMS, like you'd really use your PC? You'll find the sub $150 AMD trading blows with Intel chips costing as much as triple the price [youtube.com] and in many cases actually BEATING Intel in several tests. Which just FYI I've been saying for years that is the case, as I've got to put most Intel and AMD chips through the paces at the shop and without benchmark rigging? They are neck and neck until you get into the $650+ CPUs.

        BTW if anybody wants to know why both links are Tek Syundicate? Because surprise surprise they are one of the ONLY sites that don't take $$$$ from either chip maker. Compare this to Tom's where their "best gaming CPUs for the money" even admits that many games REQUIRE a quad core to even run...and recommends an Intel Pentium dual core for a gaming PC LOL. That is what happens when the ones in charge of reviews are for sale, they will go against their own advice to shill chips they are getting paid to shill.

        Oh and for the record? I'm impressed enough by AMD performance my entire family is on AMD, from dad's Phenom I X4 to the youngest boy's FX8300. Hell the only thing I'd argue hurts AMD is with so many cores and chips so well built they last for ages, I play the latest games on my Phenom II X6 as does the oldest boy, and the wife's Phenom II X4 has actually gone through 4 systems so far, from mine to the oldest to the youngest to the new wife, still purrs like a kitten no matter what she throws at it. Take my word for it, you want a bad ass PC that will take anything you throw at it and not break a sweat? Any of the FX X6s and X8 are just monsters, and for your ULV HTPC builds there is nothing that comes close to the AM1s, nothing.

        --
        ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
        • (Score: 3, Troll) by engblom on Monday March 30 2015, @09:03AM

          by engblom (556) on Monday March 30 2015, @09:03AM (#164127)

          I fully agree with your statement about performance compared to what the processor cost from the beginning. However, you do not take one thing into account: power efficiency.

          AMD processors are not as power hungry as they were in the past but still today, if you want both good cpu performance and silence you need Intel. Intel i3 - i7 surely are more expensive but they are better when it comes to performance/watt. The fans will not wear out the same fast and the laptop will not be the same fast full of dust.

          Working for a IT shop, I have seen really many broken AMD laptops because of overheating. The Intel ones just work most of the time.

          • (Score: -1, Redundant) by Anonymous Coward on Monday March 30 2015, @09:13AM

            by Anonymous Coward on Monday March 30 2015, @09:13AM (#164131)

            power efficiency

            Our competitors' processors are too fast. We just can't keep up. How can we use social engineering to fix this problem and still stay profitable? Let's convince trendy people that speed doesn't matter! We'll get them obsessed with power efficiency instead. Yeah, that's the ticket. Soon the trendiest assholes will have the most power efficient processors available.... and we will sell them those processors!

          • (Score: 3, Informative) by Hairyfeet on Monday March 30 2015, @11:46AM

            by Hairyfeet (75) <{bassbeast1968} {at} {gmail.com}> on Monday March 30 2015, @11:46AM (#164181) Journal

            Sorry but unless you are running a server farm it'll take 18 years [youtube.com] for that Intel to actually save you a single penny because of the price permium you pay. it would be like saying "If I sell my $6k SUV and buy a $100K Tesla I will come out ahead because of the money I save on gas!" when in reality the car will have long since worn out and been replaced before you even saved a single dollar.

            Oh and if all you care about is power consumption? You can have the APUs which are the basis of the PS4 and XB-One for $53 for the quad [newegg.com], only uses 25w, oh and you can play new games like Battlefield 4 [youtube.com] and Titanfall [youtube.com] on it. this is why I use these for HTPC builds, they'll do 1080P without even breaking a sweat.

            So look at the numbers, the REAL numbers that tek Syndicate and JayTwoCents have posted and you'll see what I have been saying for years is true, unless you are willing to spend at least $500 on the CPU? The AMD will either trade wins or be so close as to be within the margin of error. On a final note remember that AMD uses actual measurements taken while running automated tests of programs and games to come up with their max temps, Intel uses a theoretical load, Since most AMD chips are OC friendly this causes them to list a higher number than what you see in the field, for example my Phenom II X6 is listed as a 95w CPU but with it running at standard speeds I have yet to hit 60w. With most day to day tasks like web surfing or watching vids my motherboard is reading between 12w and 25w and it stays VERY cool, max temp I ever got was 138f in a shitty mATX I had to use while waiting for my case to get in from backorder, once I got my ATX (which is certainly not top o the line [newegg.com], in fact it was just $89 on sale) I have yet to hit 115 under full load and that is with the fans turned all the way down.

            --
            ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
            • (Score: 2, Informative) by engblom on Monday March 30 2015, @12:11PM

              by engblom (556) on Monday March 30 2015, @12:11PM (#164191)

              Apparently you did not read my posting. I never said you get the money back in lower electricity bill.

              I was talking about fans running at higher speed in laptops, sucking them full of dust and dying from over heating. I even have had several cases where the heat got the components to come off from the main board. I have lost count of how many AMD laptops got destroyed because of heat. During the same time span I have not seen even one Intel laptop dying because of over heating. The IT-shop I am working for is selling mostly HP laptops, so it has been the same manufacturer for both the AMD and Intel laptops.

              It has got to the point that if a customer asks me for advice on buying a laptop, I tell them to get Intel i3 .. i7. That means less trouble for both them and me. However, when it comes to desktops, it does not matter much if they get Intel or AMD.

              And to clarify one thing: I am not a big fan of Intel. During my life I have bought more AMD computers than Intel. But still I can not deny Intel got the lead at this point and at this point I can not recommend anything else for laptops.

              • (Score: 3, Informative) by Hairyfeet on Monday March 30 2015, @03:39PM

                by Hairyfeet (75) <{bassbeast1968} {at} {gmail.com}> on Monday March 30 2015, @03:39PM (#164308) Journal

                Uhhh the laptops run a max 15w-25w, as do several Intel laptops...so what EXACTLY is your point? the first hit for AMD laptops is this Lenovo [tigerdirect.com] which we look up the CPU for and...15w and that INCLUDES a Radeon R5 GPU [cpu-world.com]. So please show me an Intel APU that has a GPU which BEATS an R5 Radeon AND gets less than 15w? And they have the E series that go sub 10w for dual cores, so I really don't see what you are getting at.

                BTW since you seem to be not up on current laptop designs allow me to enlighten you as to a little fact...the modern designs use heatpipes so the fans have a MUCH shorter distance to move air than they did in the old days. So unless you are simply wanting something passive (which just FYI they have several models, all of which beat the Atoms that Intel has been pushing in the fanless sector) or are literally playing in the dirt with your laptop? I seriously doubt you are gonna plug up the fans with dust. AAMOF the last laptop I saw that had fan issues was a new Pentium dual, I open it up and found a wad of long white cat fur sucked up into it. Turned out the woman let fluffy lay against the back of the screen so the fan was sucking up every hair the cat shed.

                I have AMD laptops running in the field in construction trailers, in warehouses, got one in a lumber yard, some seriously nasty places and the most I've had to do is spritz them once or twice a year for a whole 4 seconds with a can of air. And again I can cover this page with Intel laptop CPUs of every price range that have the same TDP of 15w-25w, from low end Celerons to high end i7s...so what EXACTLY is your point?

                --
                ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
            • (Score: 2) by wantkitteh on Monday March 30 2015, @04:03PM

              by wantkitteh (3362) on Monday March 30 2015, @04:03PM (#164323) Homepage Journal

              Who said anything about server farms? The poster you're replying to was talking about laptops, and performance/watt definitely has a huge effect there - battery life. The APUs built into the PS4 and XB1 may just about be able to play those games, but *really* badly. In Dying Light, the draw distance on consoles is set lower than the lowest possible setting in the PC version. Your assertion regarding HTPC performance means nothing - a single-core Atom Z520 with a GMA 500 also does 1080p without breaking a sweat. [youtube.com] And WTF is all the Intel using a theoretical load crap about? It doesn't mean anything in the context you say it. Don't take this the wrong way, but you're typing like a foaming fanboi.

              For anyone who can't be bothered to watch that crappy video (seriously, what's wrong with showing all the results at once for easier comparison?) here's the results mentioned. NB: This video is from Jan 2013 and the Intel CPUs mentioned are all out of production in favour of newer models.

              Crysis 2 - 1080p: 8350=29.84fps, 3570k=39.52fps, 3770k=39.52fps, 3820=39.64fps
              Crysis 2 - 1440p: 8350=20.96fps, 3670k=22.76fps, 3770k=22.76fps, 3820=23.76fps

              Crysis Warhead - 1080p: 8350=35.64fps, 3570k=26.84fps, 3770k=38.44fps, 3820=26.84fps

              Black Mesa Source - 1080p: 8350=262.60fps, 3570k=196.32fps, 3770k=197.44fps, 3820=196.32fps
              Black Mesa Source - 1440p: 8350=188.80fps, 3570k=121.12fps, 3770k=111.92fps, 3820=no score

              Metro 2033 - 1080p: 8350=36.44fps, 3570k=21.20fps, 3770k=27.48fps, 3820=21.32fps
              Metro 2033 - 1440p: 8350=20.44fps, 3570k=12.80fps, 3770k=12.96fps, 3820=no score

              Trine 2 - 1080p: 8350=58.00fps, 3570k=38.80fps, 3770k=47.28fps, 3820=31.96fps
              Trine 2 - 1440p: 8350=38.64fps, 3570k=23.60fps, 3770k=27.84fps, 3820=no score

              Result: AMD 8350 outperforms all Intel processors tested in 3 out of 5 tests.
              Notes: 3820 results at 1440p missing in many cases. Xsplit results ignored because Shadowplay and VCE.

              5 gaming tests from a single source are nowhere near enough to draw a conclusion. Let's try some more:

              8350 loses against Intel 3xxx series CPUs in Skyrim and Shogun 2: Total War tests [bit-tech.net]
              8350 against Intel 3xxx series CPUs, loses badly in Far Cry 2, starts equal and falls away as resolution rises in Crysis 2 [guru3d.com]
              8350 against Intel 3770k, manages parity in x264 encoding test, loses in Cinebench R15 and R11.5, Batman Arkham City and Shogun 2 [techradar.com]
              8350 & other similar AMD chips do very badly against Intel 3xxx series CPUs in Skyrim, Diablo 3, Dragon Age Origins, Dawn of War II, World of Warcraft and Starcraft 2 [anandtech.com]

              NB: Not specifically selected to find benchmarks in which the 8350 loses, I just grabbed some at random after Googling "8350 gaming benchmarks".

              Conclusion: Intel chips are worth paying the extra for if you want the extra performance they will generally provide in most cases.
              Advice: There's plenty of information on how any piece of hardware will perform, check it out before you buy it. That is all.

        • (Score: 3, Informative) by sudo rm -rf on Monday March 30 2015, @09:47AM

          by sudo rm -rf (2357) on Monday March 30 2015, @09:47AM (#164140) Journal

          I had an AMD Athlon X2 running for years without problems. The only reason I eventually replaced it 1.5 years ago was because it couldn't run X-Rebirth.

          Anyway, I went with a FX-8350, which was immediately burnt by the new mainboard (ASUS if I remember correctly) because of an outdated firmware. Because the AM3+ socket is backward compatible with AM3, I plugged in my good old Athlon to check BIOS-version etc, only to find it frying too. To cut a long story short: Dear fellow soylentils, check your BIOS-Revision before installing relatively freshly available CPUs.

        • (Score: 3, Interesting) by wantkitteh on Monday March 30 2015, @11:39AM

          by wantkitteh (3362) on Monday March 30 2015, @11:39AM (#164177) Homepage Journal

          A friend actually showed me that video yesterday. Couple of major issues with it:

          1) Very limited selection of games presented. AMD CPUs only beat Intel CPUs in a very short list of games, although that'll get longer now the XB1 and PS4 have 8-core CPUs.
          2) The Xsplit tests don't state whether they're using CPU or GPU encoding - with GPU encoders being very common today, if they were running CPU encoding, the tests are pointless.

          I'm not against AMD at all - I have a Phenom II X6 myself, but I can't deny that single-thread bottlenecked applications, like quite a few video games, run better on my friend's Core i3, even though he has a less powerful graphics card and uses higher detail settings. You want to bring your Phenom II to it's knees? Install World of Tanks and run it in improved graphics mode - an i3 will easily manage 100fps+ almost all the time, Phenom II will be lucky to hit 60fps. Sad but trust. I wouldn't want to run Handbrake on an i3 over my Phenom, but that perfectly illustrates the issue - AMD bet on applications becoming more multithreaded and that's taking much longer to translate into reality than they hoped, so YMMV is extremely true in the AMD vs Intel battle. Your word means nothing - our application usage profiles are likely to be very different.

          I've been going over the all the benchmarks I can find from every source I can get my hands on, they all show that the G3258 marginally beats the 860K in 9 out of 10 game benchmarks. This is why Toms Hardware recommend the Pentium Dual Core over the Athlon 860K in their budget build. Oh yeah, that and the Kabini-based Athlons can't hold a candle to the old Piledriver ones performance wise, despite vastly improved TDP and power consumption.

          Bottom line - AMD do not sell a desktop processor that can consistently match the performance of the i7-4770. In specific use cases and applications, sure - you'll get better value for money, better performance/$£, but not better raw performance figures across a wide enough range of tests to make it "better".

        • (Score: 3, Interesting) by Jaruzel on Monday March 30 2015, @11:40AM

          by Jaruzel (812) on Monday March 30 2015, @11:40AM (#164179) Homepage Journal

          I WANT to agree with you, I really do. I can't though.

          Mid last year I built a new PC to replace my main desktop in the house. Be a little bit strapped for cash, and not really gaming at the time, I opted for a AMD A10-7850k - All the online benchmarks said it was a nice fast chip for desktop use plus light gaming.

          It's shit. Under Windows 8.1* it regularly maxxes out all it's cores on tasks that in my opinion shouldn't bring it to it's knees. i.e I run handbrake to encode a DVD to MKV = All 4 cores running at 100%, a stuttering mouse, and trying to do anything else takes an age. It seems to me that the Kaveri range of APUs are very good at doing ONE thing at a time. If you want them to multi-task properly, then forget it.

          Before anyone asks, yes latest drivers for chipset, yes fully patched windows, yes decent motherboard (MSI), yes fast HDD.

          So... next time, it's gonna be Intel all the way. I should have never strayed.

          -Jar

          * No, I'm _not_ installing Linux - all you *nix fanbois can save yourselves a bit of time by not telling me I should.

          --
          This is my opinion, there are many others, but this one is mine.
          • (Score: 3, Interesting) by Hairyfeet on Monday March 30 2015, @11:55AM

            by Hairyfeet (75) <{bassbeast1968} {at} {gmail.com}> on Monday March 30 2015, @11:55AM (#164187) Journal

            I can fix your problem....install Win 7 Pirate Edition and use that until Win 10 is out. The reason why is simple...Win 8 is shit, it really is. The issue is that Metro shit running in the background (even if you are using the desktop try switching to Metro and you'll see it pop up instantly, its because its running 24/7 in the background) along with all the live tiles and crapstore garbage that is constantly running. Its not an AMD thing, I've had i3s and i5s brought into the shop and the performance when Win Mist8ke is blown away? Its like night and day.

            At the end of the day running Win 8? Its like running 3 OSes at the same time, 1.-Desktop, 2.- Metro, 3.- Appstore and live tile crap. I don't care what CPU you run that much extra crap is gonna seriously strain the system, which is why all the gamers and tests you see on the net are running Win 7. I'm running Win 10 on a teeny tiny E350 netbook at the shop and while they had a bad build a month or so back (I'm running fast track) the latest builds? As fast as Win 7 as long as you kill Cortana and Live Tiles.

            So give it a try, I bet you'll be impressed. Hell JayTwoCents has a vid on YouTube running BF4 on your chip with no discrete and it plays great and stays above 30FPS....on Win 7 of course.

            --
            ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
          • (Score: 3, Informative) by wantkitteh on Monday March 30 2015, @12:11PM

            by wantkitteh (3362) on Monday March 30 2015, @12:11PM (#164190) Homepage Journal

            Firstly, video encoding should max all cores on your CPU - it's a very intensive task and Handbrake uses x264, a nicely multithreaded encoder that will use all the cores it can get it's hands on to get the job done ASAP if the job it's been given allows it to. The stuttering mouse... well, you shouldn't expect anything to perform properly while doing a video encode job, but it shouldn't be that bad unless something's badly wrong somewhere. Two ideas:

            1) What TDP do you have your A10 configured to? You can set it to 45W, 65W or 95W in the BIOS and it'll throttle the operating clock speed automatically depending on load and heat generation on the fly - and nothing will load the CPU as heavily as x264. Make sure it's at 95W (and your HSF is up to the job).
            2) How much RAM do you have and how fast is it? x264 isn't hugely memory hungry, but it will eat bandwidth and cause contention and swapping if you don't have any free and try to do something else concurrently.

            Video encoding is actually one of the few use cases where AMD processors consistently beat Intel ones on price/performance. My Phenom II X6 1045t still eats my mate's Core i3 4330 for breakfast when it comes to video encoding. Sucks for gaming, but hey, you can't have everything. My next build is a budget mITX Steam Machine; considered an Athlon X4 860K (basically your APU without the graphics card) but I'm going with a Pentium G3258 with a 7850 GFX card instead for it's marginally better gaming performance at stock speeds - it's lower TDP and power consumption make it a better candidate for overclocking than the Athlon in a tiny mITX case.

            • (Score: 1, Redundant) by Hairyfeet on Monday March 30 2015, @03:55PM

              by Hairyfeet (75) <{bassbeast1968} {at} {gmail.com}> on Monday March 30 2015, @03:55PM (#164316) Journal

              Do NOT buy the Pentium, its a dual core and most of the new games coming down the pipe want quad minimums. As for single core performance? that isn't gonna matter in less than 6 months as Win 10 ships with DirectX 12 which will spread the load over as many cores as it can, up to 8 cores if you have 'em.

              If you wanna build a mini on a budget get the FX6300 [amazon.com] as its in the same ballpark as the Pentium, will curbstomp it when it comes to multitasking (which you are gonna want if you do more than just game) and most of the new boards let you underclock as well as overclock so you can have it run as low as you wish when you aren't gaming. If you prefer using an APU there is the A8-6600K [amazon.com] which as you can see games like Watchdogs run fine [youtube.com] and if something THAT unoptimized for PC runs decently? Most games will. Oh and a plus for the APUs is if you pair it with an AMD GPU? You can take advantage of AMD Zerotech, which will shut off the discrete and just use the APU for tasks like watching videos, that way you only fire up the discrete when you are gaming and the rest of the time it idles, thus lowering power and heat.

              --
              ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
              • (Score: 2) by wantkitteh on Monday March 30 2015, @04:18PM

                by wantkitteh (3362) on Monday March 30 2015, @04:18PM (#164330) Homepage Journal

                Which part of "I'm building an mITX Steam Machine" did you not understand? There are no mITX AM3+ motherboards. The TDP is too high. I have an AMD 7850 available, I don't want an APU. Your recommendation is verging on troll-ish considering you've ignored my stated requirements. I'm basing my selection on several different recommendations, including this one [youtube.com] that shows better gaming performance on the G3258 than the Athlon X4 860K (which I need a more expensive CPU, HSF and motherboard in the build for.)

              • (Score: 1) by wantkitteh on Monday March 30 2015, @06:24PM

                by wantkitteh (3362) on Monday March 30 2015, @06:24PM (#164405) Homepage Journal
                • (Score: 2) by Hairyfeet on Monday March 30 2015, @07:39PM

                  by Hairyfeet (75) <{bassbeast1968} {at} {gmail.com}> on Monday March 30 2015, @07:39PM (#164442) Journal

                  Uhhh they look identical and score within 3 FPS, its also ignoring that more than 30 of the games recently released or about to be released require quad core minimum. GTA V, Witcher 3, Far Cry 4, MGS V, would you like me to continue? You wanna play ANY of those games the PENTIUM WILL NOT WORK, you will get "unsupported platform" and be sent back to the desktop. FX6, hell FX4? All those games WILL run. This is not counting games where you WILL see increased performance if you have a quad, GTA IV, Supreme Commander/FA/2, BFBC2, L4D 1&2, I could go on all day. Also note that DX12 which will be released with Windows 10 this summer makes it a LOT easier for game devs to support up to 8 cores, once its released? That list is gonna shoot up like a rocket.

                  So if you wanna use a dual core for gaming? Enjoy having a short list of games that run well, with the list getting shorter by the month. It would be about as wise as building a new C2D box. Its 2015, time to leave dual cores to grandma kids.

                  --
                  ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
                  • (Score: 2) by wantkitteh on Monday March 30 2015, @08:36PM

                    by wantkitteh (3362) on Monday March 30 2015, @08:36PM (#164461) Homepage Journal

                    Any of those incompatible games available on SteamOS? Oh look, no....

                    • (Score: 2, Disagree) by Hairyfeet on Wednesday April 01 2015, @09:35AM

                      by Hairyfeet (75) <{bassbeast1968} {at} {gmail.com}> on Wednesday April 01 2015, @09:35AM (#165325) Journal

                      I hate to burst your bubble, but SteamOS is a crippled cousin that will end up abandoned. Have you even bothered to look at DX12 yet? Yeah prepare for the curbstomping, nobody gonna give a shit when you are talking over 150% gains over the already stomping DX11 [pcworld.com]. That means devs can create huge worlds filled with awesome graphics in DX12, the odds that after doing that they will bother with SteamOS? Pretty much zip. I mean for fuck's sake I listed just about every major hit of the last couple of years and you yourself admitted SteamOS can't play a single one...what more do you need? Can you even name 5 games that aren't made by Valve on SteamOS that you can't just fucking download straight from the devs and cut out the middleman?

                      See what Gaben said during the launch of SteamOS, it was made because he was crapping his shorts over Win 8 and the crapstore...well that threat is long gone, Steam integrates with Win 10 just like Win 7 and the crapstore is just that, crap. Mark my words it'll be half ass supported for a year or so and when Gaben sees it doesn't even have a shot at beating the Wii U? They will quietly pull the plug. You wanna play games? Buy Windows, hell pirate Windows 7, or buy a console. All you gonna get on SteamOS is mobile titles and casual crap you can get anywhere. If you wanna play casual crap just install Mint and hit Humble Bundle or GOG, both will end up getting you more titles than SteamOS.

                      --
                      ACs are never seen so don't bother. Always ready to show SJWs for the racists they are.
                  • (Score: 0) by Anonymous Coward on Monday March 30 2015, @08:40PM

                    by Anonymous Coward on Monday March 30 2015, @08:40PM (#164463)

                    What, can't answer any of the threads have actually made a point?

        • (Score: 0) by Anonymous Coward on Monday March 30 2015, @12:30PM

          by Anonymous Coward on Monday March 30 2015, @12:30PM (#164197)

          Reeks of fanboi

    • (Score: 2) by wantkitteh on Monday March 30 2015, @02:38PM

      by wantkitteh (3362) on Monday March 30 2015, @02:38PM (#164258) Homepage Journal

      Ashraf Eassa owns shares of Intel. The Motley Fool recommends Intel. Try any of our Foolish newsletter services free for 30 days. We Fools may not all hold the same opinions, but we all believe that considering a diverse range of insights makes us better investors. The Motley Fool has a disclosure policy.

  • (Score: 0) by Anonymous Coward on Monday March 30 2015, @04:16AM

    by Anonymous Coward on Monday March 30 2015, @04:16AM (#164061)

    China will stomp both your asses with x86 clones.

    • (Score: 2) by dyingtolive on Monday March 30 2015, @05:13AM

      by dyingtolive (952) on Monday March 30 2015, @05:13AM (#164073)

      You mean that no one in the know will buy for fear of cooked in badness?

      --
      Don't blame me, I voted for moose wang!
      • (Score: 3, Touché) by Anonymous Coward on Monday March 30 2015, @05:15AM

        by Anonymous Coward on Monday March 30 2015, @05:15AM (#164076)

        You mean after no one will buy American because of actual cooked in badness?

        • (Score: 2) by dyingtolive on Monday March 30 2015, @06:48AM

          by dyingtolive (952) on Monday March 30 2015, @06:48AM (#164092)

          Yeah, basically.

          --
          Don't blame me, I voted for moose wang!
  • (Score: 4, Funny) by aristarchus on Monday March 30 2015, @05:19AM

    by aristarchus (2645) on Monday March 30 2015, @05:19AM (#164077) Journal

    Oh My God! To spell it out. Hairyfeet has submitted an article! And it has be accepted! This is like an article of clothing for a house-elf! I am now free! Thank you, Hairy Potter! Thank you!!!

    • (Score: 2) by c0lo on Monday March 30 2015, @06:42AM

      by c0lo (156) Subscriber Badge on Monday March 30 2015, @06:42AM (#164090) Journal

      Wait! Hairyfoot submitted?

      Yeap. Even more, both of them did.
      (and not the first time either [soylentnews.org]).

      --
      https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
      • (Score: 2) by aristarchus on Monday March 30 2015, @08:21AM

        by aristarchus (2645) on Monday March 30 2015, @08:21AM (#164105) Journal

        Well, excuse me for not noticing!

        • (Score: 1, Funny) by Anonymous Coward on Monday March 30 2015, @08:43AM

          by Anonymous Coward on Monday March 30 2015, @08:43AM (#164124)

          Hairyfeet might get noticed more often if he would shave once in a while.

    • (Score: 0) by Anonymous Coward on Monday March 30 2015, @02:02PM

      by Anonymous Coward on Monday March 30 2015, @02:02PM (#164232)

      Phew... did someone take their shoes off?