A Facebook job posting indicates the company is looking to design its own system-on-a-chip or Application Specific Integrated Circuit (ASIC):
Facebook Inc. is building a team to design its own semiconductors, adding to a trend among technology companies to supply themselves and lower their dependence on chipmakers such as Intel Corp. and Qualcomm Inc., according to job listings and people familiar with the matter.
The social media company is seeking to hire a manager to build an "end-to-end SoC/ASIC, firmware and driver development organization," according to a job listing on its corporate website, indicating the effort is still in its early stages.
The Menlo Park, California-based company would join other technology giants tackling the massive effort to develop chips. In 2010, Apple Inc. started shipping its own chips and now uses them across many of its major product lines. Alphabet Inc.'s Google has developed its own artificial intelligence chip as well.
Also at TechCrunch and The Verge.
(Score: 4, Interesting) by idiot_king on Thursday April 19 2018, @01:54PM (14 children)
Google with their AI chips, Apple rejecting Intel, now FB with theirs. Increasingly becoming separate in order to corner their own markets.
Soon your life will be defined by which logo you happen to wear, because nothing will be interoperable. So you get the FB phone, along with the FB self-driving car, with the FB refrigerator, and the FB diapers that report when your babies need to be changed.
Well you know I do have one good thing to say about capitalism, at least you get a choice of which logo you want your house plastered in, haha! (not funny btw)
(Score: 4, Insightful) by Snotnose on Thursday April 19 2018, @02:11PM
Doubtful. More like the 80s and 90s where there were several companies making CPUs. More brains trying to solve very similar problems yields more good solutions, pretty soon company A will grab the best of companies B and C for their chips. Companies B and C are of course doing the same. I predict in 10-15 years we'll have a couple big players e.g. (x86, ARM) and a handful of niche players (SH-4, MIPs, PIC).
Should be interesting.
Why shouldn't we judge a book by it's cover? It's got the author, title, and a summary of what the book's about.
(Score: 1, Insightful) by Anonymous Coward on Thursday April 19 2018, @02:16PM
There will be an even easier way to avoid people; don't wear their logos.
(Score: 2) by takyon on Thursday April 19 2018, @02:19PM (1 child)
You can't buy any of Google's AI TPU chips, and probably won't be able to buy Facebook's either - they'll be using them for machine learning and in their servers. If Apple does switch from Intel, it will be using ARM chips, not really what you'd consider not interoperable.
Sure you are.
[SIG] 10/28/2017: Soylent Upgrade v14 [soylentnews.org]
(Score: 2) by Freeman on Thursday April 19 2018, @03:35PM
ARM is the definition of not interoperable when compared with x86 and x86-64. That's increasingly becoming untrue, because of the want to compile one set of code for multiple architectures.
Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"
(Score: 2) by carguy on Thursday April 19 2018, @04:33PM
> Soon your life will be defined by which logo you happen to wear,...
(car analogy)
We've seen this behavior pattern before--in USA there used to be Chevy families that would have nothing to do with Mopar (Chrysler) lovers. And Ford guys/gals were off in their own corner too.
(Score: 3, Insightful) by DannyB on Thursday April 19 2018, @05:09PM (3 children)
I'm all for more architectures. Intel's architecture is decades old. Starting with a design having segment registers in order to maintain assembly source code compatibility with the 8080. Geez. And the baggage it carries.
How about if instead of each company designing their own chips, they get together as a group, sort of like with Open Compute. This project designed open standard racks, power supplies, and basically everything in a data center right down to the motherboards. In theory if you buy from Open Compute vendors, your equipment should be interchangeable. Why not do this with processors, or at least motherboard / processor combinations?
I agree with the idea that trying out more new architectures is a good idea. Let's introduce some evolution and survival of fittest and see what benefits it leads to.
As a software guy: I vote for more and more cores! Developers need to learn to use those. We have plenty of frameworks. It's not that hard with a little learning. It's mostly a change in thinking. Always be looking for opportunities to split things into parallel work -- but realize the overhead costs per workload unit so that you do it efficiently.
Oh, and hey, how about an OPEN architecture while you're at it. Let's try to have MORE manufacturers not fewer.
People today are educated enough to repeat what they are taught but not to question what they are taught.
(Score: 3, Insightful) by fustakrakich on Thursday April 19 2018, @05:44PM (1 child)
You seem to be under the mistaken impression that these people want open markets. Nothing could be further from the truth. Each is seeking out more exclusivity for themselves, for instance, with the abuse of copyright/patent law. Amongst all these companies the customer is the common enemy, to be conquered and held captive, with harsh regulations on how they can use the tech (DRM, DMCA, etc...)
La politica e i criminali sono la stessa cosa..
(Score: 2) by DannyB on Thursday April 19 2018, @08:00PM
Sadly, you're probably right.
That, despite the benefits they could mutually have by cooperating.
People today are educated enough to repeat what they are taught but not to question what they are taught.
(Score: 1, Informative) by Anonymous Coward on Thursday April 19 2018, @06:15PM
facebook is involved in the open compute initiative. maybe this will be too?
(Score: 3, Insightful) by MichaelDavidCrawford on Thursday April 19 2018, @07:46PM (4 children)
It will also be more resistant to malware
Yes I Have No Bananas. [gofundme.com]
(Score: 0) by Anonymous Coward on Friday April 20 2018, @08:17AM
This may be the only way they know for sure they don't have malware embedded in the silicon itself like INTEL's "management engine".
If I had secrets - secrets that I was accountable for - and I am not talking "business-grade" secrets that if leaked, oh well, kind of thing.... I could not sleep at night if they were in any modern system. Way too many backdoors. I would be like a bank president, trying to sleep, but knowing the safe where I have everyone's money, is nothing more than some drywall and 2x4's. And knowing my depositors are going to be at me, personally, if someone broke in and took their life savings.
I mean, there is *security*, and "business-grade" security. One *has* to work... the other is convenient and mostly theater ... and marketed to the executive class who are above accountability - and are more impressed by presentation than substance.
(Score: 1, Insightful) by Anonymous Coward on Friday April 20 2018, @09:20AM
Facebook is malware.
(Score: 2) by DannyB on Friday April 20 2018, @04:45PM
It won't be more resistant to malware. A microprocessor just executes instructions. It doesn't know if they are maleware.
What it will do is to fracture the size of teams that do analysis on machine code malware.
People today are educated enough to repeat what they are taught but not to question what they are taught.
(Score: 2) by MichaelDavidCrawford on Friday April 20 2018, @10:07PM
Malware writers have to be especially good at assembly code and with machine debuggers.
There was a time that there was very, very little malware for the PowerPC MacOS, for the simple reason that all the malware coders focused on x86.
If we had a hundred different instruction set architectures it would be damn near impossible to attack them all.
I understand that each of the root name servers runs a different OS from all the others.
Yes I Have No Bananas. [gofundme.com]