Stories
Slash Boxes
Comments

SoylentNews is people

posted by takyon on Saturday September 23 2017, @04:13AM   Printer-friendly
from the wishful-thinking dept.

From the lowRISC blog:

We are looking for a talented hardware engineer to join the lowRISC team and help make our vision for an open source, secure, and flexible SoC a reality. Apply now!

lowRISC C.I.C. is a not-for-profit company that aims to demonstrate, promote and support the use of open-source hardware. The lowRISC project was established in 2014 with the aim of bringing the benefits of open-source to the hardware world. It is working to do this by producing a high quality, secure, open, and flexible System-on-Chip (SoC) platform. lowRISC C.I.C. also provides hardware and software services to support the growing RISC-V ecosystem. Our expertise includes the LLVM Compiler, hardware security extensions and RISC-V tools, hardware and processor design.

[...] lowRISC is an ambitious project with a small core team, so you will be heavily involved in the project's development direction. This role will involve frequent work with external contributors and collaborators. While much of the work will be at the hardware level the post will offer experience of the full hardware/software stack, higher-level simulation tools and architectural design issues.

Some practical experience of hardware design with a HDL such as Verilog/SystemVerilog is essential, as is a good knowledge of the HW/SW stack. Ideally, candidates will also have experience or demonstrated interest in some of: SoC design, large-scale open source development, hardware or software security, technical documentation, board support package development and driver development. Industrial experience and higher degree levels are valued, but we would be happy to consider an enthusiastic recent graduate with a strong academic record.

Informal enquires should be made to Alex Bradbury asb@lowrisc.org.

takyon (thanks to an AC): lowRISC is a project to create a "fully open-sourced, Linux-capable, system-on-a-chip"; it is based around RISC-V, the "Free and Open RISC Instruction Set Architecture", which is meant to provide an extensible platform that scales from low-level microcontrollers up to highly parallel, high-bandwidth general-purpose supercomputers.

Reduced instruction set computer (RISC).

Previously: RISC-V Projects to Collaborate
LowRISC Announces its 0.4 Milestone Release
SiFive and UltraSoC Partner to Accelerate RISC-V Development Through DesignShare


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by bzipitidoo on Saturday September 23 2017, @01:38PM (7 children)

    by bzipitidoo (4388) on Saturday September 23 2017, @01:38PM (#572092) Journal

    Despite its age and issues, x86 is still alive and strong. Intel and AMD have been able to apply many advances to the x86 architecture. Such as, since the Pentium, there's an additional layer of microcode, and the underlying processor of an x86 system actually is RISC. All the inefficient, crufty 1970s era ideas, such as packed decimal arithmetic, stack manipulation, special purpose dedicated registers in combination with a lack of general purpose registers, have been rather neatly sidelined. Sure, a modern x86 CPU can still do all those antiquated instructions, but they are unimportant.

    Most of all, the architecture could be expanded. It has been expanded from 8bit to 16, 32, and now 64bit. More general purpose registers have been added. And it's been further expanded with MMX and SSE, which supports much more parallelism. Additional important capability with atomic test and set instructions was added in the 486.

    What seems the biggest lack is some way to completely remove the cruft. The easiest way to do that is a reboot. Start from scratch with a new hardware design. That's been done many times, but somehow, x86 is still popular. Seems the efficiency gains from dropping useless instructions isn't significant enough to provide a compelling advantage over the x86 architecture.

    So this project is attacking the lack of openness, as well as the excessive complexity. I wish them luck. But one thing I wonder, is the GPU eclipsing the CPU? I hope that thought has their attention, and their System on a Chip can do graphics. If it can't do the little that the still rather feeble Intel integrated HD graphics can do, then I'd say they're on the wrong path.

    Starting Score:    1  point
    Moderation   +3  
       Interesting=3, Total=3
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 3, Informative) by Anonymous Coward on Saturday September 23 2017, @02:27PM (2 children)

    by Anonymous Coward on Saturday September 23 2017, @02:27PM (#572108)

    Seems the efficiency gains from dropping useless instructions isn't significant enough to provide a compelling advantage over the x86 architecture.

    Arm has itself positioned as a replacement for x86 in the mobile environment. Desktop environments had of course the disadvantage that they required Windows support... And Microsoft, until recently, only supported x86. MacOSX was originally also on Power, but left it for x86 as well.

    An other architecture would be possible, but you'll need someone willing to invest quite some money up front to get it going. Raspberry PI has done this, to be honest, quite successful. You can run them as a small-scale computer, but power is severely lacking there (including a few other things).

    • (Score: 2, Disagree) by anotherblackhat on Saturday September 23 2017, @08:28PM (1 child)

      by anotherblackhat (4722) on Saturday September 23 2017, @08:28PM (#572161)

      Arm has itself positioned as a replacement for x86 in the mobile environment.

      I know - It seems like all those x86 smart phones have been completely replaced by Arm, almost as if the x86 was never dominant in the first place.

      Seriously, Arm has always been faster, cheaper, and drawn less power than the equivalent x86 CPU.
      The only thing x86 has going for it is inertia.

      • (Score: 1, Insightful) by Anonymous Coward on Saturday September 23 2017, @11:08PM

        by Anonymous Coward on Saturday September 23 2017, @11:08PM (#572185)

        The atom processors are faster, but generally draw more power.

        When it comes to anything beyond tablets/phones, x86 does better on everything, and includes a fairly open-source friendly GPU unlike ARM.

  • (Score: 0) by Anonymous Coward on Saturday September 23 2017, @11:35PM

    by Anonymous Coward on Saturday September 23 2017, @11:35PM (#572194)

    To pick a nit, the 8008 was 8-bit.
    The 8086 was 16 bit.
    The 8088 has a 16-bit internal architecture identical to the 8086 but it talked to the world via 8-bit busses.

    The Motorola 68008 was another necked-down uP and might have given us an industry-dominant flat memory model but that company was slow getting that chip to market and IBM used Intel's 8088 in their PC.

    -- OriginalOwner_ [soylentnews.org]

  • (Score: 0) by Anonymous Coward on Saturday September 23 2017, @11:47PM

    by Anonymous Coward on Saturday September 23 2017, @11:47PM (#572195)

    Even the 80286 [uaf.edu] had microcode.

  • (Score: 1) by anubi on Monday September 25 2017, @10:43AM (1 child)

    by anubi (2828) on Monday September 25 2017, @10:43AM (#572613) Journal

    I can't quite figure out why anything between the 8051 and 386SX survived.

    To me, the 8051 series is ideal for embedded stuff - when cost per device is important, but the devices aren't very complex.

    I thought the 8086 was barely OK, I hated the 80286 and its segmentation registers, and I sneered at the architecture until the 386SX finally arrived with again a nice contiguous memory accessing scheme, which was what I had all along with the 68HC000.

    Personally, I am highly into Arduino, as most of my stuff is cost-sensitive and not terribly complex. I like the Parallax Propeller series with its eight core chips as I/O, but I am afraid to design it into industrial stuff as I fear one day Parallax may stop making them, and as neat as they are, they are not second sourced and they haven't caught on as much as I would like to see.

    Those Propeller chips are extremely powerful if one has real-time processes to manage... for instance HobbyTronix has some VGA controllers built with them, and I am interested to see if I can convert them to run from the I2C bus instead of the serial bus. I also see it should be possible to run three independent VGA displays from one chip... that would be quite useful to me in making online diagnostic and reporting tools to let me observe a plant controller without bringing in a mess of diagnostic equipment... rather the propeller hung on the I2C would spew out info like an OBD-II reader does for a car.

    And not tie up any more of my precious I/O lines...

    --
    "Prove all things; hold fast that which is good." [KJV: I Thessalonians 5:21]
    • (Score: 2) by bzipitidoo on Monday September 25 2017, @03:51PM

      by bzipitidoo (4388) on Monday September 25 2017, @03:51PM (#572694) Journal

      Yes, you're right, but I'd skip the 386 as well. If you ever target the x86, go for the 486 as the base, or just use the 8086. The big lack in the 386, as I learned from others, is that it has no atomic test and set instruction. It can be worked around, but it's a lot more painful to implement semaphores and other such parallel and OS functionality on a 386. It's no accident that the Linux kernel maintainers dropped support for the 386 just a few years ago. The 286's support for multitasking OSes is even worse. Took Intel far too long to get that right-- should've got it right in the 286, instead of bungling it like they did for that iteration and for the 386.