Stories
Slash Boxes
Comments

SoylentNews is people

posted by FatPhil on Thursday June 22 2017, @01:33PM   Printer-friendly
from the Pi-in-the-sky dept.

Submitted via IRC for TheMightyBuzzard

The results are in: The Raspberry Pi 3 is the most desired maker SBC by a 4-to-1 margin. In other trends: x86 SBCs and Linux/Arduino hybrids get a boost.

More than ever, it's a Raspberry Pi world, and other Linux hacker boards are just living in it. Our 2017 hacker board survey gives the Raspberry Pi 3 a total of 2,583 votes — four times the number of the second-ranked board, the Raspberry Pi Zero W.

[...] Note that by "votes" we are referring to Borda rankings that combine 1st, 2nd, and 3rd choice rankings [...]

So, which if any credit-card-sized computers are you lot playing around with?

Source: http://linuxgizmos.com/2017-hacker-board-survey-raspberry-pi-still-rules-but-x86-sbcs-make-gains/


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by kaszz on Friday June 23 2017, @02:02AM (7 children)

    by kaszz (4211) on Friday June 23 2017, @02:02AM (#529763) Journal

    What is your opinion on the return on effort to program a 8-bit microcontroller for the benefit of small code size and definitely low power usage vs using a 32-bit microcontroller that nowadays seems to have an equivalent cost. Something on the order of $2 vs $5. The benefit being that you can keep the knowledge and development environment for both small and larger projects. Same source code tricks, compiler, loader, cables, architecture, device setup etc.

    Actual MCUs (architectures) can be ATmega, ARM, 8051, MIPS etc.

    So if you want to build a network where some nodes will measure temperature, turn light on/off and others will collect small video clips. Would it be better to go full 32-bit or separate into 8-bit for simple nodes and 32-bit for nodes needing some capacity?

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by TheRaven on Friday June 23 2017, @09:22AM (3 children)

    by TheRaven (270) on Friday June 23 2017, @09:22AM (#529927) Journal
    The ARM mBed systems are great for prototyping (expensive per unit) and there's little cost difference between a Cortex M0 and an 8-bit microcontroller: basically all of the cost at this size is in the packaging, not the silicon.
    --
    sudo mod me up
    • (Score: 2) by kaszz on Friday June 23 2017, @03:26PM (2 children)

      by kaszz (4211) on Friday June 23 2017, @03:26PM (#530044) Journal

      I noticed the cost.. about 45 US$ ;(
      Would almost be profitable to DIY..

      • (Score: 2) by TheRaven on Saturday June 24 2017, @11:55AM (1 child)

        by TheRaven (270) on Saturday June 24 2017, @11:55AM (#530534) Journal
        Yup, it's a shame. The difference between an M0 and M5 mBed system is very small because they also have another M-profile core emulating a FAT filesystem and programming the main one via JTAG. The intent for the systems is that you use them to prototype something that you'll deploy on M-profile cores but then ship on ones without all of the mBed programming stuff.
        --
        sudo mod me up
        • (Score: 2) by kaszz on Sunday June 25 2017, @02:12AM

          by kaszz (4211) on Sunday June 25 2017, @02:12AM (#530750) Journal

          Considering the price it's almost a no brainer to compete with these DIP-ARM-prototypes on price and function.

  • (Score: 2) by cafebabe on Friday June 23 2017, @04:28PM (2 children)

    by cafebabe (894) on Friday June 23 2017, @04:28PM (#530067) Journal

    An ARM micro-controller which only executes 16 bit Thumb instructions is a relatively good choice in typical circumstances. However, there are conditions where ARM should be avoided. This is mostly a generic consideration of 8 bit instruction sets, 16 bit instruction sets, 32 bit instructions, larger instruction sets and miscellaneous instruction sets.

    An 8 bit instruction set may be close to optimal code density. In a general purpose computer, this minimises RAM (or virtual memory) required for program. However, In a general purpose computer, an 8 bit instruction set provides the least defense against executable code passing through ASCII filters or suchlike. In a device with finite ROM, an 8 bit instruction set provides maximum functionality. A 16 bit instruction set and the same size ROM may attain slightly more than 90% of the code density. Two address instructions, such as MOV, are generally of similar size but 16 bit loop and jump instructions may be larger.

    GCC uses a legacy algorithm for register allocation which is a middling choice suitable for a desktop environment. With -fira-region=one, GCC uses all available registers and correctly counts the cost of instruction prefixes when accessing extended registers. [soylentnews.org]

    If your code has an unusual quantity of case statements or subroutine calls, an 8 bit instruction set may be the best choice. If your code has sections of heavy computation, a micro-controller which can switch between 16 bit instructions and 32 bit instructions may be the best choice. In typical cases, a 16 bit instruction set is good enough for new projects but may be problematic for legacy projects which already target an 8 bit instruction set.

    --
    1702845791×2
    • (Score: 2) by kaszz on Friday June 23 2017, @04:51PM (1 child)

      by kaszz (4211) on Friday June 23 2017, @04:51PM (#530077) Journal

      I'm just thinking that when the minimum clock frequency for ARM is 48 MHz and memory sizes in 128 kB or more. It will not matter much that code density goes down? ie any advantages are nullified by the pure capacity in speed and memory.

      • (Score: 2) by cafebabe on Friday June 23 2017, @06:11PM

        by cafebabe (894) on Friday June 23 2017, @06:11PM (#530128) Journal

        For a micro-controller with 4KB of ROM, code density is very important. For a micro-controller with 64KB ROM, it is probably acceptable to be within 10% of optimal code density. However, as the size of the program increases, processor complexity acts as a magnifier for size and bandwidth. Where energy consumption is of minimal concern, there comes a point where all the crazy 100 million transistor stuff becomes worthwhile. Stuff like speculative execution, object-oriented branch prediction, 144 virtual registers and the associated out-of-order reconciliation hardware. For example, a 4GHz Xeon core with a peak instruction decode of 5 bytes per cycle has a peak execution rate of 20GB/s per core - and many parties with high density legacy code want that pushed further. Back in the land of sanity, PIC, AVR, SuperH, MIPS and ARM are all useful.

        --
        1702845791×2