Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Wednesday May 11 2016, @04:54AM   Printer-friendly
from the feeding-the-world dept.

http://www.nextplatform.com/2016/05/10/shared-memory-pushes-wheat-genomics-boost-crop-yields/

Wheat has been an important part of the human diet for the past 9,000 years or so, and depending on the geography can comprise up to 40 percent to 50 percent of the diet within certain regions today. But there is a problem. Pathogens and changing climate are adversely affecting wheat yields just as Earth's population is growing, and the Genome Analysis Center (TGAC) is front and center in sequencing and assembling the wheat genome, a multi-year effort that is going to be substantially accelerated by some hardware and updated software.

[...] TGAC, which gets its funding from the Biotechnology and Biological Science Research Council (BBSRC) in England, provided the computing power that let researchers deliver a first draft of the wheat genome last November, a milestone in the project. That effort delivered a genome assembly with 98,974 genes, which TGAC reckons is about 91 percent of the total genome for the plant; it weighs in at 13.4 GB, which is pretty fat for a text file. But work still needs to be done to fill in gaps in the wheat genome, which is not expected to be fully completed until around 2018 or so. (BBSRC invested over £509 million in various life sciences projects in 2014 and 2015 and has spent over £100 million alone on wheat research in the past decade.)


Original Submission

This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 0) by Anonymous Coward on Wednesday May 11 2016, @05:51AM

    by Anonymous Coward on Wednesday May 11 2016, @05:51AM (#344521)

    > Pathogens and changing climate are adversely affecting wheat yields just as Earth's population is growing,

    But really, when has the Earth's population ever not been growing?

    • (Score: 2) by stormwyrm on Wednesday May 11 2016, @07:23AM

      by stormwyrm (717) on Wednesday May 11 2016, @07:23AM (#344526) Journal
      Well, there was the time of the Black Death, it seems. There was an decline in world population estimated to 450 million down to 350–375 million in the 14th century, and world population levels did not recover to their pre-plague levels until the 16th century or so. There was also the Great Famine just before that. There is the Toba Catastrophe theory which postulates that the eruption of the Toba volcano in 70,000 BC reduced the population of surviving humans down to only 3000 to 10,000 or so individuals. But yes, for the vast majority of prehistory and history, human population has been on the rise.
      --
      Numquam ponenda est pluralitas sine necessitate.
  • (Score: 0) by Anonymous Coward on Wednesday May 11 2016, @06:40AM

    by Anonymous Coward on Wednesday May 11 2016, @06:40AM (#344523)

    > Biotechnology and Biological Science Research Council (BBSRC) in England

    Note that the main research lab is geographically situated in England, but this is one of the UK's research councils.

  • (Score: 1, Funny) by Anonymous Coward on Wednesday May 11 2016, @08:03AM

    by Anonymous Coward on Wednesday May 11 2016, @08:03AM (#344528)

    I want a wheat plant that is so completely Glutenous that it infects the entire biosphere, and thus kills off all those pansys who THINK they have celiac disease, when really they are just in remission from Lyme Disease that they caught from a mosquito carrying Chronic Fatique Because I Could Not Give A Fuck Disease. (CFBICNAFD, a difficult acronym). But it will soon all be over, because once you post this on the internet, the power of positive thinking of thousands of other sentient beings will help you recover from your imaginary ailment. Unless, of course, you were foolish enough to post somewhere like SoylentNews! Oh, merde! You did? Wow, no hope for you, then. I suggest you just get used to gluten. It actually makes bread much more tasty. If you don't DIE from it. Gluten, it's what for dinner. Gluten, it is already calling from inside your digestive tract! Gluten, not really a cause of anyone's medical condition. Could I interest you in some Cooties? North Carolina Trans Cooties? Very best stuff, never once actually exposed in a bathroom! See, back to wheat.

    • (Score: 2, Interesting) by Anonymous Coward on Wednesday May 11 2016, @11:49AM

      by Anonymous Coward on Wednesday May 11 2016, @11:49AM (#344553)

      "I want a wheat plant that is so completely Glutenous that it infects the entire biosphere, and thus kills off all those pansys who THINK they have celiac disease, ..."

      Most of the people that want to go gluten free are following a fad. The ones with Celiac, know it because they had severe reactions. Some went to the emergency room.

      There are some that have been able to reduce blood glucose and other measurable pre diabetic indicators enough that they could stop taking meds. For these people, it is not imaginary.

  • (Score: 0) by Anonymous Coward on Wednesday May 11 2016, @09:51AM

    by Anonymous Coward on Wednesday May 11 2016, @09:51AM (#344532)
    biff@tenderloin:~$ free -m

                  total        used        free       wheat  buff/cache   available
    Mem:          32145         909       28222          45        3013       31052
    Swap:         32735           0       32735
  • (Score: 2) by LoRdTAW on Wednesday May 11 2016, @12:42PM

    by LoRdTAW (3755) on Wednesday May 11 2016, @12:42PM (#344561) Journal

    When I think of shared memory I think of shm_open() & mmap(), meaning sharing a piece of memory between applications on the same computer. What they are talking about here is NUMA, or non uniform memory architecture. Heading should read "unified memory", not shared memory.

    In large supercomputers, you have two designs: unified memory (NUMA) or distributed memory (cluster). NUMA basically means you have multiple CPU-memory nodes connected together which allows all CPU's and all memory to become one giant machine. A good example is the Opteron and newer Xeons. They have multiple CPU cores and Memory controller per socket and an ccNUMA bus between sockets creating a single cache coherent multiprocessor system. But what if you want to unify the memory between multiple physical nodes? SGI was well known for this with their old Origin and Onyx systems using NUMAlink. They still use their NUMAlink technology for linking multiple CPU-memory chassis together though they now use Intel Xeon's instead of MIPS or Itanium. These systems tend to be very specialized in terms of hardware using proprietary interconnects and chipsets on the motherboards tied directly to the CPU.

    In a shared memory or cluster system you have individual nodes networked together and software to divvy up a compute load. This is where software like MPI comes in which is the basis for your classic Beowulf cluster. It allows a compute load to be broken up into pieces and sent out to individual nodes. Each node is an island and its memory and CPU are local to the running compute process which is working on a piece of data. They also tend to lean toward commodity hardware as they don't require any highly specialized hardware on the motherboard aside from PCI slots.

    Each system has it's strengths and weaknesses. In a NUMA system, the achilles heel is bandwidth between nodes which limits scaling up. You need a lot of bandwidth and low latencies to keep everything in sync and keep data moving. But they are the system to use if you work with very large in-memory datasets that are terabytes in size. Cluster systems can scale up very well and excel at number crunching. They are great for simulating or rendering things where the datasets arent very big but the compute time is large. So in summary: editing a 6TB genome file - NUMA. Rendering toy story or simulating a black hole - cluster.

    • (Score: 0) by Anonymous Coward on Wednesday May 11 2016, @01:22PM

      by Anonymous Coward on Wednesday May 11 2016, @01:22PM (#344581)

      > Heading should read "unified memory", not shared memory.

      Eh. That's a useless nitpick. I've been working in HPC for over 20 years. My first employer in the field was one of the earliest to do commercial NUMA systems and its always been clear that when talking about hardware designs "shared" meant shared between CPUs. Whether it was UMA or NUMA was another distinction, but no one would ever hear "shared memory" and wonder if it might mean distributed memory.