Stories
Slash Boxes
Comments

SoylentNews is people

posted by on Friday April 10 2015, @01:21AM   Printer-friendly
from the stay-on-my-lawn-for-a-long-long-time dept.

From the phys.org article:

As modern software systems continue inexorably to increase in complexity and capability, users have become accustomed to periodic cycles of updating and upgrading to avoid obsolescence—if at some cost in terms of frustration. In the case of the U.S. military, having access to well-functioning software systems and underlying content is critical to national security, but updates are no less problematic than among civilian users and often demand considerable time and expense. That is why today DARPA announced it will launch an ambitious four-year research project to investigate the fundamental computational and algorithmic requirements necessary for software systems and data to remain robust and functional in excess of 100 years.

The Building Resource Adaptive Software Systems, or BRASS, program seeks to realize foundational advances in the design and implementation of long-lived software systems that can dynamically adapt to changes in the resources they depend upon and environments in which they operate. Such advances will necessitate the development of new linguistic abstractions, formal methods, and resource-aware program analyses to discover and specify program transformations, as well as systems designed to monitor changes in the surrounding digital ecosystem. The program is expected to lead to significant improvements in software resilience, reliability and maintainability.

DARPA's press release and call for research proposals.

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 5, Interesting) by bzipitidoo on Friday April 10 2015, @05:07AM

    by bzipitidoo (4388) on Friday April 10 2015, @05:07AM (#168632) Journal

    Not a chance. In the past 30 years, we've moved from 8 bit to 16, 32, and 64 bit systems. Every one of those moves required a lot of reworking. You might think after moving from 16 to 32, we'd have it down, and the shift to 64 bit would be easy, but no. Many programs have an implicit limit on the amount of data they can handle, often restricted to what 32bit addresses allow, and must be extensively rewritten, not just recompiled, to expand their capacity. Systems have changed so much in so many other ways. Hard drives took a big jump from 40M to 500M in the mid 90s, and that killed much of the interest in compressed file systems. The 80486 introduced some new operations that are key to running a multitasking OS. Graphics computations have shifted hugely from CPUs driving primitive VGA graphics without any GPU at all, to dedicated massively parallel GPUs. Took a massive rewrite of software to properly utilize that change, and we're still working on it. That's the reason the code for something like the original Doom game engine is no longer practical or particularly interesting-- just isn't relevant to current graphics. It's also why XWindows so badly needs a redesign, and projects like Wayland have sprung up. The xlib part of XWindows is full of 1980s cruft for having the CPU draw lines and other such primitive operations that GPUs do now.

    OSes have also changed massively. In the days of DOS, everyone provided their own graphics drivers, and programs were quite free to just take over the system and ignore DOS. Protected mode was another huge advance that empowered a massive shift in OS technology, which then drove a big rewrite of a great deal of software to make apps more aware of system facilities. For instance, no DOS program had code to handle the "clipboard", and, without help, can't participate in the copying and pasting between apps that is easy and routine now. Also, socket programming used to be a niche, now, with the Internet everywhere, networking libraries are much more important. Early Linux used this "a.out" executable file format and libc5. Changing to ELF and libc6 was another big move that required much reworking, a simple recompile was often not enough. Relatively new in hardware support is the No Execute bit for virtual pages. There could still be programs that deliberately modify their own machine code, and all those will no longer work on a system that uses a No Execute bit, they must be modified. Who knows what the future will bring in the way of advances? Virtual machine support is still new, and still difficult to do cleanly on a PC.

    I don't think computing is settled enough yet to think of 100 year lifetimes. Programming languages are more numerous and divergent than ever, with only a broad consensus that Structured Programming, OOP, and function programming are all good, but no agreement on the details.

    We're still stuck with a lot of legacy PC design. Shifting away from the antiquated PC platform to finally get rid of that, will require much work.

    Starting Score:    1  point
    Moderation   +3  
       Interesting=3, Total=3
    Extra 'Interesting' Modifier   0  
    Karma-Bonus Modifier   +1  

    Total Score:   5  
  • (Score: 3, Interesting) by tftp on Friday April 10 2015, @06:38AM

    by tftp (806) on Friday April 10 2015, @06:38AM (#168654) Homepage

    You are describing existing execution environments. They all are unsuitable, of course, that's why DARPA is asking for a solution.

    I would think that the desired solution will come with its own, sufficiently abstract language and I/O, and all that can run on any hardware that can execute the language (interpreted, or compiled into IL, or whatever.) This might work for tasks that are simple and abstract, like calculation of digits of Math.PI. However any software that operates hardware probably cannot be portable enough to do the job with an acceptable efficiency. Sure, you can render a modern FPS with merely setPixel() API, but that would not be such a great idea - especially if future monitors have not only (X,Y) but Z as well.

    To rephrase a classical joke, you can write software that will remain usable for 100 years. But nobody will want to use it, except few very special applications, like control circuits. You can run Windows 3.1 today, in a VM if you must; but why would you want to do that if the only extermal connection in that OS is a CD ROM and a floppy? It's pretty hard to design software that is not only functional so far in the future, but is also useful. Most of the software today is made for a specific purpose, be it to control a TV set or to decode a compressed audio file and play samples via some audio hardware. They have no value outside of that compression format and that audio wave API.

    This DARPA contract probably will end up taking several years, several million dollars, and will deliver a souped-up VM that will be capable of running a well defined execution environment. Perhaps it will have some abstraction capabilities in the hardware. For example, if it has video cameras, you can enumerate them, you can find out their orientation, resolution, day/night settings... you can poll for LIDARs, propulsion, energy sources - all the stuff that you could find in, say, a robot. You can expand this introspection to batteries, RAM, thermal management. You would be able then to write software that can run in that environment, inspect available functions and make use of those that are relevant. Does it appear to be practical? Hard to say. But it surely will be immediately profitable. It will also be very hard to be certain that the product works correctly in every combination of peripherals that come online and offline as they please.