Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

What is your favorite keyboard trait?

  • QWERTY
  • AZERTY
  • Silent (sounds)
  • Clicky sounds
  • Thocky sounds
  • The pretty colored lights
  • I use Braille you insensitive clod
  • Other (please specify in comments)

[ Results | Polls ]
Comments:63 | Votes:97

posted by CoolHand on Tuesday August 30 2016, @11:17PM   Printer-friendly
from the looking-for-little-green-men dept.

The BBC reports that the year-long simulation of near isolation in preparation for a trip to Mars, has ended successfully.

A team of six people have completed a Mars simulation in Hawaii, where they lived in near isolation for a year.

Since 29 August 2015, the group lived in close quarters in a dome, without fresh air, fresh food or privacy.

[...] The Nasa-funded study run by the University of Hawaii is the longest of its kind since a Russian mission that lasted 520 days.

Having survived their year in isolation, the crew members said they were confident a mission to Mars could succeed.

"I can give you my personal impression which is that a mission to Mars in the close future is realistic," Cyprien Verseux, a crew member from France, told journalists. "I think the technological and psychological obstacles can be overcome."


Original Submission

posted by CoolHand on Tuesday August 30 2016, @09:43PM   Printer-friendly
from the stickin-it-to-the-man dept.

Kim Dotcom has found a way to attract more attention and potentially embarrass the U.S.... live streaming video of his court battle:

Internet entrepreneur Kim Dotcom will be allowed to livestream his legal bid to halt his extradition to the United States, a New Zealand judge ruled Tuesday. Dotcom and three of his colleagues are appealing a December lower-court decision which allows them to be extradited to the U.S. to face conspiracy, racketeering and money-laundering charges. If found guilty, they could face decades in jail.

Dotcom's lawyer Ira Rothken told The Associated Press he was pleased with the decision. "It provides everybody in the world with a seat in the gallery of the New Zealand courtroom," Rothken said. "It's democracy at its finest." Rothken said the livestreaming would begin Wednesday on YouTube. He said there would be a 20-minute delay to prevent any evidence that was protected by the court from becoming public. The appeal is expected to last six weeks.

The case could still take years to be resolved:

Arguments for Mathias Ortmann and Bram van der Kolk are expected to take around eight days but the whole process is forecast to be a drawn-out affair. In the District Court the extradition hearing was supposed to take four weeks but actually took ten. This time around the actions of the District Court will be picked over in fine detail, concentrating closely on numerous matters of law. The United States Department of Justice isn't expected to begin its arguments for another three weeks or so. The hearing continues tomorrow but it's unlikely that any final decision will arrive even this year. Dotcom and his rivals in the US both seem prepared to take this battle all the way to the Supreme Court in New Zealand if necessary. That could take years.


Original Submission

posted by janrinok on Tuesday August 30 2016, @08:01PM   Printer-friendly
from the perhaps-they-ARE-out-there dept.

http://observer.com/2016/08/not-a-drill-seti-is-investigating-a-possible-extraterrestrial-signal-from-deep-space/

An international team of scientists from the Search for Extraterrestrial Intelligence (SETI) is investigating mysterious signal spikes emitting from a 6.3-billion-year-old star in the constellation Hercules—95 light years away from Earth. The implications are extraordinary and point to the possibility of a civilization far more advanced than our own.

The unusual signal was originally detected on May 15, 2015, by the Russian Academy of Science-operated RATAN-600 radio telescope in Zelenchukskaya, Russia, but was kept secret from the international community. Interstellar space reporter Paul Gilster broke the story after the researchers quietly circulated a paper announcing the detection of "a strong signal in the direction of HD164595."

The mysterious star's designation is HD164595, and it's considered to be sun-like in nature with a nearly identical metallic composition to our own star. So far, a single Neptune-like (but warmer) planet has been discovered in its orbit—HD 164595 b. But as Gilster explained, "There could, of course, be other planets still undetected in this system."


Original Submission

posted by janrinok on Tuesday August 30 2016, @06:32PM   Printer-friendly
from the not-a-good-day-for-Apple dept.

TechCrunch reports on a lawsuit being brought against Apple by owners of the iPhone 6 and iPhone 6 Plus who say a design defect causes the touchscreens to become unresponsive. The loss of responsiveness is often preceded by a flickering gray bar appearing across the top of the screen. iFixit discusses a few possible sources of the "Touch Disease" problem, with the most popular theory being that the Touch IC chips lose contact with the logic board due to the phone bending.

The complaint [PDF], filed in California's Northern District federal court, alleges that Apple is aware of the design flaw and has concealed it from consumers by refusing to acknowledge or repair it. It also suggests that the 5s and 5c protected against this problem in various ways, so it's not as if Apple didn't know it was a possibility. The 6s and Plus got stiffened to prevent bending, as well.


Original Submission

posted by janrinok on Tuesday August 30 2016, @04:57PM   Printer-friendly
from the going-up dept.

Researchers at Hokkaido University describe a novel method of making high quality vertical nanowires with full control over their size, density and distribution over a semi-conducting substrate. The findings are reported in the Japanese Journal of Applied Physics.

Nanowires hold interesting properties that are not found in bulk materials, making them useful in components for novel electronic and photonic devices. There is much interest in the development of vertical, free-standing nanowires, as their versatility shows great promise. However, most current designs use bottom-up fabrication techniques that result in vertical nanowires being randomly distributed on semi-conducting substrates, limiting their usability.

Now, Ryutaro Kodaira, Shinjiro Hara and co-workers at Hokkaido University have demonstrated a novel method of making high quality vertical nanowires with full control over their size, density and distribution over a semi-conducting substrate.

The team created an indium arsenide (InAs) nanowire template from which to grow the desired heterojunction nanowires, which were composed of ferromagnetic manganese arsenide (MnAs) and semiconducting InAs. In the fabrication process, they first produced the InAs nanowire template by precisely patterning circular openings in silicon dioxide thin films, which were deposited by plasma sputtering onto wafers. Next the researchers grew single InAs nanowires in each circular hole. The MnAs nanowires formed either inside (in the middle) or on top of the InAs nanowires, by a process known as 'endotaxy' – orientated crystal growth inside another crystal.


Original Submission

posted by janrinok on Tuesday August 30 2016, @03:25PM   Printer-friendly
from the hear-the-silence dept.

CNET reports:

They've been a fixture of the computing industry for 60 years, but in 2018, hard drives will be pushed aside by storage systems using memory chips in PCs, an analyst firm predicts. [...] SSDs no longer are exotic. This year, 33 percent of PCs sold will come with SSDs, but that should grow to 56 percent in 2018, analyst firm TrendForce forecast Monday.

They predicted 44% adoption in 2017. SSD prices are expected to drop to $0.17/GB in 2017, a direct result of new generations of 3D/vertical NAND.

As for those 3D XPoint post-NAND devices coming from Intel and Micron, the initial capacities could be closer to 140 GB than the 16-32 GB I originally expected.


Original Submission

posted by cmn32480 on Tuesday August 30 2016, @01:59PM   Printer-friendly
from the drill-baby-drill dept.

Water contaminated with some of the chemicals found in drinking water and fracking wastewater has been shown to affect hormone levels in mice:

More than 15 million Americans live within a one-mile radius of unconventional oil and gas (UOG) operations. UOGs combine directional drilling and hydraulic fracturing, or "fracking," to release natural gas from underground rock. Scientific studies, while ongoing, are still inconclusive on the potential long-term effects fracturing has on human development. Today, researchers at the University of Missouri released a study that is the first of its kind to link exposure to chemicals released during hydraulic fracturing to adverse reproductive and developmental outcomes in mice. Scientists believe that exposure to these chemicals also could pose a threat to human development.

"Researchers have previously found that endocrine-disrupting chemicals (EDCs) mimic or block hormones — the chemical messengers that regulate respiration, reproduction, metabolism, growth and other biological functions," said Susan C. Nagel, Nagel, an associate professor of obstetrics, gynecology and women's health in the School of Medicine. "Evidence from this study indicates that developmental exposure to fracking and drilling chemicals may pose a threat to fertility in animals and potentially people. Negative outcomes were observed even in mice exposed to the lowest dose of chemicals, which was lower than the concentrations found in groundwater at some locations with past oil and gas wastewater spills."

Researchers mixed 23 oil and gas chemicals in four different concentrations to reflect concentrations ranging from those found in drinking water and groundwater to concentrations found in industry wastewater. The mixtures were added to drinking water given to pregnant mice in the laboratory until they gave birth. The female offspring of the mice that drank the chemical mixtures were compared to female offspring of mice in a control group that were not exposed. Mice exposed to drilling chemicals had lower levels of key hormones related to reproductive health compared to the control group.

Adverse Reproductive and Developmental Health Outcomes Following Prenatal Exposure to a Hydraulic Fracturing Chemical Mixture in Female C57Bl/6 Mice (open, DOI: 10.1210/en.2016-1242) (DX)


Original Submission

posted by NCommander on Tuesday August 30 2016, @12:14PM   Printer-friendly
from the int-21h-is-how-cool-kids-did-it dept.

I've made no secret that I'd like to bring original content to SoylentNews, and recently polled the community on their feelings for crowdfunding articles. The overall response was somewhat lukewarm mostly on dividing where money and paying authors. As such, taking that into account, I decided to write a series of articles for SN in an attempt to drive more subscriptions and readers to the site, and to scratch a personal itch on doing a retro-computing project. The question then became: What to write?

As part of a conversation on IRC, part of me wondered what a modern day keylogger would have looked running on DOS. In the world of 2016, its no secret that various three letter agencies engage in mass surveillance and cyberwarfare. A keylogger would be part of any basic set of attack tools. The question is what would a potential attack tool have looked like if it was written during the 1980s. Back in 1980, the world was a very different place both from a networking and programming perspective.

For example, in 1988 (the year I was born), the IBM PC/XT and AT would have been a relatively common fixture, and the PS/2 only recently released. Most of the personal computing market ran some version of DOS, networking (which was rare) frequently took the form of Token Ring or ARCNet equipment. Further up the stack, TCP/IP competed with IPX, NetBIOS, and several other protocols for dominance. From the programming side, coding for DOS is very different that any modern platform as you had to deal with Intel's segmented architecture, and interacting directly with both the BIOS, and hardware. As such its an interesting look at how technology has evolved since.

Now obviously, I don't want to release a ready-made attack tool to be abused for the masses especially since DOS is still frequently used in embedded and industry roles. As such, I'm going to target a non-IP based protocol for logging both to explore these technologies, while simultaneously making it as useless as possible. To the extent possible, I will try and keep everything accessible to non-programmers, but this isn't intended as a tutorial for real mode programming. As such I'm not going to go super in-depth in places, but will try to link relevant information. If anyone is confused, post a comment, and I'll answer questions or edit these articles as they go live.

More past the break ...

Looking At Our Target

Back in 1984, IBM released the Personal Computer/AT which can be seen as the common ancestor of all modern PCs. Clone manufacturers copied the basic hardware and software interfaces which made the AT, and created the concept of PC-compatible software. Due to the sheer proliferation of both the AT and its clones, these interfaces became a de-facto standard which continues to this very day. As such, well-written software for the AT can generally be run on modern PCs with a minimum of hassle, and it is completely possible to run ancient versions of DOS and OS/2 on modern hardware due to backwards compatibility.

A typical business PC of the era likely looked something like this:

  • An Intel 8086 or 80286 processor running at 4-6 MHz
  • 256 kilobytes to 1 megabyte of RAM
  • 5-20 MiB HDD + 5.25 floppy disk drive
  • Operating System: DOS 3.x or OS/2 1.x
  • Network: Token Ring connected to a NetWare server, or OS/2 LAN Manager
  • Cost: ~$6000 USD in 1987

To put that in perspective, many of today's microcontrollers have on-par or better specifications than the original PC/AT. From a programming perspective, even taking into account resource limitations, coding for the PC/AT is drastically different from many modern systems due to the segmented memory model used by the 8086 and 80286. Before we dive into the nitty-gritty of a basic 'Hello World' program, we need to take a closer look at the programming model and memory architecture used by the 8086 which was a 16-bit processor.

Real Mode Programming

If the AT is the common ancestor of all PC-compatibles, then the Intel 8086 is processor equivalent. The 8086 was a 16-bit processor that operated at a top clock speed of 10 MHz, had a 20-bit address bus that supported up to 1 megabyte of RAM, and provided fourteen registers. Registers are essentially very fast storage locations physically located within the processor that were used to perform various operations. Four registers (AX, BX, CX, and DX) are general purpose, meaning they can be used for any operation. Eight (described below) are dedicated to working with segments, and the final registers are the processor's current instruction pointer (IP), and state (FLAGS) An important point in understanding the differences between modern programming environments and those used by early PCs deals with the difference between 16-bit and 32/64-bit programming. At the most fundamental level, the number of bits a processor has refers to the size of numbers (or integers) it works with internally. As such, the largest possible unsigned number a 16-bit processor can directly work with is 2 to the power of 16 (minus 1) or 65,535. As the name suggests, 32-bit processors work with larger numbers, with the maximum being 4,294,967,296. Thus, a 16-bit processor can only reference up to 64 KiB of memory at a given time while a 32-bit processor can reference up to 4 GiB, and a 64-bit processor can reference up to 16 exbibytes of memory directly.

At this point, you may be asking yourselves, "if a 16-bit processor could only work with 64 KiB RAM directly, how did the the 8086 support up to 1 megabyte?" The answer comes from the segmented memory model. Instead of directly referencing a location in RAM, addresses were divided into two 16-bit parts, the selector and offset. Segments are 64 kilobyte selections of RAM. They could generally be considered the computing equivalent of a postal code, telling the processor where to look for data. The offset then told the processor where exactly within that segment the data it wanted was located. On the 8086, the selector represented the top 16-bits of an address, and then the offset was added to it to create 20-bits (or 1 megabyte) of addressable memory. Segments and offsets are referenced by the processor in special registers; in short you had the following:

  • Segments
    • CS: Code segment - Application code
    • DS: Data segment - Application data
    • SS: Stack segment - Stack (or working space) location
    • ES: Extra segment - Programmer defined 'spare' segment
  • Offsets
    • SI - Source Index
    • DI - Destination Index
    • BP - Base pointer
    • SP - Stack pointer

As such, memory addresses on the 8086 were written in the form of segment:offset. For example, a given memory address of 0x000FFFFF could be written as F000:FFFF. As a consequence, multiple segment:offset pairs could refer to the same bit of memory; the addresses F555:AAAF, F000:FFFF, and F800:7FFF all refer to the same bit of memory. The segmentation model also had important performance and operational characteristics to consider.

The most important was that since data could be within the same segment, or a different type of segment, you had two different types of pointers to work with them. Near pointers (which is just the 16-bit offset) deal with data within the same segment, and are very fast as no state information has to be changed to reference them. Far pointers pointed to data in a different selector and required multiple operations to work with as you had to not only load and store the two 16-bit components, you had to change the segment registers to the correct values. In practice, that meant far pointers were extremely costly in terms of execution time. The performance hit was bad enough that it eventually lead to one of the greatest (or worst) backward compatibility hacks of all time: the A20 gate, something which I could write a whole article on.

The segmented memory model also meant that any high level programming languages had to incorporate lower-level programming details into it. For example, while C compilers were available for the 8086 (in the form on Microsoft C), the C programming language had to be modified to work with the memory model. This meant that instead of just having the standard C pointer types, you had to deal with near and far pointers, and the layout of data and code within segments to make the whole thing work. This meant that coding for pre-80386 processors required code specifically written for the 8086 and the 80286.

Furthermore, most of the functionality provided by the BIOS and DOS were only available in the form of interrupts. Interrupts are special signals used by the process that something needs immediate attention; for examine, typing a key on a keyboard generates a IRQ 1 interrupt to let DOS and applications know something happened. Interrupts can be generated in software (the 'int' instruction) or hardware. As interrupt handling can generally only be done in raw assembly, many DOS apps of the era were written (in whole or in part) in intel assembly. This brings us to our next topic: the DOS programming model

Disassembling 'Hello World'

Before digging more into the subject, let's look at the traditional 'Hello World' program written for DOS. All code posted here is compiled with NASM

; Hello.asm - Hello World

section .text
org 0x100

_entry:
 mov ah, 9
 mov dx, str_hello
 int 0x21
 ret

section .data
str_hello: db "Hello World",'$'

Pretty, right? Even for those familiar with 32-bit x86 assembly programming may not be able to understand this at first glance what this does. To prevent this from getting too long, I'm going to gloss over the specifics of how DOS loads programs, and simply what this does. For non-programmers, this may be confusing, but I'll try an explain it below.

The first part of the file has the code segment (marked 'section .text' in NASM) and our program's entry point. With COM files such as this, execution begins at the top of file. As such, _entry is where we enter the program. We immediately execute two 'mov' instructions to load values into the top half of AX (AH), and a near pointer to our string into DX. Ignore 9 for now, we'll get to it in a moment. Afterwords, we trip an interrupt, with the number in hex (0x21) after it being the interrupt we want to trip. DOS's functions are exposed as interrupts on 0x20 to 0x2F; 0x21 is roughly equivalent to stdio in C. 0x21 uses the value in AX to determine which subfunction we want, in this case, 9, to write to console. DOS expects a string terminated in $ in DX; it does not use null-terminated strings like you may expect. After we return from the interrupt, we simply exit the program by calling ret.

Under DOS, there is no standard library with nicely named functions to help you out of the box (though many compilers did ship with these such as Watcom C). Instead, you have to load values into registers, and call the correct interrupt to make anything happen. Fortunately, lists of known interrupts are available to make the process less painful. Furthermore, DOS only provides filesystem and network operations. For anything else, you need to talk to the BIOS or hardware directly. The best way to think of DOS from a programming perspective is essentially an extension of the basic input/output functionality that IBM provided in ROM rather than a full operating system.

We'll dig more into the specifics on future articles, but the takeaway here is that if you want to do anything in DOS, interrupts and reference tables are the only way to do so.

Conclusion

As an introduction article, we looked at the basics of how 16-bit real mode programming works and the DOS programming model. While something of a dry read, it's a necessary foundation to understand the basic building blocks of what is to come. In the next article, we'll look more at the DOS API, and terminate-and-stay resident programs, as well as hooking interrupts.

posted by cmn32480 on Tuesday August 30 2016, @10:29AM   Printer-friendly
from the happy-birthday-NPS dept.

The US National Park Service (NPS) has opened a new park in the vast central interior of Maine. Last Wednesday President Obama designated the Katahdin Woods and Waters National Monument on 87,000 acres of land (by comparison, Acadia National Park, located on an island off the coast of Maine, is 49,000 acres). The park land consists of what appears to be three discontiguous pieces, the largest of which borders Baxter State Park (home of Mt. Katahdin, the northern terminus of the Appalachian Trail) on its western side, and the upper reaches of the Penobscot River on the eastern side.

The park is already open to the public.

The land was donated to the US government by Roxanne Quimby, co-founder of Burt's Bees personal care products company. Quimby, a conservationist, spent decades using the proceeds from her business fortune to buy up Maine forest land; her work was controversial because she placed them off limits to loggers, snowmobilers and hunters. Quimby sold her stake in Burt's Bees to Clorox in 2007.

The new park is controversial in central Maine as well. It is a monument rather than a national park, chiefly because creating a National Park requires an act of Congress, while a national monument can be created by executive order. Obama noted, however, that Acadia National Park was originally established as a national monument as well (in 1916; it became a national park three years later).

There was, and remains, substantial local opposition to the bestowing of the land to the NPS, for a mixture of economic and emotional reasons; in particular, the land is now permanently unavailable for commercial logging, and perhaps for rights-of-way by loggers. Prices of nearby real estate may increase, making the economics more difficult for timber companies. Quimby, the donor, was controversial, as already mentioned, as was the unilateral action by Obama in designating the monument. Some fear the imposition of new air pollution controls on local paper mills. There is distrust of the NPS and fear of the emergence of a bureaucracy that will clash with local values.

But the initial harsh reaction seems to have scaled back a bit. Promises have been made to allow access to hunters, snowmobiles, and all terrain vehicles; logging access is probably another long discussion.


Original Submission

posted by cmn32480 on Tuesday August 30 2016, @08:47AM   Printer-friendly
from the turn-it-up-to-11 dept.

Arthur T Knackerbracket has found the following story:

A music service exclusive to owners of the Amazon Echo at half the price of competitors?

Amazon wants to launch a music subscription service that would be half the cost of rival services and run only on its Echo smart speaker, according to Recode.

The service would offer unlimited, ad-free music for $4 to $5 a month, half what Spotify and Apple Music charge, sources told the site. The internet retail giant would like to launch the service as early as next month but has yet to finalize deals with the major recording labels, sources told the site.

[...]

Amazon already offers an Amazon Music service as part of its $99 Amazon Prime annual subscription package, but subscribers have access to a limited catalog of music.


Original Submission

posted by cmn32480 on Tuesday August 30 2016, @05:26AM   Printer-friendly
from the this-is-heavy dept.

Hurricane Katrina and the cleanup and rebuilding of the city in the aftermath have improved soil quality in New Orleans by reducing levels of lead:

Mielke says, prior to Hurricane Katrina, 64 percent of the children living in neighborhoods identified as high-lead areas had blood lead levels equal to or above five micrograms per deciliter. According to the Centers for Disease Control, even a low level of lead in blood has been shown to affect IQ, academic achievement and behavior. Ten years after Katrina, Mielke says the number of children with blood lead levels five and above in high-lead areas dropped to 19 percent. The median amount of lead in the soil dropped from 280 milligrams per kilogram (i.e. ppm) pre-Katrina to 132 mg/kg after the storm.

The reasons for the decrease are threefold, says Mielke. The hurricane and levee failures flooded nearly 80 percent of the city, depositing varying depths of low lead sediment from the coastal environment. Mielke says the massive cleanup that followed also helped reduce the amount of lead dust in the air and soil, as housing interiors were cleaned out and materials covered in lead-based paint were removed or repainted. Lastly, uncontaminated soil was brought in from outside the city for new construction projects.

Spatiotemporal dynamic transformations of soil lead and children's blood lead ten years after Hurricane Katrina: New grounds for primary prevention (DOI: 10.1016/j.envint.2016.06.017) (DX)


Original Submission

posted by cmn32480 on Tuesday August 30 2016, @03:42AM   Printer-friendly
from the moar-pixels dept.

Google's VP9 codec can (sometimes) outperform H.265/HEVC at higher resolutions:

Netflix, being one of the biggest video streaming services in the world, tested how efficient various video codecs are for a given level of quality. The company discovered that the royalty-free VP9 codec developed primarily by Google is almost as efficient as HEVC, and can sometimes be even better at resolutions of 1080p and higher.

[...] Both HEVC and VP9 promise about 50% bitrate savings for the same quality compared to h.264, but Netflix wanted to test for itself to see if this is true. Netflix sampled 5,000 12-second clips from its catalog, which includes a wide range of genres and signal characteristics. With three codecs, two configurations, three resolutions (480p, 720p and 1080p), and eight quality levels per configuration-resolution pair, the company had more than 200 million encoded frames. Netflix applied six quality metrics: PSNR, PSNRMSE, SSIM, MS-SSIM, VIF and VMAF. This resulted in more than half a million bitrate-quality curves. Netflix's unused cloud-based encoding infrastructure allowed the company to complete this large test in only a few weeks.

The company learned that previous research showing up to 50% bitrate savings for both HEVC and VP9 compared to h.264 turned out to be true. HEVC's x265 implementation outperformed VP9's libvpx for most resolutions and quality metrics. However, at the 1080p resolution, the difference was either much smaller (in HEVC's favor), or, in some cases, VP9 even beat HEVC in bitrate savings. The fact that VP9 performs better at 1080p or higher is not a major surprise, considering VP9 was optimized for resolutions beyond HD. Google is currently using it for YouTube, where all videos are encoded in VP9.

As the article notes, new codecs are coming. Here's a little more about VP10.


Original Submission

posted by takyon on Tuesday August 30 2016, @02:05AM   Printer-friendly
from the too-cheap-to-meter dept.

Common Dreams reports:

The public cost of cleaning up the 2011 Fukushima Daiichi nuclear plant disaster topped ¥4.2 trillion (roughly [$41] billion) as of March, and is expected to keep climbing, the Japan Times reported [August 28].

That includes costs for radioactive decontamination and compensation payments. Tokyo Electric Power Company (TEPCO) will sell off its shares to eventually pay back the cost of decontamination and waste disposal, but the Environment Ministry expects that the overall price of those activities could exceed what TEPCO would get for its shares.

Meanwhile, the taxpayer burden is expected to increase and TEPCO is asking for additional help from the government.

[...] Problems still persist at the nuclear plant, most notably with the highly contaminated water being stored in tanks at the site. [...] "The situation with contaminated water at the site is a ticking time bomb and they don't seem to know what they can do--other than to construct more tanks", [said Aileen Mioko-Smith, an anti-nuclear activist with the group Green Action Japan].

takyon: ¥4.2 trillion is approximately $41 billion at today's exchange rates, not $628 billion. You can reach the author of the Common Dreams article, Nadia Prupis, by the email or Twitter account listed on this page.


Original Submission

posted by takyon on Tuesday August 30 2016, @12:42AM   Printer-friendly
from the sun-is-waiting dept.

The Price of Solar Is Declining to Unprecedented Lows: Despite already low costs, the installed price of solar fell by 5 to 12 percent in 2015

The installed price of solar energy has declined significantly in recent years as policy and market forces have driven more and more solar installations.

Now, the latest data show that the continued decrease in solar prices is unlikely to slow down anytime soon, with total installed prices dropping by 5 percent for rooftop residential systems, and 12 percent for larger utility-scale solar farms. With solar already achieving record-low prices, the cost decline observed in 2015 indicates that the coming years will likely see utility-scale solar become cost competitive with conventional forms of electricity generation.  

A full analysis of the ongoing decline in solar prices can be found in two separate Lawrence Berkeley National Laboratory Reports: Tracking the Sun IX focuses on installed pricing trends in the distributed rooftop solar market while Utility-Scale Solar 2015 focuses on large-scale solar farms that sell bulk power to the grid.

[...] The installed cost includes everything needed to get a solar power system up and running: the panels, the power electronics, the mounting hardware, and the installation itself. The continued decline in total installed cost is noteworthy considering the fact that the price of the solar panels (or modules) themselves has remained relatively flat since 2012. This means that the decline in installed cost observed since 2012 was largely caused by a decline in the cost of the inverters that convert the DC power produced by solar panels to AC power for the grid and other "soft" costs such as customer acquisition, system design, installation, and permitting.

[...] Going forward, the declining price of solar across all categories demonstrated by the latest Lawrence Berkeley National Laboratory reports coupled with the extension of the federal renewable energy investment tax credit through 2019 should drive a continued expansion of the U.S. solar market and even more favorable economics in the next few years. It will certainly be interesting to see what kind of market dynamic develops as solar approaches the tipping point where it becomes more economical than conventional forms of electricity generation.


Original Submission

posted by takyon on Monday August 29 2016, @11:15PM   Printer-friendly
from the can't-go-on-forever dept.

Even though not tech related, one of my favorite actors has passed...

US actor Gene Wilder, remembered by many for his lead role in Willy Wonka & the Chocolate Factory, has died at the age of 83, his family confirmed. The comic actor also starred in classic films such as The Producers, Blazing Saddles and Young Frankenstein. Mr Wilder frequently collaborated with writer and director Mel Brooks as well as stand-up comedian Richard Pryor. The two-time Oscar-nominated actor was diagnosed with non-Hodgkins lymphoma in 1989. Mr Wilder's nephew confirmed the actor died on Sunday in Stamford, Connecticut, due to complications from Alzheimer's disease.

Actor, director, screenwriter, and author Gene Wilder is no more:

Actor and writer Gene Wilder, who brought his signature manic energy to films such as The Producers, Blazing Saddles, Young Frankenstein and the role that forever ensconced him in the collective memory of a generation of children, Willy Wonka & the Chocolate Factory, has died. He was 83. Wilder died Monday at his home in Stamford, Conn., of complications from Alzheimer's disease, according to a statement from his nephew Jordan Walker-Pearlman.

"The decision to wait until this time to disclose his condition wasn't vanity, but more so that the countless young children that would smile or call out to him 'there's Willy Wonka,' would not have to be then exposed to an adult referencing illness or trouble and causing delight to travel to worry, disappointment or confusion," the statement read. "He simply couldn't bear the idea of one less smile in the world."

IMDB and Wikipedia.


Original Submission #1Original Submission #2