2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
2022-10-05 14:04:11 UTC --fnord666
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
NASA says the Webb Space Telescope's Near Infrared Imager and Slitless Spectrograph is currently unavailable for science operations following a software glitch earlier this month.
In a release published yesterday, the agency stated that the issue started on January 15, when a communications delay within the instrument caused its flight software to time out. Flight software is a crucial aspect of any instrument operating in space, as it manages a whole suite of operations on a given spacecraft, including its orientation, communications, data collection, and thermal control.
[...] There have also been some software hiccups. In August, the telescope's Mid-Infrared Instrument (or MIRI) had a software glitch that paused its operations through November. And in December, there was an issue with the telescope's attitude control, which manages where the telescope is pointing. The glitch put the telescope into safe mode multiple times last month.
[...] Webb has done some tremendous work so far and will continue to illuminate the most ancient and murky regions of the cosmos. You can check out some of what's on the docket, along with other astronomy plans for the year, here.
The project, in concert with US government agency DARPA, aims to develop pioneering propulsion system for space travel as soon as 2027:
The project is intended to develop a pioneering propulsion system for space travel far different from the chemical systems prevalent since the modern era of rocketry dawned almost a century ago.
[...] Using current technology, Nasa says, the 300m-mile journey to Mars would take about seven months. Engineers do not yet know how much time could be shaved off using nuclear technology, but Bill Nelson, the Nasa administrator, said it would allow spacecraft, and humans, to travel in deep space at record speed.
[...] Using low thrust efficiently, nuclear electric propulsion systems accelerate spacecraft for extended periods and can propel a Mars mission for a fraction of the propellant of high-thrust systems.
The contractors working on the Federal Aviation Administration's NOTAM system apparently deleted files by accident, leading to the delays and cancellations of thousands of US flights. If you'll recall, the FAA paused all domestic departures in the US on the morning of January 11th, because its NOTAM or Notice to Air Missions system had failed. NOTAMs typically contain important information for pilots, including warnings for potential hazards along a flight's route, flight restrictions and runway closures.
[...] The agency later reported that the system failed after "personnel who failed to follow procedures" damaged certain files. Now, it has shared more details as part of the preliminary findings of an ongoing investigation. Apparently, its contractors were synchronizing a main and a back-up database when they "unintentionally deleted files" that turned out to be necessary to keep the alert system running. It also reiterated what it said in the past that it has "so far found no evidence of a cyberattack or malicious intent."
The U.S. has just given the green light to its first-ever small modular nuclear design, a promising step forward for a power source that remains controversial among some climate advocates but is experiencing a popular renaissance.
The Nuclear Regulatory Commission approved the design, which was published Thursday in the Federal Register, from NuScale, an Oregon-based reactor company. The publication of the design in the Register allows utilities to select this type of reactor when applying for a license to build a new nuclear facility. The design would be able to produce a reactor about one-third the size of a usual reactor, with each module able to produce around 50 megawatts of power.
[...] Just because a design is on the books doesn't mean that it's smooth sailing for the industry or that all our grids are going to be powered by carbon-free nuclear electricity in a few years. NuScale is currently working on a six-module demonstration plant in Idaho that will be fully operational by 2030; the company said this month that its estimates for the price per megawatt hour of the demo plant had jumped by more than 50% since its last estimates, in an uncomfortable echo of ballooning costs associated with other traditional nuclear projects. Small modular reactors still produce nuclear waste, which some environmentalists say is a concern that can't be overlooked as the industry develops.
US Regulators Certify First Small Nuclear Reactor Design
First Major Modular Nuclear Project Having Difficulty Retaining Backers
US Gives First-Ever OK for Small Commercial Nuclear Reactor
The US Government Just Invested Big in Small-Scale Nuclear Power
Safer Nuclear Reactors on the Horizon
A new study suggests that the Earth's inner core recently stopped spinning and is changing its direction. The changing spin may be behind slight fluctuations in the length of a day from year to year.
In the January 2023 issue of the science journal Nature Geoscience, researchers Xiaodong Song and Yi Yang of China's Peking University claimed the planet's inner core stopped spinning relative to the other layers around 2009. The Earth's innermost layer, about 3,100 miles below our feet, made of hot iron and about the size of Pluto, can rotate independently of the mantle and crust because of a liquid outer core that surrounds it.
The researchers said the inner core started reversing its spin after stopping and that this process repeats about every 35 years. The reversal last occurred in the early 1970s; the next could be in the mid-2040s.
[...] University of Southern California seismologist John Vidale disagrees. He thinks the inner core oscillates every six years based on data from nuclear explosions from the late 1960s to the early 1970s. Other geophysicists have numerous theories, but Vidale doesn't believe any models adequately explain all the data.
[...] However, he also doubts the accuracy of all the proposed theories. Seismic data only provides limited information about what's happening inside the Earth. Other theories postulate that the inner core may have another core inside it. So scientists have yet to reach a consensus on what happens in the inner Earth.
Yang, Y., Song, X. Multidecadal variation of the Earth's inner-core rotation. Nat. Geosci. (2023). https://doi.org/10.1038/s41561-022-01112-z
The US National Security Agency (NSA) has published a guidance document for system administrators to help them mitigate potential security issues as their organizations transition to Internet Protocol version 6 (IPv6).
The prosaically named "IPv6 Security Guidance" [PDF] was compiled for admins inside the Department of Defense (DoD), but is likely to prove useful as a quick reference for anyone managing the transition from IPv4 to IPv6, which could turn out to be a more drawn-out experience than was originally anticipated.
"The Department of Defense will incrementally transition from IPv4 to IPv6 over the next few years and many DoD networks will be dual-stacked," NSA Cybersecurity Technical Director Neal Ziring said in a statement accompanying the publication of the document.
"It's important that DoD system admins use this guidance to identify and mitigate potential security issues as they roll out IPv6 support in their networks."
One of the recommendations is pretty basic: education. Successfully securing an IPv6 network requires, at a minimum, a fundamental knowledge of the differences between the IPv4 and IPv6 protocols and how they operate, the NSA says, so all network administrators should receive proper training.
It advises that security methods used in IPv4 networks will largely also be used with IPv6, but with adaptations to address where there are differences.
Security issues associated with an IPv6 implementation will generally surface in networks that are either new to IPv6 or in early phases of the transition. This is because such networks will lack maturity in IPv6 configuration as well as likely lacking experience in IPv6 by the admins.
Organizations running both IPv4 and IPv6 simultaneously will have additional security risks, with further countermeasures needed to mitigate these due to the increased attack surface of having both IPv4 and IPv6, the document warns.
There are no massive revelations from the NSA, but advice that many admins are likely to be already aware of, such as the recommendation to assign IP addresses on the network via a DHCPv6 server instead of relying on stateless address auto-configuration (SLAAC).
The latter uses a self-assigned IPv6 address that incorporates the fixed MAC address from the NIC, leading to concerns that data traffic could be linked to a specific device and potentially an individual associated with that equipment. Whether this is a major concern to anyone outside of defense or government is another matter, of course.
The NSA also recommends avoiding the use of IPv6 tunneling, often used to transport IPv6 packets within IPv4 packets across existing network infrastructure, again to reduce the potential attack surface and lessen complexity. It advises that tunneling protocols may be allowed if they are required during a transition, but they should be limited to approved systems where their usage is well understood and where they are explicitly configured.
Developers across government and industry should commit to using memory safe languages for new products and tools, and identify the most critical libraries and packages to shift to memory safe languages, according to a study from Consumer Reports.
The US nonprofit, which is known for testing consumer products, asked what steps can be taken to help usher in "memory safe" languages, like Rust, over options such as C and C++. Consumer Reports said it wanted to address "industry-wide threats that cannot be solved through user behavior or even consumer choice" and it identified "memory unsafety" as one such issue.
The report, Future of Memory Safety, looks at range of issues, including challenges in building memory safe language adoption within universities, levels of distrust for memory safe languages, introducing memory safe languages to code bases written in other languages, and also incentives and public accountability.
During the past two years, more and more projects have started gradually adopting Rust for codebases written in C and C++ to make code more memory safe. Among them are initiatives from Meta, Google's Android Open Source Project, the C++-dominated Chromium project (sort of), and the Linux kernel.
In 2019, Microsoft revealed that 70% of security bugs it had fixed during the past 12 years were memory safety issues. The figure was high because Windows was written mostly in C and C++. Since then, the National Security Agency (NSA) has recommended developers make a strategic shift away from C++ in favor C#, Java, Ruby, Rust, and Swift.
The shift towards memory safe languages -- most notably, but not only, to Rust -- has even prompted the creator of C++, Bjarne Stroustrup and his peers, to devise a plan for the "Safety of C++". Developers like C++ for its performance and it still dominates embedded systems. C++ is still way more widely used than Rust, but both are popular languages for systems programming.
[...] The report highlights that computer science professors have a "golden opportunity here to explain the dangers" and could, for example, increase the weight of memory safety mistakes in assessing grades. But it adds that teaching parts of some courses in Rust could add "inessential complexity" and that there's a perception Rust is harder to learn, while C seems a safe bet for employability in future for many students.
[...] To overcome programmers' belief that memory safe languages are more difficult, someone could explain that these languages "force programmers to think through important concepts that ultimately improve the safety and performance of their code," the report notes.
Are you or your employer using or considering memory safe languages, and if so what is your opinion of them in your particular sphere?
The worst procrastinators probably won't be able to read this story. It'll remind them of what they're trying to avoid, psychologist Piers Steel says.
[...] In a study of thousands of university students, scientists linked procrastination to a panoply of poor outcomes, including depression, anxiety and even disabling arm pain. "I was surprised when I saw that one," says Fred Johansson, a clinical psychologist at Sophiahemmet University in Stockholm. His team reported the results January 4 in JAMA Network Open.
The study is one of the largest yet to tackle procrastination's ties to health. Its results echo findings from earlier studies that have gone largely ignored, says Fuschia Sirois, a behavioral scientist at Durham University in England, who was not involved with the new research.
For years, scientists didn't seem to view procrastination as something serious, she says. The new study could change that. "It's that kind of big splash that's ... going to get attention," Sirois says. "I'm hoping that it will raise awareness of the physical health consequences of procrastination."
It can be hard to tell if certain health problems make people more likely to procrastinate — or the other way around, Johansson says. (It may be a bit of both.) And controlled experiments on procrastination aren't easy to do: You can't just tell a study participant to become a procrastinator and wait and see if their health changes, he says.
In a new study, researchers have tied procrastination to a range of potential health issues and other negative outcomes, including:
- Disabling arm pain
- Poor sleep quality
- Physical inactivity
- Economic difficulties
[...] The study was observational, so the team can't say for sure that procrastination causes poor health. But results from other researchers also seem to point in this direction. A 2021 study tied procrastinating at bedtime to depression. And a 2015 study from Sirois' lab linked procrastinating to poor heart health.
Johansson F, Rozental A, Edlund K, et al. Associations Between Procrastination and Subsequent Health Outcomes Among University Students in Sweden. JAMA Netw Open. 2023;6(1):e2249346. doi:10.1001/jamanetworkopen.2022.49346
On Tuesday, the US Department of Justice announced that it would be joining eight states in filing a civil antitrust suit against Google over its monopoly on digital advertising. The lawsuit claims that Google abuses its power to put website publishers and advertisers at a disadvantage if they "dare to use" competing advertising technology products.
"Google has used anticompetitive, exclusionary, and unlawful conduct to eliminate or severely diminish any threat to its dominance over digital advertising technologies," said Attorney General Merrick B. Garland in a statement. "No matter the industry and no matter the company, the Justice Department will vigorously enforce our antitrust laws to protect consumers, safeguard competition, and ensure economic fairness and opportunity for all."
The suit alleges that Google has been engaging in anticompetitive behavior for years. Some of that alleged anticompetitive conduct includes acquiring competitors to obtain their digital ad tech, forcing publishers to adopt its tools, distorting auction competition by limiting real-time bidding on publisher inventory, and manipulating auction mechanics.
"The complaint filed today alleges a pervasive and systemic pattern of misconduct through which Google sought to consolidate market power and stave off free-market competition," said Deputy Attorney General Lisa O. Monaco.
According to the DOJ and the eight Attorneys General of California, Colorado, Connecticut, New Jersey, New York, Rhode Island, Tennessee, and Virginia, Google violated Sections 1 and 2 of the Sherman Act, which concern contracts in restraint of trade and monopolization.
Rocket Lab has completed its maiden mission from its new launch site in the U.S., marking a big step forward for the company as it seeks to better compete with the likes of SpaceX.
[...] The spaceflight company used its trusty Electron rocket to deploy three satellites for Hawkeye 360, a radio frequency geospatial analytics company, to an orbit of 342 miles (550 kilometers) above Earth.
It means Rocket Lab has now launched 33 Electron missions from three different pads in two countries — the U.S. and New Zealand — deploying a total of 155 satellites to orbit.
Rocket Lab livestreamed the mission, which showed the early stages of the Electron's flight. You can watch the launch below. There was, however, a longer than usual — and rather tense — wait for confirmation of the mission's success. The delay was put down to a ground station malfunction that temporarily prevented communications between the satellite and the team on the ground. Thankfully, around 90 minutes after launch, a relieved team was able to confirm that everything had gone to plan.
Now with two launch complexes in two countries, the SpaceX rival says it will be able to support more than 130 launches annually for government and commercial satellite operators.
Besides expanding its satellite-launch service using the Electron, Rocket Lab is also building its next-generation rocket, the Neutron, which will also launch from the Mid-Atlantic Regional Spaceport, with its first test launch targeted for 2024.
Earlier this month, there was a widely reported story regarding a large batch of AMD Radeon RX 6000-series graphics cards (all Navi 21 models) that had mysteriously but catastrophically died. These were some of the best graphics cards, up until the latest generation parts started launching. German electronics repair shop KrisFix.de received 61 broken or malfunctioning RX 6900 / 6800 family graphics cards and found 48 of them suffered from physically cracked GPU silicon. The mystery regarding these ruined GPUs may now have been solved, with the likely culprits being the terrible twosome of crypto mining and high humidity storage.
[...] According to KrisFix, these cards were likely stored for a few weeks or months since GPU-based cryptomining became uneconomical. The problem is that they seem to have been stored in an environment with inappropriate temperatures / humidity levels. The experienced electronics repairer says he has seen this exact symptom of chips cracking and popping up from the PCB after being used in the wake of this kind of inappropriate storage.
[...] Readers need to be wary of the used GPUs market, but the post-crypto world has been both a source of great bargains and ticking time bombs with regard to product durability. Miners will go to extraordinary lengths to clean up and sell on their old GPUs, but thankfully we haven't heard too many tales like this one from KrisFix.
Infamous ex-pharmaceutical executive Martin Shkreli is yet again in trouble with the Federal Trade Commission, which announced today that the convicted fraudster has failed to cooperate with the commission's investigation into whether he violated his lifetime ban from the pharmaceutical industry by starting a company last year called "Druglike, Inc."
At the center of the dispute is whether Shkreli's co-founding of Druglike runs afoul of his lifetime ban from the pharmaceutical industry, which was in response to Shkreli's infamous move to raise the price of the cheap, life-saving anti-parasitic drug, Daraprim, from $17.50 a pill to $750 a pill in 2015.
The FTC also noted in its court filing that Shkreli has so far failed to pay any of the $64.6 million in disgorgement he was ordered to pay alongside his lifetime ban.
Martin Shkreli Launches Blockchain-Based Drug Discovery Platform
Martin Shkreli Accused of Running Business From Prison With a Smuggled Smartphone
Sobbing Martin Shkreli Sentenced to 7 Years in Prison for Defrauding Investors
Martin Shkreli's $5 Million Bail Revoked for Facebook Post Seeking Hillary Clinton's Hair
Martin Shkreli Lists Unreleased Wu-Tang Clan Album on eBay
Martin Shkreli Convicted of Securities Fraud Charges, Optimistic About Sentencing
Martin Shkreli Points Fingers at Other Pharmaceutical Companies
U.S. Hospitals Band Together to Form Civica Rx, a Non-Profit Pharmaceutical Company
FDA Has Named Names of Pharma Companies Blocking Cheaper Generics [Updated]
EpiPen Maker is Facing Shareholder Backlash
Mylan Overcharged U.S. Government on EpiPens
Drug Firm Offers $1 Version of $750 Turing Pharmaceuticals Pill
Scientists have advanced in discovering how to use ripples in space-time known as gravitational waves to peer back to the beginning of everything we know. The researchers say they can better understand the state of the cosmos shortly after the Big Bang by learning how these ripples in the fabric of the universe flow through planets and the gas between the galaxies.
"We can't see the early universe directly, but maybe we can see it indirectly if we look at how gravitational waves from that time have affected matter and radiation that we can observe today," said Deepen Garg, lead author of a paper reporting the results in the Journal of Cosmology and Astroparticle Physics. Garg is a graduate student in the Princeton Program in Plasma Physics, which is based at the U.S. Department of Energy's (DOE) Princeton Plasma Physics Laboratory (PPPL).
Garg and his advisor Ilya Dodin, who is affiliated with both Princeton University and PPPL, adapted this technique from their research into fusion energy, the process powering the sun and stars that scientists are developing to create electricity on Earth without emitting greenhouse gases or producing long-lived radioactive waste. Fusion scientists calculate how electromagnetic waves move through plasma, the soup of electrons and atomic nuclei that fuels fusion facilities known as tokamaks and stellarators.
It turns out that this process resembles the movement of gravitational waves through matter. "We basically put plasma wave machinery to work on a gravitational wave problem," Garg said.
[...] The scientists now plan to use the technique to analyze data in the near future. "We have some formulas now, but getting meaningful results will take more work," Garg said.
Deepen Garg and I.Y. Dodin, Gravitational wave modes in matter, J Cosmol Astropart P, 10 August 2022. (DOI: 10.1088/1475-7516/2022/08/017)
The lower Niagara River (below/north of Niagara Falls and the Rapids) has been a favored location for trade and smuggling for centuries. It seems to have taken a new turn recently, per this story, https://www.forbes.com/sites/thomasbrewster/2023/01/17/drones-carry-mdma-into-america-via-niagara-falls/?sh=7b39ea602c83 also covered by BuffaloNews.com and other outlets.
The neighbors thought it odd that no one seemed to live in the capacious, quintessential American family house in Lewiston, New York, a small town that sits on the Niagara River, just east of Ontario, Canada. Whoever owned the $650,000 property appeared neither to reside there nor care about upkeep, its lawn unmowed to the point of being "unmanageable," locals later told police. Even odder, they told the cops, were the monthly arrivals of individuals driving expensive-looking cars, only for the visitors to leave a few days later.
Then in the early hours of September 21 last year, the house became the subject of a police raid, according to a recently unsealed search warrant obtained by Forbes. In the middle of the night, using a surveillance tool that could "recognize drone signatures, map their flight path and identify starting and stopping points via GPS," border patrol watched an unmanned aerial vehicle flying over the Niagara River and into the house's garden, according to the warrant. When the cops arrived, the pilot and two other individuals tried to flee, but were caught and taken in for questioning. The police found that attached to the drone was a package of MDMA with an approximate street value of $110,000. A subsequent search of the house recovered multiple webcams watching over entrances and exits, a number of commercial drones and paracord, a kind of rope originally designed for parachutes.
The case reveals the government's investment in drone surveillance, in particular at the border. "The border entities are much better at it," said Mary-Lou Smulders, chief marketing officer at drone detection contractor Dedrone, a provider to various U.S. federal government agencies.
The investigators in the Niagara River probe likely used radio frequency signals to tag and track the unmanned flying vehicle, Smulders said. That involves setting up sensors across a given area and triangulating the drone's signals to get a relatively precise location. There are other ways to monitor drones, however, from radar to listening for the machines' noises using arrays of microphones.
A google search for: smuggling lewiston ny
turns up human trafficking, un-taxed cigarettes, and a variety of drugs -- all recently, by boat. Going back even further to US Prohibition, plenty of alcohol came in by this route. Civil War era? Last stop on the Underground Railroad for slaves escaping to Canada. Pre-US Revolution? Fur traders -- some history here, https://historiclewiston.org/history/
Intel introduced the 8086 microprocessor in 1978 and it had a huge influence on computing. I'm reverse-engineering the 8086 by examining the circuitry on its silicon die and in this blog post I take a look at how conditional jumps are implemented. Conditional jumps are an important part of any instruction set, changing the flow of execution based on a condition. Although this instruction may seem simple, it involves many parts of the CPU: the 8086 uses microcode along with special-purpose condition logic.
Most people think of machine instructions as the basic steps that a computer performs. However, many processors (including the 8086) have another layer of software underneath: microcode. One of the hardest parts of computer design is creating the control logic that directs the processor for each step of an instruction. The straightforward approach is to build a circuit from flip-flops and gates that moves through the various steps and generates the control signals. However, this circuitry is complicated, error-prone, and hard to design.
The alternative is microcode: instead of building the control circuitry from complex logic gates, the control logic is largely replaced with code. To execute a machine instruction, the computer internally executes several simpler micro-instructions, specified by the microcode. In other words, microcode forms another layer between the machine instructions and the hardware. The main advantage of microcode is that it turns design of control circuitry into a programming task instead of a difficult logic design task.
The 8086 uses a hybrid approach: although the 8086 uses microcode, much of the instruction functionality is implemented with gate logic. This approach removed duplication from the microcode and kept the microcode small enough for 1978 technology. In a sense, the microcode is parameterized. For instance, the microcode can specify a generic Arithmetic/Logic Unit (ALU) operation, and the gate logic determines from the instruction which ALU (Arithmetic/Logic Unit) operation to perform. More relevant to this blog post, the microcode can specify a generic conditional test and the gate logic determines which condition to use. Although this made the 8086's gate logic more complicated, the tradeoff was worthwhile.