Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

Idiosyncratic use of punctuation - which of these annoys you the most?

  • Declarations and assignments that end with }; (C, C++, Javascript, etc.)
  • (Parenthesis (pile-ups (at (the (end (of (Lisp (code))))))))
  • Syntactically-significant whitespace (Python, Ruby, Haskell...)
  • Perl sigils: @array, $array[index], %hash, $hash{key}
  • Unnecessary sigils, like $variable in PHP
  • macro!() in Rust
  • Do you have any idea how much I spent on this Space Cadet keyboard, you insensitive clod?!
  • Something even worse...

[ Results | Polls ]
Comments:34 | Votes:77

posted by janrinok on Tuesday September 10, @10:56PM   Printer-friendly

Core drilling is tricky. Getting a 6 GHz signal through concrete is now easier.

One issue in getting office buildings networked that you don't typically face at home is concrete—and lots of it. Concrete walls are an average of 8 inches thick inside most commercial real estate.

Keeping a network running through them is not merely a matter of running cord. Not everybody has the knowledge or tools to punch through that kind of wall. Even if they do, you can't just put a hole in something that might be load-bearing or part of a fire control system without imaging, permits, and contractors. The bandwidths that can work through these walls, like 3G, are being phased out, and the bandwidths that provide enough throughput for modern systems, like 5G, can't make it through.

That's what WaveCore, from Airvine Scientific, aims to fix, and I can't help but find it fascinating after originally seeing it on The Register. The company had previously taken on lesser solid obstructions, like plaster and thick glass, with its WaveTunnel. Two WaveCore units on either side of a wall (or on different floors) can push through a stated 12 inches of concrete. In their in-house testing, Airvine reports pushing just under 4Gbps through 12 inches of garage concrete, and it can bend around corners, even 90 degrees. Your particular cement and aggregate combinations may vary, of course.

The spec sheet shows that a 6 GHz radio is the part that, through "beam steering," blasts through concrete, with a 2.4 GHz radio for control functions. There's PoE or barrel connector power, and RJ45 ethernet in the 1, 2.5, 5, and 10Gbps sizes.

6 GHz concrete fidelity (Con-Fi? Crete-Fi?) is just one of the slightly uncommon connections that may or may not be making their way into office spaces soon. LiFi, standardized as 802.11bb, is seeking to provide an intentionally limited scope to connectivity, whether for security restrictions or radio frequency safety. And Wi-Fi 7, certified earlier this year, aims to multiply data rates by bonding connections over the various bands already in place.


Original Submission

posted by janrinok on Tuesday September 10, @06:13PM   Printer-friendly

http://www.theradiohistorian.org/fm/fm.html

Before FM, There was APEX

In the early and mid-1930s, radio communication was confined to the Low [or Long] Waves (100-500 kHz), Medium Waves (500-1500 kHz), and the Short Waves (1,500 to 30,000 kHz). The frequencies above that, referred to as the "ultra-high frequencies", were truly the "Wild West" of radio. It was a place for experimentation and a possible home to future radio services. Commercial broadcasting to the public took place entirely on the standard broadcast band (540-1600 kHz), but it was affected by a number of defects that annoyed the public – natural and man-made static, local and skip interference, atmospheric fading, and limited fidelity. Starting about 1932, a number of brave and daring broadcasters sought permission from the FCC to conduct experiments in the Ultra-Short Waves in an attempt to find solutions to these problems. In particular, these experimental stations wanted to transmit wideband, high fidelity audio. Amplitude modulation, the only known method of transmitting audio at the time, was the method utilized on these so-called "Apex" stations. Experimental licenses were being issued for up to 1,000 watts on frequencies at 25-26 MHz and 42 MHz. By 1939, these Apex stations were operating in 34 U.S. cities in 22 states They suffered less skip interference than standard AM stations, but static was still a problem.

At the same time that these Apex broadcasters were gaining a foothold in the upper frontiers of the radio spectrum, an entirely new type of radio service was also being demonstrated - one that was destined to cause Apex AM to become obsolete. That service was called Frequency Modulation, or FM.


Original Submission

posted by hubie on Tuesday September 10, @01:32PM   Printer-friendly

"We will review the data and determine the next steps for the program," says Boeing's Starliner manager:

Boeing's Starliner spacecraft sailed to a smooth landing in the New Mexico desert Friday night, an auspicious end to an otherwise disappointing three-month test flight that left the capsule's two-person crew stuck in orbit until next year.

Cushioned by airbags, the Boeing crew capsule descended under three parachutes toward an on-target landing at 10:01 pm local time Friday (12:01 am EDT Saturday) at White Sands Space Harbor, New Mexico. From the outside, the landing appeared just as it would have if the spacecraft brought home NASA astronauts Butch Wilmore and Suni Williams, who became the first people to launch on a Starliner capsule on June 5.

But Starliner's cockpit was empty as it flew back to Earth Friday night. Last month, NASA managers decided to keep Wilmore and Williams on the International Space Station (ISS) until next year after agency officials determined it was too risky for the astronauts to return to the ground on Boeing's spaceship. Instead of coming home on Starliner, Wilmore and Williams will fly back to Earth on a SpaceX Dragon spacecraft in February. NASA has incorporated the Starliner duo into the space station's long-term crew.

[...] After streaking through the atmosphere over the Pacific Ocean and Mexico, Starliner deployed three main parachutes to slow its descent, then a ring of six airbags inflated around the bottom of the spacecraft to dampen the jolt of touchdown. This was the third time a Starliner capsule has flown in space, and the second time the spacecraft fell short of achieving all of its objectives.

"I'm happy to report Starliner did really well today in the undock, deorbit, and landing sequence," said Steve Stich, manager of NASA's commercial crew program, which manages a contract worth up to $4.6 billion for Boeing to develop, test, and fly a series of Starliner crew missions to the ISS.

While officials were pleased with Starliner's landing, the celebration was tinged with disappointment.

[...] Boeing's Starliner managers insisted the ship was safe to bring the astronauts home. It might be tempting to conclude the successful landing Friday night vindicated Boeing's views on the thruster problems. However, the spacecraft's propulsion system, provided by Aerojet Rocketdyne, clearly did not work as intended during the flight. NASA had the option of bringing Wilmore and Williams back to Earth on a different, flight-proven spacecraft, so they took it.

[...] As Starliner approached the space station in June, five of 28 control thrusters on Starliner's service module failed, forcing Wilmore to take manual control as ground teams sorted out the problem. Eventually, engineers recovered four of the five thrusters, but NASA's decision makers were unable to convince themselves the same problem wouldn't reappear, or get worse, when the spacecraft departed the space station and headed for reentry and landing.

Engineers later determined the control jets lost thrust due to overheating, which can cause Teflon seals in valves to swell and deform, starving the thrusters of propellant. Telemetry data beamed back to the mission controllers from Starliner showed higher-than-expected temperatures on two of the service module thrusters during the flight back to Earth Friday night, but they continued working.

[...] The overheating thrusters are located inside four doghouse-shaped propulsion pods around the perimeter of Starliner's service module. It turns out the doghouses retain heat like a thermos—something NASA and Boeing didn't fully appreciate before this mission—and the thrusters don't have time to cool down when the spacecraft fires its control jets in rapid pulses. It might help if Boeing removes some of the insulating thermal blankets from the doghouses, Stich said.

The easiest method of resolving the problem of Starliner's overheating thrusters would be to change the rate and duration of thruster firings.

"What we would like to do is try not to change the thruster. I think that is the best path," Stich said. "There thrusters have shown resilience and have shown that they perform well, as long as we keep their temperatures down and don't fire them in a manner that causes the temperatures to go up."

There's one thing from this summer's test flight that might, counterintuitively, help NASA certify the Starliner spacecraft to begin operational flights with its next mission. Rather than staying at the space station for eight days, Starliner remained docked at the research lab for three months, half of the duration of a full-up crew rotation flight. Despite the setbacks, Stich estimated the test flight achieved about 85 to 90 percent of its objectives.

"There's a lot of learning that happens in that three months that is invaluable for an increment mission," Stich said. "So, in some ways, the mission overachieved some objectives, in terms of being there for extra time. Not having the crew onboard, obviously, there are some things that we lack in terms of Butch and Suni's test pilot expertise, and how the vehicle performed, what they saw in the cockpit. We won't have that data, but we still have the wealth of data from the spacecraft itself, so that will go toward the mission objectives and the certification."


Original Submission

posted by hubie on Tuesday September 10, @08:50AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

The multiple constellations of broadband-beaming satellites planned by Chinese companies could conceivably run the nation's "Great Firewall" content censorship system, according to think tank The Australian Strategic Policy Institute. And if they do, using the services outside China will be dangerous.

A Monday note by the Institute's senior fellow Mercedes Page notes that Chinese entities plan to launch and operate three low-Earth-orbit satellite constellations to provide terrestrial internet services. As The Register has reported, the first of 15,000-plus planned satellites launched earlier in August.

Page thinks the satellites show "China is not only securing its position in the satellite internet market but laying the groundwork for expanding its digital governance model far beyond its borders."

"Central to China's ambition is the concept of cyber sovereignty – the notion that each nation has the right to govern its digital domain," she wrote, adding that "China has used this principle to build a heavily censored surveillance system supporting the Chinese Communist Party's power, widely condemned for violating human rights."

Page also notes that satellite broadband services rely on a small number of ground stations, or gateways, and that those facilities are ideal locations to run systems that monitor, block and filter content.

[...] Page also warned "The centralized nature of satellite internet may also make countries more vulnerable to cyber espionage by the Chinese government or malicious actors." Another security risk comes from Chinese laws that require companies to store data within China and make it accessible to the Chinese government. "As China's satellite projects are intended to provide global coverage, the data of international users – spanning communication, location, and internet activity – would be subject to Chinese data laws." And that could mean "Chinese authorities could potentially access any data transmitted through Chinese satellite internet services."

If Chinese satellite broadband services are widely adopted, Page thinks "the world may witness the rise of a new digital Iron Curtain extending from space, dividing the free flow of information and imposing state control on a global scale."

Which sounds terrifying. However, many nations are already wary of satellite broadband, and are attempting to regulate it like any other telco. China's telcos and networking equipment providers have already been banned in many nations, while Beijing's various diplomatic efforts are increasingly regarded with scepticism after they left countries like Sri Lanka and Zambia in deep debt.

With US-based satellite broadband providers like Starlink and Amazon's Kuiper likely to offer service that matches the performance of Chinese providers, nations will have an easy way to route around Beijing's network controls. That is, if they are applied to satellite internet – a circumstance about which Page speculates, but which is not certain to eventuate.


Original Submission

posted by hubie on Tuesday September 10, @04:04AM   Printer-friendly
from the regulating-design dept.

Editor's note: This TechCrunch piece quotes extensively from the source. For brevity, the quoted pieces were removed, but can be seen if one clicks through to TFA.

Arthur T Knackerbracket has processed the following story:

Last month, we shared the details of a really good “Dear Colleague” letter that Senator Rand Paul sent around urging other Senators not to vote for KOSA [Kids Online Safety Act]. While the letter did not work and the Senate overwhelmingly approved KOSA (only to now have it stuck in the House), Paul has now expanded upon that letter in an article at Reason.

It’s well worth the read, though the title makes the point clear: Censoring the Internet Won’t Protect Kids.

It starts out by pointing out how much good the internet can be for families:

[...] He correctly admits that the internet can also be misused, and that not all of it is appropriate for kids, but that’s no reason to overreact:

[...] He points out that the law empowers the FTC to police content that could impact the mental health of children, but does not clearly define mental health disorders, and those could change drastically with no input from Congress.

What he doesn’t mention is that we’re living in a time when some are trying to classify normal behavior as a mental health disorder, and thus this law could be weaponized.

From there, he talks about the “duty of care.” That’s a key part of both KOSA and other similar bills and says that websites have a “duty of care” to make efforts to block their sites from causing various problems. As we’ve explained for the better part of a decade, a “duty of care” turns itself into a demand for censorship, as it’s the only way for companies to avoid costly litigation over whether or not they were careful enough.

Just last week, I got into a debate with a KOSA supporter on social media. They insisted that they’re not talking about content, but just about design features like “infinite scroll.” When asked about what kind of things they’re trying to solve for, I was told “eating disorders.” I pointed out that “infinite scroll” doesn’t lead to eating disorders. They’re clearly targeting the underlying content (and even that is way more complex than KOSA supporters realize).

Senator Paul makes a similar point in the other direction. Things like “infinite scroll” aren’t harmful if the underlying content isn’t harmful:

[...] As for stopping “anxiety,” Paul makes the very important point that there are legitimate and important reasons why kids may feel some anxiety today, and KOSA shouldn’t stop that information from being shared:

[...] He also points out — as he did in his original letter — that the KOSA requirements to block certain kinds of ads makes no sense in a world in which kids see those same ads elsewhere:

Even as I’ve quoted a bunch here, there’s way more in the article. It is, by far, one of the best explanations of the problems of KOSA and many other bills that use false claims of “regulating design” as an attempt to “protect the kids.” He also talks about the harms of age verification, how it will harm youth activism, and how the structure of the bill will create strong incentives for websites to pull down all sorts of controversial content.

There is evidence that kids face greater mental health challenges today than in the past. Some studies suggest this is more because of society’s openness to discussing and diagnosing mental health challenges. But there remains no compelling evidence that the internet and social media are causing it. Even worse, as Paul’s article makes abundantly clear, there is nothing out there suggesting that censoring the internet will magically fix those problems. Yet, that’s what KOSA and many other bills are designed to do.


Original Submission

posted by janrinok on Monday September 09, @11:22PM   Printer-friendly
from the party-like-it's-1975 dept.

A half century later Steve Wozniak reunited with nine members of the "Homebrew Computer Club" for a special YouTube event (hosted by John "Captain Crunch" Draper).

And Woz remembers that it was that club that inspired him to rig his own continent-spanning connection to ARPAnet. Later the club passed around a datasheet for an upcoming 8-bit microprocessor, Woz did some tinkering, and "You can hear the excitement in Wozniak's voice as he remembers what happened next..."

HP had a single computer that 40 people were sharing. But now, "I had this little tiny computer with my own TV set, sitting on my desk at Hewlett-Packard, and I could type in my own programs and come up with solutions ... I was just having the time of my life!" Wozniak, of course, would go on to build Apple's first personal computers, which helped Apple become the most profitable company on earth. But Wozniak closes by saying the Homebrew Computer Club "was the heart of it all. It's what turned me on to the fact that people were interested in things like computers we could afford."

Woz also says he even gave tens of millions of his Apple stock to early Apple employees who'd come from the Homebrew Computer Club, because "I just felt they deserved it as much as I did. Because that was really where all my inspiration came from." And he would also fly into computer clubs around the U.S., "because I wanted to tell them where Apple came from, where I came from: It was the Homebrew Computer Club."


Original Submission

posted by janrinok on Monday September 09, @06:41PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

It's not easy making green. For years, scientists have fabricated small, high-quality lasers that generate red and blue light. However, the method they typically employ—injecting electric current into semiconductors—hasn't worked as well in building tiny lasers that emit light at yellow and green wavelengths.

Researchers refer to the dearth of stable, miniature lasers in this region of the visible-light spectrum as the "green gap." Filling this gap opens new opportunities in underwater communications, medical treatments and more.

Green laser pointers have existed for 25 years, but they produce light only in a narrow spectrum of green and are not integrated in chips where they could work together with other devices to perform useful tasks.

Now scientists at the National Institute of Standards and Technology (NIST) have closed the green gap by modifying a tiny optical component: a ring-shaped microresonator, small enough to fit on a chip. The research is published in the journal Light: Science & Applications.

A miniature source of green laser light could improve underwater communication because water is nearly transparent to blue-green wavelengths in most aquatic environments. Other potential applications are in full-color laser projection displays and laser treatment of medical conditions, including diabetic retinopathy, a proliferation of blood vessels in the eye.

Compact lasers in this wavelength range are also important for applications in quantum computing and communication, as they could potentially store data in qubits, the fundamental unit of quantum information. Currently, these quantum applications depend on lasers that are larger in size, weight and power, limiting their ability to be deployed outside the laboratory.

For several years, a team led by Kartik Srinivasan of NIST and the Joint Quantum Institute (JQI), a research partnership between NIST and the University of Maryland, has used microresonators composed of silicon nitride to convert infrared laser light into other colors. When infrared light is pumped into the ring-shaped resonator, the light circles thousands of times until it reaches intensities high enough to interact strongly with the silicon nitride. That interaction, known as an optical parametric oscillation (OPO), produces two new wavelengths of light, called the idler and the signal.

In previous studies, the researchers generated a few individual colors of visible laser light. Depending on the dimensions of the microresonator, which determine the colors of light that are generated, scientists produced red, orange and yellow wavelengths, as well as a wavelength of 560 nanometers, right at the hairy edge between yellow and green light. However, the team could not generate the full complement of yellow and green colors necessary to fill the green gap.

"We didn't want to be good at hitting just a couple of wavelengths," said NIST scientist Yi Sun, a collaborator on the new study. "We wanted to access the entire range of wavelengths in the gap."

To fill the gap, the team modified the microresonator in two ways. First, the scientists slightly thickened it. By changing its dimensions, the researchers more easily generated light that penetrated deeper into the green gap, to wavelengths as short as 532 nanometers (billionths of a meter). With this extended range, the researchers covered the entire gap.

In addition, the team exposed the microresonator to more air by etching away some of the silicon dioxide layer below it. This had the effect of making the output colors less sensitive to the microring dimensions and the infrared pump wavelength. The lower sensitivity gave the researchers more control in generating slightly different green, yellow, orange and red wavelengths from their device.

As a result, the researchers found they could create more than 150 distinct wavelengths across the green gap and fine-tune them. "Previously, we could make big changes—red to orange to yellow to green—in the laser colors we could generate with OPO, but it was hard to make small adjustments within each of those color bands," Srinivasan noted.

The scientists are now working to boost the energy efficiency with which they produce the green-gap laser colors. Currently, the output power is only a few percent of that of the input laser. Better coupling between the input laser and the waveguide that channels the light into the microresonator, along with better methods of extracting the generated light, could significantly improve the efficiency.

Journal information: Light: Science & Applications


Original Submission

posted by janrinok on Monday September 09, @01:54PM   Printer-friendly

Here's what the science says in the wake of actress Danielle Fishel's diagnosis:

Stage 0 cancer is a condition where cells in the body look like cancer cells under a microscope but haven't left their original location. It's also known as carcinoma in situ or noninvasive cancer, because it hasn't invaded any of the surrounding tissues. Sometimes it's not even called cancer at all.

"A lot of people think of these as kind of precancer lesions," says Julie Nangia, an oncologist at Baylor College of Medicine in Houston.

There are many different types of Stage 0 cancer, depending on which tissue or organ the cells are from. Some cancers, like sarcomas (cancers of the bones or skin), don't have a Stage 0.

Fishel's diagnosis is called ductal carcinoma in situ, or DCIS. This means some cells in the milk ducts in the breast look abnormal, but those cells haven't grown outside the milk ducts and moved into the rest of the breast tissue.

The trouble is, they could. If the abnormal cells do break through the milk duct, the severity of the ensuing cancer can range from Stage 1 to the most advanced Stage 4, depending on how big the tumor is and how far the cancer has spread throughout the body.

Before regular screening mammograms became the norm, DCIS accounted for just 5 percent of breast cancer diagnoses, says breast cancer surgeon Sara Javid of the Fred Hutch Cancer Center in Seattle (SN: 6/13/14).

Now, DCIS accounts for about 20 percent of newly diagnosed breast cancers. About 50,000 cases are diagnosed in the United States every year, and it turns up in one out of every 1,300 mammograms.

Still, because Stage 0 breast cancer doesn't really have any symptoms, it's possible to have it and never notice. "A lot of women have DCIS and don't know, especially older women, as it's typically a disease of aging," Nangia says.

For other Stage 0 cancers, the situation is different. Stage 0 cancers in other internal organs are often too small to show up on a scan. Widespread screening tests in other organs might be unsafe or take too many resources to run on a whole population.

The main exception is melanoma in situ, or Stage 0 skin cancer, which can be visible on the skin. That diagnosis is even more common than DCIS: Nearly 100,000 cases are expected in the United States in 2024.

"This is exactly why we want women to have screening mammograms," Nangia says. "We want to catch cancer at its earliest stages where it's incredibly easy to cure."

Journal References:
    • DOI: https://www.nejm.org/doi/full/10.1056/NEJMoa2214122
    • DOI: https://ascopubs.org/doi/10.1200/JCO.2019.37.15_suppl.TPS603
    • DOI: https://www.nejm.org/doi/full/10.1056/NEJMoa2214122
    • DOI: https://ascopubs.org/doi/10.1200/JCO.2019.37.15_suppl.TPS603


Original Submission

posted by hubie on Monday September 09, @09:49AM   Printer-friendly
from the preserving-the-preservers dept.

Internet Archive Responds to Appellate Opinion in Hachette v. Internet Archive:

We are disappointed in today's opinion about the Internet Archive's digital lending of books that are available electronically elsewhere. We are reviewing the court's opinion and will continue to defend the rights of libraries to own, lend, and preserve books.

Take Action
Sign the open letter to publishers, asking them to restore access to the 500,000 books removed from our library: https://change.org/LetReadersRead

The Internet Archive Loses Its Appeal of a Major Copyright Case:

The Internet Archive has lost a major legal battle—in a decision that could have a significant impact on the future of internet history. Today, the US Court of Appeals for the Second Circuit ruled against the long-running digital archive, upholding an earlier ruling in Hachette v. Internet Archive that found that one of the Internet Archive's book digitization projects violated copyright law.

Notably, the appeals court's ruling rejects the Internet Archive's argument that its lending practices were shielded by the fair use doctrine, which permits for copyright infringement in certain circumstances, calling it "unpersuasive."

In March 2020, the Internet Archive, a San Francisco-based nonprofit, launched a program called the National Emergency Library, or NEL. Library closures caused by the pandemic had left students, researchers, and readers unable to access millions of books, and the Internet Archive has said it was responding to calls from regular people and other librarians to help those at home get access to the books they needed.

The NEL was an offshoot of an ongoing digital lending project called the Open Library, in which the Internet Archive scans physical copies of library books and lets people check out the digital copies as though they're regular reading material instead of ebooks. The Open Library lent the books to one person at a time—but the NEL removed this ratio rule, instead letting large numbers of people borrow each scanned book at once.

The NEL was the subject of backlash soon after its launch, with some authors arguing that it was tantamount to piracy. In response, the Internet Archive within two months scuttled its emergency approach and reinstated the lending caps. But the damage was done. In June 2020, major publishing houses, including Hachette, HarperCollins, Penguin Random House, and Wiley, filed the lawsuit.

In March 2023, the district court ruled in favor of the publishers. Judge John G. Koeltl found that the Internet Archive had created "derivative works," arguing that there was "nothing transformative" about its copying and lending. After the initial ruling in Hachette v. Internet Archive, the parties negotiated terms—the details of which have not been disclosed—though the archive still filed an appeal.

James Grimmelmann, a professor of digital and internet law at Cornell University, says the verdict is "not terribly surprising" in the context of how courts have recently interpreted fair use.

[...] The Internet Archive's legal woes are not over. In 2023, a group of music labels, including Universal Music Group and Sony, sued the archive in a copyright infringement case over a music digitization project. That case is still making its way through the courts. The damages could be up to $400 million, an amount that could pose an existential threat to the nonprofit.

The new verdict arrives at an especially tumultuous time for copyright law. In the past two years there have been dozens of copyright infringementcases filed against major AI companies that offer generative AI tools, and many of the defendants in these cases argue that the fair use doctrine shields their usage of copyrighted data in AI training. Any major lawsuit in which judges refute fair use claims are thus closely watched.

It also arrives at a moment when the Internet Archive's outsize importance in digital preservation is keenly felt. The archive's Wayback Machine, which catalogs copies of websites, has become a vital tool for journalists, researchers, lawyers, and anyone with an interest in internet history. While there are other digital preservation projects, including national efforts from the US Library of Congress, there's nothing like it available to the public.


Original Submission

posted by hubie on Monday September 09, @05:02AM   Printer-friendly

What were previously thought to be pristine archaeological deposits, ripe for investigation, could be contaminated with plastics:

Scientists in the UK have found evidence that microplastics are contaminating archaeological soil samples. The discovery has the potential to upend the way historical remains are preserved.

Tiny particles of microplastics were discovered seven metres underground in samples dating from the first or early second century. They were first excavated in the 1980s.

"This feels like an important moment, confirming what we should have expected: that what were previously thought to be pristine archaeological deposits, ripe for investigation, are in fact contaminated with plastics, and that this includes deposits sampled and stored in the late 1980s," says Professor John Schofield from the University of York's Department of Archaeology.

[...] "We think of microplastics as a very modern phenomenon as we have only really been hearing about them for the last 20 years," says David Jennings, chief executive of York Archaeology.

But, he adds, research from 2004 revealed that they have been prevalent in our seas since the 1960s due to the post-Second World War boom in plastic pollution.

"This new study shows that the particles have infiltrated archaeological deposits and, like the oceans, this is likely to have been happening for a similar period, with particles found in soil samples taken and archived in 1988 at Wellington Row in York," Jenning explains.

[...] The team says that the concern for archaeologists is whether microplastics compromise the scientific value of preserved remains. Preserving archaeology in the place where it was found has been the preferred approach to conservation for a number of years. But these new findings could change that.

"Our best-preserved remains - for example, the Viking finds at Coppergate - were in a consistent anaerobic waterlogged environment for over 1,000 years, which preserved organic materials incredibly well," Jennings says.

"The presence of microplastics can and will change the chemistry of the soil, potentially introducing elements which will cause the organic remains to decay. If that is the case, preserving archaeology in situ may no longer be appropriate."

The team says further research into the impact of microplastics will be a priority for archaeologists given their potential impact on historical sites.


Original Submission

posted by hubie on Monday September 09, @12:19AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

This vulnerability, tracked as CVE-2024-39717, is being abused to plant custom, credential-harvesting web shells on customers' networks, according to Black Lotus Labs. Lumen Technologies' security researchers have attributed "with moderate confidence" both the new malware, dubbed VersaMem, and the exploitation of Volt Typhoon, warning that these attacks are "likely ongoing against unpatched Versa Director systems."

Volt Typhoon is the Beijing-backed cyberspy crew that the feds have accused of burrowing into US critical infrastructure networks while readying "disruptive or destructive cyberattacks" against these vital systems.

Versa Director is a software tool that allows for the central management and monitoring of Versa SD-WAN software. It's generally used by internet service providers (ISPs) and managed service providers (MSPs) to maintain their customers' network configurations — and this makes it an attractive target for cybercriminals because it gives them access to the service providers' downstream customers.

That appears to be the case with this CVE, as Versa notes the attacks target MSPs for privilege escalation. 

[...] Versa has since released a patch, and encourages all customers to upgrade to Versa Director version 22.1.4 or later and apply the hardening guidelines. But the advice comes too late for some, as we're told: "This vulnerability has been exploited in at least one known instance by an Advanced Persistent Threat actor."

[...] "Analysis of our global telemetry identified actor-controlled small-office/home-office (SOHO) devices exploiting this zero-day vulnerability at four U.S. victims and one non-U.S. victim in the Internet service provider (ISP), managed service provider (MSP) and information technology (IT) sectors as early as June 12, 2024," the threat hunters noted.

After gaining access to the victims' networks via the exposed Versa management port, the attackers deployed the VersaMem web shell, which steals credentials and then allows Volt Typhoon to access the service providers' customers' networks as authenticated users. 

"VersaMem is also modular in nature and enables the threat actors to load additional Java code to run exclusively in-memory," the security shop added.

[...] Plus, for anyone not yet convinced that software should be secure by design — with the onus for managing security risks falling on technology manufacturers, not the end users — this latest vulnerability should be more proof that CISA is on to something.

"The Versa blog on the topic subtly chastises affected users for failing to implement recommended security guidance," Britton said. "CISA's whole point in Secure by Default is that vendors need to find ways to guarantee that the out of the box system is as secure as possible, minimizing the possibility that overworked operators make these types of errors."

It also highlights the need for vendors to find a way to future-proof their products against unknown flaws, he added. "Commercially available technologies exist that can allow product and software manufacturers the ability to neutralize entire classes of vulns (known and unknown), without devolving into the whack-a-mole game of bug chasing."


Original Submission

posted by hubie on Sunday September 08, @07:37PM   Printer-friendly
from the ..with-a-whammy-bar... dept.

Smithsonian Magazine has a retrospective on the 70th anniversary of the Fender Stratocaster. That very popular model of electric guitar has been manufactured since 1954.

The Stratocaster also had timing on its side. It came out amid two other transformative innovations: television and rock 'n' roll. Sales picked up once Buddy Holly showcased a Strat on the "Ed Sullivan Show" in 1957. For the American and English kids who came of age in the '60s—Hendrix, Eric Clapton, David Gilmour, the electrified Bob Dylan at the Newport Folk Festival—the guitar's look was as groundbreaking as its sound. "Cool" depends more on appearance than on wiring, and the Strat's double-cut profile (producing two "wings" at the top) and sensuous body lines were mind-blowing. It looked, one early '60s British pop musician recalled, like "the equivalent of a bullet-finned '59 Cadillac."

Not to be confused with air guitar.

Previously:
(2016) Don't Give Up on the Guitar. Fender is Begging You
(2015) Our Musical Instruments May Become Obsolete


Original Submission

posted by janrinok on Sunday September 08, @02:38PM   Printer-friendly

(FAI - Fully Automatic Installation)
https://fai-project.org/FAIme/live/

-= Custom Live Media, also for Newer Hardware
-= A web service for building your own customized Debian live image

"At this years Debian conference in South Korea I've presented[1] the new feature of the FAIme web service. You can now build your own Debian live media/ISO.

The web interface provides various settings, for e.g. adding a user name and its password, selecting the Debian release (stable or testing), the desktop environment and the language. Additionally you can add your own list of packages, that will be installed into the live environment. It's possible to define a custom script that gets executed during the boot process. For remote access to the live system, you can easily sepcify a github, gitlab or salsa account, whose public ssh key will be used for passwordless root access. If your hardware needs special grub settings, you may also add those. I'm thinking about adding an autologin checkbox, so the live media could be used for a kiosk system.

And finally newer hardware is supported with the help of the backports kernel for the Debian stable release (aka bookworm). This combination is not available from the official Debian live images or the netinst media because the later has some complicated dependencies which are not that easy to resolve2[2]. At DebConf24 I've talked to Alper who has some ideas[3] how to improve the Debian installer environment which then may support a backports kernel."

- Thomas Lange,
- https://blog.fai-project.org/
- https://blog.fai-project.org/posts/faime-live/

[Editor's Comment: OK, I've downloaded a Debian build featuring software that I have chosen. It boots OK and looks fine. Do I trust it? No, not yet. I have no idea who FAI are although I can see who they claim to be. Nor do I know if they are using the correct packages. But I will run it on a spare machine and wireshark it to death when I have some spare time. If any of you know Thomas Lange (Thomas Lange is the main author of FAI. He's a Debian Developer since 2000 and a sysadmin since 1992. He started the FAI project in 1999.) or know more about the project then please leave a comment.]


Original Submission

posted by janrinok on Sunday September 08, @09:50AM   Printer-friendly
from the here-comes-the-sun dept.

https://arstechnica.com/information-technology/2024/09/fbi-busts-musicians-elaborate-ai-powered-10m-streaming-royalty-heist/

On Wednesday, federal prosecutors charged a North Carolina musician with defrauding streaming services of $10 million through an elaborate scheme involving AI, as reported by The New York Times. Michael Smith, 52, allegedly used AI to create hundreds of thousands of fake songs by nonexistent bands, then streamed them using bots to collect royalties from platforms like Spotify, Apple Music, and Amazon Music.

While the AI-generated element of this story is novel, Smith allegedly broke the law by setting up an elaborate fake listener scheme. The US Attorney for the Southern District of New York, Damian Williams, announced the charges, which include wire fraud and money laundering conspiracy. If convicted, Smith could face up to 20 years in prison for each charge.

To avoid detection, Smith spread his streaming activity across numerous fake songs, never playing a single track too many times. He also generated unique names for the AI-created artists and songs, trying to blend in with the quirky names of legitimate musical acts. Smith used artist names like "Callous Post" and "Calorie Screams," while their songs included titles such as "Zygotic Washstands" and "Zymotechnical."

[...] Initially, Smith uploaded his own original compositions to streaming platforms but found that his small catalog failed to generate significant income. In an attempt to scale up, he briefly collaborated with other musicians, reportedly offering to play their songs for royalties, though these efforts failed. This led Smith to pivot to AI-generated music in 2018 when he partnered with an as-yet-unnamed AI music company CEO and a music promoter to create a large library of computer-generated songs. The district attorney announcement did not specify precisely what method Smith used to generate the songs.

[...] When confronted by a music distribution company about "multiple reports of streaming abuse" in 2018, The New York Times says that Smith acted shocked and strongly denied any wrongdoing, insisting there was "absolutely no fraud going on whatsoever."


Original Submission

posted by janrinok on Sunday September 08, @05:04AM   Printer-friendly
from the censorship-shall-set-you-free dept.

City of Columbus sues man after he discloses severity of ransomware attack

Mayor said data was unusable to criminals; researcher proved otherwise.

A judge in Ohio has issued a temporary restraining order against a security researcher who presented evidence that a recent ransomware attack on the city of Columbus scooped up reams of sensitive personal information, contradicting claims made by city officials.

[....] after the city of Columbus fell victim to a ransomware attack on July 18 that siphoned 6.5 terabytes of the city's data.

[....] Columbus Mayor Andrew Ginther said on August 13 that a "breakthrough" in the city's forensic investigation of the breach found that the sensitive files Rhysida obtained were either encrypted or corrupted, making them "unusable" to the thieves.

[....] Shortly after Ginther made his remarks, security researcher David Leroy Ross contacted local news outlets and presented evidence that showed the data Rhysida published was fully intact and contained highly sensitive information regarding city employees and residents.

[....] On Thursday, the city of Columbus sued Ross for alleged damages for criminal acts, invasion of privacy, negligence, and civil conversion. The lawsuit claimed that downloading documents from a dark web site run by ransomware attackers amounted to him "interacting" with them and required special expertise and tools. The suit went on to challenge Ross alerting reporters to the information, which it claimed would not be easily obtained by others.

[....] In a press conference Thursday, Columbus City Attorney Zach Klein defended his decision to sue Ross and obtain the restraining order.

[....] the screenshot of the Rhysida dark web site on Friday morning, the sensitive data remains available to anyone who looks for it. Friday's order may bar Ross from accessing the data or disseminating it to reporters, but it has no effect on those who plan to use the data for malicious purposes.

Whew! I feel safer already!


Original Submission