Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
Supreme Court wants US input on whether ISPs should be liable for users' piracy:
The Supreme Court signaled it may take up a case that could determine whether Internet service providers must terminate users who are accused of copyright infringement. In an order issued today, the court invited the Department of Justice's solicitor general to file a brief "expressing the views of the United States."
In Sony Music Entertainment v. Cox Communications, the major record labels argue that cable provider Cox should be held liable for failing to terminate users who were repeatedly flagged for infringement based on their IP addresses being connected to torrent downloads. There was a mixed ruling at the US Court of Appeals for the 4th Circuit as the appeals court affirmed a jury's finding that Cox was guilty of willful contributory infringement but reversed a verdict on vicarious infringement "because Cox did not profit from its subscribers' acts of infringement."
That ruling vacated a $1 billion damages award and ordered a new damages trial. Cox and Sony are both seeking a Supreme Court review. Cox wants to overturn the finding of willful contributory infringement, while Sony wants to reinstate the $1 billion verdict.
The Supreme Court asking for US input on Sony v. Cox could be a precursor to the high court taking up the case. For example, the court last year asked the solicitor general to weigh in on Texas and Florida laws that restricted how social media companies can moderate their platforms. The court subsequently took up the case and vacated lower-court rulings, making it clear that content moderation is protected by the First Amendment.
Cox has said that letting the piracy ruling stand "would force ISPs to terminate Internet service to households or businesses based on unproven allegations of infringing activity, and put them in a position of having to police their networks." Cox said that ISPs "have no way of verifying whether a bot-generated notice is accurate" and that even if the notices are accurate, terminating an account would punish every user in a household where only one person may have illegally downloaded copyrighted files.
Record labels urged the court to reinstate the vicarious infringement verdict. "As the District Court explained, the jury had ample evidence that Cox profited from its subscribers' infringement, including evidence 'that when deciding whether to terminate a subscriber for repeat infringement, Cox considered the subscriber's monthly payments,' and 'Cox repeatedly declined to terminate infringing subscribers' Internet service in order to continue collecting their monthly fees,'" the record labels' petition said.
Another potentially important copyright case involves the record labels and Grande, an ISP owned by Astound Broadband. The conservative-leaning US Court of Appeals for the 5th Circuit ruled last month that Grande violated the law by failing to terminate subscribers accused of being repeat infringers. The 5th Circuit also ordered a new trial on damages because it said a $46.8 million award was too high. Grande and the record labels are both seeking en banc rehearings of the 5th Circuit panel ruling.
But will they listen? And note the phrasing: whether Internet service providers must terminate users who are accused of copyright infringement.
X says The Onion can't have Alex Jones' Infowars accounts:
X claims in court filings that it still owns Infowars' X accounts and can't be sold without its permission.
Well that's a bit of a spanner in the works. Previously : The Onion Buys InfoWars - No Seriously! - SoylentNews:
The satirical website The Onion purchased InfoWars on Thursday, a capstone on years of litigation and bankruptcy proceedings following InfoWars founder Alex Jones' defamation of families associated with the Sandy Hook Elementary School massacre.
Those families backed The Onion's bid to purchase InfoWars' intellectual property, including its website, customer lists and inventory, certain social media accounts and the production equipment used to put Jones on the air. The Connecticut families agreed to forgo a portion of their recovery to increase the overall value of The Onion's bid, enabling its success.
MORE: Alex Jones still must pay $1B judgment: Judge
The families said the purchase would put an end to Jones' misinformation campaign.
"We were told this outcome would be nearly impossible, but we are no strangers to impossible fights. The world needs to see that having a platform does not mean you are above accountability -- the dissolution of Alex Jones' assets and the death of Infowars is the justice we have long awaited and fought for," said Robbie Parker, whose daughter Emilie was killed in the Sandy Hook shooting.
In 2022, the families that brought the case against Jones in Connecticut secured a $1.4 billion verdict in their defamation lawsuit. A Texas bankruptcy court ruled on the liquidation of Jones' assets in June of this year, handing over control to an independent trustee tasked with selling them off to generate the greatest possible value for the families.
"From day one, these families have fought against all odds to bring true accountability to Alex Jones and his corrupt business. Our clients knew that true accountability meant an end to Infowars and an end to Jones' ability to spread lies, pain and fear at scale. After surviving unimaginable loss with courage and integrity, they rejected Jones' hollow offers for allegedly more money if they would only let him stay on the air because doing so would have put other families in harm's way," said Chris Mattei, attorney for the Connecticut plaintiffs and partner at Koskoff Koskoff & Bieder.
Jones had filed for bankruptcy last year in a bid to avoid paying the billion-dollar judgment, but a judge ruled he still had to settle with the Sandy Hook families.
Bankruptcy often staves off legal judgments but not if they are the result of willful and malicious injury. U.S. Bankruptcy Court Judge Christopher Lopez in Houston decided that standard was satisfied in Jones' case.
"[I]n Jones's case, the language of the jury instruction confirms that the damages awarded flow from the allegation of intent to harm the Plaintiffs – not allegations of recklessness," Lopez wrote in his ruling.
Jones had claimed on his InfoWars show that the shooting at Sandy Hook Elementary School -- which killed 26 people, including 20 elementary students -- was performed by actors following a script written by government officials to bolster the push for gun control.
New filing shows electricity demand would be flat without the industry:
Ever since data centers started spreading across the Virginia landscape like an invasive pest, one important question has remained unanswered: How much does the industry's insatiable demand for energy impact other utility customers? Under pressure from the SCC [Virginia State Corporation Commission], this month Dominion Energy Virginia finally provided the answer we feared: Ordinary Virginia customers are subsidizing Big Tech with both their money and their health.
Dominion previously hid data centers among the rest of its customer base, making it impossible to figure out if residents were paying more than their fair share of the costs of building new generation and transmission lines. Worse, if data centers are the reason for burning more fossil fuels, then they are also responsible for residents being subjected to pollution that is supposed to be eliminated under the 2020 Virginia Clean Economy Act (VCEA). The VCEA calls for most coal plants in the state to be closed by the end of this year – which is not happening – and sets rigorous conditions before utilities can build any new fossil fuel plants.
[...] Even before the 2024 IRP was filed, though, the SCC directed the utility to file a supplement. It was obvious the IRP would project higher costs and increased use of fossil fuels. How much of that, the SCC demanded to know, is attributable to data centers?
A lot, as it turns out. Though Dominion continues to obfuscate key facts, the document it filed on November 15 shows future data center growth will drive up utility spending by about 20%. Dominion did not take the analysis further to show the effect on residential rates.
The filing also shows that but for new data centers, peak demand would actually decrease slightly over the next few years, from 17,353 MW this year to 17,280 MW in 2027, before beginning a gentle rise to 17,818 MW in 2034 and 18,608 MW in 2039.
In other words, without data centers, electricity use in Dominion territory would scarcely budge over the next decade. Indeed, the slight decrease over the next three years is especially interesting because near-term numbers tend to be the most reliable, with projections getting more speculative the further out you look.
Surprised? You're not alone. We've heard for years that electric vehicles and building electrification will drive large increases in energy demand. When Dominion talks about the challenges of load growth, it cites these factors along with data centers, suggesting that ordinary people are part of the problem. We're not.
[...] In addition to showing what the energy mix might look like without data centers, the SCC directed Dominion to identify which of its approximately 200 planned transmission projects were needed solely because of data centers. The 4-page table in Dominion's supplemental filing reveals that about half of the projects are solely data center-driven, with two or three dozen more serving a mix of customers that includes data centers. I tried to add up the numbers but lost track at a billion dollars' worth of projects needed solely for data centers – and I was still on the second page.
[...] Still, most of the data center growth lies ahead of us, as does Dominion's plans for new fossil fuel and nuclear generation. With state leaders avidly chasing more data centers in the name of economic development, ordinary Virginians are left to watch the assault on their energy supply, their water, and their environment and wonder: Is anyone going to fix this?
Teen Mathematicians Tie Knots Through a Mind-Blowing Fractal:
In the fall of 2021, Malors Espinosa set out to devise a special type of math problem. As with any good research question, it would have to be thought-provoking, its solution nontrivial — something others would want to study. But an additional constraint stumped him. Malors, then a graduate student in mathematics at the University of Toronto, wanted high school students to be able to prove it.
For years, Malors had been running summer workshops for local high schoolers, teaching them about basic ideas in mathematical research and showing them how to write proofs. But a few of his students seemed ready to do more — to find out what it means to do math when there is no answer key. They just needed the right question to guide them.
[...]
Menger's statement didn't distinguish between homeomorphic curves. His proof only guaranteed, for instance, that the circle could be found in his sponge — not that all homeomorphic knots could be, their loops and tangles still intact. Malors wanted to prove that you could find every knot within the sponge.
It seemed like the right mashup to excite young mathematicians. They'd recently had fun learning about knots in his seminar. And who doesn't love a fractal? The question was whether the problem would be approachable. "I really hoped there was an answer," Malors said.
There was. After just a few months of weekly Zoom meetings with Malors, three of his high school students — Joshua Broden, Noah Nazareth and Niko Voth — were able to show that all knots can indeed be found inside the Menger sponge. Moreover, they found that the same can likely be said of another related fractal, too.
"It's a clever way of putting things together," said Radmila Sazdanovic, a topologist at North Carolina State University who was not involved in the work. In revisiting Menger's century-old theorem, she added, Malors — who usually does research in the disparate field of number theory — had apparently asked a question that no one thought to ask before. "This is a very, very original idea," she said.
[...]
Broden, Nazareth and Voth had taken several of Malors' summer workshops over the years. When he first taught them about knots in an earlier workshop, "it blew 14-year-old me's mind," said Voth.
But the Menger problem would be their first time moving beyond school workbooks with answer keys. "It was a little bit nerve-racking, because it was the first time I was doing something where truly nobody has the answer, not even Malors," said Nazareth. Maybe there was no answer at all.
Their goal was essentially to thread a microscopic sewing needle through a cloud of dust — the material that remained of the sponge after many removals. They would have to stick the pin in the right places, tie the knotted tangles with immaculate precision, and never leave the sponge. If their thread ended up floating in the empty holes of the sponge for any knot, it was game over.
Not an easy task. But there was a way to simplify it. Knots can be depicted on a flat piece of paper as special diagrams called arc presentations. To create one, you start with information about how the strands of your knot pass in front of or behind each other. Then you apply a set of rules to translate this information into a series of points on a grid. Every row and column of the grid will contain exactly two points.
NASA Found An Abandoned Military Base Hidden Under Greenland's Ice:
Camp Century was built as a test towards launching nukes from the Arctic.
Arthur T Knackerbracket has processed the following story:
Deep within the ice sheet of Greenland lies a US military secret that hasn't been seen since the 1960s, but a NASA flyover earlier this year has provided an unprecedented look at the buried Cold War relic.
Camp Century, constructed in 1959 by the US Army Corps of Engineers, was built directly into the ice sheet of Greenland, giving it an interior reminiscent of Echo Base on the frozen world of Hoth in The Empire Strikes Back. At the heart of the facility was the PM-2A portable nuclear reactor, which provided power for the sprawling "city under the ice" that included housing for 200 soldiers, a theater, gym, post exchange, library, and even a chapel.
The camp was ostensibly built as a scientific outpost, and work at the facility did contribute to modern climate models thanks to ice core drilling – but its true purpose was Project Iceworm, the US Army's plan to deploy hundreds of cold-hardened Minuteman nuclear missiles capable of striking the Soviet Union across Greenland's frozen tundra.
That never came to fruition, leading to Camp Century's abandonment in 1967, after which the installation was buried under accumulating ice and snow. Camp Century is now believed to be at least 30 meters, or 100 feet, below the surface.
The only real look at Camp Century over the decades has been via ground-penetrating radar that provided, at best, a two-dimensional confirmation that some of the thousands of feet of tunnels, and whatever contents were left behind, are still down there.
That all changed in April, NASA reported this week, when the space agency's Earth Observatory unexpectedly picked up an anomaly while doing an ice sheet survey using an aircraft equipped with NASA's Uninhabited Aerial Vehicle Synthetic Aperture Radar (UAVSAR).
Camp Century, shown as the green blob, in an image captured by Chad Greene during the mapping flight – click to enlarge
UAVSAR, which in this case was being operated by scientists onboard a Gulfstream III aircraft, has the advantage of shooting radar signals directly downward and at an angle, meaning it's able to produce maps with more dimensionality than typical ground-penetrating radar that only aims straight down.
"We were looking for the bed of the ice and out pops Camp Century," said NASA JPL cryospheric scientist and project co-lead Alex Gardner. "We didn't know what it was at first."
JPL glaciologist and remote sensing specialist Chad Greene noted that the serendipitous image captured by UAVSAR shows individual structures of Camp Century that, when compared to existing layout plans, gives an unprecedented view into the state of the facility after nearly 60 years under the ice.
"Our goal was to calibrate, validate, and understand the capabilities and limitations of UAVSAR for mapping the ice sheet's internal layers and the ice-bed interface," Greene noted.
That said, while UAVSAR has granted new insights into the state of Camp Century, the images aren't perfect. NASA noted that, because of the angular nature of part of the capture, a band on the image makes it look like Century is below the ice bed, which is in reality miles below the ice sheet – far deeper than the ruins of the abandoned facility. The error is due to the angled radar picking up the ice bed further in the distance, NASA explained.
Because of the imperfections in the UAVSAR image of Camp Century, the image is "a novel curiosity" rather than a useful bit of scientific data, NASA said. Understanding more about Camp Century's status, its depth under the ice, and the status of the frozen water above it is critical, however, because it contains a lot of nuclear, biological, and chemical waste that the Army wasn't too worried about in the pre-climate change era.
If current climate change trends continue apace, NASA researchers determined in 2011, all the harmful stuff stored under the ice at Camp Century could leach into the surrounding ice well before surface melting begins to show changes.
NASA's predicted surface ice loss around Camp Century by 2090 – click to enlarge
And there's good reason to worry. There's lots of waste down there. By NASA's estimate, there are around 53,000 gallons of diesel fuel, 6.3 million gallons of waste water, including sewage from the camp's years in service, as well as an unknown quantity of radioactive waste and PCBs. The Atomic Heritage Foundation estimates the PM-2A reactor may have created more than 47,000 gallons of low-level radioactive waste over its lifetime, and that or more is likely buried beneath the ice too.
Previous 2D radar images of Century show the presence of buried waste material, so scientists know it's there, but without better imaging they can't know if anything has shifted or could begin to leak.
A 2011 radar capture of Camp Century showing what NASA scientists believe to be the buried waste at the abandoned facility – click to enlarge
NASA estimates that, by 2090, climate change could cause Greenland's ice sheet to destabilize above Camp Century, but that doesn't account for leaching into the ice before surface changes begin.
Camp Century was closed in 1967 after Project Iceworm, which sought to build thousands of miles of tunnels to deploy 600 missiles, failed due to a determination that Greenland's ice sheet was too unstable to support long-term subterranean facilities. All that remains to note the presence of the facility today is a project led by the governments of Greenland and Denmark to keep watch on the site from a small outpost above the camp, which is located 150 miles inland from the US Space Force's Pituffik Space Base, formerly known as Thule Air Base, where construction of Camp Century was managed.
"Without detailed knowledge of ice thickness, it is impossible to know how the ice sheets will respond to rapidly warming oceans and atmosphere, greatly limiting our ability to project rates of sea level rise," Gardner said.
Environmental damage from Camp Century may be inevitable as the climate continues to change, likely unabated, unless the world's governments take action. NASA noted that the flight that captured the new images of Camp Century would "enable the next generation of mapping campaigns in Greenland, Antarctica and beyond," though whether additional passes over Camp Century are planned to better map the facility is unknown.
"When we don't own what we buy, everything becomes disposable..." :
Makers of smart devices that fail to disclose how long they will support their products with software updates may be breaking the Magnuson Moss Warranty Act, the Federal Trade Commission (FTC) warned this week.
The FTC released its statement after examining 184 smart products across 64 product categories, including soundbars, video doorbells, breast pumps, smartphones, home appliances, and garage door opener controllers. Among devices researched, the majority—or 163 to be precise—"did not disclose the connected device support duration or end date" on their product webpage, per the FTC's report [PDF]. Contrastingly, 11.4 percent of devices examined shared a software support duration or end date on their product page.
In addition to manufacturers often neglecting to commit to software support for a specified amount of time, it seems that even when they share this information, it's elusive.
For example, the FTC reported that some manufacturers made software support dates available but not on the related product's webpage. Instead, this information is sometimes buried in specs, support, FAQ pages, or footnotes.
The FTC report added:
... some used ambiguous language that only imply the level of support provided, including phrases like, "lifetime technical support," "as long as your device is fully operational," and "continuous software updates," for example. Notably, staff also had difficulty finding on the product webpages the device's release date ...
At times, the FTC found glaring inconsistencies. For example, one device's product page said that the device featured "lifetime" support, "but the search result pointing to the manufacturer's support page indicated that, while other updates may still be active, the security updates for the device had stopped in 2021," per the FTC.
Those relying on Google's AI Overviews may also be misled. In one case, AI Overviews pointed to a smart gadget getting "software support and updates for 3–6 months." But through the link that AI Overviews provided, the FTC found that the three to six months figure that Google scraped actually referred to the device's battery life. The next day, AI Overviews said that it couldn't determine the duration of software support or updates for the gadget, the FTC noted.
In its report, the FTC encouraged law enforcement and policymakers to investigate whether vendors properly disclose software support commitments. The government agency warned that not informing shoppers about how long products with warranties will be supported may go against the Magnuson Moss Warranty Act:
This law requires that written warranties on consumer products costing more than $15 be made available to prospective buyers prior to sale and that the warranties disclose a number of things, including, "a clear description and identification of products, or parts, or characteristics, or components or properties covered by and where necessary for clarification, excluded from the warranty."
The FTC also noted that vendors could be in violation of the FTC Act if omissions or misrepresentations around software support are likely to mislead shoppers.
The FTC's research follows a September letter to the agency from 17 groups, including iFixit, Public Interest Research Group, Consumer Reports, and the Electronic Frontier Foundation, imploring that the FTC provide "clear guidance" on "making functions of a device reliant on embedded software that ties the device back to a manufacturer's servers," aka software tethering.
Speaking to Ars Technica in September, Lucas Gutterman, the Designed to Last campaign director with the US PIRG Education Fund and one of the letter's signatories, expressed optimism that the FTC would get involved, like when it acted against Harley-Davidson in 2022, saying that it was using warranty policies to limit customers right' to repair illegally, or when it investigated the 2016 shutdown of Nest Labs' Revolv Smart Home Hub.
In response to the FTC's report this week, Gutterman pointed to initiatives like repair scores as potential remedies.
"When we don't own what we buy, everything becomes disposable, and we get stuck in a loop where products keep dying and we keep buying," he said.
As more devices join the Internet of Things, the risk of consumers falling victim to flashy marketing that promises convenient features that could be ripped away through lack of software support becomes more concerning. Whether it's dealing with bricked devices or the sudden removal of valued features, owners of smart devices—from smart bassinets and Pelotons to printers, indoor gardening systems, and toothbrushes—have all faced the harsh realities of what happens when a vendor loses interest or the ability to support products likely sold at premiums. Some are tired of waiting for vendors to commit to clear, reliable software support and are hoping that the government creates a mandatory path for disclosure.
https://physics.aps.org/articles/v17/168
Observations confirm a theoretical model explaining how—in Earth's magnetosphere—large-scale magnetic waves heat up the magnetosphere's plasma by transferring their energy to smaller-scale acoustic waves.
Ocean currents spin off huge gyres, whose kinetic energy is transferred to ever-smaller turbulent structures until viscosity has erased velocity gradients and water molecules jiggle with thermal randomness. A similar cascade plays out in space when the solar wind slams into the magnetopause, the outer boundary of Earth's magnetic field. The encounter launches large-scale magnetic, or Alfvén, waves whose energy ends up heating the plasma inside the magnetosphere. Here, however, the plasma is too thin for viscosity to mediate the cascade. Since 1971 researchers have progressively developed their understanding of how Alfvén waves in space plasmas generate heat. These studies later culminated in a specific hypothesis: Alfvén waves accelerate ion beams, which create small-scale acoustic waves, which generate heat. Now Xin An of UCLA and his collaborators have found direct evidence of that proposed mechanism [1]. What's more, the mechanism is likely at work in the solar wind and other space plasmas.
Laboratory-scale experiments struggle to capture the dynamics of rotating plasmas, and real-world observations are even more scarce. The observations that An and his collaborators analyzed were made in 2015 by the four-spacecraft Magnetospheric Multiscale (MMS) mission. Launched that year, the MMS was designed to study magnetic reconnection, a process in which the topology of magnetic-field lines is violently transformed. The field rearrangements wrought by reconnection can be large, on the scale of the huge loops that sprout from the Sun's photosphere. But the events that initiate reconnection take place in a much smaller region where neighboring field lines meet, the X-line. The four spacecraft of MMS can fly in a configuration in which all of them witness the large-scale topological transformation while one of them could happen to fly through the X-line—a place where no spacecraft had deliberately been sent before.
On September 8, 2015, the orbits of the MMS spacecraft took them through the magnetopause on the dusk side of Earth. They were far enough apart that together they could detect the passage of a large-scale Alfvén wave, while each of them could individually detect the motion of ions in the surrounding plasma. An and his collaborators later realized that these observations could be used to test the theory that ion beams and the acoustic waves that they generate mediate the conversion of Alfvén-wave energy to heat.
Data from the various instruments aboard the MMS spacecraft show signatures from all three factors that drive the energy cascade: Alfvén waves and ion beams, both of which have length scales of about 2000 km, and acoustic waves, which have length scales of 50–1500 m. Crucially, the instruments also recorded connections between the processes. The Alfvén waves' magnetic-pressure variations were in sync with fluctuations in ion density and the local electric field, while the ion beams' speeds matched those of either the local Alfvén waves or the acoustic waves.
Reference:
Xin An, Anton Artemyev, Vassilis Angelopoulos, Terry Z. Liu, Ivan Vasko, and David Malaspina, Cross-Scale Energy Transfer from Fluid-Scale Alfvén Waves to Kinetic-Scale Ion Acoustic Waves in the Earth's Magnetopause Boundary Layer,Phys. Rev. Lett. 133, 225201, (DOI: https://doi.org/10.1103/PhysRevLett.133.225201)
"This bill seeks to set a new normative value in society that accessing social media is not the defining feature of growing up in Australia. There is wide acknowledgement that something must be done in the immediate term to help prevent young teens and children from being exposed to streams of content unfiltered and infinite.
(Michelle Rowland, Minister for Communications, Australian Parliament, Nov 21)
Australia's House of Representatives has passed a bill that would ban access to social media platforms TikTok, Facebook, Snapchat, Reddit, X and Instagram for youngsters under 16. The bill passed by 102 against 13.
Once the bill gets through the Senate -- expected this week -- the platforms would have a year to work out how to implement the age restriction, without using government-issued identity documents (passport, driving licenses), and without digital identification through a government system.
The leaders of all eight Australian states and mainland territories have unanimously backed the plan, although Tasmania, the smallest state, would have preferred the threshold was set at 14.
There are some counter-noises though (no, not you, Elon). More than 140 academics signed an open letter to Prime Minister Anthony Albanese condemning the 16-year age limit as "too blunt an instrument to address risks effectively."
The writers of that open letter fear that the responsibility of giving access to social media will fall on the parents, and "not all parents will be able to manage the responsibility of protection in the digital world".
Further, " Some social media 'type' services appear too integral to childhood to be banned, for example short form video streamers. But these too have safety risks like risks of dangerous algorithms promoting risky content. A ban does not function to improve the products children will be allowed to use."
The open letter pleads instead for systemic regulation, which "has the capacity to drive up safety and privacy standards on platforms for all children and eschews the issues described above. Digital platforms are just like other products, and can have safety standards imposed."
Australia's ban on social media will be a world-first, with fines of up to 50 million Australian Dollars for each failure to prevent them youngsters of having a social media account.
Under the laws, which won't come into force for another 12 months, social media companies could be fined up to $50 million for failing to take "reasonable steps" to keep under 16s off their platforms. There are no penalties for young people or parents who flout the rules. Social media companies also won't be able to force users to provide government identification, including the Digital ID, to assess their age.
From ban children under the age of 16 from accessing social media we also get the following:
Under the laws, which won't come into force for another 12 months, social media companies could be fined up to $50 million for failing to take "reasonable steps" to keep under 16s off their platforms. There are no penalties for young people or parents who flout the rules. Social media companies also won't be able to force users to provide government identification, including the Digital ID, to assess their age.
Social Media, or an "age-restricted social media platform" has been defined in the legislation as including services where:
- the "sole purpose, or a significant purpose" is to enable "online social interaction" between people
- people can "link to, or interact with" others on the service
- people can "post material", or
- it falls under other conditions as set out in the legislation.
Tracking Indoor Location, Movement and Desk Occupancy in the Workplace: A case study on technologies for behavioral monitoring and profiling using motion sensors and wireless networking infrastructure inside offices and other facilities
As offices, buildings and other corporate facilities become networked environments, there is a growing desire among employers to exploit data gathered from their existing digital infrastructure or additional sensors for various purposes. Whether intentionally or as a byproduct, this includes personal data about employees, their movements and behaviors.
Technology vendors are promoting solutions that repurpose an organization's wireless networking infrastructure as a means to monitor and analyze the indoor movements of employees and others within buildings. While GPS technology is too imprecise to track indoor location, Wi-Fi access points that provide internet connectivity for laptops, smartphones, tables and other networked devices can be used to track the location of these devices. Bluetooth, another wireless technology, can also be used to monitor indoor location. This can involve Wi-Fi access points that track Bluetooth-enabled devices, so-called "beacons" that are installed throughout buildings and Bluetooth-enabled badges carried by employees. In addition, employers can utilize badging systems, security cameras and video conferencing technology installed in meeting rooms for behavioral monitoring, or even environmental sensors that record room temperature, humidity and light intensity. Several technology vendors provide systems that use motion sensors installed under desks or in the ceilings of rooms to track room and desk attendance.
[Source]: Cracked Labs
[Case Study]: https://crackedlabs.org/dl/CrackedLabs_Christl_IndoorTracking.pdf [PDF]
[Also Covered By]: The Register
Intel has a serious problem with Arrow Lake and memory compatibility:
We've had Intel's Arrow Lake chips on the test benches lately, and the performance we were expecting wasn't quite there. Even on some of the best LGA1851 motherboards, we've noticed various performance quirks that need more than just a quick fix. One of those odd behaviors is DDR5 RAM sticks that refuse to boot at XMP settings, regardless of what speed they are rated for.
This is separate from the latency issue caused by the slower bus speed and the shifting of the memory controller to the SoC tile instead of on the compute tile, and it seems Arrow Lake is pickier with memory than any Intel platform I can think of since forever. I mean, Intel has always been the processor to get for using fast, low-latency RAM, and that long streak has now ended.
RAM compatibility for Arrow Lake is abysmal, Intel used to be the gold standard for fast RAM support but no longer
I've been around computing a long time, long enough that dual-data rate RAM wasn't even a thing at the time. Every new DDR revision has come with teething issues, as have most major CPU architecture changes. DDR5 still isn't a mature technology, but it's getting close as faster speeds combined with low timings are becoming more common. Arrow Lake is the first major architecture change from Intel since 2023's Alder Lake, when both DDR4 and DDR5 were supported.
Both of these new technologies seem to have caused more issues when combined, and this is one of the worst launches I've had hands-on experience with for memory compatibility. Arrow Lake feels pickier than even first-gen Ryzen was, back when Samsung B-die was the king for DDR4 RAM, spending hours looking over spec sheets and forums to find the memory kits using the vaunted DRAM modules.
But Ryzen at least had a common fix, as every Samsung B-die memory stick worked fine once the DDR and SoC voltages were increased slightly. When the Core Ultra 9 285 K was on the test bench, I tried nearly a dozen different DDR5 kits, and not one would boot with XMP enabled. Those kits ranged between 5,600MT/s and 8,800MT/s and between 16GB and 32GB per DIMM. Some of those kits were early DDR5 with XMP support, some were recent, and two kits were of the new CUDIMM variety that have an onboard clock driver to enable faster RAM speeds on Arrow Lake specifically. Some even had trouble booting at JEDEC speeds, which I've never experienced on any platform.
The only kit that did boot at higher speeds was from Kingston, and they were 8,800MT/s CUDIMMs. But it wasn't any of the BIOS settings I set from the years of RAM overclocking experience that I have on multiple platforms that worked. I had to boot into Windows and use Gigabyte's AI Snatch program, which tested the RAM and used algorithms to decide what speed and timings the kit should be using. After a reboot into BIOS to enable those AI generated settings and ensuring the DDR voltage was set to 1.45V, it booted into Windows at 8,933MT/s.
There was one last issue, however, in that the RAM would only run in Gear 4. Most low-latency memory DDR4 runs in Gear 1; using Gear 2 is a way to get higher speeds but at a slight penalty to latency, as it runs the memory at a 2:1 ratio compared to the memory controller. Most DDR5 uses Gear 2 to begin with, and to get higher speeds on Arrow Lake, drops to a 4:1 ratio, aka Gear 4. That's a huge performance hit in latency, on top of the considerable latency that Arrow Lake has by design. And remember, this is one kit out of nearly a dozen that could run at or above its rated speed.
XMP speeds will get fixed for the most part, as we've seen with AMD's Ryzen and how much better it handles RAM compatibility since its initial release. Arrow Lake is Intel's Ryzen moment, with the potential to do much more and build better CPUs in the future. It just has to get there, and the ring bus that shuttles data between the CPU and L3 cache and the hop to the memory controller are two things that need improving for the next silicon release. Intel can mitigate some of the aspects of the memory hit, but it can't do much about the inherent latency of the trip between the SoC tile and the compute tile.
It will likely involve a combination of silicon fixes, Windows, and driver improvements, as AMD did with Ryzen. If the computer's software is aware of the latency, it can be rewritten to account for it somewhat and make the system snappier as a result.
[...] Intel might not be able to fix everything [...] the hardware limitations of Arrow Lake's design means a true fix is unlikely.
The good news for consumers (and for Intel) is that the company has identified a combination of tuning and optimization issues that it can fix, and an update should be coming soon. That should improve gaming and productivity performance, and improve the overall experience while using Arrow Lake chips. We're looking forward to retesting at that time to see if we have to revise our review scores, but have realistic expectations on the performance bump because there's one thing that no amount of optimization can wipe out.
That's the inherent latency in the memory pipeline of Arrow Lake chips, because it's baked into the fabric of the hardware. Moving the IMC away from the compute tile seems to have compounded any other optimization issues. Remember, when Ryzen first launched, the IMC was on the CCX, and AMD still had memory latency issues, partly because of inter-CCX data. Later versions of Ryzen moved the IMC onto the I/O chiplet, but AMD was able to reduce the memory latency penalty because of how they designed the Infinity Fabric interconnect.
It already seems that Intel is returning to an integrated IMC because speculation and leaks around Panther Lake suggest that the IMC will be placed on the compute tile again. That might be expected anyway, as Panther Lake is a mobile chip and wouldn't have the space on the packaging substrate for an SoC tile. But Nova Lake, which is the successor to Arrow Lake, will move the IMC off the compute tile again, but with more optimizations to reduce latency hits. Or, at least, that's what the plan seems to be from the speculation.
[...] So, where does this leave Intel? The troubled chipmaker was already struggling with designs, as it canceled Meteor Lake's desktop chips so that the team could focus on Arrow Lake. One can only imagine how much worse things could have been if that hadn't happened and the engineering team had to work on two CPU lines at once. The other thing is that Arrow Lake is the first new architectural change since 2021, so Intel is already running behind on its usual tick-tock process change and then improvement cycle. Even with plenty of engineering talent going to Apple to make Apple Silicon, Intel still has plenty of talent on deck, so it's more a question of when, rather than if, it finds its groove again.
http://www.righto.com/2024/11/antenna-diodes-in-pentium-processor.html
I was studying the silicon die of the Pentium processor and noticed some puzzling structures where signal lines were connected to the silicon substrate for no apparent reason. Two examples are in the photo below, where the metal wiring (orange) connects to small square regions of doped silicon (gray), isolated from the rest of the circuitry. I did some investigation and learned that these structures are "antenna diodes," special diodes that protect the circuitry from damage during manufacturing. In this blog post, I discuss the construction of the Pentium and explain how these antenna diodes work.
Before reading the article ask yourself "What percentage of Americans do you think are [fill in the blank]?"
Now go see how you did.
Yesterday Jemele Hill recirculated a study YouGov did in 2022 about the gaps between people's perceptions and reality.
YouGov asked a series of questions on "What percentage of Americans do you think are [fill in the blank]?" with the [blank] being all sorts of qualities: black, gay, Christian, left-handed, own a passport, etc.
TLDR: there are a lot of stupid people out there.
There have been some past rumblings on the internet about a capacitor being installed backwards in Apple's Macintosh LC III. The LC III was a "pizza box" Mac model produced from early 1993 to early 1994, mainly targeted at the education market. It also manifested as various consumer Performa models: the 450, 460, 466, and 467. Clearly, Apple never initiated a huge recall of the LC III, so I think there is some skepticism in the community about this whole issue. Let's look at the situation in more detail and understand the circuit. Did Apple actually make a mistake?
I participated in the discussion thread at the first link over a decade ago, but I never had a machine to look at with my own eyes until now. I recently bought a Performa 450 complete with its original leaky capacitors, and I have several other machines in the same form factor. Let's check everything out!
Forty-four of the world's leading climate scientists have called on Nordic policymakers to address the potentially imminent and "devastating" collapse of key Atlantic Ocean currents.
In an open letter published online Monday (Oct. 21), University of Pennsylvania climatologist Michael Mann and other eminent scientists say the risks of weakening ocean circulation in the Atlantic have been greatly underestimated and warrant urgent action.
The currents in question are those forming the Atlantic Meridional Overturning Circulation (AMOC), a giant ocean conveyor belt that includes the Gulf Stream and transports vital heat to the Northern Hemisphere. Research shows the AMOC is slowing down and could soon reach a tipping point due to global warming, throwing Earth's climate into chaos.
Icy winds howl across a frozen Thames, ice floes block shipping in the Mersey docks, and crops fail across the UK. Meanwhile, the US east coast has been inundated by rising seas and there's ecological chaos in the Amazon as the wet and dry season have switched around... The world has been upended. What's going on?
While these scenes sound like something from a Hollywood disaster movie, a new scientific study investigating a key element of Earth's climate system – the Atlantic Meridional Overturning Circulation (AMOC) – says this could occur for real as soon as 2050.
(arxiv) Probability Estimates of a 21st Century AMOC Collapse
Abstract
There is increasing concern that the Atlantic Meridional Overturning Circulation (AMOC) may collapse this century with a disrupting societal impact on large parts of the world. Preliminary estimates of the probability of such an AMOC collapse have so far been based on conceptual models and statistical analyses of proxy data. Here, we provide observationally based estimates of such probabilities from reanalysis data. We first identify optimal observation regions of an AMOC collapse from a recent global climate model simulation. Salinity data near the southern boundary of the Atlantic turn out to be optimal to provide estimates of the time of the AMOC collapse in this model. Based on the reanalysis products, we next determine probability density functions of the AMOC collapse time. The collapse time is estimated between 2037-2064 (10-90% CI) with a mean of 2050 and the probability of an AMOC collapse before the year 2050 is estimated to be 59±17%.
Here's what I'll miss about Chrome OS once it turns into Android
[...] Chrome OS has an expiration date. It's not right around the corner — it'll probably take a couple of years, but one day, all Chromebooks will run Android over Chrome OS so that Google can better compete with Apple.
While I can understand why Google streamlining its operating systems is probably a good thing in the long run, losing Chrome OS will come with some growing, or rather, shrinking pains. It will force Google to choose between Android and Chrome OS for the future of several useful features, and I'm nervous that some of my favorites will disappear.
[....] Right now, regular updates are one of my favorite reasons to recommend Chromebooks. You don't usually have to worry about how many years of support your light, fast laptop is promised because it will get a brand-new version of Chrome OS every four weeks.
[....] The problem with Google shifting from Chrome OS to Android is that, well, Android updates don't work in quite the same way. Rather than pushing one update to every device, each OEM has to take the time to optimize Google's latest product to work with its own Android skin. That optimization delays the update schedule, sometimes to the point where a phone will fall behind by a version or two. And, when that happens, it almost never really catches up.
[....] Right now, organizing files on a Chromebook feels like it should — it's very desktop-coded. Everything lives inside a folder like you'd find on a Windows or Mac laptop, and you can quickly sort by everything from title to file type for easy access. When you find what you need, you can then pin it to your Chromebook's taskbar, keeping it just a tap away. Want to do that on an Android phone? There's no space for more icons at the bottom of your display.
One useful feature of Chrome OS is that you can run a Linux VM within it if you have decent Chomebook hardware.