Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

The Best Star Trek

  • The Original Series (TOS) or The Animated Series (TAS)
  • The Next Generation (TNG) or Deep Space 9 (DS9)
  • Voyager (VOY) or Enterprise (ENT)
  • Discovery (DSC) or Picard (PIC)
  • Lower Decks or Prodigy
  • Strange New Worlds
  • Orville
  • Other (please specify in comments)

[ Results | Polls ]
Comments:75 | Votes:84

posted by janrinok on Wednesday April 24, @04:38PM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Rotary engines (also known as Wankel engines and Wankel rotary engines) are quite different from piston or "reciprocating" engines. One of the distinguishing features is that they don't need valves to operate. That's to say, the Wankel rotary engine design doesn't have valves — but quite a few rotary engine designs have incorporated them. Both rotary engines and piston engines utilize internal combustion and share the same phases of intake, compression, power, and exhaust. But beyond this similarity, they are very different in design and operation.

The work done by a rotary engine doesn't need to be converted into rotational motive power for use by propellers or transmission, as is the case with piston engines. As a result, they are considered more efficient by some metrics and require fewer moving parts to function. The first rotary engines were primarily used for aircraft during World War I, but the design was abandoned due to flaws and inefficiencies.

In 1954, German engineer Felix Wankel invented a new design for an automobile rotary engine for the German car and bike company NSU. After prototype testing by NSU in the following years, they entered into an agreement with Japanese company Mazda to develop Wankel rotary engines for its cars. The first Mazda cars with rotary engines were launched in Japan in the 1960s before crossing the Pacific to America in 1971. Remaining one of the few companies that stuck with the rotary engine design, Mazda has, over the years, developed some of the most innovative engines of this type.

These engines create power by combusting a mixture of compressed air and fuel within a chamber or cylinder, translating the displacement of the rotor or piston into motion. Wankel rotary engines feature an equilateral triangular rotor with convex edges in an ovaloid chamber or rotor housing (the shape is an epitrochoid where the long sides of a symmetrical oval have two curves like a slight figure eight).

As the triangular rotor moves through the chamber, all three of its apexes are in constant contact with the housing (thanks to rotary engine apex seals), creating three gas volumes that are isolated by the rotor. There are two ports on the same side of the figure eight housing, with the top one for intake of fuel and air and the bottom one for the exhaust of combusted gases. Intake can be assisted by a supercharger pushing air into the chamber, while exhaust can be assisted by a turbocharger pulling air out of the chamber.

As the leading edge of the rotor passes the intake port, it creates a vacuum, inducing (or pulling) air and fuel into the chamber. The rotor's motion then compresses the air and fuel, lit by a spark plug (two in Mazda's design), further propelling the rotor along its path and sending the combusted gases out of the exhaust port. The rotor is attached to an eccentric output shaft (containing lobes not on the center of the shaft) via a gear, and as it rotates, the shaft produces a torque that's used to turn the transmission.

So far, we've explained the most basic Wankel rotary engine, but Mazda — the company that popularized it — has developed several variations over the years. One improvement included a concave pocket on each of the three convex edges of the rotor, which increased the amount of volume within the epitrochoid housing.

Mazda also attempted to solve one of the biggest problems of the Wankel rotary engine — the incomplete combustion of all the fuel in the air-fuel mixture — by incorporating two spark plugs to combust both compressed air-fuel volumes created within a single cycle of the rotor. In its latest rotary engine design used in the November 2023 Mazda MX-30 Skyactiv R-EV, the Japanese company also changed the location of the fuel intake to the top of the housing, separating it from the air intake, thereby keeping it in the main intake area and increasing atomization to improve the incomplete combustion of fuel.

Another Mazda rotary engine innovation is using an auxiliary intake port valve that can swivel open to increase the amount of air during the intake phase when required for more power at higher RPMs. Later designs of the Mazda rotary engine included two more valves for greater air intake when needed at higher RPMs. So, to answer the question, do rotary engines have valves? Yes, they do, including in some of the most popular rotary engines of all time — those used in the Mazda RX-7 and Mazda RX-8.

Rotary engines have three power phases instead of just the one of a four-stroke engine. This, along with the way the rotor is mated to the gear of the output shaft, means that for every complete rotation of the rotor, the output shaft rotates thrice, instead of just once, as with a reciprocating piston engine. All four phases occur within a single cycle of the rotor, rather than one at a time in a four-stroke piston engine.

There is also no need for a minimum of two valves (intake and exhaust) per cylinder as seen in a piston engine. Instead, fuel enters via intake ports that don't need to open and close. In a Wankel rotary engine, air and fuel are pulled in by a partial vacuum, and combusted gases are pushed out by pressure. However, various valves have been adopted to deliver variable valve timing for better air intake at different RPMs.

In its simplest form, a Wankel rotary engine has just two moving parts: the rotor and the output shaft. Compare that to modern piston cylinder engines, which have over 40 moving parts. Of course, in many iterations of the rotary engine, Mazda used two rotors rather than one, bringing the total number of moving parts to three. You get at least three more moving parts if you include the extra air intake valves the company added in later years.

Advantages of a rotary engine include its compact size, which makes it lightweight and gives it a higher power-to-weight ratio than piston engines. Its simple design with fewer moving parts makes it easier to produce, and the parts also move slower compared to piston engines, making rotary engines more reliable in terms of wear and tear. A reciprocating engine needs its power translated to rotational motion, whereas the rotary engine produces direct rotational motion, creating much lower vibration, higher RPMs, and smoother power delivery. Rotary engines are also touted for their multi-fuel capabilities, ranging from gasoline to ethanol and natural gas.

Disadvantages of a rotary engine include lower fuel efficiency, increased emissions (thanks to the use of oil — that is combusted alongside the air-fuel mixture — within the housing to better lubricate and seal the rotor), and regular replacement of the seals (apex, face, and side). Rotary engines also have reduced thermodynamic efficiency thanks to the large rotor housing with combustion happening only in certain sections, which leads to different temperatures of the housing, causing uneven expansion and difficulty in maintaining a seal. [...]


Original Submission

posted by janrinok on Wednesday April 24, @11:54AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Microsoft this week demoed VASA–1, a framework for creating videos of people talking from a still image, audio sample, and text script, and claims – rightly – it's too dangerous to be released to the public.

These AI-generated videos, in which people can be convincingly animated to speak scripted words in a cloned voice, are just the sort of thing the US Federal Trade Commission warned about last month, after previously proposing a rule to prevent AI technology from being used for impersonation fraud.

Microsoft's team acknowledge as much in their announcement, which explains the technology is not being released due to ethical considerations. They insist that they're presenting research for generating virtual interactive characters and not for impersonating anyone. As such, there's no product or API planned.

"Our research focuses on generating visual affective skills for virtual AI avatars, aiming for positive applications," the Redmond boffins state. "It is not intended to create content that is used to mislead or deceive.

"However, like other related content generation techniques, it could still potentially be misused for impersonating humans. We are opposed to any behavior to create misleading or harmful contents of real persons, and are interested in applying our technique for advancing forgery detection."

Kevin Surace, Chair of Token, a biometric authentication biz, and frequent speaker on generative AI, told The Register in an email that while there have been prior technology demonstrations of faces animated from a still frame and cloned voice file, Microsoft's demonstration reflects the state of the art.

"The implications for personalizing emails and other business mass communication is fabulous," he opined. "Even animating older pictures as well. To some extent this is just fun and to another it has solid business applications we will all use in the coming months and years."

The "fun" of deepfakes was 96 percent nonconsensual porn, when assessed [PDF] in 2019 by cybersecurity firm Deeptrace.

Nonetheless, Microsoft's researchers suggest that being able to create realistic looking people and put words in their mouths has positive uses.

"Such technology holds the promise of enriching digital communication, increasing accessibility for those with communicative impairments, transforming education, methods with interactive AI tutoring, and providing therapeutic support and social interaction in healthcare," they propose in a research paper that does not contain the words "porn" or "misinformation."

While it's arguable AI generated video is not quite the same as a deepfake, the latter defined by digital manipulation as opposed to a generative method, the distinction becomes immaterial when a convincing fake can be conjured without cut-and-paste grafting.

[...] In prepared remarks, Rijul Gupta, CEO of DeepMedia, a deepfake detection biz, said:

[T]he most alarming aspect of deepfakes is their ability to provide bad actors with plausible deniability, allowing them to dismiss genuine content as fake. This erosion of public trust strikes at the very core of our social fabric and the foundations of our democracy. The human brain, wired to believe what it sees and hears, is particularly vulnerable to the deception of deepfakes. As these technologies become increasingly sophisticated, they threaten to undermine the shared sense of reality that underpins our society, creating a climate of uncertainty and skepticism where citizens are left questioning the veracity of every piece of information they encounter.

But think of the marketing applications.


Original Submission

posted by janrinok on Wednesday April 24, @07:07AM   Printer-friendly

The universe may be dominated by particles that break causality and move faster than light, new paper suggests:

Could the cosmos be dominated by particles that move faster than the speed of light? This model of the universe agrees surprisingly well with observations, a pair of physicists has discovered.

In a new paper that has yet to be peer-reviewed, the physicists propose that our universe is dominated by tachyons — a hypothetical kind of particle that always moves faster than light. Tachyons almost certainly don't exist; going faster than light violates everything we know about the causal flow of time from past to future. But the hypothetical particles are still interesting to physicists because of the small chance that even our most closely held notions, like causality, might be wrong.

The researchers suggest that tachyons might be the true identity of dark matter, the mysterious form of matter that makes up most of the mass of almost every single galaxy in the universe, outweighing normal matter 5 to 1. Astronomers and physicists alike currently do not know what dark matter is made of, so they are free to cook up all manner of ideas —– because, after all, sometimes a far-out idea is right, and even if it's wrong, it can help us on the path to a better understanding.

The researchers calculate that an expanding universe filled with tachyons can initially slow down in its expansion before reaccelerating. Our universe is currently in an accelerating phase, driven by a phenomenon known as dark energy, so this tachyon cosmological model can potentially explain both dark energy and dark matter at the same time.

To test this idea, the physicists applied their model to observations of Type Ia supernovae, a kind of stellar explosion that allows cosmologists to build a relationship between distance and the expansion rate of the universe. It was through Type Ia supernovae that astronomers in the late 1990s first discovered that the universe's expansion rate is accelerating.

The physicists found that a tachyon cosmological model was just as good at explaining the supernova data as the standard cosmological model involving dark matter and dark energy. That itself is a surprise, given how unorthodox this idea is.

However, that's only the beginning. We now have access to a wealth of data about the large-scale universe, like the cosmic microwave background (remnant radiation released just after the Big Bang) and the arrangement of galaxies at the very largest scales. The next step is to continue testing this idea against those additional observations.

The tachyon cosmological model is unlikely to pass those rigorous experimental tests, given the unlikely nature of tachyons. But continuing to push in new, even unorthodox, directions is important in cosmology; we never know when we might get a breakthrough. Scientists have been attempting to understand dark matter for 50 years and dark energy for a quarter century, without any conclusive results. The solutions to these conundrums are likely to come from unexpected directions.

The team's research was published to the preprint database arXiv in March.


Original Submission

posted by janrinok on Wednesday April 24, @02:22AM   Printer-friendly
from the Evolution-never-stops dept.

Braarudosphaera bigelowii is a species of algae, a coastal coccolithophore in the fossil record going back 100 million years. It has recently been found to have engulfed a cyanobacterium that lets them do something that algae, and plants in general, can't normally do: "fixing" nitrogen straight from the air, and combining it with other elements to create more useful compounds.

Nitrogen is a key nutrient, and normally plants and algae get theirs through symbiotic relationships with bacteria that remain separate. At first it was thought that B. bigelowii had hooked up this kind of situation with a bacterium called UCYN-A, but on closer inspection, scientists discovered that the two have gotten far more intimate.

In one recent study, a team found that the size ratio between the algae and UCYN-A stays similar across different related species of the algae. Their growth appears to be controlled by the exchange of nutrients, leading to linked metabolisms.

"That's exactly what happens with organelles," said Jonathan Zehr, an author of the studies. "If you look at the mitochondria and the chloroplast, it's the same thing: they scale with the cell."

In a follow-up study, the team and other collaborators used a powerful X-ray imaging technique to view the interior of the living algae cells. This revealed that the replication and cell division was synchronized between the host and symbiote – more evidence of primary endosymbiosis at work.

And finally, the team compared the proteins of isolated UCYN-A to those inside the algal cells. They found that the isolated bacterium can only produce about half of the proteins it needs, relying on the algal host to provide the rest.

"That's one of the hallmarks of something moving from an endosymbiont to an organelle," said Zehr. "They start throwing away pieces of DNA, and their genomes get smaller and smaller, and they start depending on the mother cell for those gene products – or the protein itself – to be transported into the cell."

Altogether, the team says this indicates UCYN-A is a full organelle, which is given the name of nitroplast. It appears that this began to evolve around 100 million years ago, which sounds like an incredibly long time but is a blink of an eye compared to mitochondria and chloroplasts.

The researchers plan to continue studying nitroplasts, to find out if they're present in other cells and what effects they may have. One possible benefit is that it could give scientists a new avenue to incorporate nitrogen-fixing into plants to grow better crops.

-----
Hopefully this behavior has been going on for tens of millions of years, if this is a newer development it could become a significant evolutionary advantage - radically changing the biome in a short time.


Original Submission

posted by janrinok on Tuesday April 23, @09:38PM   Printer-friendly

The EU is at it again -- with unleashing a new raft of legislation upon world+donkey.

Maybe some of the colored tape bureaucrats are avid readers of Soylentnews, as this time they got top management in their crosshairs.

EU members need to implement the directive into national law by January 16, next year (2025). Full text of the directive here, interesting reviews here and here, and a link to the EU's wider Cybersecurity Strategy (which also involves security of hardware and software products) here.

The culprit of service is the second generation of the EU's Cybersecurity Directive (NIS2). The new legislation widely extends its scope to nearly any company with more than 50 employees and €10M+ in yearly revenue. On top of that, the number of industrial sectors which are deemed essential in terms of critical infrastructure doubles from 6 to 12, including ICT service management, government institutions, post and courier services, manufacturing companies, the food-processing industry, waste water management, space companies, research organisations and the chemical industry as a whole. Suppliers to these companies can also fall under the new regulation.

In practice, national centers for cybersecurity will be responsible to execute cybersecurity checks through audits and/or unannounced security scans. If the target company neglects their recommendations, it risks heavy fines: at least 2 percent of worldwide revenue up to a maximum of €10 million for companies with more than 250 employees, or more than €50 million yearly revenue. Smaller companies risk at least 1.4 percent of yearly revenue with a maximum of €7 million.

These fines cannot just be classified under company expenses, though. Under the new regulation, CEOs and board members are obligated to follow cybersecurity training, and to sign off on all cybersecurity measures. They are deemed personally responsible, and run the risk of being barred temporarily from similar roles, and -- most importantly -- of having to pay the resulting fine out of their own pocket, not through the company.


Original Submission

posted by janrinok on Tuesday April 23, @04:49PM   Printer-friendly

https://hackaday.com/2024/04/19/end-of-life-for-z80-cpu-and-peripherals-announced/

Zilog To End Standalone Sales Of The Legendary Z80 CPU

Zilog's parent company Littelfuse has notified customers and distributors that it's end of life for the good ol' Z80. In a End of Life / Last Time Buy notification (https://www.mouser.com/PCN/Littelfuse_PCN_Z84C00.pdf ) they state:

"Please be advised that our Wafer Foundry Manufacturer will be discontinuing support for the Z80 product and other product lines."

You can place a final order up until 6/14, if you need to stock up on Z80s.

Arthur T Knackerbracket has processed the following story:

End of an Era: End-Of-Life for the Venerable Zilog Z80

Production of some models of Z80 processor – the chip that helped spark the PC boom of the 1980s – will cease in June 2024 after an all-too-brief 48 years.

The Z80 debuted in 1976, using a 4-micron process. Readers will doubtless be aware that some modern silicon is made on a 4-nanometer process – meaning elements are 1,000 times smaller than those etched into a Z80.

Zilog will accept orders for the device until June 14, 2024. After that, it's the end for the eight-bit CPU – or at least the ZC8400 range. Zilog appears to still make the Z180 and eZ80 – successors that added lots of whistles and bells and are often packaged into SoCs.

The original Z80 packed just 8,500 transistors and chugged along at 2.5Mhz, but that was enough to power lots of fun stuff – helped by the fact that it was compatible with Intel's 8080 processor and sold at a cheaper price.

The Sinclair ZX range was perhaps the most famous application of the Z80, using it to power affordable and accessible machines that introduced many Register readers (and writers) to tech. The chip also found its way into arcade games such as Pac Man, and early Roland synthesizers.

But Zilog was overtaken by Intel in the PC market, and by the 1990s decided to focus on microcontrollers instead. The Z80 was one of its key offerings, and over the years was adapted and enhanced: we even spotted a new variant of the chip in 2016!

That sort of upgrade helped the processor and its heirs to hold on in some consumer-facing applications such as graphing calculators like the TI-84 Plus CE. But it mostly disappeared into industrial kit, where it hummed along reliably and offered developers a tried-and-true target for their code.

Perhaps someone will place a giant order for ZC8400s to hoard them, so that those committed to the platform can continue to get kit – a plausible scenario given the likelihood the processor retains a hidden-but-critical role in defense or some legacy tech that will persist for decades.

Or perhaps there's one last batch of ZX Spectrums to be made!

See also:


Original Submission #1Original Submission #2

posted by hubie on Tuesday April 23, @12:06PM   Printer-friendly

US Air Force successfully tests AI-controlled fighter jet in dogfight against human pilots:

In what has been hailed as a milestone moment for artificial intelligence's use in the military, the US Air Force has announced that it successfully tested a modified, AI-controlled fighter jet in a dogfight against human pilots last year.

The US Air Force Test Pilot School and the Defense Advanced Research Projects Agency (DARPA) first started testing a Lockheed Martin X-62A VISTA (Variable In-flight Simulation Test Aircraft) fitted with AI software back in December 2022, part of the Air Combat Evolution (ACE) program. Able to mimic the performance characteristics of other aircraft, the X-62A was flown for by the AI for over 17 hours.

DARPA revealed on Thursday that in September 2023, the X-62A carried out the first successful AI versus human within-visual-range engagements, also known as a dogfight.

The AI dogfights paired the X-62A VISTA against manned F-16 aircraft in the skies above Edwards Air Force Base in Kern County, California. After initial flight safety was built up using defensive maneuvers, the aircraft switched to offensive high-aspect nose-to-nose engagements where they approached within 2,000 feet of each other at 1,200 miles per hour. It isn't revealed which side won the simulated battle.

[...] In addition to its autonomous flight capabilities, the X-62A VISTA, a modified F-16, also features a high-resolution camera, a compact size, lightweight construction, and is versatile enough to be used for a wide range of applications, including scientific research, surveillance, recon, environmental monitoring, and emergency response.

Could Air Force pilots be yet another profession that is eventually threatened by AI? The 2020 AlphaDogfight Trials, a three-day competition designed to demonstrate advanced algorithms capable of dogfighting (using VR simulations), saw an experienced F-16 Air Force pilot lose 5-0 to the AI agent. DARPA said the machine performed aggressive and precise maneuvers that the human pilot could not match.

DARPA says that the X-62A VISTA will continue to serve a variety of customers for research while providing key academic lessons for the next generation of test leaders.


Original Submission

posted by hubie on Tuesday April 23, @07:21AM   Printer-friendly
from the complaints-department-5000-miles-> dept.

https://arstechnica.com/gadgets/2024/04/linus-torvalds-reiterates-his-tabs-versus-spaces-stance-with-a-kernel-trap/

Anybody can contribute to the Linux kernel, but any person's commit suggestion can become the subject of the kernel's master and namesake, Linus Torvalds. Torvalds is famously not overly committed to niceness, though he has been working on it since 2018. You can see glimpses of this newer, less curse-laden approach in how Torvalds recently addressed a commit with which he vehemently disagreed. It involves tabs.
[...]
By attempting to smooth over one tiny part of the kernel so that a parsing tool could see a space character as a delineating whitespace, Prasad Pandit inadvertently spurred a robust rebuttal:

It wasn't clear what tool it was, but let's make sure it gets fixed. Because if you can't parse tabs as whitespace, you should not be parsing the kernel Kconfig files.

In fact, let's make such breakage more obvious than some esoteric ftrace record size option. If you can't parse tabs, you can't have page sizes.

Yes, tab-vs-space confusion is sadly a traditional Unix thing, and 'make' is famous for being broken in this regard. But no, that does not mean that it's ok.


Original Submission

posted by hubie on Tuesday April 23, @02:36AM   Printer-friendly

Arthur T Knackerbracket has processed the following story:

Interstellar debuted in late 2014, and features a star-studded cast including Matthew McConaughey, Anne Hathaway, Jessica Chastain, John Lithgow, and Matt Damon. It is set in a near dystopian future where humans have more or less destroyed Earth and are on the hunt for a new home elsewhere in the cosmos.

The flick earned an impressive $731 million at the box office during its first run a decade ago opposite a budget of just $165 million, and earned a score of 73 percent on Rotten Tomatoes versus a more favorable audience score of 86 percent.

Theatrical re-releases have been commonplace in Hollywood over the past few years and can be traced back to the Covid era. During the pandemic, film production halted worldwide and the theater industry nearly sank. Attendance at AMC theaters in the US fell 96.8 percent in Q3 2020 compared to the previous year. Movies that were already complete saw on-demand releases at home.

[...] Interstellar will return to theaters on September 27, and will be shown on 70mm Imax and digital screens.

Did you see it the first time? Would you want to see it, or see it again, in the theater rather than just stream it?


Original Submission

posted by janrinok on Tuesday April 23, @12:20AM   Printer-friendly
from the RIP dept.

https://arstechnica.com/science/2024/04/philosopher-daniel-dennett-dead-at-82/

World-renowned philosopher Daniel Dennett, who championed controversial takes on consciousness and free will among other mind-bending subjects, died today at the age of 82.

"He was a towering figure in philosophy and in particular in the philosophy of AI," roboticist Rodney Brooks (MIT, emeritus) wrote on X, bemoaning that he'd never replied to Dennett's last email from 30 days ago. "Now we have only memories of him."

Dennett's many books, while dense, nonetheless sold very well and were hugely influential, and he was a distinguished speaker in great demand. His 2003 TED talk, "The Illusion of Consciousness," garnered more than 4 million views. While he gained particular prominence as a leader of the "New Atheist" movement of the early 2000s—colorfully dubbed one of the "Four Horsemen of New Atheism" alongside Richard Dawkins, Christopher Hitchens, and Sam Harris—that was never his primary focus, merely a natural extension of his more central philosophical concerns.

"Dan Dennett was the embodiment of a natural philosopher—someone who was brilliant at the careful conceptual analysis that characterizes the best philosophy, while caring deeply about what science has to teach us about the natural world," Johns Hopkins University physicist and philosopher Sean Carroll told Ars. "At the same time, he was the model of a publicly engaged academic, someone who wrote substantive books that anyone could read and who had a real impact on the wider world. People like that are incredibly rare and precious, and his passing is a real loss."

Dennett was a confirmed compatibilist on the fiercely debated subject of free will, meaning that he saw no conflict between philosophical determinism and free will. "Our only notable divergence was on the question of free will, which Dan maintained exists, in some sense of 'free,' whereas I just agreed that 'will' exists, but maintained that there is no freedom in it," Hoftstadter recalled.

I initially came across Dennett's writings in his book The Mind's I which he wrote with Douglas Hofstader including texts from other authors. It was when I first started to dip my toes into the philosopy of mind. He brings up some fascinating ideas from a rational perspective which always provoke a lot of thought and discussion.

Due to my being in the same philosophical camp as Dennett's great rival David Chalmers, I tend to side with those who called Denett's book, Consciousness Explained, Consciousness Avoided (Dennett actually wrote an epilogue in response to this accusation). This is because, as I understand it, Dennett has always strived to explain consciousness using only axioms derived directly from accepted science, which means it all starts and ends with a third person perspective. Dennett always raised some interesting counterarguments to other philosophers that attempted to discuss the really interesting first person phenomena of consciousness, so whether you agree with him or not, he represented a key school of thought in modern philosophy.


Original Submission

posted by hubie on Monday April 22, @09:48PM   Printer-friendly
from the oops-sorry-about-that-excuse-me-my-bad dept.

https://arstechnica.com/gadgets/2024/04/thousands-complain-about-prime-videos-wrong-titles-lost-episodes-other-errors/

Subscribers lodged thousands of complaints related to inaccuracies in Amazon's Prime Video catalog, including incorrect content and missing episodes, according to a Business Insider report this week. While Prime Video users aren't the only streaming users dealing with these problems, Insider's examination of leaked "internal documents" brings more perspective into the impact of mislabeling and similar errors on streaming platforms.

Insider didn't publish the documents but said they show that "60 percent of all content-related customer-experience complaints for Prime Video last year were about catalogue errors," such as movies or shows labeled with wrong or missing titles.
[...]
Following Insider's report, however, Quartz reported that an unnamed source it described as "familiar with the matter" said the documents were out of date, despite Insider claiming that the leaked reports included data from 2023. Quartz's source also claimed that customer engagement was not affected,

Ars Technica reached out to Amazon for comment but didn't hear back in time for publication. The company told Insider that "catalogue quality is an ongoing priority" and that Amazon takes "it seriously and work[s] relentlessly alongside our global partners and dedicated internal teams to continuously improve the overall customer experience."
[...]
Beyond Prime Video, users have underscored similar inaccuracies within the past year on rival services, like Disney+, Hulu, and Netflix. A former White Collar executive producer pointed out that the show's episodes were mislabeled and out of order on Netflix earlier this month. Inaccurate content catalogs appear more widespread if you go back two years or more. Some video streamers (like (Disney and Netflix) have pages explaining how to report such problems.
[...]
Insider said it spoke with an anonymous person involved with Prime Video library who described the inaccuracies as "extremely sloppy mistakes" that have affected Prime Video for years.
[...]
Streaming is in a cable-like rut, and subscribers are ditching their services faster than ever. That puts more pressure on streaming platforms to elevate the user experience (not just prices), including getting the basics right.

Improving behind-the-scenes tech and practices that benefit user experiences and user interfaces is something streaming companies may need to prioritize.
[...]
As people watch the cost of streaming rise, subscribers' standards are entitled to rise, too. Accurate titles and descriptions are just some of their expectations.


Original Submission

posted by hubie on Monday April 22, @05:06PM   Printer-friendly
from the make-makemake-de-autoconfiscation-hacking dept.

Michael Larabel of Phoronix informs us:
https://www.phoronix.com/news/Autodafe-0.2-Released

Autodafe 0.2 Released For Freeing Your Project From Autotools

Eric S Raymond has released version 0.2 of Autodafe, his latest open-source project that provides "tools for freeing your project from the clammy grip of Autotools."

Autodafe works to convert an Autotools build recipe into a bare makefile that "can be read and modified by humans." The Autodafe's README explains:

        "This project collects resources for converting an autotools build recipe to a bare makefile that can be read and modified by humans.
        ...
        The principal tool, makemake, reduces a generated Makefile to an equivalent form suitable for human modification and with internal automake cruft removed. It is intended to be used with ifdex(1) to enable severing a project from its autotools build recipe, leaving a bare Makefile in place. A HOWTO describing a conversion workflow for an entire project is included.

        One other tool is planned but not yet implemented."

Those wishing to learn more about ESR's Autodafe project can do so via GitLab.
https://gitlab.com/esr/autodafe

ESR additionally added on Twitter/X:

        "Release 0.2 of my autotools killer. It's ready for use on projects buiilding binaries or static libraries. Shared libraries is a more difficult problem and will gate the 1.0 release."

This is one way to work away from Autotools for those not wishing to join the Meson bandwagon or going to another build system outright.

This project intrigues me. It brings sanity to GNU. I consider Unix M4 (core macro language of Autotools) inferior even as a macro language.
I am used to craft a Makefile for my projects by hand, for decades. Having a humanized Makefile for fancy projects of others seems nice to me.

Useful links on project's GitLab:
De-Autoconfiscation HOWTO .
https://gitlab.com/esr/autodafe/-/blob/master/de-autoconfiscation.adoc

makemake manual page
https://gitlab.com/esr/autodafe/-/blob/master/makemake.adoc

Hacker's Guide to Autodafe
https://gitlab.com/esr/autodafe/-/blob/master/hacking.adoc

The project logo is the astrological symbol for planet Makemake, a Kuiper belt object 136472
https://en.wikipedia.org/wiki/Makemake
https://en.wikipedia.org/wiki/Makemake#/media/File:Makemake_symbol_(bold).svg


Original Submission

posted by hubie on Monday April 22, @12:23PM   Printer-friendly
from the don't-drink-and-sysadmin dept.

The Conversation has an article about five things their team learned when researching 16th century beer making. A lot has changed since then, such as standardized grain varieties, standardized yeasts, standardized hops varieties, standardized temperatures, and so on.

As part of a major study of food and drink in early modern Ireland, funded by the European Research Council, we recreated and analysed a beer last brewed at Dublin Castle in 1574. Combining craft, microbiology, brewing science, archaeology, as well as history, this was the most comprehensive interdisciplinary study of historical beer ever undertaken. Here are five things that we discovered.

[...] To learn more about brewing a beer from 1574, visit our online exhibition. A documentary film is coming soon. Details will be on our website.

tldr; Historical documentation shows that your average workers consumed immense quantities of beer per day back then.

Previously:
(2024) "AI Could Make Better Beer. Here's How."
(2024) "Ransomware Halts Production At Belgian Beer Brewery Duvel"
(2023) "Long-Unknown Origins of Lager Beer Uncovered"
(2022) "Beer Hops Compounds Could Help Protect Against Alzheimer's Disease"
(2022) "Genetically Modified Yeast Yields Intense Hop Aromas in Beer"
(2022) "400-Year-Old Ecuadoran Beer Resurrected From Yeast"
... and many more.


Original Submission

posted by hubie on Monday April 22, @07:34AM   Printer-friendly

University of Queensland researchers have built a generator that absorbs carbon dioxide (CO2) to make electricity.

"This nanogenerator is made of two components: a polyamine gel that is already used by industry to absorb CO2 and a skeleton a few atoms thick of boron nitrate that generates positive and negative ions," Dr Wang said.

"In nature and in the human body, ion transportation is the most efficient energy conversion – more efficient than electron transportation which is used in the power network."

"At present we can harvest around 1 per cent of the total energy carried intrinsically by gas CO2 but like other technologies, we will now work on improving efficiency and reducing cost."

"We could make a slightly bigger device that is portable to generate electricity to power a mobile phone or a laptop computer using CO2 from the atmosphere," Professor Zhang said.

"A second application on a much larger scale, would integrate this technology with an industrial CO2 capture process to harvest electricity."

https://www.uq.edu.au/news/article/2024/04/uq-turns-co2-sustainable-power
https://www.nature.com/articles/s41467-024-47040-x


Original Submission

posted by hubie on Monday April 22, @02:45AM   Printer-friendly
from the mind-is-your-business dept.

First law protecting consumers' brainwaves signed by Colorado governor:

Colorado Governor Jared Polis on Wednesday signed into law the first measure passed in the U.S. that aims to protect the data found in a person's brainwaves.

Sponsors of the bill said it was necessary as quick advances in neurotechnology make scanning, analyzing and selling mental data increasingly more possible - and profitable.

State representative Cathy Kipp, a sponsor of the legislation, said in a statement that while advancements in the neurotechnology field hold great promise for improving the lives of many people, "we must provide a clear framework to protect Coloradans' personal data from being used without their consent while still allowing these new technologies to develop."

State senator Kevin Priola, another of the bill's sponsors, said that neurotechnology "is no longer confined to medical or research settings" and that when it comes to consumer products, the industry "can currently operate without regulation, data protection standards, or equivalent ethical constraints."

The Colorado law notes that neuratechnologies used in a clinical setting are already covered by medical privacy laws, so the new measure is aimed at consumer products available outside of a hospital.

[...] Elsewhere around the world, other governments have been working to increase consumer protections when it comes to neurotechnological products.


Original Submission