Stories
Slash Boxes
Comments

SoylentNews is people

Log In

Log In

Create Account  |  Retrieve Password


Site News

Join our Folding@Home team:
Main F@H site
Our team page


Funding Goal
For 6-month period:
2022-07-01 to 2022-12-31
(All amounts are estimated)
Base Goal:
$3500.00

Currently:
$438.92

12.5%

Covers transactions:
2022-07-02 10:17:28 ..
2022-10-05 12:33:58 UTC
(SPIDs: [1838..1866])
Last Update:
2022-10-05 14:04:11 UTC --fnord666

Support us: Subscribe Here
and buy SoylentNews Swag


We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.

What is your favorite keyboard trait?

  • QWERTY
  • AZERTY
  • Silent (sounds)
  • Clicky sounds
  • Thocky sounds
  • The pretty colored lights
  • I use Braille you insensitive clod
  • Other (please specify in comments)

[ Results | Polls ]
Comments:46 | Votes:71

posted by martyb on Sunday July 12 2020, @10:42PM   Printer-friendly
from the sweet-tooth-liver dept.

https://medicine.uiowa.edu/content/study-pinpoints-brain-cells-trigger-sugar-cravings-and-consumption:

Most people enjoy a sweet treat every now and then. But an unchecked “sweet tooth” can lead to overconsumption of sugary foods and chronic health issues like obesity and type 2 diabetes. Understanding the biological mechanisms that control sugar intake and preference for sweet taste could have important implications for managing and preventing these health problems.

The new study, led by Matthew Potthoff, PhD, associate professor of neuroscience and pharmacology in the University of Iowa Carver College of Medicine, and Matthew Gillum, PhD, at the University of Copenhagen in Denmark, focuses on actions of a hormone called fibroblast growth factor 21 (FGF21). This hormone is known to play a role in energy balance, body weight control, and insulin sensitivity.

[...] Potthoff and his colleagues previously discovered that FGF21 is made in the liver in response to increased levels of sugar, and acts in the brain to suppress sugar intake and the preference for sweet taste.

[...] Although it was known that FGF21 acted in the brain, identifying the exact cellular targets was complicated by the fact that the hormone’s receptor is expressed at very low levels and is therefore difficult to “see.” Using various techniques, the researchers were able to precisely identify which cells express the receptor for FGF21. By investigating these cells, the study shows that FGF21 targets glutamatergic neurons in the brain to lower sugar intake and sweet taste preference. The researchers also showed that FGF21’s action on specific neurons in the ventromedial hypothalamus reduce sugar intake by enhancing the neurons’ sensitivity to glucose.

Journal Reference:
FGF21 Signals to Glutamatergic Neurons in the Ventromedial Hypothalamus to Suppress Carbohydrate Intake, Cell Metabolism (DOI: 10.1016/j.cmet.2020.06.008)


Original Submission

posted by chromas on Sunday July 12 2020, @08:20PM   Printer-friendly
from the demonetized dept.

YouTube is finally letting creators know exactly how they’re making money on YouTube:

YouTube creators in the company’s Partner Program can earn money a bunch of different ways — through advertising, subscriptions, donations, live-streaming features, and YouTube Premium revenue. There are a lot of variables, and now YouTube is finally gathering all of those numbers in one place and giving that information to creators in the form of a new monetization metric called RPM.

RPM, or revenue per mille, is a take on the standard metric YouTube creators already use referred to as CPM, or cost per mille (sometimes referred to as cost per thousand). Although the two sound similar, they do two different things. RPM is much more useful for creators who are trying to grow their channels and figure out where their monthly income is coming from.

CPM measures the cost of every 1,000 ad impressions before YouTube takes its share of revenue, but RPM shows a creator’s total revenue (both from ads and other monetization areas) after YouTube takes the cut. This doesn’t represent a change to how much creators are making. Rather, it helps creators better understand where they’re making their money and how the revenue share breaks down.

[...] Basically, if CPM is an advertiser-focused metric, RPM is tailor-made for creators. For example, RPM includes the total number of video views, including videos that weren’t monetized. This is designed to show creators how much they might be missing out on revenue-wise from videos that generate views but aren’t eligible for monetization and changes they can make to ensure future videos are monetized.


Original Submission

posted by chromas on Sunday July 12 2020, @05:55PM   Printer-friendly
from the ♪I-know-a-bay-where-they-don't-ask-for-ID♪
♪Yo-ho,-yo-ho,-a-pirate's-life-for-me♪
dept.

The French Parliament unanimously agreed on Thursday to introduce a nationwide age verification system for pornography websites, months after President Emmanuel Macron pledged to protect children against such content.

Macron made the protection of children against adult content online a high-profile issue well before the coronavirus crisis hit. In January, tech companies, internet services providers and the adult movies industry signed a voluntary charter, pledging to roll out tools to help ensure minors don't have access to pornographic content.

Within a broader law on domestic violence, the Senate decided in June to introduce an amendment requiring pornography websites to implement an age verification mechanism.

In order to enforce the law, the French audiovisual regulator CSA will be granted new powers to audit and sanction companies that do not comply — sanctions could go as far as blocking access to the websites in France with a court order.

The choice of verification mechanisms will be left up to the platforms. But lawmakers have suggested using credit card verification — a system first adopted by the U.K., which mulled similar plans to control access to pornography but had to drop them in late 2019 because of technical difficulties and privacy concerns. Italy also approved a similar bill in late June, which raised the same concerns over its feasibility and compliance with the EU laws.

[...] The Senate has already voted on the bill. Following an agreement today between senators and lawmakers from the lower house National Assembly, a final vote will be held again in the Senate where the bill is expected to pass.


Original Submission

posted by martyb on Sunday July 12 2020, @03:36PM   Printer-friendly
from the dremel dept.

Apple Warns Customers Not To Close Its Laptops With A Camera Cover Attached:

Though it might strike some people as obvious advice, Apple has published a support page that warns MacBook owners not to close their laptop with a camera cover in place. Damage like a cracked display could result, according to the company, because "the clearance between the display and keyboard is designed to very tight tolerances." MacRumors spotted the advisory, which Apple posted on July 2nd.

Those little plastic camera covers with a sliding mechanism are super common nowadays. Heck, I remember getting one as a holiday gift from The Verge's parent company, Vox Media. But they could spell disaster for your laptop screen if you shut the laptop with the cover still on, and Apple's laptop repairs are extremely costly. Even with AppleCare+ accidental coverage, the deductible isn't cheap.

Instead of using a cover, Apple says that customers can trust the green LED beside the camera on a MacBook Pro or MacBook Air that illuminates whenever the camera is active.

Apple does acknowledge that some people have no choice in the matter and might be required to use a camera cover by their employer. In those cases, the company says the cover should always be removed before closing the laptop.

Not just apple laptops but all of them. Have any of my fellow Soylentils besides myself suffered damage to your laptop due to this?


Original Submission

posted by martyb on Sunday July 12 2020, @01:17PM   Printer-friendly
from the cosmic-Kudzu dept.

NASA is updating its guidelines on how to prevent contamination of the Solar System:

After years of debate, NASA plans to update its guidelines for how much biological contamination of other worlds will be allowed while the agency explores the Solar System.

[...] For decades, NASA has followed fairly strict rules about how much biological contamination is considered acceptable whenever the agency sends probes — or people — to other planets. It's a concept known as planetary protection, and it has a legal basis in a treaty signed more than 50 years ago. Called the Outer Space Treaty, it challenges nations to explore other worlds "so as to avoid their harmful contamination" and to not bring back any alien microbes from other worlds that could cause harm to Earth.

A big goal of planetary protection has been to keep us from tracking microbes all over the Solar System. That way, if we were to come across some kind of life form on another world, we would know with certainty that it actually came from that world and that we didn't put it there on accident. Planetary protection is also focused on keeping humans safe, too. If a country does find life, we want to make sure it's not going to wipe us out if they bring it back to our planet.

[...] But now, NASA is particularly focused on sending humans into deep space once again. And whenever people go into space, we carry tons of bacteria with us, no matter how much we clean. With human exploration such a high priority, NASA now wants to rethink some of the more strict requirements for the Moon and Mars — otherwise human exploration would be too tough to pull off. Today, NASA released two new "interim directives" that lay out potential changes to the guidelines for exploring the Moon and Mars. It follows years of urging from the space community to update these rules.

"We need to relook at these policies because we can't go to Mars with humans if the principle that we're living by is that we can't have any microbial substances with us," NASA administrator Jim Bridenstine said during a webinar announcing the new proposed changes. "Because that's just not possible."


Original Submission

posted by Fnord666 on Sunday July 12 2020, @10:56AM   Printer-friendly
from the can-we-still-buck-feta? dept.

Scrabble Association Bans Racial, Ethnic Slurs From Its Official Word List:

The word "slur" has a number of meanings in English, but the one that has concerned Scrabble aficionados and Hasbro, which owns the U.S. and Canadian trademark for the popular board game, means "a derogatory or insulting term applied to particular group of people."

On Wednesday, the North American Scrabble Players Association [(NASPA)] announced that derogatory language would be removed from the game's official word list.

The decision follows an online poll conducted by NASPA that elicited impassioned responses, the organization's CEO, John Chew, said in a statement on Wednesday.

"Some members threatened to leave the association if a single word were removed; others threatened to leave the association if any offensive words remained," he said. "There were a lot of good and bad arguments on both sides."

NASPA's word list is used in competitive tournaments, which is different than the Merriam-Webster Official Scrabble Players Dictionary. Hasbro says it has worked to eliminate offensive words from the dictionary with every new printing of it.

[...] "One of our members asked what we were doing to reduce racial tensions in the U.S. and Canada," [Chew] said. "And then someone else asked 'what if we take the "N" word out of the lexicon, would that at least be a good start?' "

A discussion and the online poll ensued and NASPA's advisory board ultimately voted to remove 236 words from the list, Chew said. Words that are potentially offensive but are not considered slurs — such as those for parts of the body — remain, he said.

Scrabble bans racial and ethnic slurs from the board game

Hasbro, the board game giant that owns Scrabble, is banning the use of racial and ethnic slurs from its official word list. The crossword game will no longer allow derogatory language to be used while playing.

"Hasbro Gaming is rooted in community and bringing people together, and we are committed to providing an experience that is inclusive and enjoyable for all," the company wrote in a release on Wednesday. "For that reason, Hasbro is changing the official rules of its Scrabble game to make clear that slurs are not permissible in any form of the game."

[...] The crossword game uses the Merriam-Webster Scrabble Players Dictionary as its official word list in North America. Hasbro said that it first started removing offensive words in 1994 and continues to regularly review the full list.

[...] The North American Scrabble Players Association, which uses a slightly different word list for its competitive tournaments, also agreed to ban the use of racial and ethnic slurs after a poll was taken by the association's members.

The association's CEO John Chew said in a statement on Wednesday that Scrabble brings together a wide range of people and the association needs to do more to be inclusive.

"As people have said across the spectrum of responses, removing slurs is the very least that we can do to make our association more inclusive," Chew said. "I will be reaching out to the community for suggestions in coming months, and look forward to working with everyone to make our community a larger and happier one."


Original Submission

posted by Fnord666 on Sunday July 12 2020, @08:32AM   Printer-friendly
from the going-analog dept.

Imec Develops Efficient Processor In Memory Technique for GloFo

Imec and GlobalFoundries have demonstrated a processor-in-memory chip that can achieve energy efficiency up to 2900 TOPS/W, approximately two orders of magnitude above today's commercial processor-in-memory chips. The chip uses an established idea, analog computing, implemented in SRAM in GlobalFoundries' 22nm fully-depleted silicon-on-insulator (FD-SOI) process technology. Imec's analog in-memory compute (AiMC) will be available to GlobalFoundries customers as a feature that can be implemented on the company's 22FDX platform.

Since a neural network model may have tens or hundreds of millions of weights, sending data back and forth between the memory and the processor is inefficient. Analog computing uses a memory array to store the weights and also perform multiply-accumulate (MAC) operations, so there is no memory-to-processor transfer needed. Each memristor element (perhaps a ReRAM cell) has its conductance programmed to an analog level which is proportional to the required weight.

[...] Imec has built a test chip, called analog inference accelerator (AnIA), based on GlobalFoundries' 22nm FD-SOI process. AnIA's 512k array of SRAM cells plus digital infrastructure including 1024 DACs and 512 ADCs takes up 4mm2. It can perform around half a million computations per operation cycle based on 6-bit (plus sign bit) input activations, ternary weights (-1, 0, +1) and 6-bit outputs.

[...] Imec showed accuracy results for object recognition inference on the CIFAR 10 dataset which dropped only one percentage point compared to a similarly quantised baseline. With a supply voltage of 0.8 V, AnIA's energy efficiency is between 1050 and 1500 TOPS/W at 23.5 TOPS. For 0.6 V supply voltage, AnIA achieved 5.8 TOPS at around 1800-2900 TOPS/W.

Promising application: edge computing facial recognition cameras for the surveillance state.

Also at Wccftech.

See also: Week In Review: Auto, Security, Pervasive Computing

Previously: IBM Reduces Neural Network Energy Consumption Using Analog Memory and Non-Von Neumann Architecture

Related: "3nm" Test Chip Taped Out by Imec and Cadence
GlobalFoundries Abandons "7nm LP" Node, TSMC and Samsung to Pick Up the Slack - "The manufacturer will continue to cooperate with IMEC, which works on a broader set of technologies that will be useful for GF's upcoming specialized fabrication processes..."
Radar for Your Wrist


Original Submission

posted by Fnord666 on Sunday July 12 2020, @06:07AM   Printer-friendly
from the ORCs..Rings..it-sounds-familiar dept.

Four faint objects have been found (archive) by astronomers while mapping the sky in radio frequencies. The objects

are highly circular and brighter along their edges. And they're unlike any class of astronomical object ever seen before.

The objects, which look like distant ring-shaped islands, have been dubbed odd radio circles, or ORCs, for their shape and overall peculiarity. Astronomers don't yet know exactly how far away these ORCs are, but they could be linked to distant galaxies. All objects were found away from the Milky Way's galactic plane and are around 1 arcminute across (for comparison, the moon's diameter is 31 arcminutes).

The ORCs were discovered in the Pilot Survey of the Evolutionary Map of the Universe(EMU), which is using the Australian Square Kilometre Array Pathfinder(ASKAP) radio telescope array in Mid-Western Australia to make a census of radio sources in the sky.

Possible explanations considered in the source paper include

  - Imaging Artifacts
  - Supernova Remnants
  - Galactic Planetary Nebulae
  - Face-on Star-forming galaxy or ring galaxy
  - Lobe from a double-lobed radio galaxy, viewed side-on
  - Lobe from a double-lobed radio galaxy, viewed end-on
  - A bent-tail radio galaxy
  - Einstein Ring
  - Ring around Wolf-Rayet star
  - Cluster Halo
  - Galactic Wind Termination Shock

These are examined in detail and variously discarded.

The submitted paper concludes that the ORCs likely "represent a new type of object found in radio-astronomy images" or "a new category of a known phenomenon" (possibly both.)

A paper describing the objects has been submitted to the preprint site arXiv

Journal Reference:
Norris, Ray P., Intema, Huib T., Kapinska, Anna D., et al. Unexpected Circular Radio Objects at High Galactic Latitude, (DOI: https://arxiv.org/abs/2006.14805)


Original Submission

posted by martyb on Sunday July 12 2020, @03:48AM   Printer-friendly

Facebook code change caused outage for Spotify, Pinterest and Waze apps – TechCrunch:

If you're an iPhone user, odds are fairly good you spent a frustrating portion of the morning attempting to reopen apps. I know my morning walk was dampened by the inability to fire up Spotify. Plenty of other users reported similar issues with a number of apps, including Pinterest and Waze.

The issue has since been resolved, with Facebook noting that the problem rests firmly on its shoulders. A log page notes a sudden spike in errors stemming from Facebook's iOS SDK, dating back several hours.

[...] "Earlier today, a code change triggered crashes for some iOS apps using the Facebook SDK," the developer team writes. "We identified the issue quickly and resolved it. We apologize for any inconvenience."

[...] After the second major issue in recent memory, it's easy to imagine many reconsidering their relationship with the social network — after all, a bad experience can put people off an app entirely, as social media debates around Apple Music versus Spotify appeared to point to this morning. Many users will ultimately place the blame at the feet of a given app, rather than a third-party SDK that caused the crash.

A detailed timeline and very readable analysis of what happened is available at Bugsnag:

Some key takeaways

  1. Now the issue is, this absolutely should not crash an application. One of the tenets of good SDK design is that SDKs SHOULD NEVER CRASH THE APP.
  2. Defensive programming, and better handling of malformed data from the server could have meant that instead of crashing the application, the facebook initialization could have just been skipped, or better still, fall back to some kind of default settings if the server responds with junk data.
  3. Additionally, by having some kind of API data validation in place, this situation could have been avoided entirely. Services like Runscope offer this.

See, also: Robustness Principle.


Original Submission

posted by martyb on Sunday July 12 2020, @01:25AM   Printer-friendly
from the did-anyone-try-aspirin? dept.

Australian Health Protection Principal Committee (AHPPC) statement on preliminary media reports of the results of a randomised trial of the use of dexamethasone

AHPPC notes the preliminary media reports of the results of a randomised trial of the use of dexamethasone, a corticosteroid, in the management of hospitalised patients with COVID-19.

Whilst only a single trial, it appears to be a large well-conducted study. The investigators reported a significant reduction in mortality in patients on mechanical ventilation and in those requiring oxygen, but not in those with less severe illness. AHPPC notes that dexamethasone appears to reduce mortality, but mortality was still 29% in ventilated patients and 22% in patients on supplemental oxygen who were treated with dexamethasone.

Although this seems to be an exciting development, further examination of the scientific results, when published, will be required to confirm the efficacy of dexamethasone for severe COVID-19. It is likely that dexamethasone operates by reducing inflammation of the lung in severe disease, and thus would not be expected to be useful in the prevention of COVID-19.

The availability of this treatment doesn't reduce our need to prevent and control community transmission of COVID-19 as the mortality of severe COVID-19.

The University of Oxford issued a news release Low-cost dexamethasone reduces death by up to one third in hospitalised patients with severe respiratory complications of COVID-19:

In March 2020, the RECOVERY (Randomised Evaluation of COVid-19 thERapY) trial was established as a randomised clinical trial to test a range of potential treatments for COVID-19, including low-dose dexamethasone (a steroid treatment). Over 11,500 patients have been enrolled from over 175 NHS hospitals in the UK.

On 8 June, recruitment to the dexamethasone arm was halted since, in the view of the trial Steering Committee, sufficient patients had been enrolled to establish whether or not the drug had a meaningful benefit.

A total of 2104 patients were randomised to receive dexamethasone 6 mg once per day (either by mouth or by intravenous injection) for ten days and were compared with 4321 patients randomised to usual care alone. Among the patients who received usual care alone, 28-day mortality was highest in those who required ventilation (41%), intermediate in those patients who required oxygen only (25%), and lowest among those who did not require any respiratory intervention (13%).

Dexamethasone reduced deaths by one-third in ventilated patients (rate ratio 0.65 [95% confidence interval 0.48 to 0.88]; p=0.0003) and by one fifth in other patients receiving oxygen only (0.80 [0.67 to 0.96]; p=0.0021). There was no benefit among those patients who did not require respiratory support (1.22 [0.86 to 1.75; p=0.14).

Based on these results, 1 death would be prevented by treatment of around 8 ventilated patients or around 25 patients requiring oxygen alone.

Journal Reference:
Peter Horby, Wei Shen Lim, Jonathan Emberson, et al. Effect of Dexamethasone in Hospitalized Patients with COVID-19: Preliminary Report [$], medRxiv (DOI: 10.1101/2020.06.22.20137273)


Original Submission

posted by martyb on Saturday July 11 2020, @11:02PM   Printer-friendly

Nvidia overtakes Intel as most valuable U.S. chipmaker

Nvidia has for the first time overtaken Intel as the most valuable U.S. chipmaker.

In a semiconductor industry milestone, Nvidia's shares rose 2.3% in afternoon trading on Wednesday to a record $404, putting the graphic component maker's market capitalization at $248 billion, just above the $246 billion value of Intel, once the world's leading chipmaker.

[...] Despite Nvidia's meteoric stock rise, its sales remain a fraction of Intel's. Analysts on average see Nvidia's revenue rising 34% in its current fiscal year to $14.6 billion, while they expect Intel's 2020 revenue to increase 2.5% to $73.8 billion, according to Refinitiv.

Reflecting investors' optimism about Nvidia's future profit growth, its shares are currently trading at 45 times expected earnings, while Intel's trade at 12 times expected earnings.

TSMC and Samsung are more valuable than Nvidia.

In other news, Elon Musk is worth more than Warren Buffet.

Also at EE Times.

See also: Where did it all go wrong for Intel?


Original Submission

posted by Fnord666 on Saturday July 11 2020, @08:43PM   Printer-friendly
from the if-don't-do-audits-you-don't-have-findings-like-this dept.

Digicert will shovel some 50,000 EV HTTPS certificates into the furnace this Saturday after audit bungle:

Digicert says, come Saturday, July 11, it will revoke tens of thousands of encryption certificates issued by intermediaries that were not properly audited.

A notice emitted by the certificate biz explained that a number of its intermediate certificate authorities (ICAs) had issued EV certs to customers despite not being included in DigiCert's WebTrust audits – which goes against the rules for EV certs. To remedy this, DigiCert said it will revoke every single EV cert issued by the ICAs in question – think CertCentral, Symantec, Thawte, and GeoTrust.

"To resolve the issue, we must migrate issuance to new ICAs and revoke all certificates issued under the impacted ICAs," Digicert told its customers in an email.

"Although there is no security threat, the EV Guidelines require that we revoke EV certificates signed by the affected ICAs by July 11, 2020 at 12pm MDT (July 11, 18:00 UTC)."

[...] And, by the way, EV certs, aka Extended Validation certificates, are supposed to be the gold standard in the cert-selling industry: these are the ones that show up with the cert owner's legal name in some browsers' address bar next to the padlock. This is so that when you're visiting your bank's website, and it says My Super Bank Corp, you're reassured this really is the real deal. EV certs have their critics.

[...] "Revoking over 50,000 certificates within five days is a draconian move that is only warranted when a severe security breach has been detected," wrote Bugzilla user Hank Nussbacher. "There needs to be some common sense in determining how long to allow before the certificate is revoked. Minor typos in province or mistakes with audit reports should be given 2-4 weeks to revoke certificates."

As others point out, however, it isn't Digicert's call to only wait five days for the revocation. Rather, that is what is required by Mozilla and CAB Forum rules.


Original Submission

posted by martyb on Saturday July 11 2020, @06:18PM   Printer-friendly
from the a-proton-and-a-neutron-walk-into-a-black-hole dept.

Scientists propose plan to determine if Planet Nine is a primordial black hole:

Dr. Avi Loeb, Frank B. Baird Jr. Professor of Science at Harvard, and Amir Siraj, a Harvard undergraduate student, have developed the new method to search for black holes in the outer solar system based on flares that result from the disruption of intercepted comets. The study suggests that the LSST[*] has the capability to find black holes by observing for accretion flares resulting from the impact of small Oort cloud objects.

"In the vicinity of a black hole, small bodies that approach it will melt as a result of heating from the background accretion of gas from the interstellar medium onto the black hole," said Siraj. "Once they melt, the small bodies are subject to tidal disruption by the black hole, followed by accretion from the tidally disrupted body onto the black hole." Loeb added, "Because black holes are intrinsically dark, the radiation that matter emits on its way to the mouth of the black hole is our only way to illuminate this dark environment."

[...] The upcoming LSST is expected to have the sensitivity required to detect accretion flares, while current technology isn't able to do so without guidance. "LSST has a wide field of view, covering the entire sky again and again, and searching for transient flares," said Loeb. "Other telescopes are good at pointing at a known target, but we do not know exactly where to look for Planet Nine. We only know the broad region in which it may reside." Siraj added, "LSST's ability to survey the sky twice per week is extremely valuable. In addition, its unprecedented depth will allow for the detection of flares resulting from relatively small impactors, which are more frequent than large ones."

[*] LSST:

The Vera C. Rubin Observatory, previously referred to as the Large Synoptic Survey Telescope (LSST), is an astronomical observatory currently under construction in Chile. Its main task will be an astronomical survey, the Legacy Survey of Space and Time (LSST). The Rubin Observatory has a wide-field reflecting telescope with an 8.4-meter primary mirror that will photograph the entire available sky every few nights. The word synoptic is derived from the Greek words σύν (syn "together") and ὄψις (opsis "view"), and describes observations that give a broad view of a subject at a particular time. The observatory is named for Vera Rubin, an American astronomer who pioneered discoveries about galaxy rotation rates.

Journal Reference:
A. Siraj, A. Loeb. Searching for Black Holes in the Outer Solar System with LSST, https://arxiv.org/abs/2005.12280v2


Original Submission

posted by martyb on Saturday July 11 2020, @04:00PM   Printer-friendly
from the street-creds dept.

Cops Seize Server That Hosted BlueLeaks, DDoSecrets Says

Cops Seize Server that Hosted BlueLeaks, DDoSecrets Says:

Authorities in Germany have seized a server used by the organization that published a trove of US police internal documents commonly known as BlueLeaks, according to the organization's founder.

On Tuesday, Emma Best, the founder of Distributed Denial of Secrets or DDoSecrets, a WikiLeaks-like website that has published the police data, said that prosecutors in the German town of Zwickau seized the organization's "primary public download server."

"We are working to obtain additional information, but presume it is [regarding] #BlueLeaks," Best added on Twitter. "The server was used ONLY to distribute data to the public. It had no contact with sources and was involved in nothing more than enlightening the public through journalistic publishing."

Best shared a screenshot of the email they received from DDoSecrets' hosting provider informing of the server seizure.

"Your server has been confiscated," the email reads. "Until now we were not allowed to inform you accordingly." The email then notes that the seizing authority was the Department of Public Prosecution Zwickau.

German Authorities Seized the Servers that Hosted BlueLeaks Police Files at the Request of the US Government

German authorities seized the servers that hosted BlueLeaks police files at the request of the US government:

The site that hosted hundreds of thousands of leaked police files — dubbed BlueLeaks — has been taken offline after its servers were confiscated by German authorities acting at the request of the US government.

[...] It's not clear what legal grounds the US has to take the server offline. Hacking the government is a crime, but the Supreme Court has upheld the right of journalists to publish leaked documents as long as they weren't involved in their theft. DDoSecrets maintains that it's a publisher without any ties to the hacker who first obtained the BlueLeaks files.

A spokesperson for the Zwickau prosecutor's office told the German outlet Zeit Online [in German] that they were aware DDoSecrets is a journalistic project, but declined to provide any further information.

Previously: "BlueLeaks" Exposes 269 GB of Data from Hundreds of Police Departments and "Fusion Centers"


Original Submission #1Original Submission #2

posted by martyb on Saturday July 11 2020, @01:35PM   Printer-friendly

Libtorrent Adds WebTorrent Support, Expanding the Reach of Browser Torrenting

Libtorrent has bridged the gap between WebTorrent and traditional torrent clients. The open-source BitTorrent library, used by clients including Deluge, qBittorrent, and Tribler, will help to widely expand the reach of browser-based WebTorrent tools and services.

[...] Over the past few years, several tools and services have been built on WebTorrent's technology. These include Instant.io, βTorrent, as well as the popular Brave browser, which comes with a built-in torrent client based on WebTorrent. These apps and services all work as advertised. However, WebTorrent-based implementations typically come with a major drawback. Since communication between WebTorrent peers relies on WebRTC, it can't share files with standard torrent clients by default.

This rift between WebTorrent and traditional torrent clients is now starting to close. Libtorrent has just created a bridge between the two 'worlds' by implementing official WebTorrent support.

[...] Right now, WebTorrent and traditional torrent clients can't talk to each other. However, the libtorrent peers will soon act as a hybrid, bridging the gap between these two ecosystems.

WebTorrent support #4123

Previously: WebTorrent, a BitTorrent Client Running Within the Web Browser


Original Submission