2020-07-01 00:00:00 ..
2020-07-06 14:30:24 UTC
2020-07-07 03:15:40 UTC
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
Submitted via IRC for SoyCow1944
A bug impacting editors Vim and Neovim could allow a trojan code to escape sandbox mitigations.
A high-severity bug impacting two popular command-line text editing applications, Vim and Neovim, allow remote attackers to execute arbitrary OS commands. Security researcher Armin Razmjou warned that exploiting the bug is as easy as tricking a target into clicking on a specially crafted text file in either editor.
Razmjou outlined his research and created a proof-of-concept (PoC) attack demonstrating how an adversary can compromise a Linux system via Vim or Neowim [sic]. He said Vim versions before 8.1.1365 and Neovim before 0.3.6 are vulnerable to arbitrary code execution.
“[Outlined is] a real-life attack approach in which a reverse shell is launched once the user opens the file. To conceal the attack, the file will be immediately rewritten when opened. Also, the PoC uses terminal escape sequences to hide the modeline when the content is printed with cat. (cat -v reveals the actual content),” wrote Razmjou in a technical analysis of his research.
[...] “However, the :source! command (with the bang [!] modifier) can be used to bypass the sandbox. It reads and executes commands from a given file as if typed manually, running them after the sandbox has been left,” according to the PoC report.
Vim and Neovim have both released patches for the bug (CVE-2019-12735) that the National Institute of Standards and Technology warns, “allows remote attackers to execute arbitrary OS commands via the :source! command in a modeline.”
“Beyond patching, it’s recommended to disable modelines in the vimrc (set nomodeline), to use the securemodelinesplugin, or to disable modelineexpr (since patch 8.1.1366, Vim-only) to disallow expressions in modelines,” the researcher said.
Microsoft's tactics against GNU/Linux have not changed much in two decades, they're just framed differently, nowadays the attacks are masqueraded as friendship and proxies are used more than before. So as a fresh look at how these established tactics are used currently to attack Free Software, a guest poster at TechRights has summarized them in a ten-chapter handbook, aptly named A Handbook for Destroying the Free Software Movement. The first two chapters cover what Microsoft is now doing through GitHub, licensing, Azure, Visual Studio, Vista10, and its other components foisted on developers. Other chapters cover manipulation of media coverage, OEM lock-in, use of attack proxies, and software patents. Most of all, these tactics have stayed true to the plans outlined over 20 years ago in the Halloween Documents.
It's written a bit tongue in cheek from Microsoft's perspective. Some material is drawn from Comes v Microsoft (aka The Iowa Case) and, as mentioned, the leaked internal memos known as the Halloween Documents.
This is the story of Dr. Norman Borlaug who was trying to breed wheat, in 1945, which could resist stem rust, a disease that ruined many crops.
In, 1968, Stanford biologist Paul Ehrlich and his wife Anne (who is uncredited) published an explosive book. In The Population Bomb, they noted that in poor countries such as India and Pakistan, populations were growing more quickly than food supplies. In the 1970s, they predicted: "Hundreds of millions of people are going to starve to death".
Thankfully, Ehrlich was wrong, because he didn't know what Norman Borlaug had been doing. Borlaug would later be awarded the Nobel Peace Prize for the years he had spent shuttling between Mexico City and the Yaqui Valley, growing thousands upon thousands of kinds of wheat, and carefully noting their traits: this kind resisted one type of stem rust, but not another; this kind produced good yields, but made bad bread; and so on.
[...] Borlaug produced new kinds of "dwarf" wheat that resisted rust, yielded well, and - crucially - had short stems, so they didn't topple over in the wind. By the 1960s, Borlaug was travelling the world to spread the news. It wasn't easy.
[...] Progress has slowed, and problems are mounting: climate change, water shortages, pollution from fertilisers and pesticides. These are problems the green revolution itself has made worse. Some say it even perpetuated the poverty that keeps the population growing: fertilisers and irrigation cost money which many peasant farmers can't get. Paul Ehrlich, now in his 80s, maintains that he wasn't so much wrong, as ahead of his time. Perhaps if Malthus were still alive, in his 250s, he'd say the same. But could more human ingenuity be the answer?
[...] Since genetic modification became possible, it's mostly been about resistance to diseases, insects and herbicides. While that does increase yields, it hasn't been the direct aim. That's starting to change. And agronomists are only just beginning to explore the gene editing tool CRISPR, which can do what Norman Borlaug did much more quickly. As for Borlaug, he saw that his work had caused problems that weren't handled well, but asked a simple question - would you rather have imperfect ways to grow more food, or let people starve? It's a question we may have to keep asking in the decades to come.
[Related]: An Essay on the Principle of Population
Traditionally, much of the search for extraterrestrial life has focused on what scientists call the "habitable zone," defined as the range of distances from a star warm enough that liquid water could exist on a planet's surface. That description works for basic, single-celled microbes—but not for complex creatures like animals, which include everything from simple sponges to humans.
The team's work, published today in The Astrophysical Journal, shows that accounting for predicted levels of certain toxic gases narrows the safe zone for complex life by at least half—and in some instances eliminates it altogether.
"This is the first time the physiological limits of life on Earth have been considered to predict the distribution of complex life elsewhere in the universe," said Timothy Lyons, one of the study's co-authors, a distinguished professor of biogeochemistry in UCR's Department of Earth and Planetary Sciences, and director of the Alternative Earths Astrobiology Center, which sponsored the project.
"Imagine a 'habitable zone for complex life' defined as a safe zone where it would be plausible to support rich ecosystems like we find on Earth today," Lyons explained. "Our results indicate that complex ecosystems like ours cannot exist in most regions of the habitable zone as traditionally defined."
[...] "To sustain liquid water at the outer edge of the conventional habitable zone, a planet would need tens of thousands of times more carbon dioxide than Earth has today," said Edward Schwieterman, the study's lead author and a NASA Postdoctoral Program fellow working with Lyons. "That's far beyond the levels known to be toxic to human and animal life on Earth."
Similar difficulties occur with respect to ultraviolet light which leads to excess carbon monoxide; even small amounts preferentially bind to hemoglobin leading to "death of body cells due to lack of oxygen."
More information: Edward W. Schwieterman et al. A Limited Habitable Zone for Complex Life, The Astrophysical Journal (2019). DOI: 10.3847/1538-4357/ab1d52
No word on what parameters would apply to the planet Vulcan.
Submitted via IRC for Bytram
In the past year, Congress has been happy to drag tech C.E.O.s into hearings and question them about how they vacuum up and exploit personal information about their users. But so far those hearings haven't amounted to much more than talk. Lawmakers have yet to do their job and rewrite the law to ensure that such abuses don't continue.
Americans have been far too vulnerable for far too long when they venture online. Companies are free today to monitor Americans' behavior and collect information about them from across the web and the real world to do everything from sell them cars to influence their votes to set their life insurance rates — all usually without users' knowledge of the collection and manipulation taking place behind the scenes. It's taken more than a decade of shocking revelations — of data breaches and other privacy abuses — to get to this moment, when there finally seems to be enough momentum to pass a federal law. Congress is considering several pieces of legislation that would strengthen Americans' privacy rights, and alongside them, a few bills that would make it easier for tech companies to strip away what few privacy rights we now enjoy.
American lawmakers are late to the party. Europe has already set what amounts to a global privacy standard with its General Data Protection Regulation, which went into effect in 2018. G.D.P.R. establishes several privacy rights that do not exist in the United States — including a requirement for companies to inform users about their data practices and receive explicit permission before collecting any personal information. Although Americans cannot legally avail themselves of specific rights under G.D.P.R., the fact that the biggest global tech companies are complying everywhere with the new European rules means that the technocrats in Brussels are doing more for Americans' digital privacy rights than their own Congress.
The toughest privacy law in the United States today, is the California Consumer Privacy Act, which is set to go into effect on Jan. 1, 2020. Just like G.D.P.R., it requires companies to take adequate security measures to protect data and also offers consumers the right to request access to the data that has been collected about them. Under the California law, consumers not only have a right to know whether their data is being sold or handed off to third parties, they also have a right to block that sale. And the opt-out can't be a false choice — Facebook and Google would not be able to refuse service just because a user didn't want their data sold.
[...] Where the Warner/Fischer bill looks to alleviate the harmful effects of data collection on consumers, Senator Josh Hawley's Do Not Track Act seeks to stop the problem much closer to the source, by creating a Do Not Track system administered by the Federal Trade Commission. Commercial websites would be required by law not to harvest unnecessary data from consumers who have Do Not Track turned on.
A similar idea appeared in a more comprehensive draft bill circulated last year by Senator Ron Wyden, but Mr. Wyden has yet to introduce that bill this session. Instead, like Mr. Warner, he seems to have turned his attention to downstream effects — for the time being, at least. This year, he is sponsoring a bill for algorithmic accountability, requiring the largest tech companies to test their artificial intelligence systems for biases, such as racial discrimination, and to fix those biases that are found.
Up to one million plant and animal species face extinction, many within decades, because of human activities, says the most comprehensive report yet on the state of global ecosystems.
Without drastic action to conserve habitats, the rate of species extinction — already tens to hundreds of times higher than the average across the past ten million years — will only increase, says the analysis. The findings come from a United Nations-backed panel called the Intergovernmental Science-Policy Platform on Biodiversity and Ecosystem Services (IPBES).
According to the report, agricultural activities have had the largest impact on ecosystems that people depend on for food, clean water and a stable climate. The loss of species and habitats poses as much a danger to life on Earth as climate change does, says a summary of the work, released on 6 May.
The analysis distils findings from nearly 15,000 studies and government reports, integrating information from the natural and social sciences, Indigenous peoples and traditional agricultural communities. It is the first major international appraisal of biodiversity since 2005. Representatives of 132 governments met last week in Paris to finalize and approve the analysis.
Biodiversity should be at the top of the global agenda alongside climate, said Anne Larigauderie, IPBES executive secretary, at a 6 May press conference in Paris, France. "We can no longer say that we did not know," she said.
"We have never had a single unified statement from the world's governments that unambiguously makes clear the crisis we are facing for life on Earth," says Thomas Brooks, chief scientist at the International Union for Conservation of Nature in Gland, Switzerland, who helped to edit the biodiversity analysis. "That is really the absolutely key novelty that we see here."
Something strange has been going on in the friendly skies over the last day or so. Flights are being canceled. Aircraft are grounded. Passengers are understandably upset. The core of the issue is GPS and ADS-B systems. The ADS-B system depends on GPS data to function properly, but over this weekend a problem with the quality of the GPS data has disrupted normal ADS-B features on some planes, leading to the cancellations.
Automatic Dependent Surveillance-Broadcast (ADS-B) is a communication system used in aircraft worldwide. Planes transmit location, speed, flight number, and other information on 1090 MHz. This data is picked up by ground stations and eventually displayed on air traffic controller screens. Aircraft also receive this data from each other as part of the Traffic Collision Avoidance System (TCAS).
ADS-B isn't a complex or encrypted signal. In fact, anyone with a cheap RTL-SDR can receive the signal. Aviation buffs know how cool it is to see a map of all the aircraft flying above your house. Plenty of hackers have worked on these systems, and we've covered that here on Hackaday. In the USA, the FAA will effectively require all aircraft to carry ADS-B transponders by January 1st, 2020. So as you can imagine, most aircraft already have the systems installed.
The ADS-B system in a plane needs to get position data before it can transmit. These days, that data comes from a global satellite navigation system. In the USA, that means GPS. GPS is currently having some problems though. This is where Receiver autonomous integrity monitoring (RAIM) comes in. Safety-critical GPS systems (those in planes and ships) cross-check their current position. If GPS is sending degraded or incorrect data, it is sent to the FAA who displays it on their website. The non-precision approach current outage map is showing degraded service all over the US Eastern seaboard, as well as the North. The cause of this signal degradation is currently unknown.
Hundreds or even thousands of flights have been cancelled. Teleconferences with the FAA (Federal Aviation Administration) conducted. Flights are restricted to fly only up to 28,000 feet. What's going on? Is GPS under attack?
As time went on and more reports came rolling in, it became clear that the problem reports were limited to aircraft flying with certain GPS receivers:
A corrupted upgrade to a large class of receivers was to blame for what was initially suspected to be a degradation of GPS service across much of the United States (see FAA graphic).
Work by the FAA and industry groups revealed that many Rockwell Collins receivers had received a bad update. See: AIN Online "Collins GPS Receivers Suffer Reception Outage."
The more that companies try to squeeze the last bit of income out of any kind of system or product, the closer the tolerances become and they seem to be [in?]tolerant of disruptions.
How close are we to the point of "peak optimization" where any profits from additional refinements are countered by systems being more brittle and thus more expensive to operate in challenging conditions?
SpaceX will launch a Falcon 9 rocket from Vandenberg Air Force Base in California, containing the Canadian Space Agency's Radarsat trio of Earth observation satellites on Wednesday. The broadcast will be telecast live on SpaceX's YouTube channel. If you're keen to follow along, we have all the details you need right here.
Canada's three Radarsat satellites, shaped like old rubber stamps, will gather data about the nations coasts and waterways to help ships navigate the Arctic, provide agriculture solutions and help first responders save lives, according to the agency. The dimensions of the satellites are such that they're almost as big as a Tesla Roadster, but they're only half as heavy. Eventually the satellites will settle into an orbit around 600 kilometers (around 370 miles) above the Earth.
[...] The launch window opens on Wednesday, June 12 at 7:17 a.m. PT[*] and closes 13 minutes later, at 7:30 a.m. PT. Like most launches, a backup window will open 24 hours later, on June 13, should something go awry during the first launch window. The satellites will deploy at 54 minutes into the flight.
takyon: The rocket will carry one of the most expensive payloads that SpaceX has ever attempted to launch. At more than $1 billion, the Radarsat satellites will cost more than 4 years of the Canadian Space Agency's approximately $250 million annual budget.
*Launch is scheduled for 1417-1430 GMT (10:17-10:30 a.m. EDT; 7:17-7:30 a.m. PDT). See it on YouTube.
Also at NASASpaceFlight.
Submitted via IRC for Bytram
Nine states and the District of Columbia today filed a lawsuit against T-Mobile and Sprint in an attempt to stop the wireless carriers from merging.
New York Attorney General Letitia James and California AG Xavier Becerra are leading the way, with co-plaintiffs from Colorado, Connecticut, the District of Columbia, Maryland, Michigan, Mississippi, Virginia, and Wisconsin.
"When it comes to corporate power, bigger isn't always better," James said in an announcement of the lawsuit. "The T-Mobile and Sprint merger would not only cause irreparable harm to mobile subscribers nationwide by cutting access to affordable, reliable wireless service for millions of Americans, but would particularly affect lower-income and minority communities here in New York and in urban areas across the country. That's why we are going to court to stop this merger and protect our consumers, because this is exactly the sort of consumer-harming, job-killing megamerger our antitrust laws were designed to prevent."
Becerra argued that the "merger would hurt the most vulnerable Californians and result in a compressed market with fewer choices and higher prices."
The AGs filed their complaint in US District Court for the Southern District of New York.
Submitted via IRC for Bytram
In their report published in the Bulletin of the Seismological Society of America, Louis Quinones and Heather DeShon of Southern Methodist University and colleagues confirm that seismicity rates in the basin have decreased since 2014, a trend that appears to correspond with a decrease in wastewater injection.
However, their analysis also notes that new faults have become active during this period, and that seismicity continues at a greater distance from injection wells over time, suggesting that "far-field" changes in seismic stress will be important for understanding the basin's future earthquake hazard potential.
"One thing we have come to appreciate is how broadly injection in the basin has modified stress within entire basin," said DeShon. The first thing researchers noted with wastewater injection into the basin "was the reactivation of individual faults," she added, "and what we're now starting to see is essentially the leftover energy on all sorts of little faults being released by the cumulative volume that's been put into the basin."
[...] The researchers found that overall seismicity in the Fort Worth Basin has been strongly correlated in time and space with wastewater injection activities, with most seismicity occurring within 15 kilometers of disposal wells.
Submitted via IRC for Bytram
[A] Baylor University study [DOI: 10.1029/2019GL082252] [DX], published in the journal Geophysical Research Letters, combined data from NASA's Gravity Recovery and Interior Laboratory (GRAIL) and Lunar Reconnaissance Orbiter (LRO) missions and found the huge blob lurking over a hundred miles beneath the South Pole-Aitken basin.
The mass, which isn't immediately obvious on the surface, appears to be dragging down the lunar landscape above it by around half a mile. In terms of size, lead author of the paper, Peter B. James, compared it to a pile of metal five times the size of Hawaii's big island.
[...] The scientists have a number of theories for where this mass could have come from, including one that involves the solidification of an ocean of lunar magma.
The leading theory posits the mass comes from an asteroid with an iron-nickel core that smacked into the lunar surface four billion years ago. Scientists calculated that a sufficiently dispersed impactor core could remain suspended in the Moon's mantle rather than sink to the core.
Donations -- many of them anything but charitable -- are at the heart of the admissions scandal. Using a sham foundation, parents paid off coaches, in part with donations to their programs. Then the donors' sons and daughters ended up on lists of recruited athletes, easing their admission to competitive colleges. Everything was fake. The foundation was a tool for money laundering.
College officials have pointed out that those bribed were not admissions officials or development officers.
But not everyone is convinced that a sufficient divide is in place between fund-raising and admissions decisions at some institutions. U.S. senator Ron Wyden of Oregon in March pledged to introduce legislation to respond to the scandal. Wyden is the ranking Democrat on the Senate Finance Committee, an influential perch from which to influence tax policy, which is key to any regulation of tax deductions. In April, Wyden and Senator Charles Grassley, the Iowa Republican who chairs the Finance Committee, wrote to the Internal Revenue Service to urge tough enforcement of any abuses of the charitable contributions tax breaks that came out during the admissions scandal.
Last week, Wyden released his promised legislation. Whether the bill moves or not, it is likely to focus attention on the way some colleges have admitted students in close proximity to large donations from their families. Among other things, the legislation would:
- Require colleges to establish formal policies to bar the consideration of family members' donations or ability to donate in evaluating an applicant.
- Require colleges to report "the number of applicants, admitted students and enrolled students who are the children of donors." The Department of Education would make the reports public.
- Limit deductions for gifts -- at colleges that don't bar consideration of donor status in applicants -- to $100,000 over a six-year period around an applicant's enrollment at a college. Larger donations would not be eligible for the standard tax deductions for charitable contributions.
Wyden's bill would make some of these changes through the Higher Education Act and others through the "quid pro quo" provisions in IRS regulations about charitable donations. That is a part of the tax code that covers situations where a donor gets a benefit related to a contribution. The value of that benefit can't be deducted -- and college fund-raisers object to even the idea that donations from the parents of an applicant would be considered under laws about quid pro quos.
Wyden issued a statement upon introducing his legislation that noted that colleges could assure the tax deductibility of all donations they receive simply by having a policy that bars consideration of donations in the admissions process.
“While middle-class families are pinching pennies to pay tuition and graduates are buried under tens of thousands of dollars in student debt, wealthy families are greasing the skids to get their children into elite schools on the backs of those same families and graduates. It’s absurd that the tax code subsidizes the top 1 percent buying their way into school,” said Wyden. “Colleges and universities would be able to preserve the tax deductibility of all donations if they simply bar their consideration in admissions decisions. It’s ‘one and done’ -- implement a policy and you’re in compliance.”
[...] The case that probably has received the most attention is that of Jared Kushner, a top aide to President Trump (and his son-in-law).
Daniel Golden, now an editor at ProPublica, first wrote about Kushner's admission to Harvard University in Golden's 2006 book The Price of Admission, which was published before people were imagining that Donald Trump would become president and before Kushner married into Trump's family. Golden recounted the story in 2016, after Trump was elected, in an article in ProPublica. The article and the book say that Charles Kushner, Jared's father, pledged $2.5 million to Harvard in 1998, shortly before Jared was admitted. The elder Kushner denied that the gift was related to his son's application.
It is difficult to prove that someone shouldn't have been admitted to Harvard. The university uses holistic admissions, under which an applicant's entire record is considered. There is no rubric of test scores and grades used to determine in a consistent way who will be admitted and who will be rejected. That said, Golden interviewed people at the high school Jared Kushner attended, and they expressed shock that he had won admission.
“There was no way anybody in the administrative office of the school thought he would on the merits get into Harvard,” one former official told Golden. “His [grade point average] did not warrant it, his SAT scores did not warrant it. We thought for sure, there was no way this was going to happen. Then, lo and behold, Jared was accepted. It was a little bit disappointing because there were at the time other kids we thought should really get in on the merits, and they did not.”
Submitted via IRC for Runaway1956
Universo Santi in the southern Spanish city of Jerez is dedicated to helping people with disabilities join the mainstream workforce
The first thing that strikes you is the calm, the light, the modern art on the walls – and then of course the food.
It's only later that you realise there is something different, and a little special, about Universo Santi, a restaurant in the southern Spanish city of Jerez.
"People don't come here because the staff are disabled but because it's the best restaurant in the area. Whatever reason they came for, the talking is about the food," says Antonio Vila.
Vila is the president of the Fundación Universo Accesible, a not-for-profit organisation dedicated to helping people with disabilities join the mainstream workforce. He has also been the driving force behind Universo Santi, the haute cuisine restaurant whose 20 employees all have some form of disability.
[...] The 20 staff, whose ages range from 22 to 62, were recruited from an original list of 1,500. To qualify, applicants had to be unemployed and have more than 35% disability.
[...] The Jerez restaurant takes its name from Santi Santamaria, chef at the Michelin three-star Can Fabes in Catalonia until his sudden death in 2011. Can Fabes closed shortly afterwards but his family wanted to carry on his name and culinary tradition and were keen to support the Jerez project.
The family's enthusiasm attracted the attention of Spain's top chefs, among them Martín Berasategui, Roca and Ángel León, all of whom have contributed recipes and their time as guest chefs at the restaurant.
Disciples of Santamaria helped establish the kitchen, whose equipment was transferred in its entirely from Can Fabes, and several of the dishes on the menu de degustación are Santamaria originals.
The restaurant has been visited by Michelin Guide personnel and may soon have its first Michelin star.
Submitted via IRC for Bytram
The Rowhammer exploit that lets unprivileged attackers corrupt or change data stored in vulnerable memory chips has evolved over the past four years to take on a range of malicious capabilities, including elevating system rights and breaking out of security sandboxes, rooting Android phones, and taking control of supposedly impregnable virtual machines. Now, researchers are unveiling a new attack that uses Rowhammer to extract cryptographic keys or other secrets stored in vulnerable DRAM modules.
[...] RAMBleed takes Rowhammer in a new direction. Rather than using bit flips to alter sensitive data, the new technique exploits the hardware bug to extract sensitive data stored in memory regions that are off-limits to attackers. The attacks require only that the exploit hammers memory locations the exploit code already has permission to access. What's more, the data extraction can work even when DRAM protected by error correcting code detects and reverses a malicious bit flip.
Besides opening a previously unknown side channel that allows attackers to deduce sensitive data, the attack also introduces new ways unprivileged exploit code can cause cryptographic keys or other secret data to load into the select DRAM rows that are susceptible to extraction. By combining the memory massaging techniques with this new side-channel attack, the researchers—from the University of Michigan, Graz University of Technology, and the University of Adelaide and Data61—were able to extract an RSA 2048-bit signing key from an OpenSSH server using only user-level permissions. In a research paper published on Tuesday, the researchers wrote:
Previous research mostly considers Rowhammer as a threat to data integrity, allowing an unprivileged attacker to modify data without accessing it. With RAMBleed, however, we show that Rowhammer effects also have implications on data confidentiality, allowing an unprivileged attacker to leverage Rowhammer-induced bit flips in order to read the value of neighboring bits. Furthermore, as not every bit in DRAM can be flipped via Rowhammer, we also present novel memory massaging techniques that aim to locate and subsequently exploit Rowhammer flippable bits. This enables the attacker to read otherwise inaccessible information such as secret key bits. Finally, as our techniques only require the attacker to allocate and deallocate memory and to measure instruction timings, RAMBleed allows an unprivileged attacker to read secret data using the default configuration of many systems (e.g., Ubuntu Linux), without requiring any special configurations (e.g., access to pagemap, huge pages, or memory deduplication).
While RAMBleed represents a new threat that hardware and software engineers will be forced to protect against, it seems unlikely that exploits will be carried out in real-world attacks any time soon. That's because, like most other Rowhammer-based attacks, RAMBleed requires a fair amount of overhead and at least some luck. For determined attackers in the field today, there may be more reliable attacks that achieve the same purpose. While ordinary users shouldn't panic, RAMBleed and the previous attacks it builds on poses a longer-term threat, especially for users of low-cost commodity hardware.
An international team of researchers has found evidence that suggests the large dome found on the surface of the dwarf planet Ceres is made of slurry—a mix of salty brine and solid particles. In their paper published in the journal Nature Geoscience, the group describes their study of data from the Dawn spacecraft and what it revealed.
Back in 2015, NASA's Dawn space probe showed that there was a domed-shaped mountain approximately four kilometers high and seventeen kilometers wide—since named Ahuna Mons—rising from the surface of Ceres, a dwarf planet residing in our solar system's asteroid belt. Initial inspection suggested volcanism; the dome-shape streaked mountain with salt on its slope looked reminiscent of volcanoes here on Earth, or even the icy domes seen on some of the solar system's moons. But logic has suggested that the mechanics involved in creating volcanism on a dwarf planet would not work. Because of its small size, it would cool down and solidify, preventing any interior activity. But that logic appears not to apply to Ceres, the team found.
[...] More information: Ottaviano Ruesch et al. Slurry extrusion on Ceres from a convective mud-bearing mantle, Nature Geoscience (2019). DOI: 10.1038/s41561-019-0378-7
Just because it's small doesn't mean it's dead.