Join our Folding@Home team:
Main F@H site
Our team page
Support us: Subscribe Here
and buy SoylentNews Swag
We always have a place for talented people, visit the Get Involved section on the wiki to see how you can make SoylentNews better.
New tool helps distinguish the cause of blood clots:
A new tool using cutting-edge technology is able to distinguish different types of blood clots based on what caused them, according to a study published in eLife.
The tool could help physicians diagnose what caused a blood clot and help them select a treatment that targets cause to break it up. For example, it could help them determine if aspirin or another kind of anti-clotting drug would be the best choice for a person who has just had a heart attack or stroke.
[...] To develop a more effective approach to identifying different types of blood clots, Zhou and her colleagues took blood samples from a healthy individual and then exposed them to different clotting agents. The team captured thousands of images of the different types of clots using a technique called high-throughput imaging flow cytometry.
They next used a type of machine-learning technology called a convolutional neural network to train a computer to identify subtle differences in the shape of different types of clots caused by different molecules. They tested this tool on 25,000 clot images that the computer had never seen before and found it was also able to distinguish most of the clot types in the images.
Finally, they tested whether this new tool, which they named the intelligent platelet aggregate classifier (iPAC), can diagnose different clot types in human blood samples. They took blood samples from four healthy people, exposed them to different clotting agents, and showed that iPAC could tell the different types of clots apart.
"We showed that iPAC is a powerful tool for studying the underlying mechanism of clot formation," Zhou says. She adds that, given recent reports that COVID-19 causes blood clots, the technology could one day be used to better understand the mechanism behind these clots too, although much about the virus currently remains unknown.
Journal Reference
Yuqi Zhou, Atsushi Yasumoto, Cheng Lei, et al. Intelligent classification of platelet aggregates by agonist type, (DOI: 10.7554/eLife.52938)
The United Launch Alliance and the U.S. Space Force are targeting Sunday morning for launching the secretive reusable X-37B spaceplane back into orbit for a sixth mission.
Watch live here (launch is scheduled for 9:14am Eastern time)
The launch was originally planned for 8:24 AM EDT Saturday May 16th but was scrubbed due to weather.
To date the space plane has spent a total of 2,865 days in orbit, with the longest mission running for 780 days and ending in October 27th, 2019. The details of the current mission are, as with previous missions, mostly undisclosed, however this mission has a notable difference:
This mission will have even more experiments than usual, thanks to the addition of a new service module — a cylindrical structure attached to the bottom of the spaceplane that will be packed with technology to be tested on orbit. "This will be the first X-37B mission to use a service module to host experiments," Randy Walden, director and program executive officer for the Air Force Rapid Capabilities Office, said in a statement. "The incorporation of a service module on this mission enables us to continue to expand the capabilities of the spacecraft and host more experiments than any of the previous missions."
Information on a few of the experiments has been made public:
Tagging along with the X-37B is a small satellite called FalconSat-8 developed by the US Air Force Academy that carries five experimental payloads. The spaceplane will supposedly deploy the FalconSat-8 when it reaches orbit. NASA is also sending two experiments up on this flight to study how space radiation degrades certain materials as well as seeds needed for food. And the US Naval Research Laboratory has included an experiment that will "transform solar power into radio frequency microwave energy" that can then be sent to the ground for use.
A tribute to COVID-19 victims, first responders, and front-line workers has been added to the side of the Atlas V rocket that will be used for the launch.
NASA's 'Artemis Accords' set forth new and old rules for outer space cooperation
NASA's plan to return to the Moon is ambitious enough on its own, but the agency is aiming to modernize international cooperation in space in the process. Today it published a summary of the "Artemis Accords," a new set of voluntary guidelines that partner nations and organizations are invited to join to advance the cause of exploration and industry globally.
Having no national affiliation or sovereignty of its own, space is by definition lawless. So these are not so much space laws as shared priorities given reasonably solid form. Many nations already take part in a variety of agreements and treaties, but the progress of space exploration (and soon, colonization and mining, among other things) has outpaced much of that structure. A fresh coat of paint is overdue and NASA has decided to take up the brush.
[...] First, the rules that could be considered new. NASA and partner nations agree to:
- Publicly describe policies and plans in a transparent manner.
- Publicly provide location and general nature of operations to create "Safety Zones" and avoid conflicts.
- Use international open standards, develop new such standards if necessary and support interoperability as far as is practical.
- Release scientific data publicly in a full and timely manner.
- Protect sites and artifacts with historic value. (For example, Apollo program landing sites, which have no real lawful protection.)
- Plan for the mitigation of orbital debris, including safe and timely disposal of end-of-life spacecraft.
Also at The Verge, Ars Technica, and Reuters.
Medieval arrows caused injuries similar to gunshot wounds, study finds:
The English longbow was a powerful medieval weapon said to be able to pierce an opponent's armor and may have been a decisive factor in several key military victories, most notably the Battle of Agincourt. A recent paper published in the Antiquaries Journal by a team of archaeologists at the University of Exeter in the UK has yielded evidence that longbow arrows created similar wounds to modern-day gunshot wounds and were capable of penetrating through long bones.
Historians continue to debate just how effective the longbow was in battle. There have been numerous re-enactment experiments with replicas, but no medieval-period longbows have survived, although many 16th-century specimens were recovered from the wreck of the Mary Rose. The University of Exeter's Oliver Creighton, who led the latest study, and his co-authors argue that such experiments are typically done over shorter ranges, so the arrows are not fully stable and spinning in flight. This, in turn, would affect the kinds of injuries combatants sustained. He and his team believe their analysis shows the importance of osteological evidence in helping to resolve such debates.
It's relatively rare to find direct evidence of violent trauma from weapons to skeletal remains in medieval burial sites, with the exception of mass burials from known historical battles. The best-known such sites are associated with the 1361 Battle of Visby in Gotland, Sweden, and the 1461 Battle of Towton in North Yorkshire, England. Per the authors, data from these sites has yielded useful information on "the realities of medieval warfare—how people fought and were killed, which weapons were used and what sorts of injuries these caused, and what armor (if any) was worn." Evidence of trauma specifically caused by arrowheads is even rarer.
The current study examined 22 bone fragments and three teeth, all showing clear signs of trauma. All were collected during the excavation of the burial ground of a medieval Dominican friary in Exeter from 1997 to 2007, to prepare for the construction of the Princesshay shopping district. Established in 1232 and officially consecrated in 1259, the friary's burial grounds likely included wealthy, high-status laypersons, according to the authors.
[...] "These results have profound implications for our understanding of the power of the medieval longbow; for how we recognize arrow trauma in the archaeological record; and for where battle casualties were buried," Creighton told Medievalists.net. "In the medieval world, death caused by an arrow in the eye or the face could have special significance. Clerical writers sometimes saw the injury as a divinely ordained punishment, with the 'arrow in the eye' which may or may not have been sustained by King Harold II on the battlefield of Hastings in 1066 the most famous case in point. Our study brings into focus the horrific reality of such an injury."
DOI: Antiquaries Journal, 2020. 10.1017/S0003581520000116 (About DOIs).
Powerful new AI technique detects and classifies galaxies in astronomy image data:
Researchers at UC Santa Cruz have developed a powerful new computer program called Morpheus that can analyze astronomical image data pixel by pixel to identify and classify all of the galaxies and stars in large data sets from astronomy surveys.
Morpheus is a deep-learning framework that incorporates a variety of artificial intelligence technologies developed for applications such as image and speech recognition. Brant Robertson, a professor of astronomy and astrophysics who leads the Computational Astrophysics Research Group at UC Santa Cruz, said the rapidly increasing size of astronomy data sets has made it essential to automate some of the tasks traditionally done by astronomers.
"There are some things we simply cannot do as humans, so we have to find ways to use computers to deal with the huge amount of data that will be coming in over the next few years from large astronomical survey projects," he said.
[...] The morphologies of galaxies, from rotating disk galaxies like our own Milky Way to amorphous elliptical and spheroidal galaxies, can tell astronomers about how galaxies form and evolve over time. Large-scale surveys, such as the Legacy Survey of Space and Time (LSST) to be conducted at the Vera Rubin Observatory now under construction in Chile, will generate huge amounts of image data, and Robertson has been involved in planning how to use that data to understand the formation and evolution of galaxies. LSST will take more than 800 panoramic images each night with a 3.2-billion-pixel camera, recording the entire visible sky twice each week.
[...] When Morpheus processes an image of an area of the sky, it generates a new set of images of that part of the sky in which all objects are color-coded based on their morphology, separating astronomical objects from the background and identifying point sources (stars) and different types of galaxies. The output includes a confidence level for each classification. Running on UCSC's lux supercomputer, the program rapidly generates a pixel-by-pixel analysis for the entire data set.
"Morpheus provides detection and morphological classification of astronomical objects at a level of granularity that doesn't currently exist," Hausen said.
More information:Astrophysical Journal Supplement (2020). DOI: 10.3847/1538-4365/ab8868
Geometry guided construction of earliest known temple, built 6,000 years before Stonehenge:
The sprawling 11,500-year-old stone Göbekli Tepe complex in southeastern Anatolia, Turkey, is the earliest known temple in human history and one of the most important discoveries of Neolithic research.
Researchers at Tel Aviv University and the Israel Antiquities Authority have now used architectural analysis to discover that geometry informed the layout of Göbekli Tepe's impressive round stone structures and enormous assembly of limestone pillars, which they say were initially planned as a single structure.
Three of the Göbekli Tepe's monumental round structures, the largest of which are 20 meters in diameter, were initially planned as a single project, according to researchers Gil Haklay of the Israel Antiquities Authority, a Ph.D. candidate at Tel Aviv University, and Prof. Avi Gopher of TAU's Department of Archaeology and Ancient Near Eastern Civilizations. They used a computer algorithm to trace aspects of the architectural design processes involved in the construction of these enclosures in this early Neolithic site.
Their findings were published in Cambridge Archaeological Journal in May.
[...] Discovered by German archaeologist Dr. Klaus Schmidt in 1994, Göbekli Tepe has since been the subject of hot archaeological debate. But while these, and other early Neolithic remains, have been intensively studied, the issue of architectural planning during these periods and its cultural ramifications have not.
Most researchers have made the case that the Göbekli Tepe enclosures at the main excavation area were constructed over time. However, Haklay and Prof. Gopher say that three of the structures were designed as a single project and according to a coherent geometric pattern.
[...] "This case of early architectural planning may serve as an example of the dynamics of cultural changes during the early parts of the Neolithic period," Haklay says. "Our findings suggest that major architectural transformations during this period, such as the transition to rectangular architecture, were knowledge-based, top-down processes carried out by specialists.
"The most important and basic methods of architectural planning were devised in the Levant in the Late Epipaleolithic period as part of the Natufian culture and through the early Neolithic period. Our new research indicates that the methods of architectural planning, abstract design rules and organizational patterns were already being used during this formative period in human history."
Next, the researchers intend to investigate the architectural remains of other Neolithic sites throughout the Levant.
More information:Gil Haklay et al, Geometry and Architectural Planning at Göbekli Tepe, Turkey, Cambridge Archaeological Journal (2020). DOI: 10.1017/S0959774319000660
Nine in ten biz applications harbor out-of-date, unsupported, insecure open-source code, study shows:
Ninety-one per cent of commercial applications include outdated or abandoned open source components, underscoring the potential vulnerability of organizations using untended code, according to a software review.
Synopsys, a California-based design automation biz, conducted an audit of 1,253 commercial codebases in 17 industries for its 2020 Open Source Security and Risk Analysis report.
It found that almost all (99 per cent) of the codebases examined have at least one open source component and that 70 per cent of the code overall is open source. That's about twice as much as the company's 2015 report, which found only 36 per cent of audited code was open source.
Good news then, open source code has become more important to organizations, but its risks have followed, exemplified by vulnerabilities like the 2014 Heartbleed memory disclosure bug and Apache Struts flaws identified in 2017 and 2018.
Ninety-one percent of the audited applications had components that are either four years out of date or have exhibited no active development for two years. In 2019 – the time-period covered by the 2020 report – the percentage of codebases containing vulnerable components rose to 75 per cent, up from 60 per cent in 2018.
The percentage of applications afflicted with high-risk flaws reached 49 per cent in 2019, up from 40 per cent in 2018.
[Ed Note - The company that produced this report, Synopsis, is a vendor in this space and is not a disinterested party.]
CBS is launching a new Star Trek series, Strange New Worlds, which will be a TOS prequel set prior to Kirk assuming command of the Enterprise. As in season 2 of Discovery, the new series will feature Anson Mount as Captain Pike, Rebecca Romjin as Number One, and Ethan Peck as Spock. Discovery has been polarizing for Star Trek fans with many fans criticizing the writing of both Discovery and Picard, saying it deviated from the defining characteristics of Star Trek. Despite the criticisms, Mount's portrayal of Pike in Discovery was generally received well. The story for the pilot will be developed by Akiva Goldsman, Alex Kurtzman, and Jenny Lumet, the first two of which are executive producers of Discovery. Because filming of TV shows has generally been halted by COVID-19, it is not known when the series will film or premiere on CBS' streaming service.
A combo of fasting plus vitamin C is effective for hard-to-treat cancers, study shows:
Scientists from USC and the IFOM Cancer Institute in Milan have found that a fasting-mimicking diet could be more effective at treating some types of cancer when combined with vitamin C.
In studies on mice, researchers found that the combination delayed tumor progression in multiple mouse models of colorectal cancer; in some mice, it caused disease regression. The results were published in the journal Nature Communications.
"For the first time, we have demonstrated how a completely non-toxic intervention can effectively treat an aggressive cancer," said Valter Longo, the study senior author and the director of the USC Longevity Institute at the USC Leonard Davis School of Gerontology and professor of biological sciences at the USC Dornsife College of Letters, Arts and Sciences. "We have taken two treatments that are studied extensively as interventions to delay aging -- a fasting-mimicking diet and vitamin C -- and combined them as a powerful treatment for cancer."
The researchers said that while fasting remains a challenging option for cancer patients, a safer, more feasible option is a low-calorie, plant-based diet that causes cells to respond as if the body were fasting. Their findings suggest that a low-toxicity treatment of fasting-mimicking diet plus vitamin C has the potential to replace more toxic treatments.
Journal Reference
Maira Di Tano, Franca Raucci, Claudio Vernieri, et al. Synergistic effect of fasting-mimicking diet and vitamin C against KRAS mutated cancers [open], Nature Communications (DOI: 10.1038/s41467-020-16243-3)
ALGOL 60 at 60: The Greatest Computer Language You've (Probably) Never Used:
2020 marks 60 years since ALGOL 60 laid the groundwork for a multitude of computer languages.
The Register spoke to The National Museum of Computing's Peter Onion and Andrew Herbert to learn a bit more about the good old days of punch tapes.
ALGOL 60 was the successor to ALGOL 58, which debuted in 1958. ALGOL 58 had introduced the concept of code blocks (replete with begin and end delimiting pairs), but ALGOL 60 took these starting points of structured programming and ran with them, giving rise to familiar faces such as Pascal and C, as well as the likes of B and Simula.
"In the 1950s most code was originally written in machine code or assembly code," said Herbert, former director of Microsoft Research in Cambridge, with every computer having its own particular twist on things.
[..] "Fortran," said Herbert, "emerged as the first real programming language for scientific and numeric work. That convinced people that having higher-level languages (as they called them then – they were pretty primitive by modern standards) made programmers more productive."
[...] "And a bunch of people thought you could do better."
[...] One group started on the design of what was then called an "Algorithmic Language": a language for writing algorithms. The output, in 1958, described the language "ALGOL 58". However, as engineers began to create compilers for the new system, they found "all kinds of things hadn't really been thought about or worked through properly," recalled Herbert.
[...] Eventually, Herbert told us, "they published the ALGOL 60 report, which is the baseline that everyone then worked to."
[...] "People were sorting out some of the things that we now take for granted like ideas in structured programming, data structures, data types," he added.
[...] Alas, those seeking a handy-dandy "HELLO WORLD" example will be disappointed. The Achilles' heel of the language that would go on to inspire so many others was that it lacked standard input/output capabilities.
[...] Oh dear. The omission pretty much did for vendor independence as manufacturers naturally went their own way, leaving large chunks of code incompatible between systems. There were also elements of ALGOL 60 that were open to interpretation, leaving it a little compromised from the start.
While ALGOL ploughed its furrow, Fortran continued to be developed in parallel. "People in the Fortran world," explained Herbert, "saw ideas in ALGOL they quite liked and brought them across." As the decades passed, Fortran remained the centre of gravity for scientific computing while ALGOL became more of an academic language, used for teaching computer science ideas.
[...] The story of ALGOL 60 is not so much of the language's eventual fate, but also of those that it inspired. ALGOL W, based on a proposal for ALGOL X, by Niklaus Wirth and QuickSort creator Tony Hoare would go on to inspire Wirth's Pascal and Modula-2. Pascal's influence continues to be felt today.
ALGOL 60 also heavily influenced the Combined Programming Language (CPL), developed in the 1960s but not implemented until the following decade. CPL in turn led to Basic CPL (BCPL), from which B descended. The B language was further developed to become C.
[...] As for taking the ALGOL 60 itself out for a spin today, there are a few options for those not fortunate enough to have an Elliott 803 or 903 to hand. MARST will translate ALGOL 60 to C or one can get a feel for the whole 803 experience via a simulator.
This humble scribe did not have occasion to use ALGOL, but did spent the better part of two years programming PASCAL professionally. At a time when it seemed all other languages had a limited — and finite — set of data types, PASCAL was different. It encouraged the creation of whatever data types and data structures that best matched the task at a hand.
Failure to delete hate speech could cost Facebook, Google billions in France
Lawmakers in France this week passed a controversial new law that could impose billions in fines on social media companies that fail to delete certain kinds of content quickly enough—within an hour, in some cases.
The new legislation (page in French) gives online platforms 24 hours from notification to remove certain kinds of content or else face fines.
Content subject to enforcement under the law includes: sexual harassment; child pornography; anything that promotes certain crimes; anything that promotes discrimination, hate, or violence; anything that denies crimes against humanity; and promotion of terrorism. The window for removing content related to child pornography or terrorism is shorter, only one hour.
A company that fails to remove such content within the correct time limit after being notified of it can be fined €1.25 million ($1.35 million). If a regulatory board finds a company is not meeting its obligations, it can impose a maximum fine equal to 4 percent of that company's annual global revenue.
The BBC notes:
Digital rights group La Quadrature du Net said the requirement to take down content that the police considered "terrorism" in just one hour was impractical.
"Except the big companies, nobody can afford to have a 24/7 watch to remove the content when requested," a spokesman for the group said. "Hence, they will have to rely on censorship before receiving a request from the police."
That might be in the form of using an automatic system provided by the largest companies, giving them "more power on what can exist on the web or not".
But there are also fears that such tech could be used against groups such as protesters.
"Since 2015, we already had such a law that allowed the police to ask for the removal of some content if they deemed it to be terrorist... this has been used multiple times in France to censor political content," the spokesman said.
"Giving the police such a power, without any control... is obviously for us an infringement on the freedom of speech."
Zerodium Temporarily Stops Purchasing iOS Exploits Due to High Number of Submissions
Zerodium this week announced that it will not be purchasing any iOS exploits for the next two to three months due to a high number of submissions. In other words, the company has so many security vulnerabilities at its disposal that it does not need any more.
Zerodium is an exploit acquisition platform that pays researchers for zero-day security vulnerabilities and then sells them to institutional customers like government organizations and law enforcement agencies. The company focuses on high-risk vulnerabilities, normally offering between $100,000 and $2 million per fully functional iOS exploit.
Also at The Register and Wccftech.
Previously: Zero-Day Broker Publishes a Price Chart for Different Classes of Digital Intrusion
Exploit Vendor Drops Tor Browser Zero-Day on Twitter
Jennifer Ouellette over at Ars Technica is reporting on new research on "how distrust in health expertise spreads through social networks."
The article, published on 13 May, in the journal Nature compares network relationships within both pro and anti vaccination groups on Facebook. From the Ars piece:
Last year, the United States reported the greatest number of measles cases since 1992. According to the Centers for Disease Control and Prevention, there were 1,282 individual cases of measles in 31 states in 2019, and the majority were among people who were not vaccinated against measles. It was yet another example of how the proliferation of anti-vaccine messaging has put public health at risk, and the COVID-19 pandemic is only intensifying the spread of misinformation and conspiracy theories.
But there may be hope: researchers have developed a "map" of how distrust in health expertise spreads through social networks, according to a new paper published in the journal Nature. Such a map could help public health advocates better target their messaging efforts.
[...] [Lead author]Johnson and his colleagues analyzed Facebook communities actively posting about the topic of vaccines during the 2019 measles outbreak—more than 100 million users in all—from around the world, mapping out the interconnected networks of information across cities, countries, continents, and languages. There were three main camps: communities that were pro-vaccine, communities that were anti-vaccine, and communities that were neutral or undecided regarding the topic (groups focused on parenting, for instance).
The researchers then tracked how the various communities interacted with each other to create a detailed map of the networks. "It's not geographic, it's to do with closeness in a social network sense—in terms of information, influence," Johnson told Ars. "It's not whether I'm here and someone's in Australia. It's the fact that someone in Australia agrees with my slightly twisted narrative on COVID-19 and I'm getting their feed. Although my neighbor doesn't understand me, the person in Australia does.
[...] The results were surprisingly counter-intuitive. While there were fewer individual people who were anti-vaccine on Facebook, there were almost three times as many anti-vax communities clustered around Facebook groups and pages. So any pro-vaccine groups seeking to counter the anti-vaccine misinformation often targeted larger communities and missed the small- to medium-sized clusters growing rapidly just under their radar, according to Johnson.
With the COVID-19 pandemic, the spread of misinformation has gotten even worse. "We didn't stop the day we submitted this paper," said Johnson. "We've been monitoring every day, every minute, the conversations and what you see in these Facebook pages, in these clusters, these communities. It's gone into hyper drive since COVID-19." He and his colleagues developed a predictive model for the spread, which showed anti-vaccine sentiment dominating public discourse on the topic within a decade. Furthermore, "that was a worst-case scenario if nothing was done as of December 2019, when we submitted the paper," said Johnson. "Now it's amplified. If we did that same study now, I think it would be a lot faster than ten years because of the COVID-19 situation. It's the perfect storm."
[...] A new study [Abstract. Preprint PDF available for download] published in the journal BMJ Global Health bolsters Johnson et al.'s findings. Scientists at the University of Ottawa in Canada searched YouTube for the most widely viewed videos in English relating to COVID-19. They narrowed it down to 69 videos with more than 247 million views between them and then assessed the quality of the videos and the reliability of the information presented in each using a system developed specifically for public health emergencies.
The majority of the videos (72.5 percent) presented only factual information. The bad news is that 27.5 percent, or one in four, contained misleading or inaccurate information, such as believing pharmaceutical companies were sitting on a cure and refusing to sell it; incorrect public health recommendations; racist content; and outright conspiracy theories. Those videos—which mostly came from entertainment news, network, and Internet news sources—accounted for about a quarter of the total views (roughly 62 million views). The videos that scored the highest in terms of accuracy, quality, and usefulness for the public, by contrast, didn't rack up nearly as many views.
DOI: Nature, 2020. 10.1038/s41586-020-2281-1
DOI: BMJ Global Health, 2020. 10.1136/bmjgh-2020-002604 [Full paper here, gratis]
Background:
Back in the early days of SoylentNews, things were often fly-by-the-seat-of-our-pants. We tried to plan ahead and anticipate future needs. In retrospect, I'd like to think we did pretty well, all in all. One early casualty was the choice of our discussion system. My memory is fuzzy on the details, but I seem to recall it was based on "phpBB Forum Software" (Corrections welcome!) That eventually was superseded by IRC.
Internet Relay Chat (IRC):
Yes, SoylentNews has its own IRC service. It's used for all manner of purposes. Ostensibly, it's for staff to communicate with each other about site plans, development, and operations. But, multiple "channels" are readily implemented, so we have a bunch of channels up and running. If you are new to IRC, the easiest way to get started is to use our web portal — just select a nick, accept "#Soylent" as the channel, and you're there!
If you have heard about IRC and are curious about our IRC service, please read on past the fold. Otherwise, a new story will be along presently.
Unrelated:
Please join me in wishing NCommander a Happy Birthday!
Operating Systems:
One of the early missteps was the choice of CentOS as the operating system for one of our servers: beryllium. All of our other servers ran Ubuntu. That CentOS server, beryllium, became the server for all the other services that were not directly required for site operations. Quite frankly, it's a bit of a mess. For the curious, expand the following for a subset of what is runs there:
Charybdis, IRC server, http://irc.soylentnews.org - port 6667, 6697(SSL)
Atheme, IRC services
Iris, IRC web chat, http://chat.soylentnews.org - port 3989, forwarded from 80 by apache
Various IRC bots
ZNC, IRC bouncer for staff, http://irc.soylentnews.org - port 60000
Yourls, URL shortener service on http://sylnt.us - port 80
MySQL, used for Yourls
Postfix
Mailman
Dovecot
Apache2/httpd
OpenSSH
ntpd
Progress:
We are in the process of cleaning things up.
We now have 3 servers running Gentoo: lithium, magnesium, and our new server aluminum. Gentoo lets us custom build our servers so they are only running the services we need. That gives us better security (smaller attack surface) and better performance, too. Oh, and no systemd.
The Nitty Gritty: At this point, I'll turn the microphone over to Deucallion (aka Juggs) on what's happening with IRC on aluminum (lightly edited):
So far we have brought a new ircd (Internet Relay Chat - Daemon) into the network: "call.me.al". The 2 crucial key points are:
- Moving services (NickServ, ChanServ, GroupServ, HostServ, SaslServ et al.) Those are all provided by one server side process (atheme), anyone not clued up won't really to know they exist as a separate thing and just interact with it to register a nick and then as the channel bots they see with all the daft names.
- Will be reversing DNS entries for irc1 and irc3.
If I do my part right, there will be minimal to no outage time caused by any of it.
Then there are all the ancillary bits and bots that do logs and stats and story subs and the like but they are not intrinsic to the main IRC infrastructure and just an inconvenience if they go away for an hour or so while ported across.
I announce to everyone here on IRC when I am doing work on something and anticipate a possible outage of some kind as TBH the only people who care if IRC goes down or is degraded in some form are the people using it at that time. As a user it is nice to know in that scenario that it is not your client playing up, nor your network, or your ISP etc. it's just gone for maintenance and sit it out; do not bother investigating. Same reason I announce when I stop messing with stuff so people know there are no works underway.
And for clarification the 3 ircds we currently have now are all classified as hubs, no leafs, they are peers in a network. There is no master-slave relationship in play. We think of irc. as being master because all the other ancillaries sit on it but they can just as well sit on irc2. or "call.me.al". The ircds and services do not give a flying monkey what DNS name resolves to them, it is just convention to name the ircd that resides at irc2.soylentnews.org "irc2.soylentnews.org" or as it is "irc2.sylnt.us" - but it is just that, a name, a label.
This is specifically why I am going with "call.me.al" for aluminum: it breaks that cognitive second guessing about "do I need to match the reverse DNS here or not" questions in my mind at least when I come back to look at it in a year or 2 or 3 or 5. Maybe I am just a simpleton with OCD or some such, but to my mind - a label should be a label, the DNS should be another thing. If they do not need to match, make them different for clarity.
Epilog:
Do keep in mind, this is all being done by volunteers from their (limited) spare time and at no charge. There's still much to do, but we are making progress. Our goal is that over the next couple months or so, to have all of our servers refreshed and moved over to Gentoo. There will be hiccups. Hopefully they will be minor, few, and far between. As always, we will keep the community apprised as to our progress.
So cross your fingers, and join me in thanking these fine folk for all their efforts: TheMightyBuzzard, Deucallion, audioguy, and NCommander!
Previously:
(2020-05-09) Site Potpourri for Mother's Day [Updated]
CNet:
It's going to be awhile longer before the US Postal Service receives new mail delivery vehicles. The USPS has reportedly once again delayed its ongoing proposal process because of the coronavirus pandemic.
Trucks.com first reported the delay on Tuesday after the request for proposal period was supposed to end on March 27. Now, the service's latest filing pegs July 14 as the final date. A handful of companies have already provided prototype next-generation mail delivery vehicles in hopes of receiving a multibillion-dollar contract for the business.
The update is meant to introduce 200,000 replacement vehicles that incorporate electrification.