Stories
Slash Boxes
Comments

SoylentNews is people

posted by cmn32480 on Friday August 12 2016, @04:12AM   Printer-friendly

Arthur T Knackerbracket has found the following story:

Russian security outfit Dr. Web says it's found new malware for Linux.

The firms[sic] says the “Linux.Lady.1” trojan does the following three things:

  • Collect information about an infected computer and transfer it to the command and control server.
  • Download and launch a cryptocurrency mining utility.
  • Attack other computers of the network in order to install its own copy on them.

The good news is that while the Trojan targets Linux systems, it doesn't rely on a Linux flaw to run. The problem is instead between the ears of those who run Redis without requiring a password for connections. If that's you, know that the trojan will use Redis to make a connection and start downloading the parts of itself that do real damage.

Once it worms its way in the trojan phones home to its command and control server and sends information including the flavour of Linux installed, number of CPUs on the infected machine and the number of running processes. The Register imagines that information means whoever runs the malware can make a decent guess at whether it is worth getting down to some mining, as there's little point working with an ancient CPU that's already maxed out.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by butthurt on Friday August 12 2016, @08:56AM

    by butthurt (6141) on Friday August 12 2016, @08:56AM (#386942) Journal

    Excluding spam, it looks as though 15 of the pending submissions in the current queue [archive.is] were manually submitted, whilst 4 were scraped by Arthur T Knackerbracket. Among the 10 approved submissions, 7 were scraped by Arthur T Knackerbracket. It leaves me with an impression that the editors are approving stories from that bot more readily than those submitted by humans.

    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by c0lo on Friday August 12 2016, @09:37AM

    by c0lo (156) Subscriber Badge on Friday August 12 2016, @09:37AM (#386946) Journal

    Excluding spam, it looks as though 15 of the pending submissions in the current queue were manually submitted, whilst 4 were scraped by Arthur T Knackerbracket.

    Quality and potential interest (subjective, I know) also matter.

    Those 15 pending submissions:
    - 3 a book/movie/tv reviews - a topic mainly for the weekend
    - 2 spams - those "Thank you for sharing"
    - 1 on which the author marked it as for weekend (the phone-home vibrator story)
    - 1 badly formatted submission apparently about a movie (Hook reunion - I don't know what hook is, but I'm not in the mood to click on 20+ links to learn it)
    - 1 not a story but poll suggestion

    4 by Arthur

    15-(3+2+1+1+1)-4=3

    --
    https://www.youtube.com/watch?v=aoFiw2jMy-0 https://soylentnews.org/~MichaelDavidCrawford
    • (Score: 0) by Anonymous Coward on Friday August 12 2016, @03:57PM

      by Anonymous Coward on Friday August 12 2016, @03:57PM (#387065)

      I'm not in the mood to click on 20+ links

      Before your post, my curiosity had gotten the better of me and I had looked at that submission.
      It mentions Robin Williams and the concurrent anniversary of his death.
      The term "The Lost Boys" is also significant within the post.

      Yes, it is link soup for those who don't recognize the shibboleth.
      (Not being a movie guy for many years now, I was surprised that I did.)
      The post refers to a live-action retelling of J.M. Barrie's "Peter Pan" and his nemesis, the pirate Captain Hook who had a hand bitten off by a crocodile.
      (Disney did an animated version in the 1950s.)

      -- OriginalOwner_ [soylentnews.org]

  • (Score: 4, Informative) by n1 on Friday August 12 2016, @12:00PM

    by n1 (993) on Friday August 12 2016, @12:00PM (#386976) Journal

    Glad you brought this up, have a few thoughts on this myself.

    Firstly i'd be happy if we stopped using The Register's stupid 'hey buddy, we're cool too' headlines. Other than that, multiple el reg stories is a coincidence but does perhaps illustrate how good they are at drawing interest and clicks with their way of framing stories, being a tech tabloid isn't a bad thing for them.

    Beyond that, there are times when we do approve stories by Arthur quicker than normal submissions, this is a decision made in the moment that could be for many reasons.

    1) The stories that Arthur submits has already been filtered by an editor so they're usually worth running. As you can see still in the queue, my manual submissions as an editor don't get any special treatment.
    2) Sometimes it's because it's quicker to deal with a general interest story found by Arthur than it is to try and make the dregs of the submissions queue into something worthwhile. This can be seen as lazy, but it's also a practical solution to keeping the content rolling at a reasonable level of quality.
    3) Personally, I try to keep a mix of subjects/topics. I don't like running several Arthur stories in a row, but if it creates a broader spectrum of topics and potential discussions, it's worth it.
    4) What you're seeing in the queue now is not necessarily what was there when the stories were chosen. Yesterday the last time i checked, we were down to a small number of low quality submissions (plus my awesome ones that no one wants to touch), The rest was Arthur, they're a lot of the ones that got picked up. The run of stories you're seeing now was cmn putting through a bunch literally moments before he went on vacation. I'd not be surprised to learn it was rushed... The rest of editorial was asleep or at work.

    Every time you see Arthur, myself, takyon, martyb and even the IRC bots MrPlow and exec... We're often submitting these stories because either in our opinion, or in reality of a near empty queue, we don't have enough stories to run. Give us submissions to work with so we don't have to resort to internally generated submissions. There are stories every day that we miss, even when we're running lots. It's so easy to submit a story, but it's still too much work for most. Sometimes I wonder if people forget how the site works in regards to submissions.

    Arthur came through necessity, there have been and continue to be periods on the site where 5-10 submissions a day of varying quality was all we'd get... still happens.... Arthur can be a crutch, but i don't think it's actually been a detriment to the quality of the site. Even so, we should be playing closer attention to the variety of sources when pushing through stories generated this way. I certainly don't want to give El Reg any more credit or attention than it deserves.

    • (Score: 0) by Anonymous Coward on Friday August 12 2016, @12:56PM

      by Anonymous Coward on Friday August 12 2016, @12:56PM (#386987)

      Can you scrape BBC news? That's where I get most stories from for submission to Soylent.

      • (Score: 1, Touché) by Anonymous Coward on Friday August 12 2016, @05:10PM

        by Anonymous Coward on Friday August 12 2016, @05:10PM (#387092)

        The Register often has a story before anyone else.

        scrape BBC

        Roy Schestowitz and his band of smart helpers over at TechRights regularly bust BBC for being a blatantly M$-friendly and FOSS-hostile environment.
        ...as well as GCHQ-|NSA-friendly.

        IMO, BBC is only useful for tech news if you like your stuff biased toward the closed-source/proprietary sector and only useful for security news if you like that biased in favor of oppressive Imperialist regimes (USA/UK/AU).

        -- OriginalOwner_ [soylentnews.org]

      • (Score: 2) by janrinok on Saturday August 13 2016, @06:01PM

        by janrinok (52) Subscriber Badge on Saturday August 13 2016, @06:01PM (#387567) Journal

        We do scrape BBC RSS feeds. For example: " rel="url2html-23139">https://soylentnews.org/article.pl?sid=16/08/11/135225

        Now, finding a BBC story that is current, unbiased, and accurate is slightly can be more difficult.

        Sorry about the formatting on the link - that is something it has only recently started doing, and it is the first time that I have noted it...

  • (Score: 3, Informative) by janrinok on Saturday August 13 2016, @05:45PM

    by janrinok (52) Subscriber Badge on Saturday August 13 2016, @05:45PM (#387560) Journal

    As the writer of 'Arthur', allow me to make some comments:

    When the site first began we would receive over 30 submissions every day from which we tried to produce a day's output. If the queue went below 20 we would change into a slower release schedule to try to get by until the subs picked up again. Of course, not all stories are suitable for publication and there are a fair number of dupes which we, as editors, have to filter out. Over time the submission rate fell and if we asked the community to rise to the challenge. The main obstruction that members claimed was the problem was finding suitable stories for submission. Somebody on the team started monitoring the RSS feeds from various well known sources including the tech sites, news channels and security groups. These are available to anyone (https://logs.sylnt.us/%23rss-bot/index.html) to provide them with a link to brand new stories as soon as they are released. They are/were used for a while but the community seems not to look at them very often.

    As an editor, it is quite disheartening to discover that the submission queue is filled with more stories of racial inequality, police brutality, shootings or political electioneering which, while important, we have discussed so many times in the past. This is especially so when there are literally hundreds of new stories each weekday to be found on the RSS feeds. So I wrote a bot that downloads each of the stories from the feeds and dumps them, with a little bit of processing and formatting, onto my hard drive. I am trying to produce a fully automated system, but it is not quite there yet. So I go through the processed stories manually, decide which are most suitable for our site, tidy them up (and modify the bot to cope with whatever I find automatically next time!), and then submit them - just as anyone else can do. They hit the sub queue with no preference or favour other than they have already be partly processed and are more likely to be topics of interest to our community. But not all of Arthur's submissions are used - they get rejected just as often as those stories submitted by anyone else (88% accepted at the time of writing).

    If you look at the RSS feeds for any given day, you will clearly see that TheRegister is only one of dozens of feeds that we monitor. The fact is, even if you dislike their writing style, they do get interesting stories out in fairly quick time.

    If the community don't want to see submissions from Arthur T Knackerbracket the solution is in their own hands. Find current stories that can inform the community and generate new and original discussion and submit them!. Avoid those topics that we have discussed 'ad nauseum' unless they bring something new to the discussion. The emphasis should be on science and technology but we can discuss anything that is of interest to the majority of the community.

    • (Score: 0) by Anonymous Coward on Thursday August 18 2016, @12:48PM

      by Anonymous Coward on Thursday August 18 2016, @12:48PM (#389563)

      Thanks for helping keep the site alive.

      I typically only submit when the queue is drying up, but I've also noticed that it has been happening more often in recent times.