Stories
Slash Boxes
Comments

SoylentNews is people

SoylentNews is powered by your submissions, so send in your scoop. Only 19 submissions in the queue.
posted by Fnord666 on Saturday November 16 2019, @11:38PM   Printer-friendly
from the poker-analogies dept.

Arthur T Knackerbracket has found the following story:

Common behaviors shared across all families of ransomware are helping security vendors better spot and isolate attacks.

This according to a report from British security shop Sophos, whose breakdown (PDF) of 11 different malware infections, including WannaCry, Ryuk, and GandCrab, found that because ransomware attacks all have the same purpose, to encrypt user files until a payment is made, they have to generally perform many of the same tasks.

"There are behavioral traits that ransomware routinely exhibits that security software can use to decide whether the program is malicious," explained Sophos director of engineering Mark Loman.

"Some traits – such as the successive encryption of documents – are hard for attackers to change, but others may be more malleable. Mixing it up, behaviorally speaking, can help ransomware to confuse some anti-ransomware protection."

Some of that behavior, says Loman, includes things like signing code with stolen or purchased certificates, to allow the ransomware to slip past some security checks. In other cases, ransomware installers will use elevation of privilege exploits (which often get overlooked for patching due to their low risk scores) or optimize code for multi-threaded CPUs in order to encrypt as many files as possible before getting spotted.


Original Submission

 
This discussion has been archived. No new comments can be posted.
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by edIII on Sunday November 17 2019, @12:44AM (1 child)

    by edIII (791) on Sunday November 17 2019, @12:44AM (#921132)

    The journaling filing system helps mitigate file corruption, which is what ransomware really is. At this point I wouldn't consider a file system that didn't perform journaling.

    Good point about backups though. They're definitely a major part of it, but not just frequency. Duration of the backups, or when you start to cycle backup media, is very important too. Frequent snapshots of important data allow you to effectively playback the changes, but like you alluded to, they should be airgapped.

    Now thinking about it, protecting filing systems is a little more difficult than protecting databases. In the latter it's a lot easier stream transactions to a backup that is regularly backed up off site. Ransomware isn't the reason driving database backup policies to heavily mitigate corruption. I've seen situations in which corruption is happening slowly each day to large databases, and nobody noticed for 6 months. Only way I was able to recover anything were backup copies from before and during the corruption.

    Fundamentally, I think that's what will protect us against ransomware. Having many different versions of the data across time. Which has many benefits beyond ransomware protection.

    --
    Technically, lunchtime is at any moment. It's just a wave function.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 1, Informative) by Anonymous Coward on Sunday November 17 2019, @07:18AM

    by Anonymous Coward on Sunday November 17 2019, @07:18AM (#921203)

    No, Journaling file systems do nothing to prevent FILE corruption. They are designed to prevent FILE SYSTEM corruption. Specifically, they are designed to keep the different metadata structures of the disk in a consistent state. Ordered journals with barriers have the additional design benefit of keeping the metadata in a consistent state with the actual file content. Full journals have the additional design benefit of preventing some data writes from getting lost (which might incidentally cause corruption if interrupted) on replay of the journal. But do note that only full journaling prevents such incidental file corruption on overwrites.

    None of these do anything to prevent your data from being changed in-place from raw writes, cosmic rays, or whatever. They also don't allow you to recover from any errors in the writes either. They definitely don't prevent damage from otherwise acceptable commands from your software, especially those that complete successfully.