Stories
Slash Boxes
Comments

SoylentNews is people

posted by janrinok on Thursday March 16, @11:51PM   Printer-friendly

The US government looks poised to force tech companies to do more about security:

The US government, worried about the continuing growth of cybercrime, ransomware, and countries including Russia, Iran, and North Korea hacking into government and private networks, is in the middle of drastically changing its cybersecurity strategy. No longer will it rely largely on prodding businesses and tech companies to voluntarily take basic security measures such as patching vulnerable systems to keep them updated.

Instead, it now wants to establish baseline security requirements for businesses and tech companies and to fine those that don't comply.

It's not just companies that use the systems who might eventually need to abide by the regulations. Companies that make and sell them, such as Microsoft, Apple, and others could be held accountable as well. Early indications are that the feds already have Microsoft in their crosshairs — they've warned the company that, at the moment, it doesn't appear to be up to the task.

[...] In theory, if those standards aren't met, fines would eventually be imposed. Glenn S. Gerstell, former general counsel of the National Security Agency, explained it this way to the Times: "In the cyberworld, we're finally saying that Ford is responsible for Pintos that burst into flames, because they didn't spend money on safety." That's a reference to the Ford Pinto frequently bursting into flames when rear-ended in the 1970s. That led to a spate of lawsuits and a ramp-up in federal auto safety regulations.

But cybersecurity requirements backed by fines aren't here yet. Dig into the new document and you'll find that because the new strategy is only a policy document, it doesn't have the bite of law behind it. For it to go fully into effect, two things need to happen. President Biden has to issue an executive order to enforce some of the requirements. And Congress needs to pass laws for the rest.

It's not clear when lawmakers might get around to moving on the issue, if ever, although Biden could issue an executive order for parts of it.

[...] So, what does all this have to do with Microsoft? Plenty. The feds have made clear they believe Microsoft has a long way to go before it meets basic cybersecurity recommendations. At least one top government security official has already publicly called out Microsoft for poor security practices.

Cybersecurity and Infrastructure Security Agency Director Jen Easterly recently criticized the Microsoft during a speech at Carnegie Mellon University. She said that only about one-quarter of Microsoft enterprise customers use multifactor authentication, a number she called "disappointing." That might not sound like much of a condemnation, but remember, this is the federal government we're talking about. It parses its words very carefully. "Disappointing" to them is the equivalent of "terrible job" anywhere else.

[...] Even without laws and executive orders, the company could be in trouble. The US government spends billions of dollars on Microsoft systems and services every year, a revenue stream that could be endangered if Microsoft doesn't adhere to the standards.


Original Submission

 
This discussion was created by janrinok (52) for logged-in users only. Log in and try again!
Display Options Threshold/Breakthrough Mark All as Read Mark All as Unread
The Fine Print: The following comments are owned by whoever posted them. We are not responsible for them in any way.
  • (Score: 2) by canopic jug on Friday March 17, @03:27PM (1 child)

    by canopic jug (3949) Subscriber Badge on Friday March 17, @03:27PM (#1296698) Journal

    Still, should algorithms / code blobs be copyrightable? I would say, No.

    Algorithms are different than their output. The US Copyright Office has decided that the output of AI algorithms is not eligible for copyright [theverge.com], so it'd not be even a short stretch to see the same view taken on code. However, the point being made is that the algorithms don't actually produce code but merely recombine and even outright plagiarize existing code.

    --
    Money is not free speech. Elections should not be auctions.
    Starting Score:    1  point
    Karma-Bonus Modifier   +1  

    Total Score:   2  
  • (Score: 2) by Freeman on Friday March 17, @05:05PM

    by Freeman (732) Subscriber Badge on Friday March 17, @05:05PM (#1296709) Journal

    Still, I think "plagiarizing" code shouldn't be a thing. Now, if you're actually trying to make X site look like Y site, sure that should be able to be covered by copyright/trademark or other laws. In the event that you're using similar or snippets of code that are exactly the same to design a totally different site. One that isn't designed to look exactly like the other site, then those snippets of code, shouldn't be a problem. You want security in software? Don't patent/copyright any code. An entire work, sure, large sections of an entire work, possibly, but not styles/snippets/best practice kinds of thing. There's already precedent that AI output isn't copyrightable. Maybe the future is no copyrighted code. One can dream.

    --
    Joshua 1:9 "Be strong and of a good courage; be not afraid, neither be thou dismayed: for the Lord thy God is with thee"