Sean Gallagher reports at Ars Technica that Dr. Andy Ozment, Assistant Secretary for Cybersecurity in the Department of Homeland Security, told members of the House Oversight and Government Reform Committee that in the case of the recent discovery of an intrusion that gave attackers access to sensitive data on millions of government employees and government contractors, encryption would "not have helped" because the attackers had gained valid user credentials to the systems that they attacked—likely through social engineering.
Ozment added that because of the lack of multifactor authentication on these systems, the attackers would have been able to use those credentials at will to access systems from within and potentially even from outside the network. "If the adversary has the credentials of a user on the network, they can access data even if it's encrypted just as the users on the network have to access data," said Ozment. "That did occur in this case. Encryption in this instance would not have protected this data."
The fact that Social Security numbers of millions of current and former federal employees were not encrypted was one of few new details emerged about the data breach and House Oversight member Stephen Lynch (D-Mass.) was the one who pulled the SSN encryption answer from the teeth of the panel where others failed. "This is one of those hearings where I think that I will know less coming out of the hearing than I did when I walked in because of the obfuscation and the dancing around we are all doing here. As a matter of fact, I wish that you were as strenuous and hardworking at keeping information out of the hands of hackers as you are in keeping information out of the hands of Congress and federal employees. It's ironic. You are doing a great job stonewalling us, but hackers, not so much."
See our earlier stories: U.S. Government Employees Hit By Massive Data Breach and Hacking of Federal Security Forms Much Worse than Originally Thought
(Score: 0) by Anonymous Coward on Thursday June 18 2015, @01:38PM
*riiinnngg*
"Hello, I forgot my password"
"That's horrible! Tell me the username and your birthday."
"Keyboard Stapler, January 32nd 1901"
"Ok, your new username is k-s-t-a-p-l-e-r-1-3-2-19-0-1"
Probably something like that. You can't even blame the person most proximal to the cause, they were trying to help. It is not their fault.
(Score: 2) by Immerman on Thursday June 18 2015, @02:47PM
No, it most absolutely IS their fault.
In your scenario they should have required some proof that the person who "forgot their password" actually was who they claimed to be. And the required proof should be at least as reliable as what was required to create the account in the first place. Any other policy completely undermines the point of having passwords in the first place.
Being "helpful" is no excuse for completely undermining the purpose of your job.
(Score: 2) by Dunbal on Thursday June 18 2015, @03:25PM
Not only that but designing a system where someone with "valid credentials" can obtain an unencrypted copy of every single entry in the database without raising any flags is pretty shoddy design. Or do they just hand out root powers to everyone?
(Score: 2) by Immerman on Thursday June 18 2015, @03:48PM
Not being a database engineer - I wonder how you would restrict it? If you have any access to the database then it seems to me that presumably you have a potential need to access any single record within it, which would in theory allow access to *every* record. Sure, there could be red flags raised if someone is accessing large numbers of records, but actually terminating access would seem to be an invitation to a lot of grief - especially if there is ever any serious data-analysis/aggregation performed.
(Score: 2) by Dunbal on Thursday June 18 2015, @04:58PM
We're talking millions of records. It is simply not possible for someone to access them one single record at a time, nor is it possible for someone to somehow copy a record at a time. This is someone who must have had the ability to dump, unencrypted, the whole database. That's different from having read access to records, or modify access to records. That's a privilege that pretty much should belong only to the guy who makes the backups and it's pretty silly if that can be done from anywhere other than in front of the server by any user.
(Score: 3, Insightful) by Immerman on Thursday June 18 2015, @06:53PM
SELECT * FROM records. Save results to disk. Done.
(Score: 2) by WillR on Thursday June 18 2015, @07:55PM
Plonk! (Unless you've pwned a DBA's machine, anyway...)
(Score: 2) by Dunbal on Friday June 19 2015, @01:14AM
Yeah give EVERY user the ability to do this. Sorry how does this change the point I was making? If anyone can get root access to the database through "social engineering" then there is no security at all. So how many employees have already obtained/sold the list without the agency knowing?
(Score: 5, Insightful) by schad on Thursday June 18 2015, @01:45PM
The answer makes sense on its face, but if you think about it then it all falls apart.
OK, so the access was gained by way of social engineering. Why on Earth would any user have the ability to download the entire database? There is exactly one user whose job responsibility might require that level of access: a sysadmin. And guess what? Sysadmins never need to look at the actual data. So encryption absolutely would help you. Yes, your sysadmin would be able to track down the decryption key on the app server, or wherever it lives; but an attacker wouldn't immediately know where to look for it, and he might not know to look at all. If the key is on another system, then at least you're forcing the hacker to gain access to that other system too. That other system might have different passwords, monitoring, etc., all of which increase the chances of detection or even stopping the intruder outright.
It's all part of the security onion. You don't count on any one safeguard to stop everyone. Your objective is to slow the attackers, which increases the odds that they'll be detected and gives humans a chance to do something. In this case, imagine if the compromised account had only been able to download 1000 employee records per hour. It's very plausible that an audit system would flag 24 straight hours of max-rate downloads and bring it to an admin's attention. The admin would contact the user, who would claim not to know what's going on, and the account would be locked. Bam, problem solved. Yeah, you would've leaked roughly 25k accounts, but that's a far cry from the millions that were actually leaked.
But, as we always end up concluding when stuff like this happens, security is hard. It needs to be baked in from the beginning, not tacked on after the fact. And it needs to be done by actual experts, not people who've taken a 6-week correspondence course and passed some certification exams. Even after all that's done, you need to continue to devote resources to security for as long as the system is in operation. (And probably for a while after: securely disposing of the data that was on the system.) Until everyone truly accepts that this is just the way things are, we're going to keep seeing breaches.
(Score: 3, Interesting) by iamjacksusername on Thursday June 18 2015, @03:55PM
You are right - security is hard. I think the problem is that we are getting the article filtered through the PR department. Unless we are looking at the audit report, it is impossible to say one way or the other. It could mean anything- maybe they compromised the admin's personal equipment so they could have retrieved the key anytime. Maybe they compromised the transit route before the data would have been encrypted. Who knows? We sure don't because we will never get to see the un-redacted report.
On a totally unrelated yet related note, I get a kick out of the fact that everybody is catching up to security practices that were baked into Novell Bindery Services by the late 80s. Seriously, the idea of audit accounts who can only monitor admin, admins who are super users but cannot monitor audit accounts, granular role controls. I remember taking my Netware 3.x classes and part of the story was that the audit role in the Bindery tree was added because of a CIA requirement that admins be monitored without being able to see what was being monitored or what accounts were monitoring them. I blame the move to AD for the complete breakdown in traditional security roles - AD had only a very limited understanding of inheritance and good luck trying to do anything with granular controls when MS was marketing it as manageable by any 2-bit reboot jockey.
Sigh. Someone is on my lawn and they should definitely get off of it.
(Score: 2) by c0lo on Thursday June 18 2015, @07:27PM
Sorry, can't do: those who are on your lawn have audit roles
https://www.youtube.com/watch?v=aoFiw2jMy-0
(Score: 2, Insightful) by darthservo on Thursday June 18 2015, @04:34PM
It needs to be baked in from the beginning, not tacked on after the fact.
That's one of the major underlying problems in this case. Unfortunately it's not unique - a system that was designed decades ago and built from the ground up in an era and especially in a culture where much or proper consideration to security wasn't addressed until after it was in production. So these kinds of things do need to be tacked on later.
The result is that the process of getting more secure is slowed down significantly by compatibility/usability issues and also, as you addressed, lack of adequate experience. Because (and this is another underlying problem) what looks better from the perspective of unfortunately many higher ups: 'Our systems are running and the agency can function efficiently'; or 'We [ran|are going to run] into major problems while upgrading which [caused|will cause] significant downtime' ? As is common in many other industries, the favor is quite typically given to short-term focus.
A comment from Ars [arstechnica.com] also appropriately addressed the problem:
Congress: "it's all your fault for not replacing those archaic and insecure computer systems with the funding we refuse to give you!"
"Good judgment seeks balance and progress. Lack of it eventually finds imbalance and frustration." - Dwight D Eisenhower
(Score: 1) by unzombied on Thursday June 18 2015, @08:53PM
Certainly, considering the overwhelming dollars given to DHS for secret and non-secret activities, a shortage of funds is not the problem. Rather, the billions spent on US "National Security" are not going to the nation's security.
(Score: 2) by DeathMonkey on Thursday June 18 2015, @05:19PM
OK, so the access was gained by way of social engineering. Why on Earth would any user have the ability to download the entire database? There is exactly one user whose job responsibility might require that level of access...
They must have been valid credentials to the NSA's backdoor.
(Score: 1, Interesting) by Anonymous Coward on Thursday June 18 2015, @02:02PM
Who dunnit or even how. All they know their pants are down and their arse is hurting.
It's always a good guess to blame social engineering, like asking But Does It Scale? Yet, it's not like here arent defences, mostly simple common sense stuff, after all sociel engineering is much older than computers. https://en.wikipedia.org/wiki/Social_engineering_%28security%29#Countermeasures [wikipedia.org]
This does not increase my confidence on the competence of the DHS...
(Score: 2) by c0lo on Thursday June 18 2015, @07:33PM
No, they do know!
'twas those pesky Chinese state-sponsored hackers, they called from Shanghai and did some social engineering. 't worked, none of them had British accent, nothing for us to suspect.
https://www.youtube.com/watch?v=aoFiw2jMy-0
(Score: 5, Insightful) by MrGuy on Thursday June 18 2015, @02:25PM
This is both technically true and illustrative of having no idea what you're talking about.
It's true that encryption will not protect data from its intended users using the data in its intended fashion.
But lemme get this straight. You have a single login that is SUPPOSED to be able to access ALL the data? And can do so directly, in a way where they can DOWNLOAD the whole lot and walk away with it? That's normal use?
If this data is so sensitive, why isn't it access controlled to "need to know" folks and compartmentalized with "least access" principles? And why do people have direct access to the raw data (as opposed to through a search or reporting tool)?
Encryption WOULD ABSOLUTELY have helped by protecting the vast majority of the data from a single bad actor if they had other reasonable and industry standard controls in place to compartmentalize data and restrict access to specific known channels that prevent a mass compromise.
(Score: 2) by DeathMonkey on Thursday June 18 2015, @05:21PM
This is both technically true and illustrative of having no idea what you're talking about.
AKA, their core competency.
(Score: 0) by Anonymous Coward on Thursday June 18 2015, @02:50PM
And for all of these employees who have their data thrown out on the street, what do they get: "here, have 18 months 'worth' of completely worthless 'credit monitoring'" As if that is going to do anything.
Since this is the federal government, I'm surprised that they can't give any one of these people a new identity thus making the compromised information worthless.
(Score: 0) by Anonymous Coward on Thursday June 18 2015, @10:30PM
What they will get is more computer security training and PII training instead of any meaningful measure to improve security. Been there and done that.
(Score: 2) by iamjacksusername on Thursday June 18 2015, @03:42PM
It is long past the time that we started treating security breaches as the rule rather than the exception. I found this analogy helpful (sorry, it is not the form of a car analogy). Imagine that you and all your friends have $100 in a box in your respective houses. Now, a burglar walking down the street knows you have $100 in a box in your house. However, you all took some precautions like making sure doors are locked. The burglar looks at your houses and moves on because every house probably has $100 in it and he wants the easy score. Now, imagine you and all your friends have $10B in cash. Generally, you are not going to keep that in your house. Maybe you put in a bank or some other secured location. Now imagine that a team of burglars know that there are 100s of secured locations all with $10B in cash just waiting. There could be hundreds of guards with hundreds of cameras, dogs and thick concrete walls protecting each location. The problem is that the risk / reward ratio just went through the roof and, with a $10B jackpot, someone, somewhere will be willing to finance a try. It is the old weapons vs armour debate. Eventually, the weapon wins.
So, what is the solution? Keep lowering the reward. Do not keep large amounts of data in one location. There should not be a single location that, if breached, becomes a single point of failure. In the case of the government, there should not have ever been a single, master database with all the personnel records of everybody. The government is a huge target already - putting everything in one place just makes the problem worse. A breach WILL occur. There should be much more focus on mitigation after the fact. Start from assuming that the database will be breached and information leaked. Then, decide how one would mitigate that damage.
This applies to the private sector just as much. I am in favour of statutory damages for personal information breaches. That is, if your personal information is leaked by a company, you are entitled to statutory damages from the company. This would solve a lot of privacy and data security problems as it creates a potential financial liability by simply possessing the information. Right now, every company mines personal data because possessing the data is "free" to them; data breaches are an externalized cost. So, they mine the data and target behaviors. If we created a cost to holding onto that data, companies would think long and hard about data mining and personal data retention. In my mind, it would solve much of the intrusive behavior shaping marketing practices as well as contribute to personal data security as fewer companies would hold onto it.
(Score: 0) by Anonymous Coward on Thursday June 18 2015, @05:16PM
in a galaxy far far away ...
prolly just crazy paranoid, but maybe this government-personal department didn't want to "hand over" responsibility to the NSA hackers because maybe they didn't have time to play "war" but the NSA desperately needed all that information to feret out any peace-loving patriots because the long-term war .. err ... roadmap of the usa oligarchs requires servitude without doubt?
well anyways, if, just "if" the NSA wanted dominion over federal government employee data they sure as hell got it now.
also this computer security and privacy erosion hang-over attributed to possibly "behind-the-scene" string pulling of the top-of-the-class IQ.NSA has some really funny ramifications, like attacking the government-employee office for trying to keep stuff "in-house" and doing everything "wrong" : )
:messah, propose to give emergency power to the chancellor ...