NYT Link: https://www.nytimes.com/2023/05/23/opinion/cybersecurity-hacking.html
Archive Link: https://archive.is/wMAXA
In the movies, you can tell the best hackers by how they type. The faster they punch the keys, the more dangerous they are. Hacking is portrayed as a highly technical feat, a quintessentially technological phenomenon.
This impression of high-tech wizardry pervades not just our popular culture but also our real-world attempts to combat cybercrime. If cybercrime is a sophisticated high-tech feat, we assume, the solution must be too. Cybersecurity companies hype proprietary tools like "next generation" firewalls, anti-malware software and intrusion-detection systems. Policy experts like John Ratcliffe, a former director of national intelligence, urge us to invest public resources in a hugely expensive "cyber Manhattan Project" that will supercharge our digital capabilities.
But this whole concept is misguided. The principles of computer science dictate that there are hard, inherent limits to how much technology can help. Yes, it can make hacking harder, but it cannot possibly, even in theory, stop it. What's more, the history of hacking shows that the vulnerabilities hackers exploit are as often human as technical — not only the cognitive quirks discovered by behavioral economists but also old-fashioned vices like greed and sloth.
To be sure, you should enable two-factor authentication and install those software updates that you've been putting off. But many of the threats we face are rooted in the nature of human and group behavior. The solutions will need to be social too — job creation programs, software liability reform, cryptocurrency regulation and the like.
For the past four years, I have taught a cybersecurity class at Yale Law School in which I show my students how to break into computers. Having grown up with a user-friendly web, my students generally have no real idea how the internet or computers work. They are surprised to find how easily they learn to hack and how much they enjoy it. (I do, too, and I didn't hack a computer until I was 52.) By the end of the semester, they are cracking passwords, cloning websites and crashing servers.
Why do I teach idealistic young people how to lead a life of cybercrime? Many of my students will pursue careers in government or with law firms whose clients include major technology companies. I want these budding lawyers to understand their clients' issues. But my larger aim is to put technical expertise in its place: I want my students to realize that technology alone is not enough to solve the problems we face.
I start my class by explaining the fundamental principle of modern computing: the distinction between code and data. Code is a set of instructions: "add," "print my résumé," "shut the door." Data is information. Data is usually represented by numbers (the temperature is 80 degrees), code by words ("add"). But in 1936, the British mathematician Alan Turing figured out that code could be represented by numbers as well. Indeed, Turing was able to show how to represent both code and data using only ones and zeros — so-called binary strings.
This groundbreaking insight makes modern computers possible. We don't need to rebuild our computers for every new program. We can feed our devices whatever code we like as binary strings and run that program. That zeros and ones can represent both code and data is, however, a blessing and a curse, because it enables hackers to trick computers that are expecting data into accepting and running malicious code instead.
[...] Diversion programs in Britain and the Netherlands run hacking competitions where teams of coders compete to hack a target network; these programs also seek to match up coders with older security personnel to act as mentors and direct their charges into the legitimate cybersecurity industry. At the moment, with an estimated 3.5 million jobs unfilled worldwide, one fewer attacker is one more desperately needed defender.
(Score: 3, Funny) by acid andy on Thursday May 25 2023, @07:49PM
When I was an overconfident teenager with a somewhat inflated ego, a colleague about my age introduced me to a Linux login prompt for the first time. I thought I'd show off how fast I could type in my login, not realizing this system had a delay between accepting the username and displaying the password prompt. Pretty embarrassing to see my password spewed out in the clear onto the screen as he watched! I learned a lesson that day!
Consumerism is poison.