top of page

Will we ever catch up?

Working in Cybersecurity sometimes feels like a race that can never be won. Programming and computer engineering began long before we realized the implications of what we were creating. Early decisions on how we architected computing and how the internet functions has come back to bite us in the butt in a big way. In general, cybersecurity efforts that we have exerted onto our technology stacks are still at least ten years behind the curve. So it begs the question, will cybersecurity ever catch up with the technology curve? There are four major sections to consider: technology, public policy, education, and hiring.


I am an optimist and I say that we can but it will take a long time. A history lesson when it comes to cybersecurity: we could argue that computer programming has existed for a very long time in many forms. Analog punch cards transformed increasingly into digital ones and zeros. When computer programming first started is a wider historical discussion, but I’ll stake my claim that modern programming started in 1954 with the introduction of Fortran from IBM. Seventeen years later in 1971, we have our first historical evidence of a computer virus. The ‘Creeper’ is the name of the first known, self-replicating computer virus, or ‘worm’. You can read more about it in my previous post.


Now to do some math, that puts us at a seventeen year separation between modern programming and the first known instance of malware. That is consistent with the invention of the C programming language and the Morris Worm (1972-1988: sixteen years). Throughout both the 70’s and the 80’s, malware was not common knowledge. A majority of development work was still done in large companies and universities by supposed big-brained individuals. The very idea of a program being written to do harm to a system wasn’t even part of the computing zeitgeist, yet.


Then comes the internet and boy-oh-boy here we go. Web pages become a thing and at first it's just a simple exchange of HTML documents over a network connection. The HTTP protocol supports this in its most basic form. Then we began to see the introduction of smarter web-based applications with JavaScript, SQL databases for persistence, and authentication. Eventually we had the wonderful idea of storing and accessing private data this way. In 1998 and 2000 respectively, the internet euphoria came crumbling down with a one-two punch: SQL Injection and Cross-Site Scripting.


Around this time in the late eighties and nineties the first real exposure of hacking and cybersecurity entered the mainstream. Computer viruses, phone ‘phreaking’, movies about hackers, buffer overflow and smashing the stack, and many others. Anti-virus is created and sold, Microsoft is held to task about the insecurity of Windows, and some of the first appearances of Information Security groups.


It’s the unfortunate truth that bleeding-edge technology will always be ahead of the curve on the cybersecurity scale. Flawed mentalities like ‘Move fast and break things’ from -- *shudder* -- Facebook (which they stopped using in 2014 because it is stupid) and other forms of new development processes made the problem worse before it got better. Breaches became more commonplace than the previous decade and the impact of these breaches have increased dramatically.


But it’s not all bleak, because we can say that InfoSec has caught up somewhat. Let’s look at some web app examples: SQL Injection was discovered in 1998 and the introduction of prepared statements as a known-good pattern in most programming languages happened around 2011, a thirteen year gap. Cross-Site Scripting was discovered in 2000 and known-good frameworks like React that have XSS protection built-in were released in 2013. So as of around 2012, we were at a thirteen-ish year delay as opposed to a sixteen-ish year delay.


So in the last sixty-ish years, cybersecurity has caught up by about four to five years. Extrapolating that, if we say we are currently around eleven to twelve years behind and it takes about ten years to catch up by one year, it will take about another 110-120 years of effort before cybersecurity can catch up. That is if we live in a perfect world, which we do not and we cannot look at this linearly; a catastrophic hacking event like taking out utilities can help accelerate these advancements. That is also taking into consideration that we do nothing on the other three sections of improvement. Unfortunately that brings us to the next catch-up point, public policy.


The US is woefully behind Europe in the tech regulation game and that is just for general privacy concerns. Regulating where data can be sent to and who can read it is a good thing, but that does not stop hackers from attacking and succeeding in breaches. Every time that happens, bills are introduced to strengthen the nation’s cybersecurity posture, but they rarely see the light of day. Many states have introduced their own GDPR-style regulation but that doesn’t get to the heart of the problem.


What I’m afraid to tell you is this: capitalism is antithetical to cybersecurity policy. Every time cybersecurity is brought up in these discussions it’s seen as a short-term money sink and a loss of profits. So the powers that be will lobby against these types of regulations ten times out of ten, and because the US Congressional system is hopelessly corrupt, so nothing will improve. Never mind that good cybersecurity practices over a long period of time builds trust in that organization and trust is rewarded with more customers. Imagine what would have happened if twenty plus years ago Microsoft didn’t spend the time and effort securing Windows. Would you be running Windows 10 or 11 right now if it had the same security holes that Windows 98 had? Doubtful.


Education is the next major key catch-up. Collectively improving cybersecurity education and security awareness in those creating applications will improve our overall security posture. Computer Science and IT courses need to bake cybersecurity into the curriculum, otherwise we are doomed to keep repeating all of the mistakes.


Some colleges have introduced cybersecurity courses and have been successful like Carnegie Mellon, Stanford, Berkeley, and MIT, but it is still miles away from where we should be. According to a 2019 article from CyberCrime magazine, only three percent of bachelor graduates have cybersecurity related skills. The growing skills gap and unfulfilled jobs in the cybersecurity field indicates that new people coming into the field are expected to already have years of experience and be able to pass demanding, high-stakes, non-representative interviews.


Oh, the hiring process -- that is a complete travesty I’m pretty ashamed to have participated in for years. I’ve been doing interviews for Application Security for a decade and only now do I feel like I’m appropriately interviewing people. After having gone through multiple rounds of on-the-spot coding tests, I can safely say that they are bullshit. I’ve been stepping away from doing interview “challenges,” especially for people that already have a few years under their belt. I would encourage you if you’re in a hiring position to have more open discussions with interviewees, particularly with experience and projects listed on their resume. If someone is interviewing for a senior or principal position specifically, it's a complete waste of time to ask if they know what Cross-Site Scripting is.


There is a huge gap in filling these necessary cybersecurity roles, however the opportunities for people entering the market are still pretty low. We are so afraid of messing up cybersecurity in an organization that we are reluctant to hire people who have beginners experience. It creates a have and have nots when it comes to those who work in cybersecurity: a steady portion of people with years and decades of experience who are already working and a pool of potential people who are entry level and juniors who cannot find work.


Cybersecurity needs to be more courageous to hire people who are inexperienced, because hiring for entry level and junior developers will continue. Additionally, even if we solve the education problem, most of the real experience for cybersecurity is on-the-job; let’s give people the opportunity to have on-the-job experience before the hiring gap gets much worse.


Cybersecurity has a lot of catching up to do with development. Security, like in all related fields, will always lag behind, it just depends on how much. It also depends on what verticals we are talking about: Information Security in the consumer goods/appliance market is much further behind than those in banking/financial. Pure cybersecurity technology innovations will only take us so far. More cybersecurity education and better hiring practices will continue to bridge that gap, and congress can issue new regulations to enforce these standards (not hopeful on the latter unfortunately).


Sources


 

About the Author

Aaron is an Application Security Engineer with over 10 years of experience. His unorthodox career path has led to many unique insights in the security industry.



Recent Posts

See All
bottom of page