By: Charissa Lee
The shortfall of cybersecurity talent to fill gaps in the cybersecurity job market is a popular topic of discussion. There are some who argue that people are simply not looking for talent in the right places, and others who are much more alarmed about the situation. Taking the middle ground, the problem remains that jobs are not being filled to the extent needed to manage cybersecurity threats today. Martin Libicki and others at the RAND Corporation write, “A shortage exists, it is worst for the federal government, and it potentially undermines the nation’s cybersecurity.” While the federal government has difficulties competing with the private sector for cybersecurity professionals given the security clearance and compensation limitations, the private sector—in particular the financial industry— has also raised the alarm in the wake of serious breaches that carry reputational as well as pecuniary costs. An explanation for this apparent shortfall of talent is millennials’ lack of participation in the cybersecurity job market.
The Millennial Generation
What is a millennial? Pew Research defines this demographic as “those born after 1980 and the first generation to come of age in the new millennium.” This usually refers to anyone who is between 18 and 34 years old today. Pew projects that the millennial generation—having already become the largest share of the U.S. workforce—will also become the nation’s largest living generation in 2015. That is to say, there will be no shortage of warm bodies for the next 20 years to take up jobs, thereby lowering unemployment rates, ensuring steady economic growth, and adding to social security. Yet, despite Pew’s findings, millennials do not seem to be participating as much in the cybersecurity job market as expected or desired (at least formally and legally). Why? Here are some thoughts.
First, the millennial generation did not fully comprehend, or perhaps overlooked, the security implications of the interconnectedness of the Internet. . The Web has only been worldwide since the mid-1990s, prior to which it was the preserve of the military and a select segment of researchers. The generation that has grown up in the past 20 years - the millennial generation - can be divided into late and early millennials, respectively aged 18 to 26 and 26 to 34 years in 2015, based on the wireless technology available at the time that they entered their teenage years. Those years of tumult, rebellion and confiding in friends rather than family, required secure, separate, untraceable communications (or so we thought) that the authorities (Mum and Dad) couldn't track or listen in on.
Late millennials that have grown up in a time when the Internet is commonplace and widely accessible throughout the developed world are so at ease with online technology and information sharing, that they tend to be less circumspect about security, a 2013 survey suggests. Early millennials and the Internet grew up together. this demographic remembers playing Contra off a floppy disk and waiting for the distinct ping-ping-ping-bedang-bedang-shrrrrrr that indicated they were FINALLY connected to the Internet. This group, yet of age at a time when the World Wide Web was gaining traction, and fascinated at the novelty of being connected to people halfway around the world, was probably less interested in the security and wider policy implications of connectedness. That would and should have been the job of the generation that had laid the foundations of the Internet.
This brings us to a second explanation: The millennial generation was not adequately primed to become the type of human capital required for today’s cybersecurity jobs. The demand for talent is most intense for what a June 2014 RAND Corporation report refers to as the “upper-tier cybersecurity professionals.” These include the skilled network architects, programming specialists, crisis management and threat analysts, who as a team must shore up cyber defenses. There is also an urgent demand for management professionals who are not only practitioners, but who also adopt a multidisciplinary approach to cybersecurity, can translate tech talk into strategic speak, and can assemble a well-rounded team across verticals and disciplines to address cybersecurity threats throughout the operational spectrum. A cyber Nick Fury, perhaps? If these sound like tall requirements, they are.
Educating information security professionals to the level required to be entrusted with national or corporate security takes time – factoring in formal education, sufficient hands-on training, management experience, and advanced degrees – from the college level. All told, the entire process could take anywhere from six to 10 years for an entry-level position, an additional five to 15 years if a Ph.D. or industry experience is needed. This means that the average entry-level applicant for cybersecurity jobs in 2015 will be between 26 to 30 years old, falling into the early millennial group.
Going by this rough estimation, the time to have seeded cybersecurity talent was at least six to 10 years ago--to reap the harvest of ensuring that there is a sufficient pool of skilled professionals to fill the gap as pioneering and second-generation professionals move up the ladder. The sweet spot would have been 2005 to 2008—meaning that policy frameworks should have been put in place as early as 2003 to promote cybersecurity awareness and incentivize cybersecurity education. The earliest piece of legislation that marks the beginning of addressing these issues is the Cybersecurity Research and Education Act of 2002, which identifies the need for cybersecurity researchers and faculty. Did it take? Evidently, not as effectively as was hoped. A survey by Raytheon and the National Cyber Security Alliance reflected that millennials did not have a high awareness of the cybersecurity profession, the types of work involved, the opportunities available, or even that they might be able to put their interest in cyber-related skills to use in the cybersecurity job market.
Indeed, what are the types of skills or jobs available in the cybersecurity job market that make it so difficult to hire qualified talent? A Burning Glass report summarizes the challenge as being not just the advanced degree of technical skills, but also the hybridity of the job description. At the entry level, cybersecurity professionals are expected to be technically qualified, and at the same time possess public policy savviness. As was noted at a multidisciplinary workshop at SIPA for cybersecurity this past summer, cyber policy programs have been taught separate from computer science for the most part, and seldom have the two met. This results in graduates who are qualified in one or the other. The advantage arguably belongs to the computer science graduate since entry-level jobs prefer some familiarity with programming language at the least. Thereafter, an International Relations 101 course would be more affordable and applicable than a Programming 101. With technical and policy education running on parallel tracks, they can only try to enter the market with one, go back to school at some point to gain skills in the other, and then meld these disciplines to prepare for senior leadership and management positions.
A more amorphous explanation may be that it is a matter of perception. One aspect of this is a gender differential. Women and men in the early millennial age group grew up in a time where professions and careers were bound by gender stereotypes; women were discouraged or dissuaded from entering into an educational or professional field that was heavily male-dominant. Nevertheless, there are tenacious and forward-looking women who are now in senior positions in government, education and the corporate world, and who can serve as role models to inspire more female and other underrepresented groups toward cybersecurity education. The situation has been improving with more initiatives that seek to correct the imbalance.
Another aspect of the perception problem is that millennials do not think they have the skills required for such jobs. Almost a quarter of global millennials perceive that they are not qualified for cybersecurity jobs; fewer than 3% of college students in the U.S. graduate with a degree in computer science. K-12 schools in the U.S. implicitly contribute to that perception by providing few computer science classes or listing them as extracurricular credit courses, making computer science, programming and information security courses costly and less accessible. By doing so, the education system also narrows the stream of students that feeds into higher education in computer science and related disciplines. In fact, millennials may already have the requisite skills — problem solving, critical thinking, programming, data analysis etc. —which only need to be reoriented for a cybersecurity job.
It’s Not Too Late
The millennial generation is not entirely a lost resource in the cybersecurity job market. Early millennials who have already started their careers in other fields may find it more challenging to switch to the cybersecurity and information technology world. Putting forth the effort to network and gain skills through any number of available resources will enable them to capitalize on their real-world work experience and become mid-stream entrants. Late millennials, college kids and even K-12 kids are target groups to nurture as the next generation of cybersecurity professionals and leaders. As Jamesha Fisher at Dark Reading says, “Take a cue from the Jedi and mentor at least one Padawan, actively offer your knowledge and time and support to those trying to join our ranks.”