Interview extracts from the first volume of an oral history of cybersecurity practitioners.
The number of people who can easily transition from very low-level technical detail, to very strategic and even speculative thinking are few and far between. Phil Hagen has earned his place among this elite cadre through a series of baptisms-by-fire … from helping to ensure Y2K didn’t turn out the lights, to combating international money laundering, to imagining how cyberspace may impact warfare decades into the future. Today, most of his time is spent passing on a very specific set of skills to students around the globe.
Phil Hagen: I’m a SANS certified instructor in network forensics, which is really one job with two parts. When I’m teaching, it’s all about delivering content and interacting with students. That’s not something that stops in the classroom; it’s continuous during the week because you can’t always get to everything during the scheduled time frame. This is still in many ways a small community, and people are always helping each other out. When I’m not teaching, I’m working on new course content and editorial tasks. This might include working on marketing collateral and ways to position the course and promote it to potential customers.
I also work as a DFIR strategist for a company called Red Canary. My role is to provide a human front to a lot of the marketing work we’re doing. I get to talk to current and potential customers to understand what challenges they’re dealing with in their security programs. I’m trying to find out what their pain points might be and how our service can help them, or if not us, who might be a better fit. If we’re not addressing a problem that keeps coming up over and over again, I make sure we consider adding that to our capabilities so that we’re responsive to customer needs and competitive in the market.
What got you involved in security?
The inception point? Way back before the (U.S. Air Force) Academy. When I was a kid, learning how computers worked, and then how to make them do things they were not supposed to do. Somewhere around here I still have my Apple IIe from when I was a kid. The stuff we used to do with 128K of memory! That’s when things were really interesting. I didn’t realize this could be a career option, though, until the Academy, when “cyber” started to become a thing and not just a dirty hobby nobody wanted to talk about. When we realized that there were going to be military implications, espionage implications. People would look at us like we had two heads when we tried to bring it up. “What do you mean? Warfare is about bombs and guns, not computers.” We look back now, and that sort of thinking seems pretty silly, but back then it was the mainstream.
Back at the Academy, several friends and I started an extracurricular club that was dedicated to computer security research and testing. We used to joke that it was either a safe haven for us to experiment on things that would otherwise be frowned upon, or a place where the powers that be could keep track of us and shoot us if something went sideways. So, it was kind of a double-edged sword.
Once I got into the active duty Air Force I spent a lot of time using that knowledge of how computers worked and how they could be subverted, to build resilient networks and processes that could still function during outages and other challenges, including Y2K. When I look back, I realize we were building a house, but we didn’t know what it was supposed to look like. We did a pretty good job of building a foundation that is still standing today.
You got a degree in Computer Science from the Air Force Academy, but it sounds like your security skills were largely self-taught?
Definitely self-taught. In my high school I signed up for Pascal class every semester, and every semester they cancelled it because no one else signed up. There was no formal source of training available to me, especially not in security. So, I taught myself BASIC and started exploring BBSs and trying to figure out why 7-bit data doesn’t work with 8-bit data when you’re trying to connect to the 2400 baud modem. Maybe you could find some text files that covered what you were interested in, but there was no one you could talk to, who could teach you … no Internet to search.
What don’t you like about the security industry?
First and foremost: the inability to learn from our mistakes. You know as well as I do you would go blind if you took a shot every time someone said “wake-up call” or “cyber pearl harbor.” Organizations don’t learn from their mistakes, or the mistakes of others.
Second, shiny things syndrome. You see management getting all worked up over Spectre or Heartbleed or whatever, when they don’t have a list of computers in their inventory. I’m sorry, but you’re focused on the wrong things. The basics are not sexy but they’re the basics. If you don’t master them, you’re not going to get very far security-wise.
Third, I’d go back to talent management, just because like I said, if you’re hiring junior staff but can’t get the basics right, they’re not going to become senior staff who know what they need to know or do to improve your security posture.
We have a security industry, which is not the same as a security lobby. Do you think we would see improvements in security writ large if we had the latter influencing policymakers?
The short answer is “yes.” I do think that the industry needs some kind of realistic representation because you have this perfect storm of stuff that is hard for non-experts to understand, and simultaneously easy to politicize. When you look at the Equifax breach, obviously the credit industry is going to lobby against a bill that imposes severe penalties. So, what does that look like for security? IEEE? ACM? It should be some outfit that will look towards first principles. I think a non-industry or community organization—the world that we work in, not the corporations we work for—would be beneficial.
Look at the never-ending debate about encryption and backdoors. That’s an issue I’m passionate about, and it would be nice to see someone with a grasp of the fundamental math behind encryption, who can also speak in a fashion that legislators can understand, who can make a case for the best course of action. That would be a benefit. The EFF plays that role to a degree, and maybe someone like them or IEEE or ACM are the jumping off point for a representative body. It would be interesting to see how commercial interests would impact such an effort.
When you look at other domains, for the most part, security and safety practices tend to be pretty lax until a big enough pile of bodies accumulates. The list of names of people who died because seatbelts weren’t mandatory, or weren’t a thing, is far too long. Is that what it is going to take—bodies—before we start to take cybersecurity seriously?
I share the view that we’re not going to see significant improvement until there are more tangible impacts. You know that it comes with no small sense of irony that I say I have a small glimmer of optimism that I try to maintain on this issue. I sincerely hope that it doesn’t come to a body count. In the meantime, I think what we’re going to see is not a huge loss of life, but major financial impacts … to the point where it’s not just millions of credit cards or PII stolen, but a criminal element that figures out how to monetize things so that it really pains individuals. Right now, most computer crimes are painless. The store or bank makes you whole. But that’ll change as battles related to liability and culpability play out in the courts. When that changes, you’re going to see the cost of financial services and credit go up dramatically because they’ll basically be uninsurable. I think that’s probably what’s going to happen. As much as I hate to see people lose money, I’d strongly prefer that we suffer monetary pain, not negatively impact people’s health or safety.
To read the full interview, and learn more about the working lives of a range of security practitioners, order Working in Cybersecurity at Amazon.com.