Table of Contents
Table of Contents

In a recent podcast interview with Cybercrime Magazine host, Davie Braue, Scott Schober, Cyber Expert, Author of "Hacked Again," and CEO of Berkeley Varitronics Systems reflect on a 2006 data breach which served as a cybersecurity wake up call for the Department of Veterans Affairs and the rest of the federal government. The podcast can be listened to in its entirety below.

 

Welcome to the Data Security Podcast sponsored by Cimcor. I'm your host, David Braue.

Cimcor develops innovative, next-generation, file integrity monitoring software. The CimTrak Integrity Suite monitors and protects a wide range of physical, network, cloud, and virtual IT assets in real time, while providing detailed forensic information about all changes. Securing your infrastructure with CimTrak helps you get compliant and stay that way. You can find out more about Cimcor and CimTrak on the web at cimcor.com/cimtrak.  

Joining us today is Scott Schober, cyber expert, CEO of Berkeley Varitronics Systems and author of the popular books "Hacked Again," and "Senior Cyber."


David: Scott, thanks so much for joining us today.

Scott: Yeah. Great to be back here with you, David.

David: Always a pleasure to chat, and we always get into some pretty interesting discussions about some pretty interesting stuff that's happening in cybersecurity. I mean, it's a thrill a minute and one of the things I realized in reading some of the recent discussions about a breach 20 years ago or so is that some things just really don't change very much at all. Of course, I'm talking about the 2006 data breach of the Department of Veterans Affairs, way back when, it seems like a lifetime ago in technology terms. But things were a bit different back then. What happened? Give us a bit of a refresher about that.

Scott: Yeah, absolutely. And this is an interesting one, because oftentimes we hear the phrase, "you can learn from history," and there's certainly some lessons and things that I think you can learn from the history of this and some things that maybe the Government didn't learn too. And perhaps in this particular case, there was a laptop that one of the employees had that was working at the VA and they brought it home. And on that laptop it contained personal data of a little more than 26 million veterans, and I guess their house apparently got broken into a robbery or something, and the laptop was stolen.

And right away panic sets in. Because, wow! You've got well, first of all, they brought home a laptop with all this personal information on it, but I guess, as they dug in, they learned a lot more, and that's where I guess all the red flags went up that the data was not encrypted. The laptop did not have a passcode or key to keep anybody from getting in, so anybody just opened the laptop and jump right on.

So there was really little, if any, safeguards to protect this laptop. So it ended up really being chalked up to, I guess you could say, total human error, negligence, poor security practices. And the list goes on and on, and we'll probably talk about a little more of the breach notifications and stuff. So what a tremendous mess. So it really kind of sets the line in the sand, maybe for the the worst government breach to start the whole thing off to where we are today.

David: Yeah, there's so many aspects to it. You just don't even know where to start. But I mean, really, the idea of people taking home information on your computers. I mean, back then, this is Pre-Dropbox kind of days, and such as you physically had to carry your hard drive home with you, carry that work data around with you on on some sort of device which was not a a 256 GB USB stick, I assure you. This is the kind of drive that would have spun up, and, you know, taken 20 seconds to start going fast enough, and would have drained the battery of the laptop. That sort of thing. So this is, you know. Cast your mind back, I think.

But it's incredible to think that, even then—I mean, this is a government agency with a reasonably important remit—it would have normalized storing data in unencrypted form and allowing it to be taken home like that. It's just sloppy, really, from the description and from the recollection.

Scott: Oh, yeah, yeah, across the board. It's just a complete mess, I guess you could say. And one thing that I did not realize, I realized certainly it was a very large government agency, but a couple of the little research notes I jotted down here when looking at this story. Today, at least the agency they've got a $325 billion budget. They employ 420,000 people. And it's actually the second largest Federal department by workforce, right after the Department of Defense (DoD).

So they've obviously they've grown over the years. But it's a tremendous department, and they've got a lot of responsibility. They care for several different areas from the standpoint of protecting, I guess, the veterans and keeping them safe from the health administration, the benefits administration, and also from the cemetery. So it's kind of broken down into three different areas there to help them and support them. But a tremendous amount of money and support staff are associated with that. So when we think about the personal information, there's a lot of it that they have and continue to protect.

David: Yeah, well, it definitely has expanded, I would imagine, over the years, not just because it's an organization, but because the technology that's running it, the data, it's administering, the programs. This sort of thing does not get smaller over time.

Of course, there's more conflict, more veterans, and a lot more information and people that the information relates to, to deal with. So you would hope that the security approach has changed along with the investment in all that technology and all that supporting infrastructure. Do you get the sense that they learned a lot over the years based on what happened back then in 2006?

Scott: That's a great point. I think, yes, and no. I think it was number one: It was certainly what I call a wake-up call. I guess they had to reassess who's responsible for this, and it kind of shifted to really the CIO, Chief Information Officer, really had to give authority so that way they could act quicker. And why do I say that in part? They didn't really respond quickly. Back then, in 2006, VA leadership. It took about 2 weeks or so before anybody really noticed what happened and started putting out, "Hey, this is what happened, guys," and then maybe to the general public, it was about 3 weeks later. So it was a span of delay, which often happens in breaches. Everybody's trying to figure out what happened, and a little bit of the blame game, pointing fingers. But that seems to still be the case when we fast forward to where we are today. There's a lot of delays in some of the more modern breaches, even, which I guess is the concern. Did they learn anything? Maybe they did, and maybe they didn't. And certainly the breaches of today are a lot larger, some of the things. And I was reflecting earlier and thinking about some of them like the OPM data breach that was huge. That was 22 million Federal employees that had information compromised, including background checks and fingerprints. But their hackers actually stole that information, and of course, the Equifax breach that was pretty big. 147 million Americans. Their credit records were compromised. That was in 2017, and then, more recently, I guess the SolarWinds attack in 2020, when you look at this, because these are all tied to Federal agencies and things. But in those 3 cases I just mentioned, that information was all compromised. The nice part about the 2006 breach, really nothing was compromised. I guess it was reported, and things were verified. But no data was actually stolen and didn't end up in hackers' hands, to our knowledge, at least. But again, it was that wake-up call, wow! What could have happened? Because this affects so many people, and that's what I think really is a standout to me.

David: Well, it's such a concern. I mean, this is a this is a standard espionage movie plot, isn't it? You know there's this encrypted list of undercover operatives, and if it gets breached, then you know the world as we know it is going to come to a halt. So we have to recover the data, using some well-worn cybersecurity tropes in the movie usually, but that sort of thing. I mean it is potentially the kind of thing that could be very, very damaging. And you know, if it fell into the wrong hands. I guess, just as we talk about the cybersecurity practices within the agency not having been as mature, perhaps, as say, they are now, in many ways the actual cyber criminal community was learning a lot at that time as well, wasn't it? So? There wasn't the sort of orchestrated networks of, you know, dark web gangs that will work together to kind of utilize this information. They weren't going to call up the department and say, you know, "Give us $10 million or we leak this." It was a different attitude to data back then. And I guess maybe that explains some of the complacency as well.

Scott: Yeah, I think you make a brilliant point because there weren't the means to really monetize compromised data back then. Maybe they were doing something very specific. If a group was targeting a government agency, or what have you. But now it's the Wild West, I mean hackers from around the globe will compromise information and extort and exfiltrate it and use it against you, and ransomware and smear campaigns. The list goes on and on. How they misuse stolen data. It wasn't as much back then, that's for sure. And I guess the value just wasn't there. Now, there's value associated to different data sets that are stolen and like you said. You can go on the dark web, and you can buy and sell things like it's a simple commodity.


We'll be right back after a quick word from our sponsor.

Cimcor develops innovative next-generation file integrity monitoring software. The CimTrak Integrity Suite monitors and protects a wide range of physical, network, cloud, and virtual IT assets in real time, while providing detailed forensic information about all changes. Securing your infrastructure with CimTrak helps you get compliant and stay that way. You can find out more about Cimcor and CimTrak on the web at cimcor.com/cimtrak.

And now back to the podcast.

New Call-to-action


David: One of the things that came out of this event that's been credited with, really, I guess, waking up officials to the importance of having a centralized cybersecurity capability within an organization this large that's grown into the idea of a separate CISO, which is still not a universal thing within enterprises. Do we still have a lot to learn about the sort of vulnerabilities organizationally that allowed something like this to happen?

Scott: I think so, because at least, if you analyze just the past few years. Even within the U.S. Government CISOs are not fully empowered within all agencies. They don't always have the budget and the authority to do everything they need to do. Certainly, since if we look back 2 decades ago to where we are today, I think they have more power in using encryption. There's more tools there, 2 factor authentication, endpoint visibility. There's a lot of ways to streamline the breach reporting and ways to respond to different protocols when there are breaches, real-time monitoring. So there's different things that they have tools at their arsenal to combat the problems and to respond better, but they still don't always have the power they need within each agency.

The way to the top down to help get the word out there and help alert the right people. So it doesn't get further on down the line. So it's a little bit of a challenge, I think. And in part, it's probably because there's a lot more data. Now, there are a lot more people involved in the process, and there still needs to be more accountability all the way up at the top, but also the ability to use all those tools effectively to really solve these problems.

David: Do we have the right tools in place to manage all this sort of thing? I mean, there's so many cybersecurity things out there, tools for basically every different point solution you could ever think of, but they don't always integrate well, and it can be pretty hard to reconcile the technical capabilities of something with, I guess, just the need, for example, in this case, to protect against human error. I mean, Cimcor, the sponsor of our podcast, has one of the few platforms and can actually protect infrastructure against human error. It's not always that easy to do that, and even to translate human activities into metrics and into triggers and into events that you can really monitor for and deal with on a real-time basis.

Scott: Yeah. And I think that's a nice point because in this case, you're 100% right. This was really the way I view it, the 2006 breach. It was human failure. It wasn't a hacker breach. And what does that tell us? It's fixable. Humans tend to always be the weakest link. And that's often if you trace most breaches back to human error, human failure is maybe the starting point, and then it's often tied in with vulnerabilities one way or another. So, in part, now with things being so much larger, and a lot of oversight, compliance, and requirements. It's important also to step back from things and say, we have to look at the human element. But at the same time we have to stop checking off boxes and really focus in on how to secure systems properly, and viewing cybersecurity as perhaps national security, tying that in with the right thought helps us stand back and maybe look at the problem and say, this is a bigger problem, because this does affect national security when humans are not trained when we get away from just the checking of the boxes and really thinking about the bigger picture. So I think with time and reflecting upon historical breaches such as this. And then the more modern breaches, as we kind of touched on before, there are a lot of learning lessons on how to continually improve and not just accept. You know, status quo. Here's where we are, and well, that's life. We're going to get breached. It's just a matter of when. We have to get away from that mentality and really start securing things.

David: So very, very true, I mean, at least. Now we have the infrastructure capabilities to centralize this data in ways that we probably couldn't 20 years ago. People don't even need to bring hard drives back to their houses, which they can potentially get stolen or dropped in the trash, or whatever happens to these really highly concentrated databases that used to just kind of travel around in people's trunks. I mean, it was a different sort of thing when you had to bring the data with you. Certainly. Now we do have risks that are associated with centralizing the data, and we see that every day with breaches all the time. But at least we do have that ability to implement those controls and even to tap things like the AI security tools that are out now that can monitor for exceptions and alert people very quickly if anything weird is going on with the data, whereas you know, the hard drive goes missing. And it could be, as we found, that it could be weeks before people figure out what actually happened.

Scott: Yeah, yeah, I think that's so true. And it's so important to kind of reflect. When you look back on these things. Yeah, they've done a lot of adjustments, and things have changed over time. But we really have to focus on the lessons that have been learned, and not forget them and not repeat the same mistakes, because sometimes I feel like I read a story about a breach, a particular breach, and I say, "Geez, it feels like we're here again. I just read about this a couple of years ago, and they made the same mistakes." So sometimes, certain groups and agencies and even the government, they just aren't learning enough from past mistakes, and all of us do better to stop and just reflect upon things a bit more, and make changes. 

I do think it's kind of interesting, too. With what you kind of touched on this, there are a lot more things at our disposal, and just thinking about just the cloud alone, like your point about you don't have to bring home your laptop or an external hard drive. Everything's in the cloud now, and they've gotten so much better at securing things on the cloud, and encryption has gotten better, and there are lots of positive things that have allowed us to be more cyber secure. So there are not as many horror stories, possibly, as there could be. But there still are things, and that's what's concerning.

David: Well, this is the problem, because everything has become more accessible. There are a lot more people who are finding it a lot easier to sit there and just chip away until they get through. And so unfortunately, they are getting through so many times. I mean, in a lot of cases, the data is secure. But then people need access to the data. So if you can get access to the people, you can work around any of the technologies. In many ways, it's a tale as old as time. I would say that these vulnerabilities are still there. And until we can figure out a way to just get all the humans out of the jobs, which I'm sure there are places that are working on, it's going to be a risk, isn't it?

Scott: Yeah, yeah, I think so. And I was imagining what this discussion would be like if, back in 2006, after that robbery, that laptop had gotten into another Government's hands or cyber criminals' hands, what could have happened. Those are the really scary things. So fortunately, in this particular case, these are all learning lessons to make improvements, and maybe we are in a safer world as a result of it, since somebody didn't get their hands on that data.

David: Well, thank goodness, so it seems like it just was a case of, you know, it's like someone taking their coffee mug home. Basically, it just kind of went with them and went back. It's a weird idea. Because these days we leave so many footprints everywhere, and when you go online. And there's so much data everywhere. And these breaches that just keep happening. So it's almost amazing to think about a breach that was so contained as that.

Scott: Absolutely.

David: Those were the days. Hey?

Scott: Yeah.

David: Well, we'll never see them again. Unfortunately.

Scott: No, I think they're gone.

David: Scott, it's been a pleasure. Thanks so much, as always, for your time.

Scott: Yeah, wonderful conversation. Thanks so much, Sir David. Stay safe, everyone.


I'm David Braue, and joining me today was Scott Schober, Cyber Expert, CEO of Berkeley Varitronic Systems, and author of the popular books "Hacked Again" and "Senior Cyber."

The Data Security Podcast is sponsored by Cimcor.

Cimcor develops innovative next-generation file integrity monitoring software. The CimTrak Integrity Suite monitors and protects a wide range of physical, network, cloud, and virtual IT assets in real time, while providing detailed forensic information about all changes. Securing your infrastructure with CimTrak helps you get compliant and stay that way. You can find out more about Cimcor and CimTrak on the web at cimcor.com/cimtrak.

To hear our other podcasts and to watch our videos, visit us at cybercrimemagazine.com.

New Call-to-action

Tags:
Podcast
Lauren Yacono
Post by Lauren Yacono
August 7, 2025
Lauren is a Chicagoland-based marketing specialist at Cimcor. Holding a B.S. in Business Administration with a concentration in marketing from Indiana University, Lauren is passionate about safeguarding digital landscapes and crafting compelling strategies to elevate cybersecurity awareness.

About Cimcor

Cimcor’s File Integrity Monitoring solution, CimTrak, helps enterprise IT and security teams secure critical assets and simplify compliance. Easily identify, prohibit, and remediate unknown or unauthorized changes in real-time