In an earlier post, I spent some time talking about the recent contention that has arisen around the value of security awareness training. While my leanings tend to advocate the value of true (note word :) ) education, training, and awareness my major concern re: the current debate is that it tends to be absolutist -- i.e., either we focus on education and training or we pour our dollars into improving technology. In a recent article published in CSO Online, Geordie Stewart points out what should be obvious to us all: that there is room for improvement in both awareness and technology -- and that we need to be careful lest we allow our zealotry to blind us to the legitimate criticisms on both sides of the issue. Well worth a read. You can find a link to the article here.
Sunday, April 28, 2013
Thursday, April 18, 2013
Security Metrics In The News
If you want a practical example of some of the difficulties around security metrics, give a listen to the current immigration reform debates.
This week, the congressional "gang of eight" that has been working on an immigration reform package is due to release its proposals to the world. Prior to this release, many conservatives have lined up against some of the reform plan's supposed features. One of the rallying points for the opposition is that any sort of reform should be enacted only after our borders are secure.
Sounds reasonable, to be certain. The only problem right now is defining what "secure" means in this context.
In the past, the Department of Homeland Security has used a metric called "operational control" to define border security. This metric measured the percentage of the border over which the government had positive monitoring and could respond in a reasonable amount of time to suspected border incursions. The operational control metric has since been scrapped as the data it provided was deemend "largely meaningness" in the political context of border security. For the past few years DHS has been struggling to come up with a metric that makes sense; to date one is still lacking. Senator John McCain pointed this out in a recent congressional hearing on immigration, as well as the need for such a metric in order to make immigration reform meaningful.
The new immigration proposal is reported to tie reform to border security, with a requirement for the US to be able to successfully interdict at least 90% of illegal immigration. No information has been released as of yesterday as to how they indend to accomplish this...or, more fundamentally, how they are going to measure success.
Sound familiar? :)
Many of us have faced (are facing?) similar problems with our metrics. Our organizational leadership has a desire to create a secure environment, but does not necessarily know what that environment looks and feels like. If we are lucky, our leaders understand that absolute security is oxymoronic as long as their doors are open for business. Still, what does an appropriately secure state look and feel like? What are its indicators? How do we measure it?
I submit to you that our failure as a professional to engage in these difficult conversations with our leadership is one of the reasons that compliance has become the de facto benchmark for security. Compliance is easy to define, easy to measure, and easy to recognize...but as we all know compliance is not an accurate measure of security for complex, multi-faceted organizations. Being PCI compliant, for example, does nothing to ensure appropriate protections for your intellectual property, your healthcare data, or your personnel. While our metrics and the decisioning behind them are (hopefully) not as politically charged as those DHS must develop, we need to continue to evolve our reporting schema if we wish to show the true value of what we do for our organizations.
Keep an eye on the immigration debates in the weeks to come. Listen to discussions of securing the borders. If nothing else, it's a useful reminder to us all on the importance of properly framing and engagin in metrics discussions in our own organizations.
Sunday, April 14, 2013
VuDu Headquarters Robbed; Private Customer Information Stolen
In a stark reminder that security requires a holistic view/approach, VuDu posted information in its website relating to a robbery which occurred at its California headquarters. Several hard drives were stolen which contained private customer data. Wile VuDu claims that no credit card data was stolen and that the drives were encrypted, consumers who itilize their VuDu password for other internet fucntiosn are being encouraged to change their passwords right away.
Link to the story may be found here. Happy Reading!
Link to the story may be found here. Happy Reading!
Sunday, April 7, 2013
Letting Our Guard Down With Web Privacy
I stumbled across a really interesting article in the New York Times this past week. Dr. Alessandro Acquisti, a behavioral economist at Carnegie Mellon University points out some of the factors and circumstances which can drive our willingness to part with private information. Dr. Acquisti does not make any definitive conclusions within the article itself, but his observations are quite telling regarding personal behavior and how we might shape human-technology interaction to improve that behavior. Companies like Microsoft and Google have taken notice, providing grants to fund some new research avenues for the good doctor.
You can read the article here; it's worth your time. If you want more information about Dr. Acquisti, including some of his more detailed research you can find that information at this link. Oh, and I've loaded a particularly interesting paper of his regarding user-tolerance for security-related technical delays in the document repository. Enjoy!
To Train or Not To Train. That is the Question
Three years ago, I started giving a presentation called The Human Nature of Security: Pink Elephants, Power Rangers, and the United States Marine Corps. The premise of the presentation was straightforward: the security profession has preached for decades that security is people, process, and technology yet we have continued to neglect the people portion of that triad to our detriment. I then spent some time talking about different aspects of how we were overlooking the human factor and how we might better harness the human element to our benefit. I didn’t think much about this presentation at the time, although there were very few people discussing human factors security in the mainstream back in 2010.
Now, it seems, everywhere I turn there is conversation on the Human Element – and that conversation is becoming more polarized within our community. On one side of the argument are those who would argue that training computer users in security is the equivalent of training a car driver to be a master mechanic; indeed, Bruce Schneier entered the fray on this side of the argument just last week. On the other side of the argument are those who argue that the human factor is vital to our success; ISSA president Ira Winkler has, to date, been the standard bearer of the argument.
I agree with the thesis Mr. Winkler presents in most of his arguments…but not for the reasons Ira puts forth. My reasons tend to be a tad more fundamental. To understand them, let’s take a look at Mr. Schneier’s closing lines from his blog entry on this argument:
If we security engineers do our job right, then users will get their awareness training informally and organically from their colleagues and friends…[t]hen maybe an organization can spend an hour a year reminding their employees what good security means at that organization, both on the computer and off.
...and therein lies the rub. Too many security professionals believe that one hour of mandatory, fluffy, check-the-block lectures constitutes security training. It does not.
In the 2+ decades that I’ve been in this profession, I would submit that we have done little more than pay lip services to the concept of security education, training, and awareness (that multi-faceted SETA thing goes beyond just training, remember? :) ). We have had (and continue to have) difficulty getting the attention of the folks we work for, so we limit our training efforts to the minimum necessary set that is required for compliance. Further, if we can have a third party deliver this training and come up with the material, it makes it easier for us to implement. Since we have more pressing matters to attend to, why pay attention to the content and quality of something that’s not going to be effective anyway, right?
This attitude is not unexpected, nor should it be. Most of us came up geek. People – particularly non-geek people – are riddles wrapped inside of mysteries surrounded my enigmas for many of us. Worse, our collegiate programs require very little of those “fluffy” skills like psychology, sociology, and behavioral science within our engineering educational framework. It is therefore mindboggling and downright bothersome to us when people can’t see the importance of what we are doing and why we are doing it. When folks suggest things such as learning to communicate better and speak the language of the business, many of our brethren complain that we can only “dumb things down” so much without losing the efficacy of the message.
Sound familiar? :)
I recently asked a group of 50 of my peers about their SETA efforts. The results of this informal poll were interesting:
- All 50 members had some type of SETA program in place.
- 38 of those 50 were doing no more than the minimum compliance training required by various regulations.
- Of the remaining 12, 10 were doing training that was geared toward specific roles (such as developers, IT personnel, and managers).
- The remaining 2 individuals had put together comprehensive programs that included multi-faceted training, messaging, and communications that occurred regularly throughout the year and were exercised & measured accordingly.
In other words, only 4% of the sample population had implemented a comprehensive, end-to-end SETA program.
My point, folks, is this: before we dismiss the value of true SETA within an environment, how about we spend time as a profession in understanding, implementing, and measuring the impact of truly, holistic SETA programs within our environments? We are calling failure on a process we have never bothered to truly understand, implement, and embrace as a profession.
Don’t get me wrong, now; all of the arguments re: better engineered and failure-proof technology that the SETA nay-sayers make are still valid. These improvements should be pursued aggressively and passionately. That being said, technology transformation -- even at its most revolutionary -- occurs in years. If I can modify one’s opinion and subsequent behavior even a fraction, can happen in minutes. Modifying that opinion to a more security-oriented one reduces risk to my environment and to the individuals I serve. I don’t need people to understand the intricacies of what I do; I just need them to understand that our decisioning isn’t flippant and (more importantly) that they should alert us if anything they see or experience is out of the norm.
My point, folks, is this: before we dismiss the value of true SETA within an environment, how about we spend time as a profession in understanding, implementing, and measuring the impact of truly, holistic SETA programs within our environments? We are calling failure on a process we have never bothered to truly understand, implement, and embrace as a profession.
Don’t get me wrong, now; all of the arguments re: better engineered and failure-proof technology that the SETA nay-sayers make are still valid. These improvements should be pursued aggressively and passionately. That being said, technology transformation -- even at its most revolutionary -- occurs in years. If I can modify one’s opinion and subsequent behavior even a fraction, can happen in minutes. Modifying that opinion to a more security-oriented one reduces risk to my environment and to the individuals I serve. I don’t need people to understand the intricacies of what I do; I just need them to understand that our decisioning isn’t flippant and (more importantly) that they should alert us if anything they see or experience is out of the norm.
While it mightn't be as sexy as geeking out over code, as a profession I would be hesitant to dismiss the relevance of SETA before truly embracing the training paradigm; learning how to effectively train; and educate; and applying the considerable skills and passion of this profession to the human element.
You might be surprised at the results…
You might be surprised at the results…
Wednesday, April 3, 2013
Information-Sharing Critical To U.S. Cybersecurity
In a speech given at the Georgia Tech Cyber Security Symposium last week, General Keith B. Alexander stated that information-sharing and visibility into the threat landscape are "vital"
for the public and private sectors to defend cyberspace. The key, according to General Alexander, to sharing information is crafting legislation that protects privacy while
facilitating sharing between intelligence agencies, Internet Service
Providers (ISPs), and critical infrastructure.
While this somewhat states the obvious, it's worth circulating the message :) Click here to read the article in Dark Reading Daily. Enjoy!
While this somewhat states the obvious, it's worth circulating the message :) Click here to read the article in Dark Reading Daily. Enjoy!
Mobile Device Security in the US Military Comes Under Fire
Kudos to my colleague & friend Will Lippert for cluing me into this one :)
On March 26th, the DoD Inspector General released a report on the effects of BYOD on the U.S. military. Among the report's findings:
On March 26th, the DoD Inspector General released a report on the effects of BYOD on the U.S. military. Among the report's findings:
- Mobile devices were not secured to protect stored information.
- The US Department of Defense (DOD) did not have ability to wipe devices that were lost or stolen.
- Sensitive data was allowed to be stored on commercial mobile devices acting as removable media.
- DOD did not train users and did not have them sign user agreements.
- The Army CIO was unaware of more than 14,000 mobile devices used throughout the Army.
Subscribe to:
Posts (Atom)