Tales from the Technoverse

Commentary on social networking, technology, movies, society, and random musings

Tales from the Technoverse header image 2

Cyber-Security Discussion at the Fedscoop Conference

October 15th, 2009 · 3 Comments · cyber-security, General, government 2.0, government business

I was lucky enough to be part of a panel discussing cyber-security at a Fedscoop conference Wednesday, October 14, at the Newseum. The agenda for the conference is here: http://fedscoopevents.com/agenda.php. I thought it might be useful to summarize my general points for those who were not able to attend.

The theme of the conference was Lowering the Cost of Government with Technology though the panel’s comments ranged from cost issues to government 2.0 and social networking to cyber-security in general.

The panel was moderated by Chris Dorobek, the afternoon co-anchor for WFED. The other panelists included Vance Hitch, the Department of Justice CIO, Pat Howard, the Chief Information Security Officer, CISO, for the Nuclear Regulatory Commission, Dr. Ron Ross, a key figure in defining security requirements and policy at the National Institute of Standards and Technology, NIST, Gary Galloway, the Deputy Director for Information Assurance at the Department of State, and Rue Moody, the Director of Strategic Technology at Citrix.

I was called on first after the introductions to frame the conversation based on the pre-meeting discussions the panelists had held. I discussed four issues.

First, there is an inherent conflict between data sharing and data protection. In my opinion, you cannot do both perfectly. Even though almost everyone will take the position that you will have to pay attention to both, it is important to pay attention to which way you lean and why and the implications. I noted how impressed I was towards the end of the last administration, when Mike McConnell, then the Director of National Intelligence, DNI, talked about if he had to take some security risks in order to increase the ability to share information within the Intelligence Community, he would. I am sure that I am not capturing the nuances of his talk, but the messaging was very powerful. It is a position that those who know me recognize I agree with very strongly.

READ  Social Media and Open Government - My Presentation

Second, security is difficult to measure and more importantly there is little agreement among security experts as to what metrics to use. This is a particular problem for those agencies and departments who do not have security as part of their day job.

What I mean by that last sentence is that those departments who have security as part of their primary mission have a great deal of day-to-day experience in making tradeoffs involving security spending. Even if the rationale for decisions is merely experiential as opposed to quantitative, over time senior management gets to be fairly experienced at making these kinds of decisions.

For most civilian departments and agencies this is not as true. Trying to decide if taking money from safety inspections, which might be an agencies primary mission, and spending it on cyber-security is a difficult decision to make. Without defined metrics the likelihood of making the correct decision isn’t very high.

I was heartened in reading recently about the establishment of a Security Metrics Task Force by Vivek Kundra and the Federal CIO Council, http://it.usaspending.gov/?q=content/blog, chaired by Vance Hitch, who discussed this during his remarks at the panel, and Rob Carey, the Department of the Navy CIO.

Third, it is hard for people in large organizations, especially governmental organizations to prioritize; that is, to implement the results of risk analysis. The fundamental reason is that prioritization requires someone to decide to work on one set of requirements and thus to NOT work on the rest of the requirements. Few, if anyone, wants to be the person who is associated with the latter decision, the not work on part. If anything bad happens that could be associated with a requirement that is in the lower set of priorities, that will get extra attention from the various oversight groups that look over the shoulders of IT providers in the Federal Government. As someone who had the pleasure of testifying on the hill I can promise you it is not a goal for most people.

READ  Webinar on Cybersecurity: Building Secure Federal Systems

The end result is that often organizations try to do everything and thus end up doing very little of anything.

Finally, I noted that the general overemphasis on protecting the end-points of networks is starting to be balanced against the need for creating systems that are resiliant and have high-availability. Obviously, it would not be a good plan to ignore investments in protection against bad guys getting into networks. But it is equally important to recognize that regardless of the level of protection built into an architecture, at least some bad guys will get through. Therefore, it is also important to think about how to make sure systems stay up and running with protected data even while a system has been otherwise penetrated.

As hard as it is to build in protections and to measure the results, it is harder still to do the same for regarding building resiliant systems. Thus the greater emphasis on protection first, which i believe still needs to be adjusted further.

One point which I didn’t make as well as I would have liked at the Conference is the fact that security has both positive and negative cost implications. It can be positive if there is greater standardization which tends to lower support costs and can do so dramatically if done well. It can be negative if there is no clearcut methodology to making investment decisions. Without associated risk management and security metrics, security spending becomes an endless investment with no well-defined result.

Many thanks to Goldy Kamali for inviting me to be part of the panel and for putting together a great conference. Everyone who missed it missed some great discussions and networking opportunities.

READ  Management of Change Conference, Dan Heath Author of Switch
Be Sociable, Share!

Tags: ·······

3 Comments so far ↓

  • Scott Dowell

    Dan,
    Your comments are right on. The key is to define a good balance of security, risk, and cost models.

    As I read your post, I was struck by a similar, if smaller scale argument posed a recent panel discussing cloud at a very tactical level. As you know, within the Navy context the C&A function can be a challenge – and – with Cloud there appears to be significant confusion as to the appropriate accreditation & certification process.

    Do you believe that your remarks regarding standardization and rational decision making extend to the C&A level as well?

    Great post
    -Scott

  • Dennis Filler

    You’ve captured the essence of the problem. I describe it as a water drop. It is a a system in tension. A balance of security and access dependent upon the surface that the water drop is on. If it gets out of balance, the drop ceases to exist until a new equilibrium is established.

    As far as metrics, no one cares about the millions of intrusion attempts that one prohibits. I routinely had to address the failed login (access) attempts that some of the security personnel wanted us to address. I routinely dismissed them as security successes and didn’t apply any resources to following up on these types of events. The only events that folks really get concerned about are the successful intrusions. The measures that you can (try to) use are the speed of detection (only available through post analysis), the quality (accuracy) of data that your security folks pass to operations for event resolution, the speed of event containment and services restoration and scope (relative size/ containment) of the security breach. All security folks will probably agree that it is only a question of when you’ll be breached. What makes a difference is what you do operationally to manage the breach and work through the event.

  • Daniel

    The C&A issue, to me, is part of a different, broader issue.

    Whenever Federal Government IT initiatives are put in place which require actions by the CIO’s, whatever they do will be reviewed and in a public fashion evaluated on by a series of oversight organizations; these include the Inspectors General and GAO with guidance provided by NIST and in some cases using a framework contained in legislation.

    Thus, regardless of the quality and usefulness of the initiative, the actual implementation is shaped by the interaction of the level of political/organization strength (and nerve) of the CIO and the perceived nature of the oversight evaluation.

    The result can be different than the intent.

    The point of all this is that when OMB or other senior management in an administration think through what they want to accomplish, it would be valuable to also think about how the oversight requirements will impact on those goals.

Leave a Comment

This blog is kept spam free by WP-SpamFree.

Time limit is exhausted. Please reload CAPTCHA.