Tales from the Technoverse

Commentary on social networking, technology, movies, society, and random musings

Tales from the Technoverse header image 2

Technologies to Watch in 2010

December 15th, 2009 · 1 Comment · government 2.0, government business, sensors

Recently Wyatt Kash, the Editor in Chief for both Government Computer News and Defense Systems, wrote me a note saying that GCN was working on an article about technologies to watch in 2010 and that he wanted my two cents.

Naturally I had more than two cents worth of thoughts about the issue and most likely my take was so orthogonal to what they were working on that it ended up being of marginal utility.

On the other hand, it gave me an excuse to think about the topic and allowed me to fill out another blog post. With Wyatt’s permission, the rest of the entry is what I sent to him in response to his request.

Thoughts On 2010 Technologies That Will Be Important to the Government, Pick 3-5

This is a pretty interesting question to answer.

Digital technologies are becoming integrated so tightly into almost everything that we do. Thus one’s answer depends to some extent as to who we are trying to answer for: the internal technologists, the operations managers, the CIO, or the people responsible for mission implementation.

In addition we are in a period of increasingly rapid and radical changes. Thus not only do we need to make a judgment about what technology might ‘win’, e.g. your thought of 4G Wireless winning out over WiMax, but also the impact of the technology; who could have predicted Apps for Democracy happening as a result of the increased capability and comfort level with 2.0 technologies.

Let me focus on those that will have the potential to cause dramatic change either in how Government relates to its external or internal stakeholders or manages itself. I will suggest four that I would pay attention to in 2010:

  • Government 2.0
  • Virtualization
  • Real-time Security Situation Awareness
  • Mobile Network Endpoints

 and one that I would start to pay attention to in 2010 but is just starting to reshape our approach to network and business architectures:

  • Sensors.

The first two represent technologies whose 2010 importance are that they are moving from the edge of the art, or at least not as used by senior management, to mainstream.

Government 2.0 Where 2009 was a year where 2.0 technologies started to become used in general and where President Obama’s team pushing their use could be news, 2010 will be a year where every Government agency will be expected to have a robust 2.0 presence just to get to average. The culture changes needed to allow the exposure of increasing amounts of information, even in intermediate form, will take energy to overcome. But the result is extremely powerful allowing external interested parties to create mashups and produce much more interesting and often more user-friendly versions of the data which the Government might never have achieved.

This will also lead to greater use of 2.0 technologies to implement various versions of crowd sourcing. Where Intellipedia and Aspace are big news, internal wiki’s will become more second-nature. Pilots associated with prediction markets, using groups to predict things like project results or other public facing data, are starting to be piloted by early adopters.

Virtualization. In this case I am referring to virtualization computing resources, not virtual environments which I mention later. It is the maturing of virtualization of servers, still utilized by too few agencies, that has allowed the frenzy around cloud computing, with a dash of high-speed networking and ability to manage multi-tenancy on the servers also required.; though there is likely to be as much or more work done with private or community clouds than public usage in 2010.

This is a big tool for the going Green supporters as well.

Combine this with desktop virtualization and you start to get the incredibly big fight going on between desktop-client versus remote-client providers; the short-hand would be Microsoft vs Google. The implications are enormous in terms of technical architecture, application development, procurement, and security.

Real-time Security Situation Awareness. And speaking of security, I believe the big trend in 2010 will be away from static analysis focused on perimeter protection toward situational awareness used to enable mobile and distributed applications to run even while under attack.

This change underlies a lot of the ferment going on with how to rework the FISMA process.  It also ties back to the thought that it is increasingly necessary to prioritize security investments based on risk rather than trying to do everything everywhere; and thus nothing anywhere; moving from whack-a-mole security to a risk-based focus emphasizing availability and resiliency first.

For those interesting in a practical example, I would recommend looking at what the Department of State is doing in this space, which draws upon the Consensus Audit Guidelines (CAG) effort put together by John Gilligan and Alan Paller, which I had the honor of participating in.

Increasing Power of Mobile Network Endpoints. Cell phones, personal digital assistants (PDAs) continue to proliferate as their computing and communications capabilities increase and their interface to the Internet becomes increasingly robust and integrated.

Here also, three big arguments are being played out:

.   the previously mentioned desktop-client versus remote-client

.  commercialization versus standardization

.  data sharing versus data privacy

Each of these are being dealt with inconsistently across the Federal Government. Their resolution will result in winners and losers organizationally and commercially.

Sensors. While I don’t believe most Government agencies will necessary pay attention to this topic, in fact their increasing power and distribution may overwhelm all of the other suggestions. They bring about two broad changes when they become ubiquitous:

.  they become participants in the network – creating an Internet of Things

.  they allow the collection of real-time data which can then be processed in real-time

This latter change allows virtual environments to become increasingly comingled with physical environments, here virtual refers to environments as Second Life.  Smart cities which interact with their citizens, like San Francisco where it is possible in some places to find out the location of empty parking spaces on your cell phone as you drive around; or the NYU/Cornell experiment wiring some of the NYC rivers so you can check on status from the web including from your cell phone. Applications like layar which provides information about where you are based on web-provisioned information will in the future pick-up its information from the physical surroundings as everything becomes an IP address and/or twitter participant.

Be Sociable, Share!

Tags: ·······

One Comment so far ↓

Leave a Comment


6 − one =

Spam Protection by WP-SpamFree