Web 1.0 was static data, pages upon pages of data that a user searched through to find information. Web 2.0 brought a vast improvement by enabling user generated content. Users could provide feedback to the information that was provided and were enabled to collaborate amongst themselves.
The next generation of the web will consist of applications that make decisions about what information to provide based on their analysis of the user and of conditions of that could effect what the user wants to see. It will be a Priceline .com that discovers your flight has been delayed and sends alternate flight information to your Blackberry without you asking. These applications will be proactive in addressing their user’s needs.
User Centric Web Sites
In order to better serve the public, government web sites need to find ways to do several things:
- Offer contextual data to users: Because the scope and mission of government agencies are so large, only small portions of the data that they have available are useful or relevant to the average user that arrives at their web site. Unless the user is a researcher or a member of a regulated community, much of the information on government web sites is not relevant or remotely interesting to the user. Successful web sites will find ways to drill through the mounds of data and serve up subsets that are relevant to the user’s interest, geographic location, etc.
- Organize the data as it is relevant to the user, not as it is relevant to the organization. Many if not most sites are structured in the same way that the bureaucracy of the particular agency is. Information is relegated to the particular office that was responsible for producing it and then stove-piped within that corner of the website. Without knowledge of the bureaucracy’s structure, the user can not possibly hope to locate the information.
- Something beyond searching by text has to be offered. Requiring that a user use a search box puts the onus of making the web site meaningful on the user. While good search is necessary, sites should aim to be usable at face value.
- Don’t expect users to know what is available at the site. The average citizen probably has very little idea of the breadth and depth of information that each agency contains. A smart site would find ways to present its information to the user, rather than having it buried in deep levels of the site.
This combination of drawbacks is severely limiting the utility of government sites. At my day job it is estimated that over 60% of visitors arrive to the public site via a Google search. This not only means that they are not using the top layers of the site, but by arriving deep into its structure they are missing out on any attempt at organization that the site uses. Additionally, this could indicate that a large amount of the traffic is incidental, meaning that people wound up at the site only because Google suggested that they visit.
Using new technologies to build or to supplement federal web sites could turn their current orientation on their ears: a web site that actively searches for information to present rather than a web site that lays itself out flat, waiting to be searched and navigated. This type of new web site would be a modern web application that would gather a small amount of information from the user and then analyze what available content and data that user would be most interested in viewing.
This application would be constructed to consist of two separate parts: a pool of available data and the application that analyzes a user’s likely needs and selects data to be served from that available pool.
The data pool would be made of modular units that contain current static content, multimedia presentations, and database information that the particular agency has locked up in its site. New units could be added and subtracted as they become available or expired as they become outdated with no disruption to the application’s function.
If a web application of this nature was used by say the EPA, a user making their first visit to the site would provide a small amount of information about themselves:
- I am from northwest Oregon
- I am interested in environmental news and happenings around my home
- I am interested in air quality
- I am able to view multimedia
The application would search its available data modules to find the most relevant information to present to the user. The user would be presented a web page with consisting of:
- Region 10 news and events
- Information regarding a nearby Superfund site
- The latest and most relevant air quality statistics from their area
- A selection of current EPA multimedia
- Important agency news and policy initiatives that are shown to all users
Beyond providing users with rich, relevant, and contextual data, this style of web application comes with some additional benefits. By building custom views into an agency’s data the application can accommodate users in a much richer way than the currently used web technology.
- If a user has accessibility needs the application can configure their particular custom view to be screen reader and keyboard friendly while preserving the ability to harness rich and modern AJAX or animated display techniques for other viewers.
- The custom view can provide data that is specially formatted to be friendly to mobile browsing devices like Blackberries and iPhones.
- Custom views allow the internationalization of the application. A user could select a preferred language with which to operate the web site.
- Rather than being limited to just web pages, custom views would be capable of presenting the agency’s data through web services and APIs that would enable the public to consume the data in manner in which they found most advantageous.
- The application would bring data together from across the offices of the agency. Rather than having data distributed through the web sites of the particular offices that are responsible for it, the shared data pool would present a unified set of data to the public.
Not Just A Tech Change
Making web applications like this functional would require more than the deployment of the applications themselves. Much of the data that is locked up in federal web sites is not suitable for public consumption. When a report on say a certain chemical is delivered, it will often be in the form of a 600 page document with no executive summary and no immediately identifiable chunks of information that would be suitable for laypeople to consume. Additionally, much of the information that is in currently existing databases is not in forms that were intended to be extracted piecemeal and be displayed by location or interest. Correcting this problem by supplementing government data with easily accessible chunks of information would have to be a top down managerial commitment from CIOs (or a CTO knock on wood). A culture change rather than strictly a technological one.
Above and Beyond Web 2.0
Government web sites have been slow to grab a hold of some of the most important Web 2.0 tools that have cut their teeth on the consumer web. While there are many reasons for this, some of the biggest revolve around legal hurdles and concerns over the appropriateness of user generated content on .gov sites.
This type of application is possibly even more suitable for government use than applications like blogs or wikis, and could produce more dramatic gains in usability. Rather than turning the operation of a website over to the users like many Web 2.0 applications do, this generation of application allows a user to receive tailored but vetted content in a manner that is configured to appeal to the user. It is the intersection of the customization and personalization that Web 2.0 hints at with the coordination and integrity that enterprise level IT seeks to provide.
Leave a Reply