Understanding Web 2.0 Design

In the early years of web design, the process of creating a website initially had two stages. GIFs and table hacks were used to create animation. Later, CSS divided style and structure. External CSS files defined the style, and the purpose of the design was to draw attention to the content. Web 2.0 brought XML, which made keywords and semantics more important than layout and style. This means that the design needs to catch the attention of computer search engines more than that of users. To put it simply, designers should not give up making pages appealing, but need to operate more like computer programmers in order to be successful.

The Internet continues to develop very quickly. Web 2.0 has replaced Web 1.0 in both design technique and in the way information is broken up and distributed. Web 2.0 divides information into “microcontent”; information that can be sent to many different domains.

Interfaces such as Flickr and Del.icio.us are examples of the changes occurring to domain content and the way that people store, access and share information. The interfaces of search engines, RSS aggregators, portals, web services and APIs (application programming interfaces) provide further evidence of its usefulness and popularity. As a platform for interactive microcontent, Web 2.0 has also had a profound impact on design. It is now possible to design a better interface by personalizing or remixing content with other data in order to build tools and features that are innovative and more functional.

The first trend of Web 2.0 that affects designers is the change to semantic markup. Semantic markup is used to express the content, and HTML and XHTML are the two languages commonly used to display the content. Designers use CSS to tag and apply the markup. The markup languages are effective for simple documents, but in many Web 2.0 applications the tags cannot effectively describe the content.

In order to keep up with the changes in semantic markup, look into RSS, which uses the XML format to better organize content. This also simplifies the process of informing people about any available new content. Typing the RSS URI of a site into an aggregator allows the feed aggregator to survey the site periodically and deliver new content to users as it becomes available.

Websites in the ‘90s usually had no semantic meaning. The pages were mostly static HTML pages called brochure-ware, and the interactive pages used JavaScript. As the new millennium dawned, XML technology became available and forever changed website design. This new technology allowed content to be shared and altered between different systems and for the web services to hook into the sites. Rather than being designs of visual interfaces for the content, the service became a program interface to the content. This means that with a Web service API anyone can create an interface to the content of a domain. eBay and Amazon both illustrate how this technology is used. Any participating developer can access their commercial data. For example, Andale is an interface created using eBay, and it can help track sales and prices.

Early web design under Web 1.0 was centered on creating websites. However, there is no way to limit content on a single site without making it secure with Web 2.0. The newer technology is focused on creating experiences, and RSS allows people to read content without superfluous design at any time with an aggregator. “Future searches” will combine content from different sources and ignore the website’s visual design. Mixing searches with RSS can tag RSS feeds based on specified topics. Feedster and PUbSub are excellent examples of this. Due to the way content can be remixed online, it is essential that designers learn to shift from branding the websites to distinguishing the content.

Because of the way that Web 2.0 operates, most users do not view content in its original domain. Chances are that a blog link, feed reader, aggregator or search engine will provide the content rather than the navigation designed to reveal it. This is called “distributed content.” Pathways and sources change, but content aggregators are able to compensate by keeping track of the microcontent that an account views and using the information to determine the pertinent content in the future. Daypop, Del.icio.us and Blogdex feeds illustrate that the content used in the past will play a role on their needs in the future. The shift in navigation with third-party interfaces mean that the user behavior, more than design, will have a greater influence on the navigation paths traveled.

For more information on why you need to understand Web 2.0 Design, call us at 1-800-978-3417 or Contact Active Web Group and let us help.

Free SEO Analysis

Facebooktwittergoogle_pluspinterestlinkedinmail