The Web 2.0 is both a new buzzword and a real progress. In this article, I’ll to separate the myth from the reality.
This version does integrate, in a very “WEB 2.0 fashion” a lot of comments from XMLfr editors and sparklingPoint participants and I’d like to thank them for their contribution.
The first difficulty when we want to make an opinion about Web 2.0 is to distinguish its perimeter.
When you need to say if an application is XML or not, that’s quite easy: the application is an XML application if and only if it conforms to the XML 1.0 (or 1.1) recommendation.
That’s not so easy for Web 2.0 since Web 2.0 is not a standard but a set of practices.
In that sense, Web 2.0 can be compared to REST (Representational State Transfer) which is also a set of practices.
Fair enough will you say, but it’s easy to say if an application is RESTfull. Why would that be different with Web 2.0?
REST is a concept that is clearly described in a single document: Roy Fielding’s thesis which gives a precise definition of what REST is.
On the contrary, Web 2.0 is a blurred concept which aggregates a number of tendencies and everyone seems to have his own definition of Web 2.0 as you can see by the number of articles describing what the Web 2.0 is.
If we really need to define Web 2.0, I’ll take two definitions.
Web 2.0 is a term often used to describe what is perceived as an important transition of the World Wide Web, from a collection of web sites to a computing platform providing web application to users. The proponents of this vision believe that the services of Web 2.0 will come to replace traditional office applications.
This article also gives an history of the term:
The term was coined by Dale Dougherty of O’Reilly Media during a brainstorming session with MediaLive International to develop ideas for a conference that they could jointly host. Dougherty suggested that the Web was in a renaissance, with changing rules and evolving business models.
And it goes on by giving a series of examples that illustrate the difference between good old “Web 1.0” and Web 2.0:
Google who has launched AdSense in 2003 was doing Web 2.0 without knowing it one year before the term has been invented in 2004!
Let’s focus on the technical side of Web 2.0 first.
One of the characteristics of Web 2.0 is to be available to today’s users using reasonably recent versions of any browser. That’s one of the reasons why Mike Shaver said in its opening keynote at XTech 2005 that “Web 2.0 isn’t a big bang but a series of small bangs”.
Restricted by the set of installed browsers, Web 2.0 has no other choice than to rely on technologies that can be qualified of “matured”:
- HTML (or XHTML pretending to be HTML since Internet Explorer doesn’t accept XHTML documents declared as such) –the last version of HTML has been published in 1999.
- A subset of CSS 2.0 supported by Internet Explorer –CSS 2.0 has been published in 1998.
- XML –published in 1998.
- Atom or RSS syndication –RSS has been created by Netscape in 1999.
- HTTP protocol –the latest HTTP version has been published in 1999.
- URIs –published in 1998.
- REST –a thesis published in 2000.
The usage of XML over HTTP in asynchronous mode has been given the name “Ajax”.
Web 2.0 appears to be the full appropriation by web developers of mature technologies to achieve a better user experience.
If it’s a revolution, this is a revolution in the way to use these technologies together, not a revolution in the technologies themselves.
Probably not if the rule was to keep the same paradigm with the same level of features.
We often quote the famous “80/20” rule after which 80% of the features would require only 20% of the development efforts and sensible applications should focus on these 80% of features.
Office applications have crossed the 80/20 border line years ago and have invented a new kind of 80/20 rule: 80% of the users use probably less than 20% of the features.
I think that a Web 2.0 application focussing on the genuine 80/20 rule for a restricted application or group of users would be a tough competition to traditional office applications.
This seems to be the case for applications such as Google Maps (that could compete with GIS applications on the low end market) or some of the new wysiwyg text editing applications that flourish on the web.
A motivation that may push users to adopt these web applications is the attractiveness of systems that help us manage our data.
This is the case of Gmail, Flickr, del.icio.us or LinkedIn to name few: while these applications relieve us from the burden of the technical management of our data they also give us a remote access from any device connected to the internet.
What is seen today as a significant advantage for managing our mails, pictures, bookmarks or contacts could be seen in the future as a significant advantage for managing our office documents.
If the French version of Wikipedia has the benefit of being concise, its is slightly out of date and doesn’t describe the second layer of Web 2.0, further developed during the second Web 2.0 conference in October 2005.
The English version of Wikipedia adds the following examples to the list of Web 1.0/Web 2.0 sites:
These examples are interesting because technically speaking, Wikipedia, blogs, wikis or folksonomies are mostly Web 1.0.
They illustrate what Paul Graham is calling Web 2.0 “democracy”.
Web 2.0 democracy is the fact that to “lead the web to its full potential” (as the W3C tagline says) the technical layer of the internet must be complemented by a human network formed by its users to produce, maintain and improve its content.
There is nothing new here either and I remember Edd Dumbill launching WriteTheWeb in 2000, “a community news site dedicated to encouraging the development of the read/write web” because the “tide is turning” and the web is no longer a one way web.
This social effect was also the guide line of Tim O’Reilly in his keynote session at OSCON 2004, one year before becoming the social layer of Web 2.0.
With a technical and a social layer, isn’t Web 2.0 becoming a shapeless bag in which we’re grouping anything that’s looking new on the web?
We can see in the technical layer a consequence of the social layer, the technical layer being needed to provide the interactivity required by the social layer.
This analysis would exclude from Web 2.0 applications such as Google Maps which have no social aspect but are often quoted as typical examples of Web 2.0.
Paul Graham tries to find common trends between these layers in the second definition that I’ll propose in this article:
Web 2.0 means using the web the way it’s meant to be used. The “trends” we’re seeing now are simply the inherent nature of the web emerging from under the broken models that got imposed on it during the Bubble.
This second definition reminds me other taglines and buzzword heard during these past years:
- The W3C tagline is “Leading the Web to Its Full Potential”. Ironically, Web 2.0 is happening, technically based on many technologies specified by the W3C, without the W3C… It is very tempting to interpret the recent announcement of a “Rich Web Clients Activity” as an attempt to catch a running train.
- Web Services are an attempt to make the web available to applications which was meant to be from the early ages of Web 1.0.
- The Semantic Web -which seems to have completely missed the Web 2.0 train- is the second generation of the web seen by the inventor of Web 1.0.
- REST is the description of web applications using the web as it is meant to be used.
- XML is “SGML on the web” which was possible with HTTP as it was meant to be used.
Here again, Web 2.0 appears to be the continuation of the “little big bangs” of the web.
In maths, continuous isn’t the same as differentiable and in technology too, continuous evolutions can change direction.
Technical evolutions are often a consequence of changes in priorities that lead to these changes of direction.
The priorities of client/server applications that we developed in the 90’s were:
- the speed of the user interfaces,
- their quality,
- their transactional behaviour,
They’ve been swept out by web applications which priorities are:
- a universal addressing system,
- universal access,
- globally fault tolerant: when a computer stops, some services might stop working but the web as a whole isn’t affected,
- scalability (web applications support more users than client/server ones dreamed to support),
- a user interface relatively coherent that enables sharing services through URIs,
- open standards,
Web 2.0 is taking back some of the priorities of client/server applications and one needs to be careful that these priorities are met without compromising what is the strength of the web.
Technically speaking, we are lucky enough to have best practices formalized in REST and Web 2.0 developers should be careful to design RESTfull exchanges between browsers and servers to take full advantage of the web.
Web 2.0 run in a web browsers and they should make sure that users can keep their Web 1.0 habits, especially with respect to URIs (including the ability to create bookmarks, send URIs by mail and use their back and forward buttons).
Let’s take a simple example to illustrate the point.
Have you noticed that Google, presented as a leading edge Web 2.0 company is stubbornly Web 1.0 on its core business: the search engine itself?
It is easy to imagine what a naïve Web 2.0 search engine might look like.
That might start with a search page similar to the current Google suggest. When you start writing your query terms, the service suggests possible completions of you terms.
When you would send the query, the page wouldn’t move. Some animation could keep you waiting even if that’s usually not necessary with a high speed connection on Google. the query would be sent and the results brought back asynchronously Then, the list of matches would be displayed in the same page.
The user experience would be fast and smooth, but there are enough drawbacks with this scenario that Google doesn’t seem to find it worth trying:
- The URI in the address bar would stay the same: users would have no way to bookmark a search result or to copy and past it to send to a friend.
- Back and forward buttons would not work as expected.
- These result pages would be accessible to crawlers.
The web developer who would implement this Web 2.0 application should take care to provide good workarounds for each of these drawbacks. This is certainly possible, but that requires some effort.
Falling into these traps would be really counter-productive to Web 2.0 since we have seen that these are ergonomic issues that justify this evolution to make the web easier to use.
The last point on which one must be careful when developing Web 2.0 applications are development tools.
The flow of press releases made by software vendors to announce development tools for Ajax based applications may put an end to this problem, but Web 2.0 often means developing complex scripts that are subject to interoperability issues between browsers.
Does that mean that Web 2.0 should ignore declarative definitions of user interface (such as in XForms, XUL or XAML) or even in the 4GL’s that had been invented for client/server applications in the early 90’s?
Catching up with the popular “Ruby on Rails”, web publications frameworks are beginning to propose Web 2.0 extensions.
This is the case of Cocoon which new version 2.1.8 includes a support of Ajax but also of Orbeon PresentationServer which includes in its version 3.0 a fully transparent support of Ajax through its Xforms engine.
Published in 2003, XForms is only two years old, way too young to be part of the Web 2.0 technical stack… Orbeon PresentationServer is a nifty way to use XForms before it can join the other Web 2.0 technologies!
What about the business model?
The definition of Paul Graham for whom Web 2.0 is a web rid of the bad practises of the internet bubble is interesting when you know that some analysts believe that a Web 2.0 bubble is on its way.
This is the case of Rob Hof (Business Week) who deploys a two step argumentation:
1) “It costs a whole lot less to fund companies to revenue these days”, which Joe Kraus (JotSpot) explains by the facts that:
- “Hardware is 100X cheaper”,
- “Infrastructure software is free”,
- “Access to Global Labor Markets”,
- Internet marketing is cheap and efficient for niche markets.
2) Even though venture capital investment seems to stay level, cheaper costs mean that much more companies are being funded with the same level of investment. Furthermore, cheaper costs also means that more companies can be funded by non VC funds.
Rob Hof also remarks that many Web 2.0 startups are created with no other business model than being sold in the short term.
Even if it is composed to smaller bubbles, a Web 2.0 bubble might be on the way…
Here again, the golden rule is to take profit of the Web 1.0 experience.
Data Lock-In Era
If we need a solid business model for Web 2.0, what can it be?
One of the answers to this question was in the Tim O’Reilly keynote at OSCON 2004 that I have already mentioned.
Giving its views on the history of computer technologies since their beginning, Tim O’Reilly showed how this history can be split into three eras:
- During the “Hardware Lock-In” era, computer constructors ruled the market.
- Then came the “Software Lock-In” era dominated by software vendors.
- We are now entering the “Data Lock-In” era.
In this new era, illustrated by the success of sites such as Google, Amazon, or eBay, the dominating actors are companies that can gather more data than their competitors and their main asset is the content given or lent by their users for free.
When you outsource your mails to Google, you publish a review or even buy something on Amazon, upload your pictures to Flickr or add a bookmark in del.icio.us, you tie yourself to this site and you trade a service against their usage of your data.
Against this fake freedom, users should be careful:
- to trade data against real services,
- to demand technical means, based on open standards, to get their data back.
What are the conclusions of this long article?
Web 2.0 is a term to qualify a new web that is emerging right now.
This web will use the technologies that we already know in creative ways to develop a collaborative “two way web”.
Like any other evolution, Web 2.0 comes with a series of risks: technical, ergonomic, financial and threats against our privacy.
Beyond the marketing buzzword, Web 2.0 is a fabulous bubble of new ideas, practices and usages.
The fact that its shape is still so blurred shows that everything is still open and that personal initiatives are still important.
The Web 2.0 message is a message of hope!