By Matthew Magain

Are You Ready For Web 3.0?

By Matthew Magain

Ok, so I admit it — the title of this post was indeed meant to draw you in. But before you jump straight to the comments form to leave your cynical wisecracks, hear me out.

Granted, we’ve struggled until now to even agree on what the term “Web 2.0” means (although SitePoint’s own Kevin Yank has done a pretty admirable job of explaining the term in the past). As a result of the overuse of this marketing-friendly term, now anyone who doesn’t work in marketing physically cringes whenever they see or hear the term. Which is a shame, in my opinion — something that could potentially be used in a positive manner to communicate a new phase of the Web is instead being used to poke fun at the tech boom (yours truly, guilty as charged) and mock those who generate more spin than substance.

So what if there was a way for the term to be more clearly defined? What if we were to recognize that the Web is actually an ecosystem that is evolving, and that a version number — when applied to the Web — is as good a way as any to describe the various phases of its evolution? How about a version number that increments with each decade of the Web?

Entrepreneur, blogger and technology visionary Nova Spivack thinks that this makes sense, and that the ultimate goal should be the realization of a Semantic Web, where data is universally searchable, and understandable, by humans and machines. The diagram above is his vision of where he thinks the Web is headed, and his startup company, Radar Networks, has been developing a proprietary platform for consumers to take advantage of this “next-generation” Web.

Regardless of whether you believe that the Semantic Web is all hype, or whether such a platform should in fact be proprietary, there’s no denying that this is an interesting space to keep an eye on. Microformats and tagged data are bringing more semantic meaning to the Web, and beyond Web 3.0 things get pretty interesting.

Unless you have a better name for it?

  • Semantic web is the future? Where’s the fun in that???

    On the other hand, narrowing the gap between desktop and web is interesting (Apollo and Slingshot are just the beginning). Drag and drop between web and desktop apps, direct upload, rich content editing controls, HTML5… Now, that is something that I would like to play with!

  • Andrew Rickmann

    The problem with Web 2.0 and its next evolutions is that it suggests that the web as a whole is an iterative thing. It isn’t.

    The web is made up of many different people doing many different things and any term such as Web 2.0, Web 3.0 is to do all these groups a disservice.

    The semantic web revolves around data, whereas the social web revolves around people. A site may fall into one, both, or neither, where does it stand then in the global evolution of the web? 2.0? 3.0? 2.5? 2.75? where do you stop?

  • bungle

    Web 1.0: thin clients
    Web 2.0: fat clients
    Web 3.0: thin clients
    Web 4.0: fat clients

  • I think that the diagram is great. It’s a good tool to help better explain what Web 2.0 really means.

    As developers, it’s easy to get annoyed when marketing people go overboard with a term or idea. However, in many ways it can be a good thing. Much of the current success of internet-based companies is due to the buzz about Web 2.0.

    It’s easy to get frustrated and point out where they’re wrong from a technical stanpoint, but it doesn’t change the fact that this Web 2.0 craze has helped lead to a lot of great tools and applications.

  • Interesting prediction of how the technology can evolve, but if we keep calling this Web *.0, we’re just continuing hyping the term. Can someone come up with a less catchy naming convention? ;)

  • cob

    Hey junjun… how about, oh, I don’t know, “the web”. ;)

    It’s outrageous, I know, but I like to be on the cutting edge.

  • Anonymous

    i believe in the semantic web but it isn’t going to happen for a long time yet… even in this day and age, you have people (see highlighted thread in forums home page) still asking if we need web standards or not.

    of course we need them, but then again you have those who add to the confusion… html5? that ain’t a -beep- standard but the trash going about, you’d think it were :(

    so, yes we need to let those out there know about those web standards first, get them taught the hows and the whys first, before you (or anyone else) claim about web number two…

    a bemused dr livingston

  • Even worse than the whole version numbers issue, is the term “WebOS”.

  • first let web 2 come in and then talk about web 3 lol

  • jobe

    One key thing will be to limit the damage machines already do to search engines. I’m forever finding stuff I’ve written garbled up and put on some aggregate site that is obviously run by a poorly written machine. The dam things end up with better Google rank than I do, so my stuff just gets buried within a few weeks. I think most of these junk sites don’t really care what kind of damage they do to organizing content so long as real people stumble onto them and give them traffic.

  • gout

    Technology is created to meet a demand. Unless you can predict what the demand will be in the future, you will be unable to predict the technology that will meet it. I would bet that the person who made the above graph has no idea what future technology demand will look like any more than anyone else. But keep trying none the less.

    Also, without a standards based web future technology(demand or not) will have a difficult time coming to fruition on the web.

  • Matt

    Agree about numbering, it’s all marketing rubbish. I think the big issue is convincing the web-world that web standards are the pre-requisite to everything to come in the future. People will always use the latest ‘toys’ that come along, but it’s all just a passing fad unless the standards on the web are improving. It would be great if the term web standards and all it stood for was a more popular buzz word than Web 2.0.

  • malikyte

    @Jobe: If you’re getting content ripped from you, add a (hidden via CSS?) link back to your site as the source of the content. It’s not an end-all solution, but it is free advertising if you use it to your advantage.

  • Anonymous

    > I think the big issue is convincing the web-world that web standards are
    > the pre-requisite to everything to come in the future.

    that is the job of web developers in my view, and not for those in marketing who have absolutely no idea about software development… they should just butt out of it.

    it is we the developers who advice our clients on what best suits their needs and trying to convince clients that standards are the way to go, even if it increases development time and costs (to a certain degree at least) is well worth it in the long run isn’t easy but then it never is, is it?

    what makes our lifes more difficult is some -beep- in the media making a song and dance about technology that is hyped that the client reads about and then wants though it’s not in their interests.

    i’m still bemused… dr livingston

  • George A. Maney

    There are many miracles on this map. Semantic databases, semantic search, SaaS, (SOA), and such require very high levels of Semantic Web information quality and information safety to be more that trivially useful. This just isn’t going to happen any time soon.
    Note that Internet document web is information unsafe. Even so, it works great. The Internet data web is different. Any workable, worthwhile Internet data web must be intrinsically information safe.
    All Internet data web architectures so far start with metaphysical information modeling architectures. Today all mainstream information modeling is metaphysical. Entity-relationship and object-oriented are the predominant forms. RDF and all alternatives are just alternative flavors of metaphysical pattern modeling.
    Metaphysical modeling is intrinsically low quality and thus intrinsically unsafe. This is readily demonstrable. So metaphysical information modeling mashup interoperability, insurability, and immortality cannot be modeled or managed. This is a killer. It eliminates nearly all customer value potential in data web model mashups.
    Today’s best institutional data processing operations are a sanity check. Today these are severely limited in scope and scale by workable information safety and quality limits. Model mashups within and among software packages requires ruinously expensive recurrent reverse engineering. Most high value mashups are impractical or infeasible. Those mashups that are done often suffer from reliability problems.
    Any workable data web build-out will, in effect, be a huge worldwide data center. This will be millions of times larger than the largest data center operations today. This will involve myriad thousands of independent modeling contexts and myriad millions of models. This simply isn’t going to fly with any metaphysical information modeling approach.
    Today alternative mechanistic subject modeling methods are limited to the applied science automation software world. These scale without limit and provide fully manageable information safety and quality. Any workable Internet data web must and will ultimately these alterative methods.
    Today these mature methods are unknown in the mainstream software world. There is no commodity infrastructure support for this sort of modeling. Moreover, this sort of modeling is incompatible with the huge legacy of SQL RDB data maintained today.
    So for the foreseeable future the Internet data web will be limited to a relatively small range of tactical applications that can tolerate information unsafely. These will provide some trivial value. The mother lode of Internet data web innovation value, amounting to at least a trillion dollars in financial market capitalization, will remain far out of reach.

  • Tim

    I agree with Jobe; Google starts losing its edge when the first page of results (from well-selected search terms) only produces conglomerate advertising pages that have no real content except what they scraped from other sites.

    I’m not really sure what George was saying…sounded like “data quality sucks, and thus a semantic web is meaningless”. But he said it so much better than that, which is why his version will get semanticized and mine won’t. :-)

  • dX-Xel

    hmm…I think I need to plan to develop web 4.0..hhehee

  • Net World

    web3.0 seams the actual implementation into real life. Tim Burner’s imagination is now going to rule the actual life.

    Heads Off to you Tim

    S.N. Jha

Get the latest in Front-end, once a week, for free.