For a lot of people, the term “Web 2.0,” ceased to mean anything real a long a time ago. For some, it never really meant anything to begin with. As someone who writes about the so-called second version of the web for a living, I think I’ve held onto the Web 2.0 term as long as I could. But today, “Web 2.0” has officially jumped the shark for me. That doesn’t mean I’ll stop using it — as a blanket term to describe the industry that I write about it can be helpful — but I have to admit that it has now become somewhat of a parody.
Defining Web 2.0 has been something like a fun parlor game for a few years now. There’s a long history of people trying to come up with a unified definition of Web 2.0. But like the elusive theory of everything in physics, a single, agreed-upon definition of what Web 2.0 really means has been hard to come by.
Probably the most widely accepted definition is Tim O’Reilly’s compact definition: “Web 2.0 is the business revolution in the computer industry caused by the move to the Internet as a platform, and an attempt to understand the rules for success on that new platform.”
But even O’Reilly’s definition has changed and evolved to get to that point.
So what caused me to finally admit that Web 2.0 has jumped the shark? It was waking up today, and finding a link to this story at PC World, a very mainstream computer publication: Web 2.0 Tactics for Successful Job-Hunting.
Among the “Web 2.0 tactics” that PC World recommends: letters of recommendation, staying current with your skills, and networking. Isn’t that how people have been searching for jobs nearly forever? What the heck is “Web 2.0” about that? The only item on the list that could be even mildly considered to have some sort of tie in with what we generally like to think of as Web 2.0 was “Upgrade your online image,” in which the authors recommend joining relevant online social communities like LinkedIn, and Twitter, blogging, and making sure your profiles at other social sites are clean of college party photos.
In other words: Web 2.0 is now a mainstream marketing term. In reality, Web 2.0 has always been a marketing term. O’Reilly’s company, which owns the trademark on the term, uses it to promote their hugely successful web-focused conference series, for example. But until today, I hadn’t actually seen it applied in a way that so blatantly targets a mainstream audience in an effort to make something rather dull appear more hip (I’m sure it’s happened before, this was just the first time I’ve seen it).
All that said, the confusion over Web 2.0 — whatever it means and however it is now being used — has been helpful.
Last April, I wrote that there really is no such thing as Web 2.0, or Web 3.0 for that matter, there is just the web. “Web 2.0 and Web 3.0 — they don’t really exist. They’re just arbitrary numbers assigned to something that doesn’t really have versions,” I said. “But the discussion that those terms have prompted have been helpful, I think, in figuring out where the web is going and how we’re going to get there; and that’s what is important.”
I think that’s still true, and as long as we continue to have that discussion and attempt to define these nebulous ideas, we’ll continue to get value from the discussion. I wrote in April that instead of telling people I write about Web 2.0, I’d tell them that I “write about the web, what you can do with it now, and what you’ll be able to do with it in the future.” I haven’t done a very good job in keeping with that promise, but I still like the idea.
Wrapping Your Head Around Python
Introduction to Swift
Jump Start Git
Level Up Your Web Apps With Go
Jump Start MySQL
Jump Start Git
- 1 Versioning Show, Episode 9: Code Longevity and Web Ghost Towns
- 2 Form UX: Sometimes Even Apple, Google and Amazon Make Mistakes
- 3 8 Key Announcements for Android Developers at Google IO
- 4 6 More Must-Do Grav Tweaks: Ready for Hacker News Traffic!
- 5 Bringing Pages to Life with the Web Animations API