At the start of the decade, the total storage required for every web file was estimated to have reached five zettabytes. Within a few years, the world will be generating fifty zettabytes of data every year. A zettabyte is one billion terabytes — or 1021 bytes. It’s a lot, and hard disk manufacturers cannot keep pace with demand.
The numbers are not surprising when you consider more than four people are born every second. This is significantly higher than disk drive production. The 50Zb estimate would be exceeded if everyone on the planet uploaded just two photographs annually.
I interviewed Prof. Ali Lo who heads the Taskforce Utilizing Redundant Disk Space:
The web is at crisis point. It cannot cope with the sustained influx of new data. We have deleted temporary files, backed-up to the cloud, archived to DVDs and defragmented several times. Nothing helps: every byte is used the moment it’s freed.
Web Usage Breakdown
Sir Tim Berners-Lee devised the web to share research information in hyper-linked documents. Within twenty-five years, it now consists:
- 28.65% pictures of cats
- 16.80% vain selfies
- 14.82% pointless social media chatter
- 12.73% inane vlogger videos
- 9.76% advertising/clickbait pages
- 8.70% scams and cons
- 4.79% articles soliciting spurious statistics
- 0.76% documents for the betterment of human knowledge
Prof. Ali Lo stated:
There are now more pictures of cats on the web than there are cats.
Social Media Stockpile
The success of social media has exponentially increased data usage. People are expected to document their daily routine with comments, photographs and videos. Much of this content is never viewed by anyone but it remains stored forever. If a post does go viral, the same data is replicated thousands of times by different people across different networks.
The situation will worsen when Facebook’s Fetus project is launched. The ambitious service will automatically open an account, record a timeline and send status updates before you are born.
Weighty Web Pages
Web developers must share the blame. The average page reached 2Mb at the end of 2014. The problem is exacerbated by new fonts, high-resolution images and large client-side libraries. Prof. Ali Lo commented:
By the end of the decade, the code required for a single web page will exceed the size of the browser application used to render it.
The only solution is to legally enforce a 100Kb page weight limit and resort to OS fonts such as Comic Sans.
How Long do we Have?
Current estimates indicate that one more viral post will push the web into a storage abyss. According to Prof. Ali Lo:
The web is on a precipice. All it needs is another photo of a blue/brown — not white/gold — dress to go global. We will be plunged into chaos.
Please spread the word and send this article to everyone you know before it’s too late.
Craig is a freelance UK web consultant who built his first page for IE2.0 in 1995. Since that time he's been advocating standards, accessibility, and best-practice HTML5 techniques. He's created enterprise specifications, websites and online applications for companies and organisations including the UK Parliament, the European Parliament, the Department of Energy & Climate Change, Microsoft, and more. He's written more than 1,000 articles for SitePoint and you can find him @craigbuckler.
Jump Start Git, 2nd Edition
Visual Studio Code: End-to-End Editing and Debugging Tools for Web Developers