As a web developer and someone who was borderline raised by the web, this is a topic that’s pretty close to my heart. For most people born before 2000, we have watched the web grow in prominence, fueled by the rise of the social media giants, from a playground for the tech-savvy minority, to an integral part of our daily lives. Today, web technologies connect, inform, and entertain. They play kingmaker, offering, to those that are willing to embrace them, a pathway to success through access to many avenues of wealth generation and self-improvement. Yet, today, the web finds itself at a crossroads, criticized by many for having deviated from its founding principles. With that in mind, there are many who are hard at work building the next iteration of the web, offering new opportunities to those that can see beyond the horizon and become the Facebooks and Googles of tomorrow. But, they can only do that by analyzing the web as a project, seeing where it has come from and what seeds have been sown to build the web of tomorrow.
In the beginning, the web was static. In this first epoch, known more commonly as Web 1.0, the web was nothing more than an interlinked mesh of raw HTML files connected by hyperlinks. There was little to no interactivity, which meant that the majority of web users were consumers who saw the same thing, with only a minority creating the content. This era had its advantages — the web was mostly decentralized (hold onto that because it will be important later), which meant that anyone who knew HTML could spin up a server and host their own site or have it hosted through an ISP. It was also open-source, and many of the early innovations were a result of developers being able to build on top of existing software without having to get permission from the original developer. To a modern netizen, the web of the 90s and early 2000s might look bland, with very basic web design and none of the interactivity that we see today, but it served its purpose, and the relatively low barrier for entry made adoption easy, allowing the web to grow into what we see today.
Why does the web need to change?
We can explore this by first touching on the principles upon which the web was originally built. Accessibility was at the core, with the hope being that the web would enable communication, cooperation, and knowledge distribution on a scale never seen before. When Sir Tim Berners-Lee built the web in the early 90s, it was nothing more than a single web page being served from his desktop computer. As such, the hope was that anyone with the right skills would be able to freely publish content and have it accessible over the internet. While the web has become widely accessible, this has been through an ever-shrinking number of providers who exert an increasing amount of control over who can publish and what they can publish. They have laid the infrastructure that allows anyone to establish an online presence with incredible ease, yet this has come at some cost for ordinary users. The egalitarian ideals that led early web development, from open-source software to equal access and a platform for all, regardless of whether or not their opinions were widely accepted and/or supported, seem to be disappearing and something not entirely recognizable has taken its place.
As mentioned above, the web, today, has become more centralized than ever. This is not bad in itself — these companies have brought immense value to their billions of customers. The problems, as mentioned earlier, are those of power, trust, and flexibility. For example, when OnlyFans woke up and announced back in August 2021 that they would be banning explicit content, it showed just how much power these platforms now held over their users. And, because of how these platforms lock their users in, there was no simple way for them to migrate their content and followings to a different platform, which meant that they would possibly have to rebuild their businesses from the ground up. These companies also collect and store massive amounts of user-generated data, giving users little to no power over what can be collected and how it is used. This has fueled the growth of a business model built on advertising that generates massive wealth that many argue has disproportionately benefited companies over their users. This collected data also leaves users at risk of exposure, as has happened on so many occasions in recent years with bad actors gaining access to millions of users’ data. To top it all off, because these platforms have the power to silence or boost certain voices, we must trust that their interests are aligned with those of the general public. These are just some of the problems that the next iteration of the web was envisioned to solve.
If you have been on the web lately, you have probably encountered the term Web3. There are many definitions of Web3 floating around, but what they all have in common is that they envision a better, more decentralized web where control is wrangled from the Big Tech companies and returned to the web’s users. At its core, web 2.0 relies on trust — we trust Google to be unbiased when we search for something, we trust Facebook and Twitter with our personal information, and we trust Big Tech not to pull the rug out from under us when we depend on their platforms for our livelihoods. Web3, on the other hand, will be built atop zero or low trust interaction protocols such as Bitcoin and Ethereum. These zero-trust platforms will allow, for example, money transfers with no middlemen through consensus algorithms that give everyone a say. Where Web 2.0 was characterized by walled gardens and extensive incompatibility, Web3 aims to usher in an age of standardization reminiscent of Web 1.0. We can see this standardization in action already — smart contracts are standardized apps that run on blockchains, NFTs are standardized non-fungible digital assets, and tokens are standardized fungible digital assets. This standardization means that, as your data will live on the blockchain, you’ll have the ability to simply retrieve it, take it with you, and continue on a different platform. Web3 also aims to solve the problem of identity by allowing users to control their digital identity instead of entrusting it to an unknowable third party and hoping to God that it doesn’t get stolen by hackers or used to manipulate them. With this power, users can decide how much of their personal information is accessible, keep track of who has it and what they are using it for, and revoke access when they feel the need. Future users will also have the opportunity to partly own online platforms through DAOs (Decentralized Autonomous Organizations) where token holders vote and make decisions based on their holdings. We can see that Web3 has so much promise, even though it does have its critics.
Looking Beyond The Web3 Hype
While the Web3 developer community is working hard to make this vision a reality, we can take a look at how the web might evolve in the near future. Web technologies have become so much more powerful than many people realize. Many of the applications that we use daily, both on our mobile phones and computers, are built on web technologies like React Native, Flutter, and Electron. This is important because it has become the vehicle for the rise of the Saas (Software as a Service) business models. This is a form of cloud computing and software delivery model where users access their software over the internet, usually via a subscription model. And, because they are built on web technology, this software can also run in any browser on any device. Users can access powerful coding (CodeSandBox), picture and video editing (Figma, Rive, Framer), and document editing (Office 365, Google Docs) tools, to name a few, on low-cost hardware. For example, through their Chromebooks, Google offers users cheap hardware that benefits from harnessing the immense power of the cloud. While this poses a risk of further consolidating the web, especially as users are required to put more trust than ever in the hands of the few that either owns the hardware backbone of the internet (AWS, for example), or those that can afford to rent out the hardware required to deliver their software as a service, there’s no doubt that there will be tangible benefits for customers. Another issue is that this will also probably alienate those in underdeveloped countries where internet access is not as ubiquitous as in developed nations. Yet, as the prevalence of game, music, and video streaming shows, this model will only become more popular, likely contributing to bringing about the meme-world idea of a future where we will own nothing and be happy.
What the Future Holds
What we see, then, is a tug-of-war between competing ideas. On one hand, we can relinquish control to Big Tech and enjoy all the amenities that come with that, or we can take back control and see where that leads. Should we choose the first path, instead of Web3, we get something more like Web 2.5 — an extension of what we have already. If we choose the latter, however, as Stephen Diehl says in his blog post Web 3 is Bullshit, “Technology won’t save us from having to ask the hard questions of who should have the power to control our digital lives”. And, even as Web3 is ushered in, there is evidence that today’s financial powerhouses are already moving to co-opt this new technology to maintain control. The question of regulation will also continue to cast a great shadow over us — a web filled with hatred and bigotry will only serve to alienate most people, which means that decisions about what gets to stay online and what should be removed will still need to be made if it is to remain as social and accessible as it is today. Regulation is also important in another sense as we have seen how vulnerable the less tech-savvy early adopters of blockchain technology are, with endless reports of people falling prey to pump-and-dump schemes and many other exploits.
As tech enthusiasts who get our kicks out of popping off the hood to figure out how new technologies work, we may be liable to overestimate how much people care about how their technology works. What we see, in reality, is the widespread adoration of Apple’s “it just works” philosophy that allows people to simply enjoy their gadgets without having to get their hands dirty. With that in mind, I am inclined to believe that a Web3 where people will have to do more of the grunt work of setting up their online presence, keeping themselves secure, and managing their identity will have far less appeal than one where things just work. Yet, the cost of that is having to live with the knowledge that powerful entities not only hold our data but also know the tiniest details about what we do online and have the power to control what we can and cannot see online, while also having the power to banish us from all that the web has to offer.
Amidst all this uncertainty, we have no reason to believe that the web (or any technology, for that matter) will stop evolving. As I said earlier, there is no clear cut-off point between webs 2.0 and 3.0. In many ways, Web3 is already here, yet adoption will determine whether or not the vision as outlined by its proponents will solidify. On the other hand, Web 2.0 pioneers like Mark Zuckerberg are hard at work, building their own vision of what the next epoch of the web should be. What’s certain is that, as web technologies have become more powerful, they have also become more invaluable to us, ensuring that whichever way the cookie crumbles, there’s something at stake for all of us.