|
|
|
|
|
|
|
Origin >
How to Build a Brain >
The Third-Generation Web is Coming
Permanent link to this article: http://www.kurzweilai.net/meme/frame.html?main=/articles/art0689.html
Printable Version |
|
|
|
The Third-Generation Web is Coming
Web 3.0, expected to debut in 2007, will be more connected, open, and intelligent, with semantic Web technologies, distributed databases, natural language processing, machine learning, machine reasoning, and autonomous agents.
Published on KurzweilAI.net December 17, 2006.
The Web is entering a new phase of evolution. There has been much
debate recently about what to call this new phase. Some would prefer
to not name it all, while others suggest continuing to call it "Web
2.0." However, this new phase of evolution has quite a different
focus from what Web 2.0 has come to mean.
John Markoff of the New York Times recently suggested naming
this third-generation of the Web, "Web 3.0." This suggestion has
led to quite a bit of debate within the industry. Those who are
attached to the Web 2.0 moniker have reacted by claiming that such
a term is not warranted while others have responded positively to
the term, noting that there is indeed a characteristic difference
between the coming new stage of the Web and what Web 2.0 has come
to represent.
The term Web 2.0 was never clearly defined and even today if one
asks ten people what it means one will likely get ten different
definitions. However, most people in the Web industry would agree
that Web 2.0 focuses on several major themes, including AJAX, social
networking, folksonomies, lightweight collaboration, social bookmarking,
and media sharing. While the innovations and practices of Web 2.0
will continue to develop, they are not the final step in the evolution
of the Web.
In fact, there is a lot more in store for the Web. We are starting
to witness the convergence of several growing technology trends
that are outside the scope of what Web 2.0 has come to mean. These
trends have been gestating for a decade and will soon reach a tipping
point. At this juncture the third-generation of the Web will start.
More intelligent Web
The threshold to the third-generation Web will be crossed in 2007.
At this juncture the focus of innovation will start shift back from
front-end improvements towards back-end infrastructure level upgrades
to the Web. This cycle will continue for five to ten years, and
will result in making the Web more connected, more open, and more
intelligent. It will transform the Web from a network of separately
siloed applications and content repositories to a more seamless
and interoperable whole.
Because the focus of the third-generation Web is quite different
from that of Web 2.0, this new generation of the Web probably does
deserve its own name. In keeping with the naming convention established
by labeling the second generation of the Web as Web 2.0, I agree
with John Markoff that this third-generation of the Web could be
called Web 3.0.
A more precise timeline and definition might go as follows:
Web 1.0. Web 1.0 was the first generation of the Web. During
this phase the focus was primarily on building the Web, making it
accessible, and commercializing it for the first time. Key areas
of interest centered on protocols such as HTTP, open standard markup
languages such as HTML and XML, Internet access through ISPs, the
first Web browsers, Web development platforms and tools, Web-centric
software languages such as Java and Javascript, the creation of
Web sites, the commercialization of the Web and Web business models,
and the growth of key portals on the Web.
Web 2.0. According to the Wikipedia, "Web
2.0, a phrase coined by O'Reilly
Media in 20041,
refers to a supposed second generation
of Internet-based services—such
as social
networking sites, wikis,
communication tools, and folksonomies—that
emphasize online collaboration and sharing among users." I
would also add to this definition another trend that has been a
major factor in Web 2.0—the emergence of the mobile Internet
and mobile devices (including camera phones) as a major new platform
driving the adoption and growth of the Web, particularly outside
of the United States.
Web 3.0. Using the same pattern as the above Wikipedia definition,
Web 3.0 could be defined as: "Web 3.0, a phrase coined by John
Markoff of the New York Times in 2006, refers to a supposed third
generation of Internet-based services that collectively comprise
what might be called 'the intelligent Web'—such as those using
semantic web, microformats, natural language search, data-mining,
machine learning, recommendation agents, and artificial intelligence
technologies—which emphasize machine-facilitated understanding
of information in order to provide a more productive and intuitive
user experience."
Web 3.0 Expanded Definition. I propose expanding the above
definition of Web 3.0 to be a bit more inclusive. There are actually
several major technology trends that are about to reach a new level
of maturity at the same time. The simultaneous maturity of these
trends is mutually reinforcing, and collectively they will drive
the third-generation Web. From this broader perspective, Web 3.0
might be defined as a third-generation of the Web enabled by the
convergence of several key emerging technology trends:
Ubiquitous Connectivity
- Broadband adoption
- Mobile Internet access
- Mobile devices
Network Computing
- Software-as-a-service business models
- Web services interoperability
- Distributed computing (P2P, grid computing, hosted "cloud computing"
server farms such as Amazon S3)
Open Technologies
- Open APIs and protocols
- Open data formats
- Open-source software platforms
- Open
data (Creative Commons, Open Data License, etc.)
Open Identity
- Open identity (OpenID)
- Open reputation
- Portable identity and personal data (for example, the ability
to port your user account and search history from one service
to another)
The Intelligent Web
- Semantic Web technologies (RDF, OWL, SWRL, SPARQL, Semantic
application platforms, and statement-based datastores such as
triplestores, tuplestores and associative databases)
- Distributed databases—or what I call "The World Wide Database"
(wide-area distributed database interoperability enabled by Semantic
Web technologies)
- Intelligent applications (natural language processing, machine
learning, machine reasoning, autonomous agents)
© 2006 Nova Spivack.
| | |
|
|
Mind·X Discussion About This Article:
|
|
|
|
Re: The Third-Generation Web is Coming
|
|
|
|
I agree with Ted Stalets' post that these capabalities are coming and, as they suggest, will readically transform how we percieve not only the web, if that even continues to be known as a distinct entity, but what it means also to be human and have experiences in the "world".
With expanded options and aspects of experience come, which will evolve with the web, intricate and interconnected rules and paradigms in regard to how experiences can and should be structured. How does your virtual reality experience interact with mine? Are their protections that need to be in place in regard to the strong subsuming the weak? What about experiential imcompatibilities, roughly analogous to the interoperativity obstacles found in today's devices, software or platforms? How will these be integrated or reconciled?
The registering of future internet domain names is merely the very tip of one iceberg in regard to commercialization and profit-centered initiatives from emerging technologies. Already we are entering a world in which stock market moves are being increasingly automatized and linked to quick actions of emerging knowledge. As the web 2.0 is transforming business models from media to commerce to volunteer and collaborative labor (and collaborative wisdom as well), so will future versions of the web, again if there is such a distinct entity by which one is still called, be impacted if not defined on a continually changing basis in 3.0, 4.0 and beyond.
In my own field of politics, www.hammer2006.politicalgateway.com, communications (blogs, video, media, volunteers) is just one area of web 2.0 influence and Google CEO Eric Schmidt recently predicted that the 2008 Presidential winner will stem from Internet influencing factors.
Finally, principles of integration, organization and expansion, among others, some not yet even evolved, with be intricately woven with web developments. As Google seeks to increasingly organize the world's information, and Bill Gates talks about a new paradigm of software beyond creating what is known to organizing what is known, to virtual reality experiences, nanotechnology and the singularity expansion of knowledge into the universe, expression of these principles will become more sophisticated, competing, interwoven and possibly strong. |
|
|
|
|
|
|
|
|
Re: The Third-Generation Web is Coming
|
|
|
|
Speaking to the last comment you made in your post, regarding the obtuse toolsets available for RDF construction: isn't that what we're talking about here ... improved tools for creating this type of content?
I daresay that everything is already possible, if only we had the tools ... semantic internet tech isn't being discovered with Web 3, it's being enabled.
As for the first post, about Web 4 and 5, I think that a step is being ignored: the step where we lose these bulky desktops and heavy laptops and move into neural-interfaces which are portable insofar as they are inside our bodies. To me, that's the real gimmick behind Web 4 -- integration into the network, which will allow VR to flourish ... but not until the interface is perfected. So, I peg Web 5 as the VR revolution.
On a related note, I peg MMORPGs as a valuable testing ground for comp sci enthusiasts to begin to explore just how different our society will be once we are all protected from physical harm (usually that which comes as a consequence of an action) and persecution (personalities are swapped at-will). In some ways, these games (Second Life being the most esoteric and humanist, World of Warcraft being an example of the most utilitarian and low brow) are Web 3 already; a network of other players constitutes the semantic web. Need an answer? Ask a question. |
|
|
|
|
|
|
|
|
Re: The Third-Generation Web is Coming
|
|
|
|
"But the development of pervasive computing environments changes all that, since it becomes increasingly hard to draw a line that says 'here endeth virtual reality and here beginnith real life'. It may still be valid to talk about cyberspace as a distinct geographical concept, but its relationship with reality in both a physical and a societal sense is getting more and more complex."
True -- it just depends on what sort of timeline you're talking about. Personally, I'm partial to the idea of an "enhanced reality," combining what you *want* to see with what you *need* to see in order to navigate safely through the world. This becomes more plausible as things like traffic become automated and "quantified at-will." I like to call that effect "uncanny awareness" or "casual omniscience". I also like to call the hypothetical network which can provide the positional data -- as well as the most likely positional data at Time + x -- of every object/entity in a given region The Metronome, and it's through this mechanism by which we may someday orchestrate our personal illusions in such a way that they do nothing to interupt the world around us, and vice versa.
However, for my money, that's down the road another fifteen to twenty years and the tools for Web 5 are probably five to ten years away. Unless we find a way to alter our outward physical appearance on a whim before then (even when we do, it's something which we may never legally allow as a society), we will find a strong demand for escapist-inspired pocket universes -- with very strong barriers between their planes and those of the real world -- tomorrow, just as we do today.
People don't like to be fat. They don't like to be ugly. And even more to the point: they don't like to be "ordinary." Nanotech may provide solutions to many health problems and perhaps even to most cosmetic problems, and this may undermine everything I've just said, but I am not hopeful that AI is going to give anyone any more purpose than they have today -- in fact, it will likely diminish the need for us to be here are all quite gravely. Therefore, we will find the money chasing the dreams and the dreams will be chasing human beings' insecurities ... voila: geographically distinct, VR pocket universes, in which our existence matters, our deeds have an impact (albeit on an imaginary storyline), and in which we can still become famous. That is the true opiate of the masses. We're talking about technologies that are going to emerge in a world that has automated every blue collar job and most of the less interesting white collar jobs as well. Having a universe in which a single, average, human existence is still meaningful may very well become the driving force of our economy given all that we know of our species' ego. |
|
|
|
|
|
|
|
|
Re: The Third-Generation Web is Coming
|
|
|
|
I agree with this article's position that the Semantic Web is growing and has a lot of potential for being useful. It is not easily accessed at the moment, however, and very few tools help the authors add semantic information to Web content. I think one of the reasons for this is because the corporations just haven't figured out how to make money from the Semantic Web yet. It will remain only an academic interest until it can be made easily accessible to the lay user. The problem, however, is that the Semantic Web will not be commercialized and made popular until it has significant value, which is only added when people or machines voluntarily add semantic information to the documents. But if people don't know about the Semantic Web, then very little information will be added, resulting in a Catch-22-like scenario. The responsibility for populating the Semantic Web lies with its enthusiasts and risk-taking businesses. If a large Web company like Google made a significant effort to contribute to, or at least use, the Semantic Web it would accelerate its growth and adoption. Google is already using some microformats (like the rel-nofollow attribute of links), but that is not enough.
The Semantic Web certainly seems to be the way of the future, but is it a significant leap forward? The main theme behind Web 2.0 usability-aside is to relay the task of sorting and tagging content from the machine to the human. Flickr's image tags are added by people who use the service in order to describe their content and make it easily searchable. Books are tagged on LibraryThing by its users to help the machine recommendation engine make educated guesses as to the relations between different books. It seems like many isolated semantic webs are already here and we're contributing to them every day, but they need to be more consolidated to reach their full potential. |
|
|
|
|
|
|
|
|
Re: The Third-Generation Web is Coming
|
|
|
|
Well, it is March 23rd, 2007 now, and here are some of my observations as to the nature of this article. I claim no prophetic capacity whatsoever.
First, back in the day of web 1.0, I saw the main trend being the unfiltered access to information, and the open sharing of ideas without too much bias or fear. Everyone was just happy and jazzed up about communicating in this medium, and that excited the whole planet, me along with it. It was beautiful, glorious, transcendental, and lasted up until web 2.0 killed it for everyone.
Web 2.0, which we may or may not now be at the end of, was/is the age of corporate and government control. No information on the Internet is free, everyone wants to sell you something, and most web content has to have a big money sponsor to survive. Combined with this is the government's (American and other) sudden need to police all electronic action for signs of subversive behaviour. Most search engines have become context sensitive advertising engines for those who will pay them, and if you do a search for anything outside of social constraints, you are on some list somewhere. This is web 2.0, but there is hope...
Web 3.0, if it is actually happening, will mark a return, with advanced technology, back to the wild and free days of web 1.0. Faster, better Internet technology living up to an ideal of access and community, not exploitive capitalism and low class mind control. In order for this to happen, though, a lot of big corporations are going to have to lose some really hefty lawsuits, and the governing body of a few powerful countries are going to have to return to being under the control of (maybe for the first time in their history) of its people, and not corporate interests.
Otherwise, we're going to have web 2.1, which will be useless to the betterment of humanity, at one low price while supplies last.
|
|
|
|
|
|
|
|
|
Re: The Third-Generation Web is Coming
|
|
|
|
As a computer science student, I do believe that the third generation of Web is coming, but yet not in the 2007, it might take at least 2 more years or so.
After researching on the web about the definition of web 1.0, web 2.0, and web 3.0, it is still unclear for me to provide a proper definition for each of the above technologies.
So in my own opinion, I think of them as this: First, we had Web 1.0 - the read-only web. Then came Web 2.0 - the read-write web - all of these services that make it easy for us to contribute content and interact with others. If we keep up the programming analogy, the next phase would be Web 3.0 - the Read-Write-Execute Web.
What I mean by that is that, web 1.0 had the ability to view information for us on the internet. Then 2.0 allow us to share information among the users like youTube, P2P and etc. Thus, web 3.0 should be having a web that can easily give the users and/or perhaps the computer itself the ability to create their own tools online rather than just uploading stuffs to other people.
Officially, by Tim Berners-Lee, the man who invented the first World Wide Web, interpret the Web 3.0 as something called the Semantic Web, a term coined. In his concept, the Semantic Web is a place where machines can read Web pages much as we humans read them, a place where search engines and software agents can better find what we're looking for. "It's a set of standards that turns the Web into one big database," says Nova Spivack, CEO of Radar Networks, one of the leading voices of this new-age Internet.
For instance, one of the examples presented by the semantic web developers is:
'The semantic Web is to build a system that can give a reasonable and complete response to a simple question like: 'I'm looking for a warm place to vacation and I have a budget of $3,000. Oh, and I have an 11-year-old child.' Under today's system, such a query can lead to hours of sifting ' through lists of flights, hotel, car rentals ' and the options are often at odds with one another. Under Web 3.0, the same search would ideally call up a complete vacation package that was planned as meticulously as if it had been assembled by a human travel agent'
Rough Type, Posted by nick at November 11, 2006
http://www.roughtype.com/archives/2006/11/welcome_ web_30.php
In fact, our current web technology is adopting to have the trend for the new web generation already, as you can see here: 'Is every single page you see at Amazon written by hand? There's a massive amount of information already in machine-friendly form, tucked away in databases around the world. A lot of this is already exposed as HTML. But HTML is designed for human legibility, you have to scrape to get it back into a machine friendly form. But if you're publishing HTML in this fashion, it's no harder to publish data in a form that's more convenient for computers. With RDF and the associated Semantic Web technologies there are the languages and tools for joining together all the databases in a useful fashion.'
Posted by: Danny Ayers at November 12, 2006
With the new technologies like the Recourse Description Framework (RDF) and the Web Ontology Language (OWL), all the standards will need to be generalized and the process for the users and programmer to get used to it will take some time.
In addition, while it will be easy for you to mine meaning about vacations and other stuffs, it will also be easy for others to mine meaning about you. It is especially crucial for the developers to have a flawless design, so that both the security and quality are being kept in a good shape.
With all of these reasons in mind, I believe that the web 3.0 will take some time before they get commonly used like the web 2.0 technologies.
|
|
|
|
|
|
|
|
|
Re: The Third-Generation Web is Coming
|
|
|
|
Novack: The threshold to the third-generation Web will be crossed in 2007. At this juncture the focus of innovation will start shift back from front-end improvements towards back-end infrastructure level upgrades to the Web. This cycle will continue for five to ten years, and will result in making the Web more connected, more open, and more intelligent. It will transform the Web from a network of separately siloed applications and content repositories to a more seamless and interoperable whole.
I would posit that the focus of innovation will not simply shift from front-end improvements to back-end improvements. Rather, back-end improvements will start arriving at the same pace that front-end improvements have been arriving. The web will have two focuses of innovation.
The single most exciting recent back-end announcement for me was Google Caja (http://code.google.com/p/google-caja/), which builds object-capability security into Javascript. It will finally allow meaningful interactions between websites, whether through user-driven drag-and-drop or through more automated means. Hitherto, most websites have only allowed static, HTML-text-only third-party content (e.g. Facebook user profiles, Flickr photo albums, and Wikipedia articles). The ability to publish Turing-complete Javascript code that can only do what the host website permits promises incredible things.
It will also enable rapid and inexpensive implementation of things like Google Opensocial (http://code.google.com/apis/opensocial/) and Facebook Platform (http://developers.facebook.com/) standards for hosted applications.
Novack: Web 3.0. Using the same pattern as the above Wikipedia definition, Web 3.0 could be defined as: "Web 3.0, a phrase coined by John Markoff of the New York Times in 2006, refers to a supposed third generation of Internet-based services that collectively comprise what might be called 'the intelligent Web''such as those using semantic web, microformats, natural language search, data-mining, machine learning, recommendation agents, and artificial intelligence technologies'which emphasize machine-facilitated understanding of information in order to provide a more productive and intuitive user experience."
In addition to the technological trends that you name, I would mention one very crucial item: HTML5 (http://www.whatwg.org/).
First of all, HTML5 explicitly specifies how browsers currently interpret HTML with maximal inter-browser compatibility. It is the first HTML specification that one can use as a sole reference when writing a semi-intelligent web-crawler and expect a significant degree of success.
Secondly, it adds many semantically meaningful constructs to HTML that are both backwards-compatible and reflect actual internet practices. One basic example is that there is now a meaningful way to convey the idea of a set of navigation links. Such things are of huge importance for accessibility tools and other programs that need to parse HTML documents for meaning.
Finally, by incorporating Google Gears (http://gears.google.com/) and other powerful proposals into the specification, it makes client-side web development more powerful, more reliable, more stable, and more secure. Google Gears adds domain-specific relational database capability to Javascript. The immediate use case is offline mode for web applications, but that's just the immediate use case.
Strategos: http://en.wikipedia.org/wiki/Grid_computing Welcome in the GRID.
Indeed. And not just a grid circa 1994 of dumb clients and slightly less dumb servers, but a grid of very, very smart clients. For one thing, each and every client will have a several website-specific relational databases with some very serious data crunching capabilities.
What can a distributed grid with a billion nodes that runs a high-level language, allows meaningful exchange of Turing-complete code between nodes, and has powerful tools such as universally available relational databases accomplish? I don't know yet.
AMCKeon: This vision, as with web 1 and web 2 is all about wires, routers, networks and computers. It ignores the data. Its all very well having a W3C semantic web common framework for data sharing and searching. But this assumes data reliability.
I think the technologies mentioned will contribute greatly towards eliminating or, rather, ignoring bad data. Google PageRank (http://en.wikipedia.org/wiki/PageRank) and Bayesian Spam Filtering (http://en.wikipedia.org/wiki/Bayesian_spam_filter ing) are a very rudimentary step in that regard. Given a domain-specific approach, it should be possible to apply similar techniques to discard irrelevant and erroneous web pages. The semantic capabilities of HTML5, and exported data from sites such as Reddit (http://reddit.com/) could improve this by a degree of magnitude. As Paul Baran conceptualized it in 1962, the network 'routes around damage'. Bad data is damage, and semi-intelligent filtering tools should be able to route around it most of the time if given sufficient cues. |
|
|
|
|
|
|
|