- News>
- Internet & Social Media
Future Internet aims to transform way in which information shared online
Researchers have taken the first step towards a radical new architecture for the Internet, which they claim will transform the way in which information is shared online, and make it faster and safer to use.
London: Researchers have taken the first step towards a radical new architecture for the Internet, which they claim will transform the way in which information is shared online, and make it faster and safer to use.
A revolutionary new architecture aims to make the internet more `social` by eliminating the need to connect to servers and enabling all content to be shared more efficiently.
The prototype, which has been developed as part of an EU-funded project called `Pursuit`, is being put forward as a proof-of concept model for overhauling the existing structure of the internet`s IP layer, through which isolated networks are connected, or `inter networked`.
Its creator said that the Pursuit Internet would enable a more socially-minded and intelligent system, in which users would be able to obtain information without needing direct access to the servers where content is initially stored. Instead, individual computers would be able to copy and republish content on receipt, providing other users with the option to access data, or fragments of data, from a wide range of locations rather than the source itself.
Essentially, the model would enable all online content to be shared in a manner emulating the "peer-to-peer" approach taken by some file-sharing sites, but on an unprecedented, internet-wide scale.
That would potentially make the internet faster, more efficient, and more capable of withstanding rapidly escalating levels of global user demand. It would also make information delivery almost immune to server crashes, and significantly enhance the ability of users to control access to their private information online.
While this would lead to an even wider dispersal of online materials than we experience now, however, the researchers behind the project also argue that by focusing on information rather than the web addresses (URLs) where it is stored, digital content would become more secure.
They envisage that by making individual bits of data recognisable, that data could be "fingerprinted" to show that it comes from an authorised source.
Dr Dirk Trossen, a senior researcher at the University of Cambridge Computer Lab, and the technical manager for Pursuit, said: "The current internet architecture is based on the idea that one computer calls another, with packets of information moving between them, from end to end. As users, however, we aren`t interested in the storage location or connecting the endpoints. What we want is the stuff that lives there."
"Our system focuses on the way in which society itself uses the internet to get hold of that content. It puts information first. One colleague asked me how, using this architecture, you would get to the server. The answer is: you don`t. The only reason we care about web addresses and servers now is because the people who designed the network tell us that we need to. What we are really after is content and information," he said. For example, at the moment if a user wants to watch their favourite TV show online, they search for that show using a search engine which retrieves what it thinks is the URL where that show is stored. This content is hosted by a particular server, or, in some cases, a proxy server.
If, however, the user could correctly identify the content itself - in this case the show - then the location where the show is stored becomes less relevant.
The designers of Pursuit hope that, in the future, this is how the internet will work.
Technically, online searches would stop looking for URLs (the Uniform Resource Locator) and start looking for URIs (Uniform Resource Identifiers).
In simple terms, these would be highly specific identifiers which enable the system to work out what the information or content is.
A revolutionary new architecture aims to make the internet more `social` by eliminating the need to connect to servers and enabling all content to be shared more efficiently.
The prototype, which has been developed as part of an EU-funded project called `Pursuit`, is being put forward as a proof-of concept model for overhauling the existing structure of the internet`s IP layer, through which isolated networks are connected, or `inter networked`.
Its creator said that the Pursuit Internet would enable a more socially-minded and intelligent system, in which users would be able to obtain information without needing direct access to the servers where content is initially stored. Instead, individual computers would be able to copy and republish content on receipt, providing other users with the option to access data, or fragments of data, from a wide range of locations rather than the source itself.
Essentially, the model would enable all online content to be shared in a manner emulating the "peer-to-peer" approach taken by some file-sharing sites, but on an unprecedented, internet-wide scale.
That would potentially make the internet faster, more efficient, and more capable of withstanding rapidly escalating levels of global user demand. It would also make information delivery almost immune to server crashes, and significantly enhance the ability of users to control access to their private information online.
While this would lead to an even wider dispersal of online materials than we experience now, however, the researchers behind the project also argue that by focusing on information rather than the web addresses (URLs) where it is stored, digital content would become more secure.
They envisage that by making individual bits of data recognisable, that data could be "fingerprinted" to show that it comes from an authorised source.
Dr Dirk Trossen, a senior researcher at the University of Cambridge Computer Lab, and the technical manager for Pursuit, said: "The current internet architecture is based on the idea that one computer calls another, with packets of information moving between them, from end to end. As users, however, we aren`t interested in the storage location or connecting the endpoints. What we want is the stuff that lives there."
"Our system focuses on the way in which society itself uses the internet to get hold of that content. It puts information first. One colleague asked me how, using this architecture, you would get to the server. The answer is: you don`t. The only reason we care about web addresses and servers now is because the people who designed the network tell us that we need to. What we are really after is content and information," he said. For example, at the moment if a user wants to watch their favourite TV show online, they search for that show using a search engine which retrieves what it thinks is the URL where that show is stored. This content is hosted by a particular server, or, in some cases, a proxy server.
If, however, the user could correctly identify the content itself - in this case the show - then the location where the show is stored becomes less relevant.
The designers of Pursuit hope that, in the future, this is how the internet will work.
Technically, online searches would stop looking for URLs (the Uniform Resource Locator) and start looking for URIs (Uniform Resource Identifiers).
In simple terms, these would be highly specific identifiers which enable the system to work out what the information or content is.