Internet might someday lose its dependency on servers, rely on P2P instead

By on November 4, 2013, 10:00 AM
internet, peer to peer, tcp

It's hard to even imagine the Internet without servers, but researchers at Cambridge University have developed a proof-of-concept for a new server-free Internet architecture. The prototype was developed as part of the €5.2 million project PURSUIT that comprises representatives from European research institutes, universities and telecommunication companies. The revolutionary new Internet architecture is designed to meet the ever growing traffic requirements of web services and security concerns of Internet users.

As of today, online data is stored on servers residing at different locations around the globe. Data requests made by client devices like PCs, tablets or smartphones are fulfilled by the geographically closest server, making the information exchange quick but server dependent. This centralized approach opens the door to problems like server attacks and traffic overloading. Also, users have less control over how and when their data is accessed.

PURSUIT users wouldn't have to face these security and privacy problems as the architecture does away with the need of individual computers connecting to dedicated servers. Instead, it uses a peer-to-peer information sharing technique which enables individual computers to copy and republish data on receipt.

This, if deployed, would replace the existing client-server based TCP/IP networking model and could radically change the way information is stored, searched and shared online. Users would be able to fetch the the requested data (or smaller data fragments) from a wide range of computers around the world and online search engines would look for URIs (Uniform Resource Identifiers) rather than URLs (Uniform Resource Locators).

The project builds upon other successful projects like TRILOGY and PSIRP, and won the Future Internet Award 2013 at FIA (Future Internet Assembly) in Dublin, earlier this year.

Image via Shutterstock

User Comments: 4

Got something to say? Post a comment
amstech amstech, TechSpot Enthusiast, said:

I'd still rather have servers do all the heavy lifting. Data cannot be secured properly now, what makes you think I want all my neighbors PC's holding my stuff with the wet paper bag strength of P2P?

Also, today's devices are still not that powerful, we can easily bring some of our new ProLiant XEON servers down to their knees. Just because your iphone can play 3D games made 12 years ago doesn't mean they are fast, or versatile.

1 person liked this | Burty117 Burty117, TechSpot Chancellor, said:

But this would also mean upload speeds on the ISP end would need to be increased substantially, I don't see this taking off to be honest :/

Skidmarksdeluxe Skidmarksdeluxe said:

This makes sense. It could be one the ways forward but not for a while yet I suspect.

Guest said:

Not necessarily. You'd still have to have a server to specify the URI to get and do pki to make sure you get the right resource, but the ISP could traffic shape the P2P element so it only went to other subscribers (so it didn't need more peering, which is why upload is so limited on normal home connections) for the p2p part. You could also just stick it in java script  so that anyone logging onto your website would be in your cloud. Problem though your costumers will hate you if you take too many resources, especially on mobile where bandwidth is expensive.

Load all comments...

Add New Comment

TechSpot Members
Login or sign up for free,
it takes about 30 seconds.
You may also...
Get complete access to the TechSpot community. Join thousands of technology enthusiasts that contribute and share knowledge in our forum. Get a private inbox, upload your own photo gallery and more.