Li Fan, Pei Cao and Jussara Almeida & Andrei Z. Broder - this new protocol, each proxy keeps a summary of the cache directory of each participating proxy, and checks these summaries for potential hits before sending any queries.
implements tcp splicing for the Linux kernel. The tcp splicing is a technique to splice two connections inside the kernel, so that data relaying between the two connections can be run at near router speeds.
The purpose of the Online Games White Papers is to provide online games market statistics, business model descriptions, technology summaries and publisher listings.
explores the challenges of constructing a distributed e-business architecture based on the concept of Request Based Virtual OrganiZation (RBVO) and presents a solution based on ebXML, Open Source e-business component
aims to build scalable, robust distributed systems using peer-to-peer ideas. The basis for much of our work is the Chord distributed hash lookup primitive.
Onion Networks builds and licenses data transfer technologies that drastically improve the speed, scalability, reliability, and security of file transfers over global networks.
to provide a framework for finding information within a distributed environment. NeuroGrid is based on the idea of automating the process we use in human society to find out things that we want to know, or the locations of things that we need.
Network Address Translation (NAT) causes well-known difficulties for peer-to-peer (P2P) communication, since the peers involved may not be reachable at any globally valid IP address.
Graham Wihlidal "There is no such thing as the “perfect networking code”, due to the Internet’s unreliability, but there are a few tricks you can use to improve it to the point where the illusion can hold."
the Peer Distributed Transfer Protocol provides a method of transferring files using peers to aid in distribution of content, similar to BitTorrent. PDTP servers export a dynamically changing directory hierarchy, making it somewhat more like HTTP or FTP.
science which solves a large problem by giving small parts of the problem to many computers to solve and then combining the solutions for the parts into a solution for the problem.
a novel decentralized infrastructure, based on distributed hash tables (DHTs), that will enable a new generation of large-scale distributed applications.
O. Hassan, O. Aderibigbe, O. Efijemue, and T. Onasanya. Proceedings of the 2005 ACM SIGMOD International Conference on Management of Data, page 906--908. New York, NY, USA, ACM, (2024)