File servers like RapidShare were supposed to put an end to P2P networks, which include BitTorrent. Nothing happened, however. The torrents are great, and their popularity is growing. Discover the technology that has conquered the world.
The word “BitTorrent” has many meanings. We use it to determine the technology (protocol) for exchanging and distributing files over the Internet, created in 2002 by the Cohen programmer. BitTorrent is also the name of the first file-sharing program that uses this technology, and is now often simply a synonym for any application that allows you to “download torrents”. In turn, the network created by users exchanging files using programs based on the innovative Cohen protocol is called the BitTorrent network – to distinguish it from ordinary Internet communication.
What made the initially modest invention so popular? It’s simple: it turned out to be better and more convenient than other file-sharing technologies.
The career of any direct file exchange networks is based on the use of a peer-to-peer communication model (aka equal), in short P2P. Normally, when we download files from, for example, websites, we use a standard client-server communication model. Our computer is a client that sends requests to remote servers. These react appropriately and as a result we can view websites, download files, etc. In the P2P model it is different: each computer, called a node, fulfills both the role of the client and the server. It connects and downloads data from other machines, but also accepts connections from them and provides them with data. All computers are equal.
Theoretically, the P2P network should operate only on the basis of a decentralized connection structure between nodes, without any traditionally understood server. However, to this end, all nodes would have to be permanently available and each would have to store the information necessary to search and retrieve resources. (By the way, it’s worth noting that such a database dynamically changes over time, so to keep it up-to-date, all clients would have to exchange information quite often, which would cause an additional slowdown of the entire network.) Therefore, for practical reasons in the most popular P2P networks , and therefore also in the BitTorrent network, central servers are used (in various ways). Thanks to them, new users of a given network are immediately informed about active connections and a list of available files, which can be searched easily and quickly. This significantly improves the comfort of work and speeds up the functioning of the network.
A wider explanation of the term P2P you will find, among others on Wikipedia.
A brief history of Mininova technology – one of the largest torrent sites in the world.
Mininova – one of the largest torrent sites in the world.
The story of madness, which is called “downloading P2P files”, began in 1999 with the arrival of Napster. This program allowed users to download MP3 files from each other, not from servers. For proper communication, however, the server was needed – client programs contacted him in order to get information about where (at whom) and what files are currently available for download. Napster has conquered the hearts of millions of internet users, staring for illegal empetrons. From the point of view of technology, it was quite primitive: it allowed downloading the file from only one source, only in its entirety, and had no control mechanisms. There was no certainty whether the downloaded file is really what its name indicates. You could check it only after downloading.
The answer to the court trials that Napster finished off was that Nullsoft programmers (the same ones who wrote Winamp) created a program called Gnutella. It was a completely new quality: thanks to it, internet users could connect to each other without the intermediary of any server and exchange emtabs and other files with impunity. Since the network did not have a central server, no court would be able to close it because there was nothing to close. Gnutella, however, paid her independence with a difficult search for resources and slow action.
The next incarnation of P2P was Kazaa, based on FastTrack technology. A two-layer network structure has been introduced. Computers on free lines fulfilled the function of ordinary nodes known from Gnutella, while those that were connected to the Internet with fast links became automatically so-called. super-nodes and created a high-throughput core of the network. It was on them that file indexes were stored, they also served all queries. The change in the concept of network construction has resulted in a large increase in efficiency. Kazaa was incomparably faster than Gnutella. In addition, it introduced – for the first time in P2P networks – the ability to download files from multiple sources at the same time. Still, it was necessary to download the files in their entirety and there were still no safeguards, thanks to which first jesters, and later (supposedly) music companies, let fake files “pretending” the originals into Kaza’s network. Half bad, if the MP3 file size 2-3 MB was fake. Worse, if after a long hours of downloading a 1.4-gigabyte film, it turned out that it was not a movie at all, just ordinary chaff.