We use cookies to give you the best experience possible. By continuing we’ll assume you’re on board with our cookie policy

The Evolution and Development of Peer to Peer (P2P) Networking

essay
The whole doc is available only for registered users
  • Pages: 52
  • Word count: 12787
  • Category: Evolution

A limited time offer! Get a custom sample essay written according to your requirements urgent 3h delivery guaranteed

Order Now

1.  Introduction

It is no doubt that we are presently living in an era where information is important. We live today in a world much beyond the hype of the dotcom bubble. It is a world in which even the most staid and traditional businesses, such as banks and insurance companies, would not be without their online and telephone service channels. In organizations, for example, the purchasing department, senior manager and marketing departments will be pulling together a range of newly integrated, up-to-date information and using it to perform more effective analysis and decision making. That same infrastructure is also allowing closer links with suppliers and collaborating organizations, and enabling personnel to operate more flexibly and remotely – in the field (with clients), at home (as tele-workers) or with other colleagues and partners (in virtual project teams) (Jackson, Harris, & Eckersley, 2003, p. 4).

Thus, slowly old approaches are giving way to a new way of doing things. But, as it does, most organizations are still trying to fathom the implications and opportunities this presents. As managers increasingly recognize, the Electronic Age calls into question many established ‘good’ business practices while also creating new challenges and dilemmas.

            As the Internet is a shared resource, a cooperative network built out of millions of hosts all over the world, business are taking advantage of more applications than ever that they want to use the network, consume bandwidth, and send packets far and wide. Since 1994, the general public has been racing to join the community of computers on the Internet, placing strain on the most basic of resources: network bandwidth. Our increasing reliance on the Internet for critical applications has brought with it new security requirements, resulting in firewalls that strongly partition the Net into pieces. Thus, the Network Access Providers (NAPs), the emails, and the system had been vastly improved beyond its original design (Minar & Hedlund, 2001).

In fact, the World Wide Web (WWW) emerged as a powerful means of communication. In 1993, there were only about 50 servers and by 1997 there were over 100,000 servers dedicated to the WWW (Staiman, 1997). This new online environment was becoming a burgeoning distribution medium, allowing the public to transfer and to share information on a global scale. New business opportunities of selling server space to ordinary users emerged through the creation of Internet service providers (ISPs), which offered a certain amount of server space and Internet access for a fixed subscription fee. The Internet is a worldwide network of computers, comprised of thousands of separate networks, interconnected by a common information transmission standard known as TCP/IP (Transaction Control Protocol/Internet Protocol) (Segkar, 2003).

Because the phenomenal growth of the Internet and the World Wide Web have provided the opportunity for electronic commerce (e-commerce) to grow faster than any other innovation in recent years, we have devoted an entire section to this subject. E-commerce growth has not been without some setbacks as businesses experimented with new approaches to utilizing information technology and the Internet.

Since e-commerce is based on an interactive model to conduct business, it has expanded the methods for maintaining business relationships. The nature of the Internet has created tremendous opportunities for businesses to forge relationships with consumers and business customers, target markets more precisely, and even to reach previously inaccessible markets. The Internet also facilitates business transactions, allowing companies to network with manufacturers, wholesalers, retailers, suppliers, and outsource firms to serve customers more efficiently (Pride & Ferrell 2000, p. 597).

According to Rayport & Jaworski (2004), there are four categories of e-commerce: business-to-business, business-to-consumer, peer-to-peer, and consumer-to-business:

  • Business-to-business (B2B) activity refers to the full spectrum of e-commerce that can occur between two organizations. Among other activities, this includes purchasing and procurement, supplier management, inventory management, channel management, sales activities, payment management, and service and support. Major players such as FreeMarkets, Dell, and General Electric may be familiar names, but there are also some exciting emerging consortia that combine the purchasing power of traditional competitors such as GM, Ford, and Daimler-Chrysler, which jointly created Covisint. Similar initiatives are under way with industry groups, including pharmaceuticals, commercial real estate development, and electronic subcomponents.
  • Business-to-consumer (B2C) e-commerce refers to exchanges between businesses and consumers, such as those managed by Amazon, Yahoo, and Charles Schwab & Co. B2C transactions can include the exchange of physical or digital products or services and are usually much smaller than B2B transactions.
  • Peer-to-peer (P2P) exchanges involve transactions between and among consumers. These exchanges can include third-party involvement, as in the case of the auction website eBay. Other operations that support peer-to-peer activity include Owners.com and Craigslist (classified ads), Gnutella (music), Monster (jobs), and Lavalife (personal services).
  • Consumers can band together to present themselves as a buyer group in a consumer-to-business (C2B) relationship. These groups may be economically motivated, as with demand aggregators, or socially oriented, as with cause-related advocacy groups at SpeakOut.com.

1.1 Rise of Peer-to-Peer Networking

            The Internet public became aware of the peer-to-peer (P2P) model, which enables users to transfer files directly, in early 2000 when the popular download software for music files, Napster, became a media event. It was already well known to a dedicated following mostly composed of teens and young adults. It emerged from the shadows when colleges began to filter Napster from their networks. The music files that students were downloading in computer labs and dorm rooms were clogging networks and devouring storage space. The music industry’s unwavering opposition to Napster forced first its sale to European media giant Bertelsmann and later a court-ordered filtering of all copyrighted material on the site.

            Godwin-Jones (2005) had indicated that the surprising standout of P2P networking is the commercial success of a small electronic device called the iPod, Apple’s digital music player. While Apple has had success in selling digital music (with its own proprietary digital rights management) through its iTunes music store, the initial popularity of the iPod was built on the wide-spread sharing of music files through the original Napster and subsequent P2P networks (FastTrack, Gnutella, Bittorrent).

Despite law suits and crackdowns on file swapping at many universities, Godwin-Jones (2005) argued that the P2P network file swapping remains widespread, with free downloads of music still at many times the rate of sales of digital music through iTunes and other on-line music services, according to most estimates. Although some language educators have recommended use of P2P for sharing of teaching resources, it has not been widely used for that purpose, due perhaps to the discrediting of the P2P process (through copyright infringements) and of P2P software (through intrusive adware and spyware). One interesting example, however, is the built-in P2P functionality of the Canadian LLEARN project for learning French (at the secondary school level). It is being use as part of the learning infrastructure to provide students a means to find and exchange resources.

The P2P system which has generated the most buzz lately is Bittorrent, which is particularly well-suited for transferring large files, since it enables files to be transferred in smaller chunks from a variety of sources. Some software is being distributed now in this way (including language software), as well as media such as films from independent filmmakers. An interesting example of the use of BitTorrent is Fugu Tabetai’s large collection of annotated Manga stories (over 4,000 pages), for learning Japanese. One of the powerful possibilities for using a P2P like BitTorrent is the potential integration of digital rights management to identify legal use and of RSS (Really Simple Syndication) to enable searching and retrieving. As is, BitTorrent is not a good way to circumvent copyright since IP addresses for downloaders and uploaders can be tracked (Godwin-Jones, 2005).

As today’s organizations are facing increasingly complex problems resulting from global competition; real-time, 24/7/365 operations; the explosion in information; and rapid technological change is needed. The potential of P2P networking in organizations is profound that this type of model could solve numerous business problems in organizations as they address these in cost-effective manner, while making effective use of underutilized computing resources. P2P unlocks underused computing resources and allows a larger community to share them. This is more defined as peer-to-peer computing (also called grid computing), a type of distributed computing that exploits the resources of dispersed computers of a network. Instead of being clients in a client/server network, each connected device becomes a fully participating peer in the network (Turban et al., 2003).

1.2 Objectives

            Given the potentials of peer-to-peer (P2P) networking in organizations, the objectives of this study are:

  1. To find out how P2P emerged as network configurations
  2. To survey an organization which is successful in their endeavour in applying P2P networking
  3. To delineate the advantages and disadvantages of P2P networking in organizations
  4. To recommend possible solutions to the issues in P2P networking in organizations

2. Research Methodology

            In finding out the relevance of P2P in organisations, it will be vital to track down the history of peer-to-peer networking, in congruence with the development of Internet. Books, journals and articles will be used in the research of the historical background. As organisations at present are currently engaging in electronic commerce, some businesses will be studied about their experiences with P2P and how they had improved their systems and processes with the incorporation of networking. Since this is a qualitative type or research, all information that will be derived from secondary sources.

When enough data is already gathered, suitable ones will be chosen and analysed. To some extent, the data gathered will dictate the research findings, conclusion and recommendations of this study. As an example, consider historical data, those pieces of information gleaned from written records of past events. It will not extract much meaning from historical documents by using it alone. It will be related to some actual and factual event that had transpired or applied in an organisation.

            From there, we will construct interpretative narratives from our data and try to capture the complexity of the phenomenon under study. Analysis that will be derived will be more personal, based on the acquired information and would include the perspectives of the sources. Although the validity of the analysis derived will be subjective, it is expected that with the support of academic and credible sources, sound and precise judgments will be borne out of this research. This research could be used in future studies involved in P2P networking in organisations.

3.  Review of Literature

In the public eye, peer-to-peer (P2P) computing has become almost synonymous with Napster, but it really goes beyond Napster and music sharing to include a broad sharing of resources and information through the direct linking of systems within networks. The underlying technology has the potential to change the way the growing number of participants in the wired world share information, content, and resources. The question that remains, however, is whether a viable business model exists underneath all the hype. Even in today’s relatively unfriendly technology market, over a hundred companies occupy the P2P computing space. Although many companies and individuals have entered this space, only a few contenders appear to have a sustainable business model (Gulati, Sawhney, & Paoni, 2003, p. 284).

Peer-to-peer (P2P) computer architecture is a type of network in which each workstation (or PC) has similar capabilities. This is in contrast with client-server architecture in which some computers serve other computers. The peer computers share data, processing, and devices with each other directly, rather than through a central server. The main benefit of P2P is that it can expand enormously the universe of information accessible from a personal computer—users are not confined to just Web pages. Additionally, some proponents claim that a well-designed P2P system can offer better security, reliability, and availability of content than the client-server model, on which the Web is based. Note that the acronym P2P also can stand for people-to-people, person-to-person, or point-to-point (Turban & King, 2003). In a web article, Ropelato (2006) simplified what P2P means:

Peer-to-Peer networking, known as P2P, is similar in concept to a browser. It is an application that runs on your PC and allows sharing of files. Napster used to be one of the most popular peer-to-peer application programs, sharing MP3 music files, until it was shut down by the U.S. Justice Department. Today, favorites like Limewire, Gnutella, Morpheus, Bearshare, and Kazaa share center stage. The following table is a list of some of the common peer-to-peer file sharing applications, most of which are free:

Kazaa

 

LimeWire Gnucleus Gnutella JungleMonkey
Morpheus

 

Bodtella iMesh FileNavigator MojoNation
MyNapster BearShare DirectConnect eDonkey2000 Konspire
Gnutella Mactella Freenet Newtella Filetopia
Aimster WinMX Hotline Flycode

Table 1. The Most Popular Peer-to-Peer Sharing Applications (Source: TopTenReviews.Com).

According to Turban and King (2003), peer-to-peer systems have seven key characteristics:

  1. User interfaces that load outside of a Web browser.
  2. User computers can act as both clients and servers.
  3. The overall system is easy to use and is well integrated.
  4. The system includes tools to support users wishing to create content or add functionality.
  5. The system provides connections with other users.
  6. The system does something new or exciting.
  7. The system supports “cross-networking” protocols such as SOAP or XML-RPC.

These characteristics of P2P computing show that devices can join the network from any location with little effort. Instead of dedicated LANs, the Internet itself becomes the network of choice. Easier configuration and control over the applications enables people without network savvy to join the user community. In fact, P2P signifies a shift in peer networking emphasis—from hardware to applications. Peer-to-peer networking connects people directly to other people. It provides an easy system for sharing, publishing, and interacting that requires no knowledge of system administration. The system wraps everything up into a user friendly interface and lets people share or communicate with each other.

An example of a P2P network is graphically illustrated in Figure1. The workstations shown in the drawing perform computer-to-computer communication directly through their own operating systems, and each resource can be shared by all devices. In a nutshell, the concept of P2P networking is to allow computers to communicate directly with each other, rather than through a central server like a website. Once you have a peer-to-peer application installed, you can allow anyone in the world to copy files from your home PC. This can be a single file, an entire directory, or your entire hard-drive. If care is not exercised, your entire hard-drive, including any confidential documents, may be wide-open to anyone in the world (Ropelato, 2006).

Figure 1. An Example of a Peer-to-Peer Network

3.1 History of the Emergence of P2P Networking

            Back in the 1960s, when the Internet was conceived, it was a P2P system. The Advanced Research Projects Agency (ARPA), of the U.S. Department of Defence, created ARPANET, which was the precursor of the Internet. The agency’s goal was to develop a host-to-host protocol that would improve and increase computer-research productivity through resource sharing over a common network. The University of California, Los Angeles, and the Stanford Research Institute were the first two hosts, with independent computing sites and equal status on the network, and ARPANET connected these universities as computing peers. After ARPANET was up and running, the researchers and developers realized that assisting human communication was the network’s most important contribution.

Early applications of the network provided a fundamental symmetry that made it radical in relation to other network structures. FTP and Telnet were client/server applications. While an FTP client sent and received files from a file server, a Telnet client logged into a computer server. The applications were client/server, but the usage patterns were symmetrical. The first e-mail program for a distributed network was released in 1971, and Telnet appeared in 1974. By 1983, ARPA research had developed the TCP/IP protocol (Transaction Control Protocol/Internet Protocol), to which it converted all the interconnected research networks of ARPANET, and the “Internet” became official. It had 500 hosts.

In 1984, the Domain Name System (DNS) was established as a solution to a file-sharing problem. DNS blends P2P networking with a hierarchical model of information ownership, and it was developed to distribute the data sharing across the P2P Internet. Also in 1984, Mosaic, the first graphical Web browser, was released and took the Internet by storm. The Internet now comprised over three million hosts.

The explosion of the World Wide Web caused the Internet to become more and more restricted and pushed development toward the client/server model. This structure has plagued the Internet with firewalls and thus with extreme partition. There is a need to return the Internet to its initial design to effectively serve its original purposes of sharing and cooperating.

            However, the technological development that shaped P2P networking was when Sean Fanning, an 18-year-old student with the nickname ‘the Napster’, was intrigued by the challenge of being able to enable his friends to ‘see’ and share between their own personal record collections. He argued that if they held these in MP3 format then it should be possible to set up some kind of central exchange program which facilitated their sharing. The result—the Napster.com site—offered sophisticated software which enabled P2P transactions.

The Napster server did not actually hold any music on its files—but every day millions of swaps were made by people around the world exchanging their music collections. Needless to say this posed a huge threat to the established music business since it involved no payment of royalties. A number of high-profile lawsuits followed but whilst Napster’s activities have been curbed the problem did not go away. There are now many other sites emulating and extending what Napster started—sites such as Gnutella take the P2P idea further and enable exchange of many different file formats—text, video, etc. In Napster’s own case the phenomenally successful site concluded a deal with entertainment giant Bertelsman which paved the way for subscription-based services which provide some revenue stream to deal with the royalty issue (Tidd et al., 2005).

Napster’s website facilitated the sharing of MP3 files by holding a centralized index on its server containing the names of all MP3 music files held by other Napster uses. Within a year, in August 2000, Napster had 20 million members using its free software, leading the pack of similar file-sharing networks. Similar file-sharing networks such as KaZaA, Grokster and Limewire are decentralized systems that convert a user’s hard drive into a directory that can be accessed by other members, rather than using a centralized directory of files (Segkar 2003, p. 29).

3.2 P2P Network Topologies

            There is no central coordinating authority for the organization of the P2P network (setup aspect) or the use of resources and communication between the peers in the network (sequence aspect). This applies in particular to the fact that no node has central control over the other. In this respect, communication between peers takes place directly. Now the computer is able to respond to a request for information or computing resources from another peer computer.

This deviation from the classic client/server system has led to common network topologies: pure P2P and hybrid P2P. Each offers advantages and is coexisting with the current Internet topology. Due to the fact that all components share equal rights and equivalent functions, pure P2P networks represent the reference type of P2P design. Within these structures there is no entity that has a global view of the network (Barkai, 2001, p. 5). In hybrid P2P networks, selected functions, such as indexing or authentication, are allocated to a subset of nodes that as a result, assume the role of a coordinating entity. This type of network architecture combines P2P and client/ server principles.

            In a pure P2P architecture, no centralized servers exist. All peers are equal, thus

creating a flat, unstructured network topology (Peter et al., 2002). In order to join the network, a peer must first contact a bootstrapping node (node that is always online), which gives the joining peer the IP address of one or more existing peers, officially making it a part of the ever dynamic network. Each peer, however, will only have information about its neighbors, which are peers that have a direct edge to it in the network. Since there are no servers to manage searches, queries for files are flooded through the network (Kurose & Ross, 2003). The act of query flooding is not exactly the best solution as it entails a large overhead traffic in the network. An

example of an application that uses this model is Gnutella.

Figure 2. Two Major Forms of Peer-to-Peer Networks (Source: O’Brien, 2003).

On the other hand, more complex real-world systems that generally combine several basic topologies into one system is known as the hybrid architecture (Yang & Garcia-Moline, 2003). In this system, nodes will usually play more than one role.

One common example of a hybrid topology is the Centralized and Ring Topology. These are a very common sight in the world of Web hosting .Heavy-loaded Web servers usually have a ring of servers that specializes in load balancing and failover. Thus, the servers themselves maintain a ring topology. The clients, however, are connected to the ring of servers through a centralized topology (i.e., client/ server system) as shown in Figure 3. Therefore, the entire system is actually a hybrid, a mixture between the sturdiness of a ring topology with the simplicity of a centralized system.

Figure 3. A Simplified Illustration of the Centralized and Ring Topology

With the centralized hybrid P2P topologies, the Super Nodes maintain a database that maps file names to IP addresses of all peers that are assigned to it (Yang & Garcia-Moline,

2002). It should be noted here that the Super Node’s database only keeps track of the peers within its own group. This greatly reduces the scope of peers that it needs to serve. So any ordinary peer with a high speed connection will qualify to be a Super Node. The best example of a P2P application that utilizes such a topology is Kazaa/ FastTrack.  It should be noted that the hybrid topologies mentioned so far are just the common ones. As can be seen, there is a great deal of different combinations of hybrid topologies that can be achieved from the basic topologies. However, if one is to make too many combinations, the resulting topology may become too complex, hence making it difficult to manage.

3.3 Controversies That Abound

At the midst of the technological advantages of P2P networking, the issue of copyright infringement has been magnified. Online music-sharing networks have been used by millions of people since they first appeared in 1999 (Leuf 2002; Associated Press 2004). The popularity of peer-to-peer (P2P) technology and the low-cost music distribution channel it created threaten the dominant position of a small number of large firms in the commercial music industry (Alexander 2002). Threatened by the possible restructuring of the entire industry in five to ten years (BBC 2002; Mann 2003), the dominant players responded with a fierce campaign against online copying.

They oppose music-sharing networks on the grounds that the networks facilitate mass copyright infringement and erode industry’s profits. This type of institutional response orchestrated by the incumbent economic players has precedents. Eighteenth century publishers in Great Britain resisted the emergence of public circulating libraries (Roehl and Varian 2000), and Hollywood studios perceived video technology when it first appeared in the late 1970s as a threat to their movie revenues (Roehl and Varian 2000; Alexander 2002).

According to the Recording Industry Association of America (RIAA), which represents music copyright owners, online file sharing causes a significant decline in CD sales and erodes the financial incentives for the production of new material (RIAA 2003). Several research studies, however, suggest that online file sharing may not have a negative effect on the music market (Alexander 2002). In a widely publicized study, Felix Oberholzer and Koleman Strumpf (2004) compared directly observed data on CD sales and downloading. They concluded that “downloads have an effect on sales which is statistically indistinguishable from zero.” The authors explained the result by noting that “most [P2P] users … would not have bought the album even in the absence of file sharing.” Contrary to the RIAA claims, file-sharing networks may have also increased the supply of music–the networks brought visibility to many “non-marketable” artists that could not use distribution channels controlled by the profit-seeking commercial music establishment (Gallaway and Kinnear, 2001).

The commercial music industry pursues an offensive strategy comprising litigation, lobbying, and self-help (Yu, 2003). Often testing the boundaries of legal and regulatory systems, the war against file swapping set off sharp and sagacious debates on the nature of intellectual property, the role of the copyright law, and fundamental notions of citizenry such as freedom of speech (Goldstein 2003; Green 2003). Since institutions invariably affect the economy (North 1992), the outcomes of the polemics in courts will have considerable economic consequences for the recording industry and the entire economy.

New laws and new interpretations of old laws may cause new industries that are attempting to grow on the platform of peer-to-peer technology to flourish or decline (Elkin, 2002). But it is still not clear to what extent the recording industry can control the recalcitrant music networks (France and Grover, 2003). Some analysts predict that peer-to-peer networks will ebb under pressure from the music industry, while other people prophesise a continuous rise in popularity (BBC, 2002). There also have been warnings that the true danger to the existence of file-sharing networks is not the belligerence of the music industry but the prevalence of free riding on P2P networks (Adar and Huberman 2000; Alexander 2002).

As the Internet is a modern day abyss, the amount of information that can be communicated is infinite; the accessibility to such files is achieved through the click of a button, all done anonymously anywhere in the world. The consequences from digital copying of music files is much more detrimental than the copying familiar in the analog world. The nature and scope of digital copying is vastly different from analog copying due to the ease, quality and cost of making musical copies. The term “digital copying” represents online copying which includes copies made through peer-to-peer networks while “analog copying” refers to pre-Napster copying on cassette tapes and CDs, which present a lesser threat to copyright-holders since such copying can be detected and controlled.

The first important distinction arising from these two kinds of copying is the simplicity in making digital copies. There is greater availability and accessibility to works on the Internet. In a paperless world, there is more room to store information. Unlike the physical world, there is no spatial limit and therefore there is an enormous range of available works online. Furthermore, there are no physical boundaries in the virtual world and no need to be in a particular location in the world to access information; therefore there is greater ease in accessing all available works online. With a vast availability of works online coupled with an easy access to such works, it is understandable why making digital copies is so simple and why there is a greater scope of copying on the Internet.

Secondly, digital copies pose a severe problem to the music industry due to the high quality of these copies. Analogue copies such as paper photocopies or cassette tape recordings are never as good as the original. Digital copies, including CD copies, are clones of the originals and thus possess the same high-quality characteristics. More importantly, digital copies preserve their original state through successive copying, which is why a music file on a P2P network can be shared and downloaded by millions of users without impairing the quality of the file. On the contrary, analogue copying is limited by the very quality of the copy and its durability in subsequent copies. Making a copy of a copy will eventually stop since the quality of each ensuing copy will diminish. Thus, digital copies occur more frequently and in larger numbers as the copies are never inferior to the original.

The last distinction is the cost of digital copies. Monetary expenses, time, effort and convenience can all fall under the rubric of “copying costs”. Starting from the last category, digital music copies can be made anywhere in the world, in the comforts of home, within minutes using a high-speed Internet connection. The primary monetary expense is a user’s Internet subscription fee and unlike media such as blank CDs, the RIAA does not impose a levy on hard drive memory integrated in PCs. Thus, the cost-effectiveness of making digital copies makes downloading music from P2P networks much more desirable, which in turn contributes to the high prevalence of illegal file-sharing.

The longevity of file-sharing networks relies heavily on the continued uploading and sharing of music. This underlying principle of sharing is a primary reason why P2P systems are such powerful distribution channels, especially when distribution is throughout the world. However, the concept of “sharing” in P2P philosophy is different from its traditional understanding, which requires the sharer to give up something. Napster brought us a new kind of “sharing” one in which recipients could enjoy the giver’s munificence, while the “giver never had to give anything up”. File-sharing continues in abundance primarily because most users are not truly sharing.

There is no scarcity of available music, let alone having to give up something in order to acquire it, such as money. P2P “sharing” is a sham and its activity has overshadowed the artist and the importance in ascertaining whether the artist has given permission for his music to be freely copied and shared in the first place. Hence, file-sharing networks are not necessarily the culprits, it is the unauthorized copying occurring on these networks that form the nucleus of the problem.

Perhaps this misguided perception of P2P file-sharing results from a lack of clarification surrounding the legalities of P2P activities. Or perhaps it is due to the level of indifference, or lack of sympathy or even lack of education and understanding. The scope and extent of copying on P2P networks gives rise to consequences that are detrimental to a musician’s copyright. The musician should decide whether his music ought to be freely shared over the Internet and not the public. The evolution of file-sharing technology has stripped this choice from the artist and put it in the hands of the masses. Whether the public argues that its actions are not responsible for the music industry’s troubles, or whether it argues that P2P networks are merely addressing a market void for digital music, unauthorized sharing and copying of music is morally and ethically wrong. Without proper redress of this problem, society is hurting an important contributor to any nation’s economy, hindering the development of a global industry that serves as a strong platform to share popular culture with the world, and perpetuating an ethos that condones theft.

Thus, the expectations that legal protection would limit the impact of this revolution have been dampened by a US Court of Appeal ruling which rejected claims that P2P violated copyright law. Their judgment said, “History has shown that time and market forces often provide equilibrium in balancing interests, whether the new technology be a player piano, a copier, a tape recorder, a video recorder, a PC, a karaoke machine or an MP3 player” (Personal Computer World, 2004).

Significantly the new opportunities opened up by this were seized not by music industry firms but by computer companies, especially Apple. In parallel with the launch of their successful I-Pod personal MP3 player, they opened a site called I-tunes which offered users a choice of thousands of tracks for download at 99 cents each. In its first weeks of operation it recorded 1 million hits and has gone on to be the market leader in an increasingly populated field, having notched up over 50 million downloads since opening in mid-2003 (Digital Music News, 1 December 2004)..

3.4 Potential of P2P to Organizations

            With P2P, users can sell digital goods directly from their computers rather than going through centralized servers. If users want to sell on eBay, they are required to go through eBay’s server, putting an item on eBay’s site and uploading a photo. However, if an auction site uses file sharing, it can direct customers to the seller’s PC, where buyers can find an extensive amount of information, photos, and even videos about the items being sold. In this case, an auction site serves as an intermediary, making the C2C link between the sellers and buyers.

Several companies are using P2P to facilitate internal collaboration. For example, in 1990, Intel wrote a file transfer program called NetBatch, which allows chip designers to utilize the additional processing power of colleagues’ computers across sites in California, Arizona, and even foreign countries such as Israel. Intel saved more than $500 million between 1992 and 2001 (Intel.com, 2001).

Furthermore, P2P could be a technology panacea for systems innovators building B2B exchanges. With P2P, people can share information, but they are not required to send it to an unknown server. Some companies fear that exchanges make it possible for unauthorized personnel to gain access to corporate data files. P2P applications enable such companies to store documents in-house instead of on an unknown, and possibly unsecured, server. Several companies are using the P2P architecture as a basis for speeding up business transactions, as shown in the following examples.

  • Hilgraeve of Monroe, Michigan, has a technology called DropChute that establishes a connection between two computers and allows users to transfer files. The company has won a U.S. patent for its P2P communication process that touts four levels of encryption and virus-scanning protection.
  • Fort Knox Escrow Service in Atlanta, Georgia, which transmits legal and financial documents that must be highly secure, has leveraged DropChute to enable clients to deliver material electronically. ‘Instead of having to wait for an overnight package, we can do it all over the Internet,’ said Jeanna Israel, Fort Knox’s director of operations (Lason, 2001).
  • Certapay.com is a P2P e-mail payment platform that enables e-banking customers to send and receive money using only an e-mail address.

Peer networks effectively address the Web’s B2B deficiencies. The model is a natural fit for the needs of business, as business relationships are intrinsically peer to peer. Peer networks allow businesses to communicate, interact, and transact with each other as never before by making business relationships interactive, dynamic, and balanced—both within and between enterprises. However, the success of P2P in B2B is not guaranteed. It depends in part on the ability of the technology to address security and scalability issues.

There are several potential applications of P2P in marketing and advertisement. For example, fandango.com is combining P2P with collaborative filtering. Assuming a user is conducting a search for a product, Fandango’s product will work as follows:

  1. The user enters a search keyword.
  2. The keyword is sent to 100 peers, which search local indexes of the Web pages they have visited.
  3. Those computers also relay the query to 100 of their peers, and that group submits it to 100 of theirs, yielding, in theory, up to 1 million computers queried.
  4. The resulting URLs are returned to the user, weighted in favor of the most recently visited pages and peers with similar interests.

Also, P2P enables new e-commerce applications. Although sensitive information may require special security arrangements and many users may encounter scalability issues, P2P is a very promising technology. Recently, Bleicher (2001) introduced that the final frontier of P2P computing is distributed computing. The vast majority of time, the central processor of your computer is idle. You and your programs just don’t give it enough work to do, especially when you are away from your desk. Distributed computing makes use of those idle “cycles” by distributing a large computing project among many computers.

The best known example of this is SETI–the Search for Extraterrestrial Intelligence project started a few years ago at the University of California-Berkeley. It uses the Internet to distribute the massive computational workload required to analyze radio telescope signals for signs of intelligent life. More than 2.4 million people have a screen saver that does complex mathematical computations continuously when their computers would otherwise be idle. When a computation is completed (sometimes taking a day or two), the results are uploaded to the central server, and a new problem is downloaded. The result is a distributed computer that is more powerful than the fastest computer in the world.

Lest we think that this concept is restricted to odd applications, consider this: Juno, one of the largest suppliers of free Internet access, just announced that it is creating a “virtual supercomputer network.” Users of its free Internet service will be required to install a screen saver that will use the idle computational power of their computers. Juno plans to sell this capacity to pharmaceutical and biotechnology companies that require massive computation power for bioinformatics, including genomics database mining.

Since P2P computing uses the processing power of millions of individual personal computers to perform a variety of applications, many have considered it to be applied in organizations. Philanthropic examples of these applications include searching for extraterrestrial intelligence (setiathome.ssl. Berkeley.edu), exploring AIDS treatments (fightaidsathome.org), and researching potential cancer drugs (ud.com). But peer-to-peer is also about companies tapping their employees’ PCs to analyze numbers or share knowledge. Examples of the P2P model providing benefits for businesses include:

  • J.P. Morgan Chase & Company (jpmorganchase.com) uses P2P for large processing tasks such as risk management calculations.
  • The law firm of Baker & McKenzie (bakerinfo.com) uses P2P to capture and share the knowledge of its 3,000 attorneys in 60 offices.
  • Pratt & Whitney (prattwhitney.com) uses P2P to analyze engine turbine blades.
  • Intel (intel.com) uses P2P to test new chip designs.
  • Boeing (boeing.com) uses P2P to analyze the fuselages of fighter planes.
  • IBM’s (ibm.com) computing grid, known as the Distributed Terascale Facility (DTF), is capable of 13.6 trillion calculations per second. The grid enables scientists to share computing resources in search of breakthroughs in many disciplines.

However, P2P does have its disadvantages. Handing off sensitive corporate data to unknown PCs over the public Internet poses serious technical and security issues. PCs vary in power, operating systems, and availability. Also, if a participant shuts down his or her PC during a crucial bit of computation, that computation must be reassigned. Limited bandwidth can slow up network traffic, forcing some pieces of processing to wait in long queues.

Another disadvantage to P2P networks is how to obtain the needed numbers of machines for effective computing. With some projects, people are happy to volunteer their unused computing capacity for a good cause. However, it is unclear if people would be willing to volunteer to help businesses. As a result, some businesses are offering money to participate. For example, a company called Distributed Science (distributedscience.com) rents CPU cycles and network bandwidth on the public’s personal computers. Most of the value of P2P computing is not yet developed or even conceived.

4. Research Findings

With the development of networks, information has multiplied and become a critical resource. Within this sea of information, we have some that others may want. There is no reason for us not to provide it to them, but we have no system that allows us to find one another effectively. Napster overcame the challenge for music. By allowing its users to find the content they were looking for anywhere in the world, Napster greatly increased the content available to all.

Most of the technical requirements to handle this type of file-sharing community are already available. Today’s file-sharing software has only marginal use outside the music industry, but that will change. P2P computing allows people and companies to significantly decrease the complexity and expense of networking. It is not difficult to create Napster- or Gnutella-like programs that are specific for other communities, such as business to business (B2B). XML will be a major enabler for interoperability among networks.

P2P technology is already giving rise to new ways of networking and collaborating directly with peers and avoiding the complexities of centralization. It is not hard to imagine the value of a private collaborative network. The idea of sharing spaces with peers and working together without redundancy on a project is fundamental in any industry. Security, however, is a serious concern even in traditional networks that depend on centralized servers, and P2P networks exacerbate the problem. “The old days [i.e., the current Internet] were all about centralization and control, almost Soviet style,” said Miko Matsumura, the CEO and cofounder of Kalepa Networks, a 3-year-old start-up that intends to link P2P networks and create a sort of alternative Internet. “In this new topology, everyone brings his or her own resources. The new network will be built on top of the old network, like Rome was built in different layers.”

One recognized application using aggregated resources is the SETI@home project. Launched May 17, 1999, at the University of California, Berkeley, the project set out to help find extraterrestrial life beyond our solar system. Radio-frequency signals received from outer space are distributed among the participating computers in the network to enable an unprecedented number-crunching application, as those computers use the minute gaps in their processing time to analyze chunks of the signals for a repeated pattern that would indicate an intelligent source. To date, over 2.8 million users in more than 226 countries have contributed nearly 583,000 years of computing to the cause. This computing rate of 25 teraflops more than doubles the speed of IBM’s ASCII White, the fastest supercomputer in the world.

The power behind this direct two-way exchange of information and the vast pool of underused resources represent a wealth of opportunities waiting to be exploited. Pioneering individuals and firms have organized, planned, and are already off and running in search of the pot of gold that awaits them. Before declaring any winners, it is imperative to consider what lies beneath the P2P technology, understand the sources of value, consider the landscape as it exists today, identify the technology enablers, and address the barriers that still need to be overcome. Clay Shirky, from the Accelerator Group, indicates that two questions test whether we have a P2P network:

  1. Does it allow for variable connectivity and temporary network addresses?
  2. Does it give the nodes at the edges of the network significant autonomy?

If the answer to both questions is yes, the application qualifies as peer to peer. P2P computing allows for the sharing of resources and information through the direct linking of systems within a network. Since the Internet is the most extensive computing network available, P2P computing can take advantage of that existing infrastructure to connect peers. In a typical Internet client/server interaction, the computer at the edge of the network acts as a client while central servers store information. In the P2P system, an additional software layer enables the computers at the edge of the network to act as clients and/or servers depending on the need. Now the computer is able to respond to a request for information or computing resources from another peer computer. This deviation from the classic client/server system has led to common network topologies: pure P2P and hybrid P2P. Each offers advantages and is coexisting with the current Internet topology.

4.1 SETI@home Project

One dramatic use of P2P is aggregating the unused computing power of individual personal computers into a computer power grid to create a virtual supercomputer. Some analysts suggest that today’s Internet-connected personal computers with their vastly improved resources represent more than 10 billion Mhz of processing power and 10 thousand terabytes of storage — with more than half of that capability going unused! An interesting example of tapping into that unused computing power is the SETI@home project.

When researchers at the University of California at Berkeley faced a lack of funds to pay for computer analysis of radio telescope data gathered for the SETI (Search for Extraterrestrial Intelligence) SERENDIP project, they found a unique solution: harnessing the unused power of thousands of personal computers around the world. The SETI@home project allows people interested in participating in the radio telescope data analysis to download a screensaver program from the SETI@home Web site. When their computer is turned on but not being used, the screensaver program downloads chunks of radio telescope data and examines it for patterns. The results of the analysis are then uploaded to the SETI@home project servers and combined with the patterns analysis performed by thousands of other personal computers participating in the project. The SETI@home project has likely surpassed the wildest dreams of its creators as more than 3.7 million people have downloaded the software, contributing almost one million years of computing time to the project.

4.2 Napster

Peer-to-peer’s next big event was Napster, an e-business that allowed users to share music and other files (stored on their individual personal computers) with other users via the Internet. Technically speaking, Napster didn’t employ a pure P2P network. That’s because, in a pure P2P network, every file search traverses the entire P2P network, returning a list of all computers on the P2P network that contain a copy of the searched for file. By contrast, Napster relied on a master server that maintained a searchable list of all available files and the files’ locations. (A pure P2P network would not include a master server.) To find a music file, Napster users first searched for files on the master server list and then directly contacted the individual computer where the files were stored to download the files.

File swapping — especially music files — exploded in popularity and soon millions of people were using Napster to share music files. Of course, the recording industry was not pleased and several suits were filed against Napster for copyright violations. The courts agreed with the recording industry that Napster’s music file sharing operations violated copyright laws. In the summer of 2001, the courts ordered Napster to shut down its Web site until it could demonstrate a reliable way to prevent the swapping of copyrighted files. By June 2001, the embattled Napster, without revenues or sources of additional funding, filed for bankruptcy protection and its assets were acquired by Bertelsmann AG, the German media conglomerate.

4.3 P2P Technologies: Gnutella, KaZaA, and JXTA

The courts’ actions against Napster did not stop file sharing or progress in the development of P2P technologies, of course. One example of a pure P2P technology is Gnutella. The Gnutella open-source P2P technology (developed originally by the late Gene Kan and others) allows file sharing directly between individual computers across an IP network. However, Gnutella exposed a fundamental inefficiency in many P2P networks: it didn’t scale well. In other words, as more people became part of the Gnutella P2P network, file searches became slower and slower, because each file search had to traverse the entire P2P network.

The solution to this P2P scalability problem was the introduction of supernodes, computers with fast connections that act as hubs for several computers. The supernodes can also be supplemented by super-supernodes, computers that act as hubs and indexes for multiple supernodes. Thanks to the supernode solution, P2P file searches now only have to traverse a limited number of super- and super-supernodes. One e-business that uses this technique to allow file searching and sharing between its customers is KaZaA, which has a P2P network developed by Sharman Networks.

The Gnutella network is declining in size now that a better solution (supernodes) is being provided by competitors such as KaZaA. By some measures, the Gnutella network is, as of this writing, only about one-quarter the size it reached in mid-2001 at the height of its popularity. Other technology providers such as Sun Microsystems, Inc. are also involved in developing P2P technologies that can be used for more than just music file sharing. For example, Sun’s JXTA project involves developing P2P technologies that would allow any devices connected to a network (such as servers, personal computers, wireless cell phones, or PDAs) to communicate with each other in a P2P environment.

4.4 P2P Payments

Person-to-person (P2P) payments is one of the newest and fastest-growing payment schemes. They enable the transfer of funds between two individuals for a variety of purposes like repaying money borrowed from a friend, sending money to students at college, paying for an item purchased at an online auction, or sending a gift to a family member. One of the first companies to offer this service was PayPal (paypal.com). PayPal claimed, in late 2001, to have 8 million customer accounts, handling 25 percent of all eBay transactions and funneling $5 billion in payments through its servers annually. Although PayPal had not made a profit by fall 2001, this kind of business activity has drawn the attention of a number of other companies who are trying to get in on the action. Citibank C2IT (c2it.com), AOL QuickCash (aol.com), One’s Bank eMoneyMail, Yahoo! PayDirect, and WebCertificate (webcertificate.com) are all PayPal competitors.

Virtually, all of these services work in a similar way. Assume you want to send money to someone over the Internet. First, you select a service and open up an account with the service. Basically, this entails creating a user name, a password, giving them your e-mail address, and providing the service with a credit card or bank account number. Next, you add funds from your credit card or bank account to your P2P account. Once the account has been funded you’re ready to send money.

You access PayPal (for example) with your user name and password. Now you specify the e-mail address of the person to receive the money, along with the dollar amount that you want to send. An e-mail is sent to the payee’s e-mail address. The e-mail will contain a link back to the service’s Web site. When the recipient clicks on the link, he or she will be taken to the service. The recipient will be asked to set up an account to which the money that was sent will be credited. The recipient can then credit the money from this account to either his or her credit card or bank account. The payer pays a small amount (around $1) per transaction.

4.5 Other Commercial Use of P2P

And how are e-businesses using P2P today? As of now, P2P technologies are being used to share drug data between more than 10,000 employees and researchers at a major drug manufacturer. One technology provider uses P2P technologies to distribute computer-based training materials to employees more efficiently. And a major aircraft manufacturer uses P2P technologies (instead of a wind tunnel) to create a computing power grid that simulates wind flow over a structure. According to Bleicher (2001), other companies are already using distributed computing to model influenza, the interaction of chemotherapeutics with cancer cells, protein folding, and other complex biological systems.

4.6 Educational Uses of P2P

            The advent of the P2P revolution (which began around 1999 with Napster and also with the introduction of new compression formats for digital media such as mp3 and Divx) has provided language teachers with a potentially invaluable, multi-faceted didactic tool, thanks to the ease of use, availability and flexibility of digital realm it can provide. Obviously some digital realia are more effective than others.

We all have, at one point or another, used a Webpage (or online newspapers or streaming TV newscasts) in the target language to illustrate a linguistic or a cultural point. According to Bregni (2006), there are two formats of digital realities which prove to be the most effective in teaching: songs (music videos primarily, then audio) and videos (TV shows such as TV series, cartoons, soap operas, but also commercials and short movies). Also, the most effective results are obtained through diverse combinations of media. Such variety would have been nearly impossible before the advent of the P2P revolution, and since then the horizons are constantly widening.

            Also, Turban & King (2003) had seen the potential of P2P in edutainment. Edutainment is a combination of education and entertainment, often through games. One of the main goals of edutainment is to encourage students to become active rather than passive learners. With active learning, a student is more involved in the learning process, which makes the learning experience richer and the knowledge gained more memorable. Edutainment embeds learning in entertainment to help students learn almost without their being aware of it. The media often shared in P2P is a great tool that could be shared with students swiftly and conveniently.

5. Gnutella: A Case Study

            The Gnutella network is a P2P network that was designed to allow users the ability to share files without worrying about network access being blocked by system administrators. Targeted towards university students, it instantly became a popular network because of its many built-in privacy measures. Over the course of its continued use, it has gained a lot of attention from developers who have produced dozens of client applications that could be used to access the network (Piccard 2005, p. 240).

            Compared to Napster software (and that of comparable systems like iMesh), Gnutella operates on a pure P2P topology. Instead of working through a central server, Gnutella creates a network of users over the Internet and transmits the search request from user to user until the file is found.  While Napster connects users to other users, but does not store any music, pass any music through its servers, or connect users on one server to those on another server, Gnutella has no company behind it and no central servers, so it is very difficult to “shut off” or regulate. Gnutella (as well as iMesh, Morpheus, and KaZaa) differs from Napster in that it can also carry any kind of file, not just WMA or mp3 music files. Gnutella is also open source software; consequently, many different versions of Gnutella-style interface exist on the Web, with names like Gnotella, Gnut, Limewire, and Bearshare (Jones & Lenhart, 2004).

Gnutella was designed and created by the two founders of Nullsoft, the company that also created Winamp, SHOUTcast, and WASTE. Justin Frankel and Tom Pepper designed the protocol and its related client application to help user’s easily share data, while allowing for total control over how the data was shared. This control allowed knowledgeable users to bypass firewalls and enter networks without centralized servers. Nullsoft started out as a small business; however, America Online purchased it in June 1999, with the founders staying on as full-time developers. The protocol and the original Gnutella application were unveiled in March 2000. It was first publicized on the Web site Slashdot.org, where tens of thousands of users downloaded it within the first day (Piccard 2005, p. 241).

There were also plans to release the source code to the Internet public, but these plans were short lived. Within a day after the original program was announced, America Online forced the removal of the source code from Nullsoft’s Web site. This move was not unexpected; the program was originally described as being an alternative service for transferring copyrighted material and any other forms of data across the Internet. This raised too many legal concerns for America Online, and the developers were ordered to stop work on the project. However, the closure was not quick enough to prevent the source code from ending up in the hands of thousands of users. Some of the more enterprising computer experts immediately began reverse-engineering the protocol used by the application, to design their own clients (known as servents) that could be used with the network.

5.1 LimeWire

LimeWire is a free, open-source Gnutella client located at http:// www. limewire. com. It is programmed in Java, which allows it to run on any operating system with a Java engine (e. g., Microsoft Windows, Linux, and Mac OSX), and is the most popular Gnutella client currently used (see Figure 9.1). In late 2005, under pressure from the U. S. legal system and the Recording Industry Association of America (RIAA), LimeWire started altering their client to reduce the sharing of copyrighted material on the Gnutella network. This announcement caused the LimeWire source code to be forked, which means that the developer used existing code to build a spin-off of the original application.

Figure 4. The LimeWire Client

5.2  BearShare and Morpheus

BearShare is another popular Gnutella client that was developed by Free Peers, Inc.

and released as a closed-source application. It is currently provided in different variations,

from an ad-supported free version to an ad-free commercial version. BearShare can be downloaded from its Web site at http://www.bearshare.com. On the other hand, Morpheus is a P2P client that has traveled a variety of P2P networks. Though originally designed for the OpenNAP network, it was moved to the FastTrack protocol after OpenNAP’s RIAA-induced shutdown. Morpheus’ stay with FastTrack was short lived, though, as it was barred from the network in February 2002. It then recreated itself as a Gnutella client, using existing code made available by Gnucleus.

5.3 Gnutella Architecture: Strengths and Weaknesses

The Gnutella network is based on a decentralized architecture, making its topology a pure P2P network that does not involve central servers for authentication, indexing, or assisting in file searches. This architecture eliminated the problems associated with a centralized server, including traceability and scalability, thus allowing the network to grow as users connect to it. When connecting a Gnutella servent to a network, it must have some knowledge of which systems to connect to in order to join the network and begin sharing files. Gnutella servents connect to several Internet Protocol (IP) addresses that were installed with the client software. IP addresses are aware of other servents on the network, and transfer information to them with the one that has just connected.

Due to some of the many features and design implementations of the Gnutella network, it is open to a wide variety of security risks. Many of the issues that arise from using the Gnutella network are due to the nature of the content that is traded within it. The main issues covered here are the type of data that could be downloaded from the FastTrack network, and the type of data that could end up being shared across the network.

Furthermore, with the Gnutella architecture it is more difficult to catch are the users, which glide in a peer-to-peer file sharing system in which there is no central directory. Instead of keeping a central directory on the Web, Gnutella establishes direct links between its users. Gnutella supports the sharing of all kinds of files, including movies. The growing popularity of Gnutella threatens the music and movie industry. Because there is no central directory in Gnutella, many Gnutella users assume they are anonymous and therefore safe from detection when they share illegal copies of movies. Behind the scenes, however, the Gnutella user’s IP address is associated with each file transfer. In a criminal proceeding, courts can order an ISP to identify the name of the user associated with an IP address.

Citing the Digital Millennium Copyright Act, the Motion Picture Association of America (MPAA) has begun sending legal notices to the ISPs of Gnutella users sharing pirated movies (Wall Street Journal, 4 May 2001). The MPAA wants to identify the perpetrators and stop this peer-to-peer distribution of illegal copies.

Beside the tightening rules about copyrights, Gnutella users also could face some problems like improperly configuring the file sharing program, which puts the system at risk by sharing confidential or sensitive files, or possibly an entire hard drive. Another problem seen is that the use of P2P applications such as Gnutella can welcome possible viruses and trojan software into a network. While blocking and monitoring Gnutella traffic can be difficult, there are a few basic firewall and IDS rules that could be implemented to do so. This makes downloading unsuccessful or could take a lot more time.

6. Research Analysis

            Though P2P computing is not completely revolutionary, technological advances have enabled it to progress to where it is today and will further propel it into the future. However, there are some problems foreseen: Increasing bandwidth, hybrid centralized/decentralized models, network externalities, scalability, and security. These issues should not be ignored because of the sheer growth in the number of participants have enhanced the P2P offering. If organizations want to be assured of having P2P’s benefits at the maximum, these issues should be pondered upon with scrutiny.

6.1 Bandwidth Availability

Widespread use of broadband enables P2P computing because it facilitates communication between the nodes at the edge of the network. “Last mile” bottlenecks limit the speed of data transfer on the network. This becomes even more critical in P2P networking, since a majority of the communication occurs over the last mile. The rollout of broadband makes P2P computing much more powerful. Although the increase of broadband has not lived up to expectations, its steady growth improves the prospects for the success of P2P computing.

P2P applications are changing the assumption that end users want only to download from the Internet and never upload. The Web was the killer application of the Internet, and it is made up mostly of clients, not servers. New P2P players face great challenges sparked by the rise of asymmetrical network connections such as cable modems and Asymmetric Digital Subscriber Line (ADSL). Until this asymmetry is resolved, digital subscriber line (DSL) providers have an advantage in the race to enable P2P networking because of their relatively symmetrical technologies.

6.2 Hybrid Centralized/Decentralized Models

Napster struck oil by integrating the benefits of the direct P2P connection with the centralized directory search to efficiently pair up the seeker and provider of content. P2P success stories will properly balance this blend of centralization and autonomy to facilitate people’s tasks in a more efficient manner or allow them to complete new functions that are not even out there today. As noted, the goal is to be just decentralized enough.

6.3 Network Externalities

As the number of participants grows, the available shared resources (content, storage, or processing) increase as well. The increase in available resources then attracts new participants, and the cycle continues. Napster managed to build a database of more than 50 million addresses. Virtually any song from any genre was available because of the enormous number of participants. For example, Napster changed the economics of the music industry because the marginal cost of downloading a song is virtually zero. By creating one more copy of a song in the network, a user increased the chances that the next user who searched for that song would find it. Therefore, accidentally, value was created in the network. Designing systems that create value automatically will be key for succeeding in the P2P space.

6.4 Scalability

By using the resources of the end user, peer networks benefit from the scalability inherent in the network. In the corporate environment, instead of fighting the bureaucratic channels to obtain additional storage and processing capacity for the servers as the users grow in numbers, each new user brings the incremental resources to grow the network.

6.5 Security

Security and authentication are two crucial factors that will determine the success or failure of P2P networks. It is predictable that P2P networks will become the killer application for virtual private networks. It is hard to imagine those networks without state-of-the-art security systems. Some players in the P2P space have already created maverick forms of security. These include one pass phrase per account, one asymmetrical key per account, and one asymmetrical key pair per identity for signature/verification, and another asymmetrical key for encrypting/decrypting symmetrical keys. There is little doubt that the latest technological advances that allow for increasing security will be key for these applications.

7. Conclusion and Recommendations

As peer-to-peer (P2P) technology allows direct communication for sharing files and for collaboration, its potentials are limitless. Although Napster gets a lot of publicity for its support of music and game sharing among millions of its members, the same technology is used in both B2B and in intrabusiness. P2P’s importance is magnified because knowledge management and e-learning are using the same ‘coin of the realm’—knowledge. Knowledge is one of the most important assets in any organization, and so it is important to capture and apply it.

Whereas e-learning uses that ‘coin’ for the sake of individual learning, knowledge management uses it to improve the functioning of an organization. This is one of the major purposes of knowledge management. In our view, knowledge management (KM) refers to the process of capturing or creating knowledge, storing and protecting it, updating it constantly, and using it whenever necessary (Turban & King, 2003). Knowledge is collected from both external and internal sources. Then it is examined, interpreted, refined, and stored in what is called a knowledge base, the repository for the enterprise’s knowledge.

P2P enhances the knowledge part and could help the organisation achieve its mission and vision by making it work efficiently and effectively. The P2P network requires and allows minimal interaction between clients. Figure 5 illustrates the lack of communication between peers. The focus is on the interaction between the client and the server. Information has to be manually transferred between the clients with no direct means of communication.

Figure 5. The Functionalities of P2P-Enabled Client Interactions (Source: Gulati et al., 2003)

P2P applications will allow clients to interact in ways that do not erect barriers between them. Clients will be able to peer through the firewall and share information and resources. These applications will bring together in a unique way the functional capabilities already offered in the market.

7.1 File Sharing

The current killer application for P2P computing has certain key value drivers. The success of file sharing depends on both distributed content storage and dynamic search capabilities. Initially, the information must be indexed in a way that makes it accessible and understandable. Then a powerful search capability is required to find the data. This can be implemented through a system in which the information either resides on central servers or is created dynamically by each user who logs on. The ability to distribute content storage prevents overburdening any individual node and optimizes storage usage.

7.2 Instant Messaging

Instant messaging is another popular function in P2P computing. The critical value driver in instant messaging is direct communication/information distribution. To find another user on the network, the instant messaging system must allow users to directly access intermittently connected computers that have changing IP addresses. This capability allows the program to connect any two or more users and allows them to communicate directly.

7.3 Distributed Computing

The key value driver of distributed computing is parallel processing. The ability to take a large task, break it into manageable pieces, and then send it out to individuals is at the heart of distributed computing. The value of direct communication further enhances this proposition by enabling direct connectivity with the nodes.

7.4 Role of P2P in Organisations

Rather than users’ receiving information from a network, there will be true two-way communication, and the computers at the edge of the network will play a critical role. This will require more computing resources in the networked computers in organisations. The resources will be used for many applications, including collaboration, knowledge management, games, and enterprise processing. As P2P computing continues to develop, we will see further integration of the different functions into more sophisticated applications. This includes the integration of instant messaging and file sharing into the collaboration application, which is already occurring. This trend will continue as more of the functions come together to create killer applications.

The ability to build a network around individual PCs dramatically changes the network architecture. Instead of relying on central server farms that are vulnerable to going down, the network is more stable by being spread throughout the system.

For instance, two approaches potentially offer viable business models in the knowledge-management area. These are software licensing and solutions providing. The first, licensing, is straightforward in that a vendor would provide a software program allowing a company to index, store, and retrieve information. This is the first area of focus for companies interested in the market. As additional competitors enter the market, however, this space will become commoditized. Those vendors who understand the functions of the company will be able to add more value as solutions providers and generate higher revenues. In addition, a vendor might be able to use the transaction model on a megabyte-per-megabyte-of-data basis to generate revenue.

Also, the aggregators of computing power who seek to serve the enterprise market must focus on securing a transaction-based revenue stream or becoming a solutions provider. The transaction-based system is the first step and is attractive for customers whose tasks are not critical. These customers would be willing to let their processing be done outside their firewalls and pay a per-usage charge. However, the majority of major corporations would like to maintain control over their processing. In these cases, P2P computing vendors will need to propose an enticing value proposition that will allow them to get their foot in the door. That entails becoming a solutions provider and understanding the inner workings of a customer’s business and the intricacies of P2P computing.

In addition, the incremental investment to expand a network is dramatically reduced. Rather than buying a new server to expand the network, firms can simply add another PC to the system. This means that building relationships will drive the future development of P2P computing; the technology will be an enabler. To build a P2P network, traditional walls of separation must be torn down. This applies in both the corporate setting and the home-user environment.

Even within corporate divisions, the traditional mode of operation is to divide the business into departments and maintain autonomy and information within those silos. The influence of P2P computing will help tear down these divisions. The information that is available in one part of the corporation will be readily accessible to all. That presents some management challenges in implementing P2P systems. Managers unfamiliar with the technology behind peer-to-peer systems may be reluctant to embrace what may appear to be radical changes. Although adoption may initially be difficult, tremendous returns await companies taking advantage of the benefits of P2P computing.

Despite its advantages, P2P networking has its obvious weaknesses. As seen in the Gnutella Case Study, P2P networks that use Gnutella architecture has enormous risk when used in an office because it could accidentally share vital or confidential records. It is recommended that not to use them in corporate networks, outside the firewall, on a test system, or on a system set up only for P2P. In all realty, we really should not be using a freeware file-sharing program inside your company’s network, unless it is for business reasons and purposes. If you have to use a file-sharing program, it is advisable that we make sure to go through the default configuration settings and set up security to your personal needs and wants.

In this day and age of globalisation and information, P2P takes organisations at the forefront of their business. P2P networking will definitely change the way we communicate and share information. Whether it becomes a revolution, as Patrick Gelsinger, of Intel, predicted, or simply merges the centralized model of today with the decentralized model of yesterday, we all stand to gain from the functionality it brings. As all technological tools, we should use it fairly, legally and ethically so that we will not trample the rights of others.

8. Bibliography

Adar, E. and Huberman, B.A. 2000. “Free Riding on Gnutella.” First Monday 5(10).

Alexander, P.J. 2002. “Peer-to-Peer File Sharing: The Case of the Music Recording Industry.” Review of Industrial Organization 20(2): 151.

Associated Press. 2004. Illegal Music Downloading Climbs. The New York Times, (January 15).

Barkai, D. 2000. An Introduction to Peer-to-Peer Computing. Acquired online 10 September 2006 at http://www.intel.com/technology/magazine/systems/it02012.pdf

BBC. “Music Swapping on the Net Rising.” October 9, 2002. Retrieved 25 August 2006 at http://news.bbc.co.uk/1/hi/technology/2309671.stm.

Bleicher, P. 2001. “The Next New Thing.” Applied Clinical Trials, 10(3, March).

Bregni, S. 2006. “Enhancing language & culture learning through P2P.(peer-to-peer networks).” Academic Exchange Quarterly 10.1 (Spring): 33-39.

Elkin, N. 2002. “The Future of the P2P Transaction Market.” Entrepreneur.com, (January 2).

France, M. and Grover, R. 2003. “Striking Back.” Business Week, (September 29).

Gallaway, T. and Kinnear, D. 2001. “Unchained Melody: A Price Discrimination-Based Policy Proposal for Addressing the Mp3 Revolution.” Journal of Economic Issues, 35(2): 279-287.

Godwin-Jones, R. 2005. Emerging Technologies: Messaging, Gaming, Peer-to-Peer Sharing Language Learning Strategies & Tools for the Millennial Generation. Language, Learning & Technology, 9(1), 17

Goldstein, P. 2003. Copyright’s Highway. Stanford, Calif.: Stanford University Press.

Gomes, L. 2001. “Entertainment Firms Target Gnutella”. Wall Street Journal, (May 4) p. B6.

Green, H. 2003. “The Underground Internet.” Business Week, (September 15): 80-82.

Gulati, R., Sawhney, M., & Paoni, A. (Eds.). 2003. Kellogg on Technology & Innovation. Hoboken, NJ: Wiley.

Hute, S.E., Burrus, A. and Hof, R.D. 2001. “In Search of the Net’s Next Big Thing,” Business Week, (March 26): 141.

Jackson, P., Harris, L., & Eckersley, P. M. (Eds.). 2003. E-Business Fundamentals: Managing Organisations in the Electronic Age. New York: Routledge.

Kurose, J. F., & Ross, K. W. 2003. Computer Networking: A Top-Down Approach Featuring the Internet. Boston: Addison-Wesley.

Leuf, B. 2002. Peer to Peer: Collaboration and Sharing over the Internet. Boston: Addison-Wesley.

Mann, C.C. 2003. “The Year the Music Dies.” Wired Magazine, (February).

North, D.C. 1992. “Institutions and Economic Theory.” American Economist 36(1): 3-7.

O’Brien, J.A. 2003. “Chapter 4: Telecommunications and Networks”. Management Information Systems: Managing Information Technology in the Business Enterprise, 6th ed. New Jersey: The McGraw-Hill Companies.

Oberholzer, F. and Strumph, K. 2004. “The Effect of File Sharing on Record Sales: An Empirical Analysis.” Working paper.

Peter, B., Tim, W., Bart, D., & Piet, D. 2002. A Comparison of Peer-to-Peer Architectures. Ghent, Belgium: Broadband Communication Networks Group (IBCN), Department of Information Technology (INTEC), Ghent University.

Piccard, Paul. 2005. Securing IM and P2P Applications for the Enterprise. Rockland, MA: Syngress Publishing.

Pride,W.M. and Ferrell, O.C. 2000. Marketing: Concepts and Strategies,  Boston: Houghton Mifflin.

Recording Industry Association of America (RIAA). 2003. “Music Industry Unveils New Business Strategies and Combats Piracy during 2002.”

Roehl, R. andVarian, H. 2000. Circulating Libraries and Video Rental Stores.

Ropelato, J. 2006. P2P Networking – Kids Know! Do Mom and Dad? Retrieved 25 August 2006 at http://internet-filter-review.toptenreviews.com/peer-to-peer-file-sharing.html

Segkar, A. 2003. “The Napster Decision and Beyond – a Look at Music Copyright Issues in the Internet Age,” Journal of the Intellectual Property Society of Australia and New Zealand, (June): 28.

Shirky, C. 2001. Listening to Napster. In A. Oram (Ed.), Peer-to-Peer: Harnessing the Power of Disruptive Technologies. Sebastopol, CA: O’Reilly.

Staiman, A 1997. Shielding Internet Users from Undesirable Content: The Advantages of a PICS Based Rating System. Fordham International Law Journal 866

Tidd, J., Bessant, J. and Pavitt, K. 2005. Chapter 1: Key Issues in Innovation Management. Managing Innovation: Integrating Technological, Market, and Organizational Change, 3rd ed. NJ: John Wiley & Sons, Ltd

Turban, E. and King, D. 2003. Chapter 7: Intrabusiness, E-Government, C2C, E-Learning, and More, Introduction to E-Commerce, 1st ed. NY: Pearson Education, Inc.

Yang, B. H., & Garcia-Moline 2002, February. Designing a Super-Peer Network. Stanford, CA: Stanford University Press.

Yu, P.K. 2003. “The Escalating Copyright Wars.” Lecture in the Frontiers in Information and Communications Policy 2003 Lectures Series sponsored by the James H. and Mary B. Quello Center for Telecommunication Management and Law at Michigan State University.

Related Topics

We can write a custom essay

According to Your Specific Requirements

Order an essay
icon
300+
Materials Daily
icon
100,000+ Subjects
2000+ Topics
icon
Free Plagiarism
Checker
icon
All Materials
are Cataloged Well

Sorry, but copying text is forbidden on this website. If you need this or any other sample, we can send it to you via email.

By clicking "SEND", you agree to our terms of service and privacy policy. We'll occasionally send you account related and promo emails.
Sorry, but only registered users have full access

How about getting this access
immediately?

Your Answer Is Very Helpful For Us
Thank You A Lot!

logo

Emma Taylor

online

Hi there!
Would you like to get such a paper?
How about getting a customized one?

Can't find What you were Looking for?

Get access to our huge, continuously updated knowledge base

The next update will be in:
14 : 59 : 59