Where is processing done in p2p applications
A P2P process that incorporates best practices can save companies money. It should include limitations on what can be purchased outside of the standard P2P process.
By allowing non-standard purchases, the incidences of unnecessary costs rise and policing these purchases can waste resources that could be saved if the correct P2P process was used.
Actively scan device characteristics for identification. Use precise geolocation data. Select personalised content. Create a personalised content profile. Measure ad performance. Select basic ads. Create a personalised ads profile. Select personalised ads. Apply market research to generate audience insights. Measure content performance. Develop and improve products.
List of Partners vendors. Supply Chain Management Strategic Management. Although these programs are no longer in play, P2P file sharing is alive and well think BitTorrent and the like. Even instant message IM clients can serve this function, since the majority of them support sharing files in addition to chatting.
While there are numerous legitimate uses for P2P networking, the file sharing aspect raises both intellectual property and cybersecurity concerns. Any time people are sharing music, movies, software, or any other proprietary content, questions of intellectual property and copyright laws surface. In fact, some internet service providers have attempted to ban torrents and other P2P applications, despite the valid and perfectly legal functions P2P can serve.
They are also highly vulnerable to denial of service attacks, since each device helps route traffic through the network. We compare these levels as akin to intranet, extranet, and Internet for information sharing.
As an organization moves from one level to another it expands the scope of the node pool both in terms of diversity and geographical distribution. The scope expands from completely internal, to partners, to completely external, leading to an increase in the complexity of issues discussed earlier.
As the scope moves from being completely internal to completely external, the level of direct control that an organization can exert on the node pool diminishes and the organization needs to resort to complex mechanisms to ensure performance. In discussing the models the issue of pricing or rewarding resource use and enabling technologies are also briefly discussed.
As discussed before and generally accepted in the industry, all knowledge-intensive and creative work is best done on desktop type machines [9], with good processing and storage power, and ability to run an appropriate userinterface. Gartner consulting also supports this assertion stating the desktop computer is not going to be replaced entirely [38]. It is anticipated that organizations will continue using desktop machines for a foreseeable future. These desktops and other machines can be centrally managed using technologies such as Microsoft Active Directory and Group Policies.
This along with system-level management utilities like Intel System Management allows a fine grained and low level control of hardware and software on a machine e. Routing, searching, and other resource co-ordination can be done much more efficiently since the pool of computing devices is relatively static and deterministic.
As such, an organization should be able to use the resources on the employee end-user machines while having a high level of control. This model would be ideal for most organizations to deploy their P2P applications.
Hence we can offer the following propositions:. Proposition 1: For most organizations, the way forward with P2P computing would be a B2E model using employee desktops. A distributed model spread across thousands of enduser machines is most likely to be infeasible for most practical business applications since, in the highly distributed model, sufficient control for performance and liability security cannot be implemented with reasonable overhead. Proposition 2: For business applications, a strong central control is desirable and recommended so that reasonable SLAs can be ensured.
Table 3. P2P business application models. This is most likely achievable with a strong central control that is present in a B2E model. Strict central control ensures that issues of performance, end-user compensation, business liability, end-user liability, and security are in a deterministic state at all times. Totally or highly decentralized models are not desirable, though with a deterministic node population they can implemented more effectively. The B2E model has been successfully implemented in several companies.
Many financial institutions use servers with spare capacity to execute computationally intensive tasks through the use of grids [16]. An instance of a serverless file system was implemented within Microsoft using desktops [39]. Medical image sharing has been implemented using a collaborative P2P and hybrid P2P architecture [40].
Another example is of cloud-bursting a need for additional resources when the cloud resources run out to manage the IT infrastructure. An organization may add resources from a P2P grid to the resources they purchase from a commercial cloud provider because although cloud resources are cheap, they are not actually free. P2P resources may be obtained for example by organizational employees without budgetary powers, or the organization may use cloud resources for required applications and P2P resources for discretionary applications without Quality of Service requirements [41].
The B2B P2P model can offer many of the same advantages as with a B2E model such as a predictable and a static end-user computing node population, strong central control, service level agreements, and security. However the expansion of the span of control will lead to some of the issues discussed earlier related to point of responsibility amongst the organizations.
For instance, who is responsible and liable if data is compromised? Some of the issues may be easier to resolve since it can be assumed that all organizations involved in the partnerships will exercise due diligence in maintaining and securing their computing infrastructure, and contractual agreements can be signed between parties. Here utility pricing models [42] and emerging cloud computing pricing models may be used to compensate partners.
There are some examples of B2B P2P models that can be used for content distribution and distributed processing. Deloitte UK aggregates massive amounts of diverse regulatory information, corporate policies, and best practices, some of which is generated internally and some of which comes from outside vendors like B2B provider ABG Professional Information. It would be virtually impossible to maintain up-to-date versions of all of this material on centralized servers.
The data is maintained and resides on servers at different offices and even companies, but to the auditor at Deloitte, the information is all available from a single web page interface and looks as if it all sits in one place [43]. Another application involves sharing data on proteins [44]. This platform uses fully distributed P2P technologies to share specifications of peer-interaction protocols and service components that is no longer centralized in a few repositories but gathered from experiments in peer proteomics laboratories.
Distributed or grid computing can be done using software like Legion development stopped in and Global ROME [32]. In ROME, size of the network can be controlled. Through a number of defined actions, extra nodes can be recruited into the network structure to deal with overload and unnecessary nodes removed to deal with underload, thus optimizing the size and therefore lookup cost of the network.
Nodes that are not currently members of the structure are held in a node pool on a machine designated as the bootstrap server. Since node utilization is monitored, cost metrics or revenue metrics may be used to compensate partners in the B2B P2P network.
For implementing applications at the B2C level, we stress that a specialized intermediary is needed who can take care of the technological, business, and other issues; keep the costs down; and ensure P2P application affordability. This intermediary may control a pool of nodes for content distribution or distributed processing. Accounting models which can measure the overall contribution of a computing resource to a task can facilitate this.
Example of a simplistic model might be a couple of cents for rendering one frame of an animation. Grid computing applications for the B2C space may be developed using the Globus toolkit [20,21]. Globus complies with the Open Grid Services Architecture OGSA and provides grid security, remote job submission and control, data transfer, and other facilities which may ensure some level of service though not necessarily high performance and does address some liability issues, but it does not have any mechanism for compensating endusers.
Content distribution especially audio and video streaming applications have been successfully implemented in the P2P B2C space [8,45]. An Internet-based storage application that compensates users for their participation while implementing features like security etc.
A P2P C2C application may either operate through an intermediary e. Skyrider [47] , which is the same as a B2C model, or operate on a totally decentralized fashion. The C2C model is perhaps the most distributed and decentralized in its scope. This is also where most of the current P2P activity is underway. However as outlined before, the lack of central control and inability to guarantee performance levels does not make this configuration appropriate for business applications proposition 2.
However due to lack of centralized control and their vast distributed nature, such networks are suitable of preserving privacy of users and users are endowed with anonymity. For example, The Onion Router TOR is used by a spectrum of population all over the world to escape government censorship, and report on oppressive governments [48].
As tracking of activity and control of the Internet becomes more pervasive, the C2C P2P will come to play a more important part on the Internet. It appears that anonymity and privacy based on decentralization has been a prime aim of application design in the C2C P2P realm [49], but that is not a prime consideration for a business P2P application. Proposition 3: The C2C P2P model that is highly decentralized and distributed will form the backbone of most anonymity and privacy mechanisms on the Internet.
Given the structure of connectivity of the Internet and the ability to control traffic at various exchange and access points, a P2P C2C architecture may be the only feasible way to protect anonymity and privacy. Incentive to participate in a P2P network in important for its success [36]. In spite of the fact that users do not get compensated for participating in decentralized C2C P2P networks, they are the most in use today. Users participate on their own accord and most often the rewards are indirect.
Participation in networks like TOR is for altruistic and humanitarian reasons. Participation in the file sharing by end-users may be rebelling against the big corporations, and monetary benefit without any explicit compensation mechanism. The end-users may also participate for reciprocation or in a spirit to give back when they have gained something. With all the networks, including those like TOR, liability questions are complex and the subject of various lawsuits across the globe.
On file-sharing networks, security has been a concern with spyware and viruses spreading through innocuous looking files. On the infrastructure level, an important application in the P2P C2C realm is the standards for ad-hoc networks especially wireless ad-hoc networks part of the Mobile ad-hoc networks, wireless mesh networks, and wireless sensor networks are all important applications in this area [50]. These networks standards developed by the Internet Engineering Task Force IETF enable the formation of networks on the fly without the need for central routing with nodes entering and leaving the network at will.
Ad-hoc network standards have been successfully used by the US Army on the battlefield. The ability of mobile devices to come together and make a functioning network as these devices become more popular and powerful, creates possibilities for some interesting applications [51]. The growing popularity of mobile devices has been discussed earlier in this paper. Another example of the increasing importance of mobile devices is illustrated by the recent discussion on the IPO of Facebook the social networking site , where several experts have pointed out that Facebook has a mobile problem since it does not monetize mobile traffic [52].
Mobile devices in use today have limited capabilities due to various technology constraints e. While there seems to more and more cases made for mobile device-centered cloud computing-backed centralized computing, thick computing still has its place. All serious work that requires any significant computing power is better done on thick clients like desktops and workstations. Within business enterprises and other organizations, the primary mode of work is still anticipated to be desktop machines and other thick clients for some years to come.
In addition many external legal, political, and environmental constraints may still keep desktop computing in fashion. For instance, most cellular network providers in the USA have imposed caps and restrictions on the maximum amount of data usage that a user is allowed within a given mobile service plan. The costs of using cellular data is significantly higher than a few years ago and the speeds pale in comparison to wired networks.
The implication of this may be that most users may connect through Wi-Fi networks rather than cellular networks and the true mobility of a device is significantly hindered. This may also hinder uptake of these devices as the primary computing device is backed by cloud computing and render them as secondary or supplementary computing devices.
Hence we may state our fourth proposition as follows:. Proposition 4: Desktops and thick clients are likely to retain their status as the primary computing device for a majority of the population, especially businesses. Mobile devices are more likely to be used as supplementary devices by most knowledge workers.
This implies that P2P business applications would still remain pertinent in the enterprise realm. Given the popularity of mobile devices, one may like to explore how P2P may be used with mobile devices and the kind of applications that may be implemented. This discussion is based on the P2P features discussed earlier and hence one is not focused on application that can serve as clients to a P2P infrastructure, but have to participate in that infrastructure in some server capacity.
For instance an application like PeerBox [53] which allows connection to P2P networks for downloading, but does not allow for serving any files, may not qualify. There are several mobile P2P applications in existence: mBit P2P application allows mobile phone users to share files, pictures, music, etc.
The uptake of these applications apart from Skype and certain messaging applications may be questionable. Given the dependency of the mobile devices on the cloud at the backend to handle heavy-duty processing and storage tasks, they may not be too apt to resource intensive applications.
Some lightweight tasks that rely on processing and storage which happens as a matter of routine on these mobile devices may be good candidates for P2P applications. Applications that take the benefits of ad-hoc networks which these devices can form automatically when they are in the vicinity of each other, are other natural candidates. It is however natural to assume that till there is a significant increase in the processing and storage capacity of these devices, they will continue to operate in a cloud-coupled mode.
For example, there is the new Android Botnet that is being used to send spam through the Yahoo email services [56]. Hence we can forward the following propositions:. Proposition 5: Current generation of mobile devices that are not apt for P2P computing and can mostly perform satisfactorily in a cloud-coupled mode, will be ready for P2P applications in another three to four years based on hardware and battery power trends.
Proposition 6: The economics of wireless network connectivity will be an important factor in determining the success of P2P computing model on the mobile platforms. The current breadth and depth of research in P2P computing points to its potential as a viable and useful infrastructure for business applications.
It can be used for several useful applications like content distribution, load balancing, and grid computing. P2P is a natural evolution of decentralized computing and the increase in the power of the client machines. Though businesses have not fully utilized its potential, business P2P applications operating in a B2E mode should be easy for enterprises to implement and are a viable way for uptake and forward movement in this area.
These client machines will predominantly reside within enterprises and hence the B2E and B2B P2P computing models will still remain viable and useful even within the current computing shift to mobile devices at the consumer levels. Hence moving forward we can say that the P2P computing model does not lose its viability due to increased uptake in mobile devices by the consumers, at least for most businesses and enterprises.
If we can learn something from history and plot a trend, we can safely state that power of the mobile computing devices will increase and in a few years match or exceed those of the thick clients today. Similar to how computing became more distributed and moved out of the confines of mainframes and powerful servers, the same trend may follow with mobile devices. Though the network connectivity will be pervasive, the computing may be moved back from the cloud to the mobile client devices. One reason for this may be the cellular data price structure and wireless spectrum issues that are likely to restrict the replacement of wired connectivity by wireless connectivity.
Another reason is likely to be that end-users demand and enjoy freedom, flexibility, and having their own span of control. While some control has been given up by users to the cloud-based services due to the limitations of the mobile devices, they would be more inclined to gain it back as the capacity of these devices increase.
At that stage, P2P computing on the mobile devices will once again become feasible and they can be incorporated into the P2P infrastructure Anonymity and privacy can only be reasonably preserved through a C2C P2P architecture.
As the desire of governments and businesses to control the Internet increases, the architecture will become more and more popular for this purpose. Hence one can infer that P2P computing is still an architecture that will stay relevant in both consumer and business spaces in the foreseeable future, even in this era of cloud-coupled mobile computing. It is therefore important for the MIS academicians to take a holistic and practical approach to the P2P applications.
Understanding what is feasible will allow us to channel our energies into the study of issues that will bring in both immediate and practical benefits to the business organizations. Detailed study of issues related to applications running on end-user machines to benefit the organizations by better uses of slack resources, should be undertaken. There are many areas for future potential research, the most important of which is a payment or compensation scheme using microtransactions that will allow for-profit businesses to make a transition to the B2C model.
0コメント