I am a college student, and I have started working in the area of grid computing. But, now I am confused about...
the definition of grid vs. distributed computing. Can you please resolve my confusion and provide a clear-cut difference between grid and distributed computing? Grid Computing is the latest name for the hoped-for universal distributed computing facility. The promise of ubiquitous, cheap, and almost infinitely scalable computing is alluring, and many descriptions paint a future in which grid computing gives every user and every application access to "supercomputing on demand". In the early 1990s, the Beowulf project (http://www.beowulf.org) demonstrated a practical application of large numbers of PCs to solve grand challenge, supercomputing problems. However, the early successes required applications to be highly scalable and custom tailored for the environment. In the years since, many researchers and vendors have searched for ways to apply ever larger collections of computers to a wider variety of computing problems. Today, numerous technical forums, conferences, and Web sites are buzzing with highly-detailed debates of how to achieve these goals, and how to solve some of the sticky problems that must be conquered. For example, even if one has access to large pools of shared computing resources, questions of security, priority, licensing, authorization, and billing must be solved. Further, at the current time, the sharing model requires either the applications themselves, or at least a shared infrastructure of middleware, to fully comprehend the distributed computing environment. Issues of reliability, data integrity and time-determinism further complicate the situation. In short, Grid Computing is a desirable vision, but only in its infancy in implementation.
Have a question for an expert?
Please add a title for your question
Get answers from a TechTarget expert on whatever's puzzling you.