I am a college student, and I have started working in the area of grid computing. But, now I am
confused about the definition of grid vs. distributed computing. Can you please resolve my
confusion and provide a clear-cut difference between grid and distributed computing?
Today, numerous technical forums, conferences, and Web sites are buzzing with highly-detailed debates of how to achieve these goals, and how to solve some of the sticky problems that must be conquered. For example, even if one has access to large pools of shared computing resources, questions of security, priority, licensing, authorization, and billing must be solved. Further, at the current time, the sharing model requires either the applications themselves, or at least a shared infrastructure of middleware, to fully comprehend the distributed computing environment. Issues of reliability, data integrity and time-determinism further complicate the situation.
In short, Grid Computing is a desirable vision, but only in its infancy in implementation.
This was first published in October 2003