What is middleware?
The term middleware is defined by one's point of view. It is used to describe a broad array of tools and data that help applications use networked resources and services. Some tools, such as authentication and directories, are in all categorizations. Other services, such as coscheduling of networked resources, secure multicast, object brokering and messaging, are the major middleware interests of particular communities, such as scientific researchers and business systems vendors. One definition that reflects this breadth of meaning is "Middleware is the intersection of the stuff that network engineers don't want to do with the stuff that applications developers don't want to do."
Why is middleware important?
Middleware has emerged as a critical second level of an enterprise IT infrastructure, sitting on top of the network level. The need for middleware stems from the increasing growth in the number of applications, in the customizations within those applications and the number of locations in our environments; these and other factors now require that a set of core data and services be moved from their multiple instances into a centralized institutional offering. This central provision of service eases application development, increases robustness, assists data management, and provides overall operating efficiencies.
Okay, so it is important. Lots of things are these days. Why is it urgent? There are several drivers bringing middleware to campus; Advanced scientific computing environments such as PACI are placing requirements on campus researchers for middleware services such as authentication and directories. Library projects such as the UCOP/Columbia certificate project will be extending across a broader higher ed community . The Federal government is preparing requirements for digital signatures for student loan forms. New versions of software, such as Windows 2000, come with the tools to build ad hoc middleware components. What is...
Please join StudyMode to read the full document