Without true finite capacity scheduling, any implementation for manufacturing execution, whether it is ERP, SCM, or MES, cannot realize the goal of enterprise efficiency and agility. All aspects of OM for manufacturing execution fall behind the lead of FCS, which is the bridge between planning and execution. Real tangible return on assets rests with FCS.
Integrating a diverse collection of resources to accomplish a goal is an issue that has faced humankind since the first city arose and food and services needed to be provided to the populace. The modern challenge for operations management (OM) is the speed and volume that data is presented to OM systems. This explosion of data holds the promise of efficiency and agility unrealized in the past, but it forces the attention of analysts and engineers to convert the flood of data into a useable form to move from planning to action. All the systems such as MES, SCM, and ERP are information hungry beasts that must be fed with the right information at the right time to direct enterprise resources. OM requires a well-coordinated dispatch of its resources to realize efficiency and agility. This paper addresses the need to look at OM from an information-centric perspective as a necessary complement to emerging process-centric views. This discussion moves to the execution systems, also treated from an information-centric perspective, and concludes with a discussion as to why finite capacity scheduling (FCS) is the key to OM for manufacturing execution.
WHEN DATA BECOMES INFORMATION
Despite the advances in information technology, notably object-oriented software, systems continue to be defined by functional decomposition. Functional decomposition creates complex definitions with fragile coupling and cohesion that are on one side of a great chasm from the reality of the methods that are used to build modern information systems. Information itself is an under designed component of modern systems. Information is a series of objects made from atoms of data. Data becomes information only through context and inferences derived from context. A good example is the use of spreadsheets to attempt to understand data rather than the use of application software designed to with the operational context in mind.
Figure 1: Hierarchy of Data Fusion Inferences
Figure 1 shows the hierarchy of inferences through a process called data fusion. Data fusion simulates the cognitive processes used by humans to continuously integrate data from their senses to make inferences about the external world. Information systems collect data though sensors and other assets, and in the hierarchy of data processing, multiple data sources are combined to approximate or estimate the condition of some aspect of the enterprise operation. This is the first translation of data to a level of inference. Parametric data is processed to begin specific identification of a situation. As more parametric data are collected, different aspects of the situation come together to allow a contextual analysis of an increasingly complex set of conditions. Once integrated, the situation can be compared to the goals or desired state of the system.
Parallel to the types of data processing are the types of inference. With raw data an inference can be made of the general condition. While this level of inference rarely points to a specific correction action, it does begin to isolate what subsystems require attention. The next level of inference will reveal a specific characteristic behavior of the system. With more integrated data, the identity of an operational system or process is revealed. The next inference is the behavior of a process, which then leads to an assessment of a situation. At the highest levels of inference, the performance is assessed to determine the deviation from the performance goals, acceptable risks, or desired state.
Data fusion is not a new concept, having its origins in...
Please join StudyMode to read the full document