Formal modelling techniques that employ principles of finite state automata have been increasingly used for shop-floor control. These approaches have focused primarily on unmanned systems where the self-contained logic for control of the systems is embedded within the model(s). To date, human operators are seldom configured within the modelling detail required for these systems. The result has been that the interface of humans within automatic systems has been limited. In general, humans performing physical activities in an automatic system must mimic the detailed responses normally transmitted via control computers and equipment, making the human interaction painful and prone to error. This paper discusses (
Keywords: Shop-floor control; Supervisory control; Human computer interaction
One of the main objectives of automated manufacturing systems is to decrease the level of human interaction in the system in order to maximize the system performance, mitigate human error and alleviate safety issues. However, human interaction and intervention in automated systems cannot be completely eliminated because of the complexities associated with developing computer control systems and the flexibility that a human brings to the system, both for real-time decision-making as well as for physical limits imposed by machines. This brings a need for human operators as both a process provider as well as a supervisor in automated systems. The concept of supervisory control was developed for modelling the relationship between the human supervisor and the computer-controlled, automated subsystem (Sheridan [
In addition to decision-making, human operators augment many automated manufacturing systems by performing manual or semi-automated tasks. Complex material-handling operations as well as some inspection and packaging tasks are done by humans. Although such systems are not fully automated, there is no reason for them not to be computer-integrated. Partially automated manufacturing systems can also be seen in small- and medium-sized manufacturing companies that usually do not have the capital to invest in a fully automated system, or they can be present in fully automated manufacturing systems as a temporary replacement of automated equipment with manual equipment and human operators because of a breakdown or unavailability. Even in partially automated systems, computer integration can result in higher overall efficiency by facilitating a higher level of planning, control and monitoring.
Consequently, there is a need for integrating the human operator with the computer-automated controller system. In contrast to supervisory control, the computer controller directs the actions of the human operator in the system. The controller must provide unambiguous directions to the operator, guide him/her through the required steps of the tasks and ensure the tasks are executed to achieve the system's goals. In advanced implementations, the system must also be capable of error detection and recovery.
A recent modelling approach for controlling flexible manufacturing systems uses physical graphs (and the related finite state automata) of the system that depict product flow through processing and handling resources (Smith [
Researchers have also defined different architectures that are necessary to design, develop and implement shop floor control systems. The characteristics of several related architectures that are required for factory control such as factory, functional, control, information and data, communication, and information system were discussed by Hoberecht et al. ([
Smith and Joshi ([
The generic MPSGs were based on the equipment classifications provided by Smith et al. ([
An important method of characterizing a methodology for modelling the control of a shop floor is the model complexity (a quantitative measure). The efficiency of model construction is another key factor—a qualitative measure (Qiu [
Much of the on-going work in control has focused on automata-based methods for computer control of complex systems. These techniques have worked reasonably well for unmanned systems. Unfortunately, few unmanned systems exist in industry, and control systems incapable of interacting with human services limit their application and flexibility. The focus of this paper is to investigate the inclusion of a human MH in an automatic control environment. Although there has been significant work in human supervisory control, little work has appeared on integrating a human into a physical environment controlled by a computer so that the capabilities of both the human handler as well as the computer controller are effectively used. This paper develops a collaborative (automatic equipment and human handlers) control environment that has the deterministic character of a finite state machine, but also has human flexibility contained within its boundaries. To this end, it defines the required controller extensions and a wrapper around the finite state controller that allows the human to interact with the control system as effortlessly and inauspiciously as possible.
From the human perspective, the problem can be described by a plot of human–computer control combinations at different levels of automation and task predictability (figure 1). Based on Sheridan ([
Graph: Figure 1. Combinations of human–computer control at different levels of task predictability (based on Sheridan [
Note that the control formalism presented here does not address the specific problem of creating a flexible control environment for humans. Rather, focus is on the inclusion of humans in the control loop of a computer-integrated manufacturing system without creating an information overload on them. To this end, during the design and implementation of the control system the following issues are addressed: (
In order to begin the development of a collaborative control, the paper will first address the finite state character of the system. It has been shown that the size of a Petri net model in terms of the number of control states grows linearly with the number of system components, while the size of the reachability graph grows exponentially (Dotan and Ben-Arieh [
The addition of a human as an MT/MH provides different control states that are not possible with an automatic MH device. Human activities such as: (
Few manufacturing systems are completely unmanned. Although automation has been increasingly used in manufacturing systems, most such applications are limited to machine tending and transportation. Highly automated systems typically are associated with a low product mix and high volume with fixed process routings such as in flow shops. As the product mix increases (such as in a job shop), the challenges of automated process control also increase. Many of these challenges are because of the scheduling requirements for the system: however, the flexibility of the system is frequently limited of the MH devices. For instance, an end effector may not be capable of moving a part because of size, geometric or weight limitations. Designing a new end effector to handle only a few parts cannot be economically justified so the system should be able to support and track a human handler while he/she performs the operations.
The limitations of modelling and control software generating techniques with respect to the number of control states were discussed above. Current modelling techniques can be implemented in scenarios where the product mix and part routings are predetermined and fixed. An addition of a new product type or a new alternative routing for an existing product (resulting from a different equipment layout which can alter the reachability graph, or the addition/removal of equipment), cannot be always accommodated by modelling techniques used to generate control software. This is primarily caused by the limitations imposed by the use of automatic MH devices that have a limited workspace and accessibility. These limitations may be caused by the constrained workspace and/or the limitations imposed by availability of appropriate end-effectors. The difficulties caused by the lack of appropriate end-effectors are two-fold. The process planning and scheduling activities of the products now need to ensure the availability of the appropriate end-effectors at the required times. Moreover, the planning needs to incorporate the time required for changing end-effectors. Such an approach can also necessitate planning regarding sharing of end-effectors among the MH devices.
Steele and Wysk ([
The addition of a human offers the potential of increased flexibility in the collaborative control system. While guidelines for function allocation in prespecified environments exist (Fitts [
Consistent with research findings in human–machine interaction, we propose a collaborative control system in which tasks and responsibilities are shared between the MPSG and human operator. The collaboration consists of allocating functions to the MPSG controller and to the human so that factory resources are preserved and human abilities are exploited. Our use of a human MH will obviate the need for different end-effectors that may be necessary for automatic MH devices. Moreover, it provides opportunities to circumvent some pre-conditions that may be part of the shop floor control system. As an MT, the human operator can unload a part, physically transport it to the next station and load it in the next MP devices, all without the need of a port. As a controller, the human operator will be able to handle 'deadlock' situations such as that shown in figure 2 without the need of any additional ports or buffers.
Graph: Figure 2. Deadlock with an automated MH.
The resource model proposed by Wysk et al. ([
MH devices are entities that transfer products between equipment for processing, storage or transportation. Typically, they have the kinematic flexibility to change the orientation and position of the part to insert the part properly into equipment fixturing. MT devices, such as automated guided vehicles (AGVs), are entities that usually lack manipulation and are associated with moving parts from one location to another. Steele and Wysk ([
The use of physical graphs in creating models, which were then used to generate control software, was outlined above. Figure 3 shows the physical models for MH (automatic), MT and a human MT/MH based on Mettala's ([
Graph: Figure 3. Physical models: (a) material handler, (b) material transporter and (c) hybrid material handler and transporter.
Graph: Figure 4. MPSG for a human MT/MH.
A key aspect of the proposed control architecture is the impact of the human operator on the communication systems. This is not limited to the communication protocols used in the system, but also in the design and implementation of necessary human interfaces. Jones and Mitchell ([
Another key consideration when incorporating the human into the automata-based control architecture is the implication on the complexity of the controller models. The physical model and the MPSG corresponding to the MT/MH controller and its differences with the individual controllers for MT and MH have been discussed above. Apart from the complexity of the equipment level controller for the human MT/MH, a more important concern is the increase in the complexity of the shop level controller that will collaborate with the human MT/MH.
The potential benefit of having a human in the shop floor lies in the superior flexibility of the system. However, a higher degree of flexibility implies a more complicated control system. Figure 5 shows an example automated manufacturing system with two MPs, one MH, a buffer (B) and an MT, and the corresponding reachability/accessibility graph. It is assumed that the AGV is capable of placing the part to the port, or obtaining the part from the port without requiring any external device. As can be seen from figure 5, the MH device can access all MPs but the MT is not capable of doing any pick and place operation directly with the machines. The flexibility of having the human operator in the system can be demonstrated through the accessibility graph shown in figure 6. The human is capable of assuming a supervisory role by taking actions in response to contingency events inside the workstation such as a breakdown or a deadlock. In addition, the automated control system is given more flexibility through using the human transporter as a second handler for the workstation. It can be seen that the number of additional arcs in the accessibility graph is 2n, where n is the number of pieces of equipment the MT/MH is 'allowed' to interact with. In the maximum condition, n will be the number of all MPs inside the system. This increase in the model complexity is obviously a linear function of the number of resources in the system.
Graph: Figure 5. Example workstation with an automated MT.
Graph: Figure 6. Accessibility graph with a human MT/MH.
It has already been discussed that the shop floor level MPSG is an extension of the physical model and the accessibility graph associated with a shop floor. It is of interest to understand the impact of the human MT/MH on the shop floor controller MPSG, given the fact that the accessibility graph will be increased by a maximum of 2n arcs as discussed above. For this purpose, consider the scenario of two cells each with two MPs, one MH and a buffer (figure 7). Figure 8 represents the shop level controller MPSG for the first scenario with the automated MT device.
Graph: Figure 7. Example CIM system.
Graph: Figure 8. Shop level controller MPSG with automated MT devices (message structure:
The MPSG in figure 8 is an aggregate version of the complete MPSG necessary to control the system. Each arc denotes a collection of arcs and nodes corresponding to a specific control event. Each node represents a state of the part in the system. The notation used to represent events is in the form of a_b_c, where a is the equipment for which the event is directed (active resource), b is the event itself and c is the location for the event (passive resource). For instance, mh1_pick_b1 means MH1 in Workstation 1 is ordered to pick a part from B1. For a better understanding, the non-aggregate form of the mt_put_xx arc is shown in figure 9. Here, although an AGV is not capable of handling the material, it is assumed that there is a physical connector system between the AGV and the buffers that enables the AGV to place a part in the buffers. Therefore, there are a pick and a put event available for the automatic MT device, but these are only available when MT is in contact with the buffers. The non-aggregate version of each event is not presented here since MPSG implementation is beyond the focus of this paper. A prototype implementation of the MT/MH has been undertaken at The Pennsylvania State University's CIMLAB.
Graph: Figure 9. Expanded version of the mt_put_xx arc.
In the aggregated MPSG (figure 8) there are 15 nodes and 22 arcs. In a complete shop floor controller MPSG for the discussed scenario, the number of nodes and arcs are significantly higher than that shown in figure 8. A complete MPSG for the above scenario in a simulation-based architecture will have 110 nodes and 125 arcs. When the automated MPSG is replaced with the human MT/MH, the aggregate MPSG takes the form shown in figure 10. The complete MPSG for this scenario has 150 nodes and 165 arcs. As can be seen from the above example, the MPSG for the shop floor controller grows significantly even though the accessibility graph grows only by 2n (in this case, 2 × 6 = 12 arcs). Different scenarios were used to study the growth in the number of arcs and nodes in the shop floor controller MPSG.
Graph: Figure 10. Shop level controller MPSG with a human MT/MH (message structure and resource names are the same as defined in figure 8.)
As shown in figure 9, a put event for MT/MH with an MP is different from the put event of an MT with a passive buffer. Since there are four MPs in this system, this means eight new aggregate arcs and, as shown in figure 10, each aggregate arc adds five more nodes and arcs in the complete MPSG. The human MT/MH has at most 2cn more arcs in the complete MPSG, where c is the maximum number of arcs represented by an aggregate arc and n is the number of MPs.
The above discussion illustrates that the complexity of the shop floor controller MPSG grows as a linear function of the number of resources in the shop floor. Obviously, this approach does not accommodate any intricacies that can occur from introducing more flexibility into the controller by allowing extra decision-making paths for greater human autonomy. This modelling formalism addresses the very basic way of including a human in the control loop, however the authors believe that it is a promising scheme that can be augmented with various other formalisms for greater flexibility to address ergonomic aspects of the phenomenon. The next section presents the implementation of this formalism.
As discussed above, the main issues that need to be considered while designing a control system with a human MT/MH are the communication system (language, semantics), ergonomic considerations (memory, repetitive actions, safety), error detection and recovery, and the level of aggregation of information. A preliminary implementation of a control system for shop floor activities with a human MT/MH is discussed here. The control system is simulation-based. A real-time discrete event simulation model generated in ARENA software is used as a task generator. The use of real-time simulation models for shop floor control was discussed by Smith et al. ([
The human MT/MH presented here has been implemented as a part of the computer control system in CIMLAB. The automated manufacturing system in CIMLAB consists of five computer-numerically controlled (CNC) machine tools, an automated storage/retrieval system (AS/RS) and an industrial robot. Currently, all material transportation activities are handled by human operators. This created the need for a collaborative controller for the MT/MH. The simulation model (Arena 3.0 RT) was used as the shop floor controller. The main aim of the controller implementation was to make the control system as easy to use as possible for human MT/MH operators. For this reason, numerous sensors and switches were used in the shop floor.
Control of a human MT/MH requires an interface between the control system and the MT/MH for communication of control system messages and operator responses to those messages. Visual display boards are used for conveying the messages of the control system to the human MT/MH. Figure 11(a) shows an example light-emitting diode (LED) display board for communicating messages of pick, put and move messages, the basic messages required for controlling an MT/MH. The LED on the upper part of the display shows the location number for pick/put commands while the two LEDs at the bottom (labelled as 'PICK' and 'PUT') indicate the message type. This display box is connected to one of the serial communication ports of the control computer using a DB25 serial connector (shown on the left side of the display box). Whenever the control system sends a pick or put message with a location parameter attached to it, the hardware interface (through the use of a digital input/output card in the controller computer) is used to send the appropriate signals to the LED display board.
Graph: Figure 11. Example display boards used in implementation: (a) for MT/MH and (b) display box for a CNC lathe (HAAS SL20 with bar feeder).
Moreover, there are four push buttons on the board used for obtaining the confirmation signal from the MT/MH for each location. The MT/MH is required to confirm that the required activity has been successfully accomplished by using these 'push' buttons. The decision-making entity, in this case, the simulation model, will not generate additional messages until it receives the confirmation from the MT/MH. Similarly, there is another push button for the verification of the move command. Whenever MT/MH completes executing a task (pick, put or move), he/she is required to push the corresponding button on the display board for the response. Each machine tool and the AS/RS have one display board for the MT/MH. Confirmation push buttons for the move command are placed on the corresponding machine's display board (figure 11(a)). The display box in figure 11(a) is the one used for the AS/RS. For machine tools, a simpler display board is used (figure 11(b)). Since the MP devices used in this system have no internal buffer, multiple location signals were not necessary. (This would not be the case if an MP device with a pallet exchanger were used.) Second, there is no pick/put command verification push button on these display boards.
A limit switch placed inside the machine door checks for status of the machine door. After the control system sends a pick or put message to the MT/MH, first the corresponding LED lights up on the display board. After that, the control system waits until the limit switch on the machine door changes its status for two consecutive times, in opposite directions (whenever the human operator wants to place a part inside the machine, he/she has to open the door of the machine, place the part and close the door for a machining process to start). Therefore, the control system reads the confirmation message by checking the machine door status, which eliminates the need for the MT/MH to push a button. More robust and reliable automatic response systems can be designed through the use of special sensors placed inside the machine chuck that checks for the part and its proper alignment. A similar method using automatic switching was used in the system buffer so that whenever a part is placed on a buffer location or picked from that location, the switches placed at the bottom of the location signals the completion of a pick or put task. Similar sensory systems can be used for move command by the use of photo-sensors located at a proper location near the machine or using pressure sensitive mats in front of machine tools and other equipment.
Although the above discussed hard-wired system of LEDs and push-buttons are enough to serve as a successful communication medium, a graphical user interface (GUI) was also developed and embedded in the MT/MH controller software for conveying the controller messages to the operator. This system is intended for novice operators who need audio-visual explanations of the tasks to be executed.
After an input message is received from the shop floor controller, the MT/MH controller displays a pop-up dialogue box on a display screen that gives a brief description of the task assigned to the operator along with the description of the part involved. When a part is assigned, the operator is asked to pick the part from a specified location, transfer to a new location and put into a new position given by the input message. From a control point of view, this activity contains three distinct tasks: pick, move and put. However, from a human MH's perspective, it is a combined job of transferring a part from a location to another in the shop floor. Therefore, rather than creating three different messages and separate dialogue boxes, the three tasks can be aggregated and displayed as a single task on the GUI, which will decrease the complexity of the job description for the operator. This way, the detailed information embedded in the controller is not revealed to the operator preventing redundant communication and information overload on the operator. While the operator only observes a single GUI describing the details of the transfer job, the controller still monitors the proper execution of the three separate tasks (pick, move, and put) that take place within an aggregate transfer task. Improper execution of any of these three tasks triggers the controller to halt the execution sequence and interrupt the operator by giving error dialogue boxes. This approach does not require any changes in the MPSG of the controller.
Figure 12 shows the GUI designed for the MT/MH controller. This dialogue box appears on the screen of the computer, which runs the controller software after the first message of the transfer task sequence, pick_bt, is received from the shop floor controller and it is followed by a brief audio alert to notify the operator.
Graph: Figure 12. Graphical user interface for the transfer operation.
As seen in figure 12, the GUI shows the part number (or a description of the part with its picture) and the parameters related to the source and destination of the transfer operation. There are two common parameter fields displayed under 'Source' and 'Destination' boxes: location and position. The location field shows the name or the physical location of the equipment holding the part. The position field shows the relative position of the part within the equipment holding it. In the example given in figure 12, the source location for the part to be transferred is the second tray of the AS/RS system in the CIMLAB, while the position of the part is the fifth slot in that tray. The AS/RS system consists of several trays for storing material and each tray has several labelled slots. The destination location for the part in this example is the HAAS SL20 CNC turning centre, which has only one position to hold the part, its chuck. The location and position information for all parts are stored in the manufacturing database. When an order corresponding to a specific part is released to the shop floor, the simulation model starts reading the necessary parameters for the operations required to manufacture the part from the process plans stored in the manufacturing database. Eventually, these parameters are passed with the messages to the subsequent controllers for the execution of the tasks required to accomplish the manufacturing operations.
From the GUI in figure 12, the operator has three choices: accept the task by clicking on the OK button, refuse the task by clicking on the ABORT button and request a detailed explanation of the task by clicking on the HELP button. If the task is accepted by the operator, after the operator hits the OK button, the controller will start monitoring the sensors to capture the completion of the execution of the three embedded tasks: pick, move and put. If the task is aborted, the controller will decide that the operator cannot handle the task and the system will automatically release the MT/MH controller from the part and notify the simulation model through the shop floor controller to find alternative ways to carry out the required transfer operation. Both of the above-described activities are accomplished by changing the initial structure of the controller MPSG and modifying the automatically created controller software (this modification adds extra arcs into the MPSG for alternative actions and is an extension to the basic structure explained above). The operator can then choose to see the help information about the transfer operation by clicking on the HELP button. When help is requested, the control software opens a 'Task Help' dialogue box, which gives a brief verbal description of the transfer task. This dialogue box also offers links to the following:
- • HTML page, which shows an animated pictorial explanation of the transfer task by showing the transfer operation on the animated shop floor layout.
- • Video file, which shows an example video of the transfer job recorded previously.
The 'Task Help' dialogue box and the 'Video Box' that displays the sample video are shown in figure 13. The sample web page created for task animation is shown in figure 14.
Graph: Figure 13. Task help and video dialogue boxes.
Graph: Figure 14. Sample HTML page for the transfer task.
This paper has discussed the use of a recent modelling approach (MPSGs) for controlling flexible manufacturing systems when a human operator is present as an MT/MH. The difficulties in designing and implementing a control system with a human operator was discussed in comparison with a shop floor with automated MH devices, which have predetermined and limited workspace. The growth of accessibility graphs and MPSGs (both defined above) in terms of the number of nodes and arcs when a human is introduced in a hitherto automated system was also analysed. As discussed above, this growth was linearly dependent on the number of resources. The formal modelling specifics (MPSGs) required a human to be included in a shop floor control system, and its impact on the complexity of the accessibility graph was discussed with a graph-theoretic perspective as well as the human–computer and human–systems issues.
A major challenge in implementing a control system as discussed is developing the required supervisory control framework and interfaces so that the system allows the human handler to make good decisions while tracking product flow in the system. The issues that need to be considered while designing and implementing a control architecture for an automated manufacturing system with a human were also outlined. Some human–computer and human–systems implementation issues were discussed, but the full resolution of these is beyond the scope of the paper. The use of hardware interfaces to communicate with the human operator and understand his/her responses and actions was discussed with an example implementation.
By B. Altuntas; R. A. Wysk and L. Rothrock
Reported by Author; Author; Author