Поиск

Полнотекстовый поиск:
Где искать:
везде
только в названии
только в тексте
Выводить:
описание
слова в тексте
только заголовок

Рекомендуем ознакомиться

Остальные работы->Реферат
One incident (which involves the prostitute) is when Holden didn?t want to have sex with her but instead wanted to chat, she responded by saying, ?Wha...полностью>>
Остальные работы->Реферат
If Billy is not executed then corruption might occur on the ship and cause a mutiny....полностью>>
Остальные работы->Реферат
“The company’s main competitors in cars and trucks in the United States and Canada are the Ford Motor Company, DaimlerChrysler Corporation, Toyota Cor...полностью>>
Остальные работы->Реферат
As the tropical rainforest of Rhondonia, Brazil is leveled for farmland, rainfall is significantly and directly reduced. Rainfall is reduced because t...полностью>>

Главная > Реферат >Остальные работы

Сохрани ссылку в одной из сетей:

How Do People Contribute To The Catastrophic Breakdown Of Complex Automated Technologies? Essay, Research Paper

As scientific knowledge progresses and technological advances are made, greater dependence is placed upon automated systems and their complexities are, necessarily, increased. Whilst the systems themselves may be rigorously tested to ensure they operate correctly, errors can enter the system via the weak link in the chain – the human designers and operators. Unlike the machines that they operate, humans are not very good at doing the same task for a prolonged period, or at doing two things at once, and their performance becomes impaired if asked to do so, e.g. Casali & Wierewith 1984. Human errors therefore become almost an inevitability in a complex system and this has lead to much research into the causal factors behind errors and new ways of implementation to minimise their occurrence. Reason (1990) distinguishes between two types of error; latent errors, problems caused by poor design or implementation at a high level which may not be immediately apparent, and active errors, errors caused by front line operators which are often inherited from latent errors, although the consequences here are usually seen on site and are more immediately apparent. Latent errors are the more serious category for complex automated systems as they may not be apparent at the initial onset of system implementation and can lie dormant until triggered by an active error (giving rise to the ‘pathogen metaphor’). As Reason observes, these errors “constitute the primary residual risk to complex, highly-defended technological systems.” Errors may also be exacerbated by the increasing opacity of automated systems, and this theme is central to the issue of automated systems breakdown. As automated systems become more complex, the human operators become increasingly distanced from the actual processes and lose their ‘hands-on’ knowledge of the system. Such distancing and complexity of function can lead to ‘mode errors’ – the human operators perform the appropriate action for one mode when they are in fact in another, e.g. the pilots of an Aero Mexico DC-10 made a mode error in using the autopilot, causing the engine to stall in mid-air and damaging the plane. (Norman 1983). Such complicated systems as these are often ’safeguarded’ by features in them which are designed to accommodate errors without breakdown, though ironically these systems often tend to exacerbate the problem. Automated systems frequently have inbuilt defence mechanisms for errors and can compensate for them. Often, such systems may operate on the ‘defence in depth’ principle in which a system incorporating a hierarchical structure of processes will have corresponding defence at each level. In such cases, an active error by the operator may be subjected to attempts by the system to compensate at several levels, only returning to the operator as a last resort if the error cannot be fully compensated for. However, this mechanism is often unseen by the operator who may be unaware that an error has even occurred until compensation is no longer possible and the system breaks down. At this point the error may have been compounded by the systems attempts to cope with the situation and be of a much larger, more complex and more obscure nature than when first encountered. Also the delay by the system in informing the operator of the error may well have caused the system to go beyond the point where the operator is able to save it. Such a situation is cited by Norman (1990) in which the loss of power to one aeroplane engine is compensated for by the autopilot until such compensation is no longer possible and it is too late to prevent the plane from rolling. Norman argues that in such cases automation is not the problem, rather the inappropriate level of feedback given by the system. To consider a similar scenario to the one outlined above, we can envisage problems even if the system informs the operator of the problem whilst it is still possible to act. Humans have the unique ability to use knowledge based problem solving routines on novel stimuli, as would be returned by a machine faced with an error which it’s programming does not teach it to cope with. However, this very ability of the human operator is not an optimal one, especially when the individual is under a stress situation, as would be the case if the automated system was one which could have catastrophic consequences from an error, for example an air traffic controlling system. The operator is likely therefore to be under considerable pressure to produce a solution and this is likely to interfere with already less than perfect heuristic problem-solving techniques. In attempting to match the situation with previously experienced ones (’a technique known as ’similarity matching’) and thus use previously successful solutions it is quite likely that the individual will distort the problem space and arrive at a solution which does not fully meet the requirements of the problem. Here again then, we see that the strategy of using the ill-informed operator as a last resort in the rectification of errors which have possibly been made more complex by the systems own attempts to correct then is a seriously flawed and potentiality catastrophic one. Reason (1990) has highlighted another area where humans can cause the breakdown of automated systems, and this is in the field of violations. Reason outlines intentionality as being the differentiation between errors and violations. Within the violations category, routines are a consequence of the natural human tendency to take the path of least effort. Thus, the problem here is not a sub-conscious mistake but a decision taken by the individual in response to an indifferent environment in which such practices can go unnoticed. Daily safety violations were made for a long time leading up to the Chernobyl disaster. Such human factors as sloppiness of procedure, mis-management and the practice of placing economic considerations above safety can all contribute to system failure. For example, in the Bhopal tragedy, the staff were insufficiently trained, the increased reading of a pressure gauge was not seen as abnormal and factory inspectors warnings were ignored. Clearly with such huge latent violations in procedure the disaster was not entirely unpredictable. (Stix 1989.) Exceptional violations are those in which the operating circumstances make them inevitable, for example in the Zeebrugge ferry disaster it was first thought that the blame rested with the crew member who did not close the bow doors, though later evidence that bad time management and lack of checking by higher orders were at fault. Again though, the less than optimal performance of the people running the operation is seen to be the root cause of breakdown. Human errors are things which will always occur. Attention lapses, performance limitations, slips etc. are all far too unpredictable to ever be eliminated altogether and so perhaps the aim of systems designers should be to minimise the effects of errors and to maximise their early detection. Automation is an area in which efficiency can be greatly improved, safety standards increased and economies made, though as Weiner & Curry (1980) observe, attempting to remove errors by automation is a flawed idea in itself, since humans will be monitoring the systems and thus the errors are merely relocated. In conclusion to this study, it is perhaps ironic to note that with the continued implementation of more advanced technologies, humans are increasingly assigned to the role of monitors. Here then we see ourselves falling into a situation where each half of the system is engaged in doing what the other half does best – computers are excellent at repeatedly performing mundane and tedious tasks without getting distracted whilst humans have very limited attention spans and become bored very easily. As Bainbridge (1983) points out, vigilance studies have shown that it is impossible to maintain effective visual attention on a source of information on which very little happens for more than half an hour. Perhaps if more emphasis were placed on implementing technologies to monitor the performance of humans rather than the reverse, accidents such as Chernobyl and Bhopal may be avoided. Bibliography Bainbridge, L. (1983) Ironies of automation. Automatica, 19(6) pp. 775-779. Eysenck, M.W. & Keane, M.T. (1991) Cognitive Psychology. Lawrence Erlbaum Norman, D.A. (1983) Design rules based on analyses of human error. Communications of the ACM, 26 pp. 254-258. Norman, D.A. (1990) The problem with automation: Inappropriate feedback and interaction, not over-automation. Reason, J. (1990) Human Error. ch.7. Cambridge University Press Stix, G. (1989) Bhopal: A tragedy in waiting. IEEE Spectrum

3b9


Загрузить файл

Похожие страницы:

  1. The End Permian Mass Extinction Essay Research

    Реферат >> Остальные работы
    ... at the end of the Permian. One of the main questions was the even a catastrophe ... and sulfur dioxide would also contribute to the observed negative anomalies in C-13 ... the extinction but what do we do with evidence that the hypothesis doesn?t support? People ...
  2. The Matrix Essay Research Paper Reality Bytes

    Реферат >> Остальные работы
    ... Time contributed to the commodification of time, the modern mind, according to Simmel, ... But how, and to what extent, do ‘naked existence’ and the information ... the exile of the virtual, of which television is the universal mirror, to the catastrophe of the ...
  3. The Deterioration Of Salem The Crucible

    Реферат >> Остальные работы
    ... only contributed to the deterioration of Salem s social structure. Miller displays how easily people can ... to have charmed the girls. Abigail swears that she [Tituba] made me do ... , at the hands of the state. The Crucible is a story of a great catastrophe which ...
  4. The Future Looks Bright For Japan Essay

    Реферат >> Остальные работы
    ... eyes to the extreme dangers and potentially catastrophic ... the worst nuclear accident the world has ever experienced. People and governments around the ... into the atmosphere. Carbon dioxide contributes to the green ... how would we exist without it when we do ...
  5. The Hindu Caste System Essay Research Paper

    Реферат >> Остальные работы
    ... outlines how the caste system is likely to ... they were the indigenous tribes? contribution to the religion ... The gun theory, like the coded-substance theory, asserts that people and jatis do ... to be able to worship the gods, and catastrophe would soon follow. The ...

Хочу больше похожих работ...

Generated in 0.0024549961090088