Things that have never happened before happen all the time.

Scott D Sagan The Limits of Safety

Spad situation

Drive a mile in my seat: signal design from a systems perspective.

A paper by Dr Anjum Naweed and John Aitken.

sign1512 1

You can get lonely out there!

Keeping in communication with people who are in isolated locations.  A paper by John Aitken.

Red and green buttons

IN CASE OF EMERGENCY PRESS BUTTON "B" ... or is it "C"?

A paper by presented to the International Railway Safety Council.

springs

Resilience

Enhancing error tolerance, error detection and error recovery to together produce system safety.

Biological principles for future internet architecture design

Biological systems have remarkable capabilities of resilience and adaptability. These capabilities are found in various biological organisms, ranging from microorganisms to flocks of animals and even human society.

...

There are two especially appealing aspects of biological systems that could be beneficial in designing architectures or the future Internet. First, biological systems are always composed of a multitude of protocols that combine various processes to control different elements of an organism. Second, biological systems as a whole exhibit a hierarchical ecosystem structure that allows various organisms and systems to coexist.

Highly Resilient Organisations

Highly resilient organisations can be recognised by the following four behaviours:

  • They anticipate critical disruptions and situations and their consequences
  • They notice the critical disruptions and situations when they occur
  • They plan how to respond
  • They adapt and move into different actions.

Mechanistic thinking vs Systems thinking about Failures

Mechanistic thinking about failures, that is, the Newtonian-Cartesian approach, means going down and in.  Understanding why things went wrong comes from breaking open the system, diving down, finding the parts, and identifying which ones were broken.  The approach is taken even if the parts are located in different areas of the system, such as procedural control, supervisory layers, managerial levels, regulatory oversight. 

Network Resilience: A Systematic Approach

Whether used for professional or leisure purposes, for safety-critical applications or e-commerce, the Internet in particular has become an integral part of our everyday lives, affecting the way societies operate.  However, the Internet was not intended to serve all these roles, and, as such, is vulnerable to a wide range of challenges.  Malicious attacks, software and hardwired faults, human mistakes (eg software and hardware misconfigurations)  and large-scale natural disasters threaten its normal operation.

Response to Stress

The response of an organisation to stress is strikingly similar to the response of a ductile metal to stress. For a ductile metal, as load (or stress) is increased, it is able to recover or return to its original form when the load is removed, up to a point.

Task descriptions and dynamic behaviour of systems

While a system traditionally is modeled by structural decomposition into structural elements, the dynamic behaviour of systems and their actors is modelled by decomposition of the behavioral flow into events, acts, decisions, and errors. Such decomposition is the basis for identification of activity elements in terms of 'tasks' and task elements in terms of 'acts.' The problem is, that all work situations leave many degrees of freedom for choice by the actors, even when the objectives of work are fulfilled.

Things that go right

Resilience Engineering sees the "things that go wrong" as the flip side of the "things that go right" and therefore assumes that they are a result of the same underlying processes. In consequence of that, "things that go right" and "things that go wrong" should be explained in basically the same way.

It therefore makes as much sense to try to understand why things to right as to understand why they go wrong. In fact, it makes more sense because there are many more things that go right than things that go wrong, the ratio depending on how (im)probable an accident is considered to be.

"The enemy of safety is complexity."

 Behind Human Error, Woods et al, Ashgate 2010 p 23

"Knowledge and error flow from the same mental sources, only success can tell one from another."

 

Ernst Mach, 1905