Risk! Engineers Talk Governance

The Judicial Need for Reliable Knowledge and to Demonstrate Causation.

Richard Robinson & Gaye Francis Season 4 Episode 1

In the first episode of Season 4 of Risk! Engineers Talk Governance, due diligence engineers Richard Robinson and Gaye Francis discuss the judicial need for reliable knowledge and the need to demonstrate causation. 

They highlight some of their Expert Witness examples to demonstrate how for the courts to come to a decision, they have to not only have an understanding of what went wrong, but if there was something that could have been done that would've prevented the incident. And if this hasn’t been made clear, then making a decision becomes very complicated. 

From the point of view of the courts, if things don't happen because there's a causal link to which you can assign liability or responsibility, how can they make a decision? 

They finish by talking how often it’s due process and, on a certain level, it is more important to have a decision society can live with rather than actually getting it right. 

If you’d like to find out more about Richard and Gaye’s consulting work, head to https://www.r2a.com.au.

Find out more about their publications at https://www.r2a.com.au/store.

Find out more about their training at https://www.r2a.com.au/education 

Megan (Producer) (00:01):

Welcome to season four of Risk! Engineers Talk Governance. In this episode, due diligence engineers Richard Robinson and Gaye Francis discuss the judicial need for reliable knowledge and the need to demonstrate causation.

(00:17):

We hope you enjoy the episode. If you do, please give us a rating. Also subscribe on your favorite podcast platform. If you have any feedback or topic ideas, we'd love to hear from you. Please email us at admin@r2a.com au.

Gaye Francis (00:36):

Hi, Richard, welcome to a podcast session.

Richard Robinson (00:38):

Good morning Gaye. Good to be back.

Gaye Francis (00:40):

Good to be back. Today we're going to talk about the judicial need for reliable knowledge and how that sort of fits in society now, and how expert witnesses behave in court. This is one of the topics that you're very, very interested in and have a lot of information on. So I think you were going to open up with an example from one of our (R2A) courses.

Richard Robinson (01:05):

Yeah. The difficulty here is that in order for the courts to come to a decision, they have to have an understanding of what went wrong and, more to the point, if there was something that could have been done by somebody in the process that if it had been done would've stopped it all going wrong. And if you can't come to a clear understanding what that is, then making decision becomes very complicated.

(01:25):

Now, the simplest way to do this is just to explain a case that went through courts, and we used it in our training a lot, and that's about a woman who alleged she slipped over outside the freezer area of a supermarket. There's no doubt she'd gone over because she'd had a laminectomy. That's where you take a piece of bone from somewhere else in your body and fuse to vertebrae together. So I turned up with her shoe, a kilogram weight and a spring balance trying to work out how slippery the floor was. So I parked the kilogram weight on her leather sold shoe on a line of floor and I drag it around the floor working out the coefficient of friction - static dry, static and dynamic, wet and dry - and work out whether the floor was slippery or not. As I have mentioned, if you want to feel like a bit of a goose in a public place, I could recommend putting a kilogram weight on a woman's shoe and drag it around the floor.

Gaye Francis (02:10):

You get some weird looks.

Richard Robinson (02:11):

You get some weird looks. Anyway, I got a good result. So I then trotted over the management, said, guys, what happens if you get a spill? And they said, oh, we've trained our people. If we see a spill, we put up those plastic posts and tape and we block it off and we're out there with buckets and mops and we don't take the tapes down and the poles away and until the floor's back in pristine condition. So I had to go back to this woman and say, look, so far as I can tell, there is nothing which this supermarket has failed to do, which will (lead to) you slip, fall, and injury. Tell me again what happened. She said, well, actually it's quite interesting actually. I actually fell forward. I said, well, that's interesting because the comedians have got it right. If you step on a banana skin, you'll tend to fall on your back, but if you trip, you'll tend to fall forward.

(02:53):

So I said, it sounds like a trip. So I tried it back to where she said she'd fallen over and precisely where she said she'd fallen over, there was a little access hatch in the lino floor with a little bronze lip around the edge, and it had a sort of a lip of a few millimeters and she was wearing strapless sandals. So I reckon what had happened, she'd been walking in the supermarket for 15 minutes. As she walked over this little hatch, the heel caught on the back of the hatch, which stopped her foot, which means she fell forward.

(03:21):

So what this means in causation terms, if she sues the supermarket because the floor was slippery, which led to her slip, fall injury, then I don't think it's the case to answer. But if she sues the supermarket for her trip, fall and injury because they failed to maintain the floor in level condition, I think there is a case to answer. And that's why causation is so critical to the courts. If you can't satisfy causation, you've got troubles.

Gaye Francis (03:45):

We haven't got a case to start with.

Richard Robinson (03:48):

So then a little while later we got another case, the Supreme Court of Victoria, and they were fretting about the impact of rolling blackouts if we had a strike in the Latrobe Valley with the power supplies and things like that. And the problem you've got here is that, I mean, we know from a memory with the blackout in South Australia, all these dreadful things happened, but that was after a windstorm blew over all the towers. And the problem you have is that you don't know... You can say that the bloodbank went wrong and a whole lot of invitro samples died and a whole number of other things. People got trapped in trains and lifts and all these other things like that. But you don't actually know predictively in advance which one of these things would actually happen. So you sort of have to start doing a probabilistic thing. The problem you normally have is if there's a storm and the power goes off, you can't say that somebody didn't make it to hospital just because the lights were out, because often there's a flood on the road or a tree across the road and that blocks things up. So you can't separate out what's the actual cause.

Gaye Francis (04:42):

There's a number of mechanisms that all add to it.

Richard Robinson (04:45):

Now, as it turned out, after a bit of scrambling around, we actually did find a way forward, and that was that we found there was, they called a blue sky blackout in New York about, I don't know, 2008 or something like that. And there was no storm. For various reasons the network failed and the power went off for 24 hours, something like that. And all the things that we've talked about went wrong. People got stuck in lifts, subway tubes, pharmacies lost their freezers, people couldn't get their prescriptions. All sorts of things went wrong. But because it was a blue sky one, you can actually adopt an epidemiological view. You didn't have to know what happened. You say, look, what was the increase in mortality because of that event? Because that's what epidemiologists do. And the study we were reading was by epidemiologists.

Gaye Francis (05:29):

Right.

Richard Robinson (05:30):

Now what that means though, you're taking a probabilistic view of the way things behave. You don't know, like the woman who slipped over because of this, you failed to maintain the floor and level condition, trip, fall injury. You can only say, oh, I think there might be a percentage increase of this in the same circumstance for a similar city. Now, Melbourne's not as dense or as high rise as (New York), so what factors are important? Now, the reason why this gets particularly complicated is because you start getting into this probabilistic... The first model we talked about was Newtonian. Now there's a causal link and a time sequence and events in series with a predictive model like an epidemiological model. You can't say that's the case. I mean, that's part of the problem with the covid work. You could do these mathematical models, but you can't say that's definitely what's going to happen. There was never that nice crisp connection.

Gaye Francis (06:22):

No, there were too many mechanisms that led to all of the consequences that came out of it.

Richard Robinson (06:27):

Correct. And that's why Schrodinger cat popped up of everybody heard about that popular culture. But basically what it was talking about was the fact that when you start talking about atomic decay and things like that, you can't predict it. You can put a probability number on it, but you can't say in the next half hour that this one thing will happen. I mean, the way Schrodinger cat worked, it was sort of a thought experiment of a, I thought, a pretty robust sort. Basically you put a cat in a box of the poison container and with a radioactive source of some sort, and if there's a certain decay in that radioactive source, it'll set the poison off and kill the cat. And so you don't know in the next... if you put it in there for an hour, but you can't say whether that event will happen or not.

(07:09):

You dunno whether the cat's dead or alive. And the only way to find that is to open the box up to have a look, which I said I thought was always fairly macabre sort of example. But everybody kept talking about it. So that's what you do.

(07:21):

Now, you can see from the point of view of the courts and the possibility of causation that if you say everything is strictly probabilistic in nature, that things don't happen because there's a causal link to which you can assign liability or responsibility. How can a court make a decision? You're in a very difficult situation. Now, obviously in the case of the Supreme Court we were doing here, we could actually say the mortality basically in New York increased by about 30% during that blackout. And if you know what the mortality figures in Melbourne are, obviously we're different city, but you could say as a first cut, quick and dirty, but it's still a pretty rubbery first cut.

Gaye Francis (08:01):

There's not a direct link between the two events.

Richard Robinson (08:04):

You can't say 'if this, then that'. You don't get... Remember, common law is done on the balance of probabilities. Well, that's really hard to show. And if you start talking about beyond reasonable doubt, I do not see how a probabilistic basis of causation could help a court at all.

Gaye Francis (08:23):

So going forward with the way that information is at the moment and the reliability of that, the courts are still relying heavily on the experts to get it right?

Richard Robinson (08:35):

Correct.

Gaye Francis (08:37):

And there's a lot of information out there, and I don't know about you. Sometimes I read some things and I'm like, oh, can that quite be true? But how do the courts deal with this way in thinking and this vast amount of information that's coming out our way when they're making those sort of decisions?

Richard Robinson (08:59):

Well, obviously it depends on your experts, and that's why, I mean ,the courts are very clear that they want the experts to be expert. You were talking about that other example which we use in the court where a...

Gaye Francis (09:14):

Storeman hurt his back.

Richard Robinson (09:15):

Yeah. Tipping up a drum, a 44 gallon drum, which was lying on the side. And then the lower court took the advice of an orthopedic surgeon that if you've got a 400 or 200 kilogram drum lying on the side, that's a 200 kilogram lift to tip it vertical. Well, no, it's a simply supported object. And so it's a 200 kilogram. It's 100 kilograms at either end. So it's a one hundred kilogram lift to raise it up, which decreases as you bring it up. Well, it took the high court of Australia to work that out after two Supreme Court appeals. Now when that was because the data that went into the first trial judge who was not a physicist, obviously, and I'm still not too clear how the high court became aware of this distinction, but if you get a bad input, you'll get a bad output. And that's the advice we got from Engineers Australia. I assume that they've got... their lawyers are good in that. What's a fact between sensible and smart? Is it what actually happened? No, most emphatically not. At best, it's what the trial court, the trial judge or jury thinks happened. But the trial court or the trial court may be hopeless and incorrect, but that doesn't matter legally speaking because you've got to remember, the reason why we have courts isn't so much to get it right, is to make sure we stop escalating events.

Gaye Francis (10:31):

It's about due process.

Richard Robinson (10:33):

It's due process, and getting a decision. At one level, it's more important to have due process than a decision we can live with rather than actually getting it right. Although that would offend an awful lot of people I know.

Gaye Francis (10:45):

I think it's an interesting space and especially the way the world's going. As you said, there's a whole lot of different ways that you can think about information and that traditional way of causation with almost a linear understanding of events. I think more and more events are becoming more complex and there's less events that are just being able to be shown linearly.

Richard Robinson (11:10):

Well, you might remember the maritime rules of the road, and we've talked to a lot of master Mariners about these things at different times, and we've been told a couple of times, the only reason why those rules of the road exist is so they can assign liability after the event. If you can't stay away from each other in big ships...

Gaye Francis (11:27):

You've got more problems...

Richard Robinson (11:27):

...more problems than that. But after it's all gone wrong and somebody's sunk and the people have been drowned, you need a way to make a decision. And so you have these rules, which more about making decisions than hindsight, sometimes, than they're about preventing things from occurring in the first place, which speaking as due diligent engineers, we find a frustration.

Gaye Francis (11:48):

I think you'd say that with a lot of the rules and policies and regulations that are around, it's about assigning a liability after the fact.

Richard Robinson (11:56):

It's like Sydney Decca's line: There's now so more safety rules out there that nobody, at least of all the people doing their job knows what they are. So what the heck are they for? And a lot of the time you've got to say, oh, it's so if it all goes wrong. You should have known, even if you didn't.

Gaye Francis (12:12):

I think that's probably a space that's becoming more and more, it's a liability. The due diligence stuff that we're doing is becoming more a liability management exercise rather than a safety improvement exercise.

Richard Robinson (12:29):

I think that was always the case, Gaye.

Gaye Francis (12:32):

I was optimistic that it wasn't, but I think liability is something that we'll just talk more about in this podcast season. So thanks Richard for the podcast today and nice talking to you. Will you hope to see you next time?

Richard Robinson (12:48):

Thanks, Gaye

 

People on this episode