June 11, 2025
June 11, 2025
Sinclair Koelemij warns that using the Cyber-Informed Engineering (CIE) idea of “credibility”—i.e., treating any scenario that seems plausible as a reason to act—doesn’t fit properly with the IEC 61511 risk standard used in the process industry
IEC 61511 measures risk by multiplying the probability of something happening by its consequences. It requires that the final, “residual” risk after applying safety measures is below a set limit. It doesn't matter what caused the risk—a cyberattack, mechanical failure, or operator mistake—it’s the outcome that counts
CIE, on the other hand, tells us to take action whenever a cyber scenario seems credible, no matter how likely it is to happen. But Koelemij argues that a credible event isn’t enough under IEC 61511—you have to show, in numbers, that the risk has been reduced to a safe level
So what’s the alternative? Koelemij recommends using a method he calls “Probability of (defense) Failure if Attacked” (PFA). It’s similar to how safety systems use Probability of Failure on Demand (PFD). Instead of guessing how often an attack might occur, PFA looks at how likely your defense is to fail if an attack happens. You can base that on real weaknesses in your system, your past test results, and potential vulnerabilities
With PFA, you stay inside the IEC 61511 framework because you’re still doing a numbers-based risk calculation. You combine PFA with the potential consequence, compare that risk to your acceptable limit, and decide whether more protection is needed—just as you would for any other hazard
Koelemij’s main point: we should treat cyber threats exactly like other safety hazards in process plants. We need to measure, calculate, and demonstrate that risks stay below the same safety limits—credibility alone isn’t enough