This paper contributes a formal framework for quantitative analysis of bounded sensor attacks on cyber-physical systems, using the formalism of differential dynamic logic. Given a precondition and postcondition of a system, we formalize two quantitative safety notions, quantitative forward and backward safety, which respectively express (1) how strong the strongest postcondition of the system is with respect to the specified postcondition, and (2) how strong the specified precondition is with respect to the weakest precondition of the system needed to ensure the specified postcondition holds. We introduce two notions, forward and backward robustness, to characterize the robustness of a system against sensor attacks as the loss of safety. To reason about robustness, we introduce two simulation distances, forward and backward simulation distances, which are defined based on the behavioral distances between the original system and the system with compromised sensors. Forward and backward simulation, respectively, characterize upper bounds of the degree of forward and backward safety loss caused by the sensor attacks. We verify the two simulations by expressing them as formulas of differential dynamic logic, and prove the formulas with existing tool support. We showcase an example of an autonomous vehicle that needs to avoid collision.
Quantitative Robustness Analysis of Sensor Attacks on Cyber-Physical Systems
Merro, Massimo
;
2023-01-01
Abstract
This paper contributes a formal framework for quantitative analysis of bounded sensor attacks on cyber-physical systems, using the formalism of differential dynamic logic. Given a precondition and postcondition of a system, we formalize two quantitative safety notions, quantitative forward and backward safety, which respectively express (1) how strong the strongest postcondition of the system is with respect to the specified postcondition, and (2) how strong the specified precondition is with respect to the weakest precondition of the system needed to ensure the specified postcondition holds. We introduce two notions, forward and backward robustness, to characterize the robustness of a system against sensor attacks as the loss of safety. To reason about robustness, we introduce two simulation distances, forward and backward simulation distances, which are defined based on the behavioral distances between the original system and the system with compromised sensors. Forward and backward simulation, respectively, characterize upper bounds of the degree of forward and backward safety loss caused by the sensor attacks. We verify the two simulations by expressing them as formulas of differential dynamic logic, and prove the formulas with existing tool support. We showcase an example of an autonomous vehicle that needs to avoid collision.I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.