Security mistakes in information system deployment projects
Purpose ‐ This paper aims to assess the influence of a set of human and organizational factors in information system deployments on the probability that a number of security-related mistakes are in the deployment. Design/methodology/approach ‐ A Bayesian
network (BN) is created and analyzed over the relationship between mistakes and causes. The BN is created by eliciting qualitative and quantitative data from experts of industrial control system deployments in the critical infrastructure domain. Findings ‐ The data collected
in this study show that domain experts have a shared perception of how strong the influence of human and organizational factors are. According to domain experts, this influence is strong. This study also finds that security flaws are common in industrial control systems operating critical
infrastructure. Research limitations/implications ‐ The model presented in this study is created with the help of a number of domain experts. While they agree on qualitative structure and quantitative parameters, future work should assure that their opinion is generally accurate.
Practical implications ‐ The influence of a set of important variables related to organizational/human aspects on information security flaws is presented. Social implications ‐ The context of this study is deployments of systems that operate nations' critical infrastructure.
The findings suggest that initiatives to secure such infrastructures should not be purely technical. Originality/value ‐ Previous studies have focused on either the causes of security flaws or the actual flaws that can exist in installed information systems. However, little research
has been spent on the relationship between them. The model presented in this paper quantifies such relationships.