Research

Published Work

Peer-reviewed publications in cybersecurity, blockchain, and information systems. I serve as a reviewer for 20+ journals and conferences.

Journal Articles

4

The Vigilance Paradox: Automation Reliance Inside the Modern SOC

Tilbury, J., & Flowerday, S. V.

Information and Computer Security·2026

Abstract

Automation and artificial intelligence (AI) are increasingly leveraged in Security Operations Centers (SOCs) to assist security analysts in managing growing alert volumes and escalating threats. However, their rapid integration introduces the cognitive risks of automation complacency (AC) which can lead to automation bias (AB) among security analysts. This study adopted a mixed-methods approach. First, this study conducted four qualitative SOC observations to validate the alert overload issue. Next, this study collected large-scale survey data (n = 696) to assess the research model, investigating the interplay between the automation-induced phenomena of AB and AC, trust in automation and dual information processing techniques among security analysts. The model was analyzed using the partial least squares (PLS) algorithm. Finally, to validate the quantitative findings, this study conducted structured interviews with 29 security analysts. This study found that security analysts strategically reallocate their cognitive resources toward manual, non-automated tasks. This inadvertently leads to the complacent behavior of reduced monitoring, increasing the reliance on automated results. The results show that systematic verification procedures act as a mitigating factor. This study enriched the Information Systems literature by evaluating the antecedents of AB and their effects on analysts' susceptibility to automation overreliance, deconstructs the monolithic concept of automation complacency, and empirically models its attitudinal and behavioral components as distinct constructs, revealing a 'tale of two complacencies'.

Read full paper

The Rationality of Automation Bias in Security Operation Centers

Tilbury, J., & Flowerday, S.

Journal of Information System Security, 20(2)·2024

Abstract

Security Operation Centers (SOCs) comprise people, processes, and technology and are responsible for protecting their respective organizations against any form of cyber incident. These teams consist of SOC analysts, ranging from Tier 1 to Tier 3. In defending against cyber-attacks, SOCs monitor and respond to alert traffic from numerous sources. However, a commonly discussed challenge is the volume of alerts that need to be assessed. To aid SOC analysts in the alert triage process, SOCs integrate automation and automated decision aids (ADAs). Research in the human-automation field has demonstrated that automation has the potential of cognitive skill degradation. This is because human operators can become over-reliant on automated systems despite the presence of contradictory information. This cognitive bias is known as automation bias. The result of this study is the development of four critical success factors (CSFs) for the adoption of automation within SOCs in an attempt to mitigate automation bias: (1) Task-based Automation; (2) Process-based Automation; (3) Automation Performance Appraisal; and (4) SOC Analyst Training of Automated Systems. In applying these CSFs, a beneficial balance between the SOC analyst and the use of automation is achieved. This study promotes the human-in-the-loop approach whereby experienced and cognitively aware SOC analysts remain at the core of SOC processes.

Read full paper

Automation Bias and Complacency in Security Operation Centers

Tilbury, J., & Flowerday, S.

Computers, 13(7), 165·2024

Abstract

The volume and complexity of alerts that security operation center (SOC) analysts must manage necessitate automation. Increased automation in SOCs amplifies the risk of automation bias and complacency whereby security analysts become over-reliant on automation, failing to seek confirmatory or contradictory information. To identify automation characteristics that assist in the mitigation of automation bias and complacency, we investigated the current and proposed application areas of automation in SOCs and discussed its implications for security analysts. A scoping review of 599 articles from four databases was conducted. The final 48 articles were reviewed by two researchers for quality control and were imported into NVivo14. Thematic analysis was performed, and the use of automation throughout the incident response lifecycle was recognized, predominantly in the detection and response phases. Artificial intelligence and machine learning solutions are increasingly prominent in SOCs, yet support for the human-in-the-loop component is evident. The research culminates by contributing the SOC Automation Implementation Guidelines (SAIG), comprising functional and non-functional requirements for SOC automation tools that, if implemented, permit a mutually beneficial relationship between security analysts and intelligent machines. This is of practical value to human automation researchers and SOCs striving to optimize processes. Theoretically, a continued understanding of automation bias and its components is achieved.

Read full paper

Humans and Automation: Augmenting Security Operation Centers

Tilbury, J., & Flowerday, S.

Journal of Cybersecurity and Privacy, 4(3), 388–409·2024

Abstract

The continuous integration of automated tools into security operation centers (SOCs) increases the volume of alerts for security analysts. This amplifies the risk of automation bias and complacency to the point that security analysts have reported missing, ignoring, and not acting upon critical alerts. Enhancing the SOC environment has predominantly been researched from a technical standpoint, failing to consider the socio-technical elements adequately. However, our research fills this gap and provides practical insights for optimizing processes in SOCs. The synergy between security analysts and automation can potentially augment threat detection and response capabilities, ensuring a more robust defense if effective human-automation collaboration is established. A scoping review of 599 articles from four databases led to a final selection of 49 articles. Thematic analysis resulted in 609 coding references generated across four main themes: SOC automation challenges, automation application areas, implications on analysts, and human factor sentiment. Our findings emphasize the extent to which automation can be implemented across the incident response lifecycle. The SOC Automation Matrix represents our primary contribution to achieving a mutually beneficial relationship between analyst and machine. This matrix describes the properties of four distinct human-automation combinations, and is of practical value to SOCs striving to optimize their processes.

Read full paper

Conference Papers

4

Human-Automation Preferences in Security Operations Centers

Tilbury, J., & Flowerday, S.

MWAIS 2025 Proceedings·2025

Abstract

In automation-heavy Security operation centers (SOCs), analysts face risks of automation bias and complacency. Surveying 720 analysts using PLS-SEM, we explored preferences for human versus automated task handling at two different stages of the incident response lifecycle. Results exposed a stark dichotomy: analysts favored entirely human-driven or fully automated processes, expressing little desire for intermediate collaborative approaches. This presents a challenge for integrating necessary automation effectively. Gradual, evolutionary implementation is advised over revolutionary, disruptive changes to encourage acceptance.

Read full paper

Cybersecurity in the Age of Uncertainty: A Call for Resilient and Antifragile Systems

Flowerday, S. V., Tilbury, J. L., & Higgs, J.

AMCIS 2024 Proceedings·2024

Abstract

To date, research on incident response has predominantly focused on system resilience in terms of recovery mechanisms, falling short of discussing how systems improve post-disruption. This work offers a novel conceptual investigation into systems that improve because of disruption experienced within the cybersecurity context – antifragile systems. Chaos engineering represents a prominent method for resilience engineering, accomplished by exposing systems to short-term stressors in a controlled environment to establish long-term sustainability. This paper contributes to the cybersecurity incident response literature by reframing how resilient systems are defined. The main contribution is the Resilient Systems Model, defining five system classifications: fragile, reliable, robust, recovery, and antifragile. This is necessary as these systems are oftentimes incorrectly defined and applied in an inconsistent manner. Organizational systems must strive to be as close to the top of the model as possible – fostering anticipatory practices, system improvement, and controlled experiments that stimulate learning.

Read full paper

Epidemiology Triad Analysis Guiding Malware Control Expenditure

Flowerday, S. V., Higgs, J., Flowerday, E., & Tilbury, J. L.

AMCIS 2024 Proceedings·2024

Abstract

Malware-related threats continue to pose significant challenges for protecting organizational perimeters. Proposed solutions for defending against malware threats are often devised without considering the socio-technical environment in which they are implemented. This paper argues that by treating malware-related threats as an epidemic, a framework is provided for formulating targeted security controls to prevent and disrupt malware-related attacks. This paper adapts the well-known epidemiology triad into a decision-making and risk-quantification model to assist organizations in formulating security controls. The proposed model can be used to formulate security controls, quantify the probability of a cyber infection, and assess the economic feasibility of the formulated controls. This model constitutes a novel contribution to the knowledge base, demonstrating the potential for fruitful discourse and engagement to co-exist at the boundaries of disciplines.

Read full paper

Business Process Models of Blockchain and South African Real Estate Transactions

Tilbury, J. L., de la Rey, E., & van der Schyff, K.

2019 International Conference on Advances in Big Data, Computing and Data Communication Systems (icABCD). IEEE·2019

Abstract

The current real estate purchasing process in South African sector can be described as inefficient due to heavy reliance on multiple third parties which results in high transaction costs and a prolonging of the time in which property transactions are completed in. Additionally, the extensive manual review and verification of financial and legal documents as well as manually updating multiple systems with redundant information not only takes time but is also prone to error and fraudulent activities. Blockchain technology presents an opportunity for the real estate sector as it has the potential to bring about more efficient transactions. This study examines two approaches to executing real estate transactions; the South African case and an international blockchain technology use case. Two conceptual models are presented using Business Process Modelling and Notation. The findings show that the South African real estate transaction process is inefficient as it is manual, involves paper-based documents and relies heavily on third parties which result in numerous bottlenecks. The study revealed that blockchain-based transactions are more efficient and reduce reliance on third parties and manual processes. The study contributes two conceptual models illustrating how the two different processes are conducted, as well as a list of the challenges and opportunities related to blockchain-based real estate transactions.

Read full paper

Peer Review

I serve as a trusted peer reviewer for more than 20 academic journals and conferences across cybersecurity, information systems, and blockchain research.