Home Mathematics From usable design characteristics to usable information security policies: a reconceptualisation
Article Open Access

From usable design characteristics to usable information security policies: a reconceptualisation

  • Dennis Lawo ORCID logo EMAIL logo and Gunnar Stevens
Published/Copyright: March 26, 2025

Abstract

Information Security Policies (ISPs) are crucial artefacts in organisations, governments, and civil societies to mitigate information security threats and risks. However, poorly designed ISPs can lead to hidden costs and decreased compliance in daily practices. While behavioural factors such as social norms, positive attitudes, and knowledge are well-known to influence compliance, the usability of ISPs, which takes the context of use seriously, remains understudied. To address this, we introduce the concept of Usable Information Security Policy (UISP). This concept is derived from the argument that usability is not just about the usable design of the document itself, but a relational property of the ISP in a specific context of regulation. We argue that UISPs integrate usability as an inherent feature of policies besides compliance. Based on this, an extended scope of content, adapted policy management methods, and strong alignment with said context are required. Our research provides implications for theory and practice. By providing a new concept for engagement including a research agenda, we provide usable security research with a new tool to increase protection between socio-technical contexts and artefacts. For practitioners, the concept provides first guidance on how to incorporate usability more strongly in the otherwise formal policy-making processes.

1 Introduction

Advancing digitalisation and ubiquitous computing have fundamentally changed our societies. The increasing penetration of everyday life by technologies such as the internet of things (IoT), artificial intelligence (AI) and cloud computing is opening up new opportunities – but also new risks. Cyber attacks, data leaks, and security breaches are the order of the day and threaten information security, but also the well-being and health of users. 1 Organisations 2 and governments 3 are facing the challenge of protecting themselves and the users of IT systems from these threats. Moreover, governmental, organisational, and civil spheres are highly interconnected when it comes to issues of security. 4 , 5 , 6 , 7

In this context, Information Security Policies (ISPs) are becoming increasingly important. 8 , 9 These policies are established to ensure that digital systems are designed robust, reliable, and secure as well as users comply with the boundaries and secure usage conditions. 10 Until now, such measures were limited predominantly to the organisational sphere, i.e. companies, offices and institutions, although the objectives and functions of ISPs in companies, at the governmental and the civil level are similar – ISPs focus on protecting against cyber threats, ensuring the integrity of data and systems and preventing damage. 11 For example, in Europe, we are witnessing the emergence of new laws and regulations governing the use of digital technologies 12 , 13 such as artificial intelligence based on the AI Act. 14

Usability is a key challenge when implementing such policies. 15 , 16 , 17 , 18 The regulatees – be it a whole organisation, a developer or an individual user – must be able to understand the guidelines and implement them in practice to comply with the ISP. Comprehensibility and practicability are therefore decisive factors for the success of security measures. For example, over half of security breaches result from non-compliant security behaviour. 19 Current approaches to ISPs often look into usable design characteristics, but fail to consider the context of regulation, use, and practice. ISPs are unlikely to succeed when they neglect to focus on individuals and their work routines or everyday practices. 20 , 21 , 22

Based on these considerations, it is necessary to revisit and reconceptualise ISPs. In this paper, we adopt a conceptual research approach to systematically analyse the role of usability in ISPs. Following the tradition of conceptual research, e.g., 23 , 24 we aim to develop a theoretical argument for integrating usability as a central feature of ISPs. We do not conduct empirical data collection, but instead build on existing literature and theoretical frameworks to develop a novel perspective on ISPs. Specifically, we understand usability not just as the application of usable design characteristics to ISP design, but as a relational property between ISPs as an artefact and their context. 25 , 26

To ground our approach within the Human-Computer Interaction (HCI) research landscape, we align our contribution with the conceptual (see, e.g. 24 ) and theoretical contribution types as outlined in. 27 This paper aims to refine the theoretical understanding of ISPs by critically synthesising literature and identifying conceptual gaps, thus following an argumentative approach based on a selective literature review. By deliberately selecting literature that addresses both usability and ISPs, we construct a foundation for rethinking ISPs. This method allows us to develop a well-founded argument that moves beyond isolated usability recommendations and instead integrates usability as a core feature of ISPs in its socio-technical context.

To this end, this conceptual paper provides an overview of ISPs, introduces usable design characteristics for such documents, and argues for a stronger focus on context in the usable design of ISPs to enhance their usability and, ultimately, security. Moreover, we outline a research agenda that addresses gaps and improvement opportunities to increase usability. Thereby, we first dive into ISPs, their foundations, and compliance aspects, followed by an extended scope that includes ISPs at governmental and civil levels. We then emphasise usability as an inherent feature alongside compliance and propose the reconceptualisation of ISPs as Usable Information Security Policies (UISPs). Based on this analysis, we examine challenges from multiple perspectives, highlighting various issues, and present a research agenda to refine the UISP concept. Finally, we discuss the implications for both theory and practice.

2 Information Security Policies

2.1 Organisational Information Security Policies

2.1.1 Foundations

ISPs are crucial organisational artefacts designed to ensure expected security behaviours. While most organisations have established some form of ISP, 28 the content varies based on organisational values, information sensitivity, and regulatory requirements. 29 The primary objective of an ISP is to ensure data confidentiality, integrity, and availability. 30 However, beyond this objective, ISPs lack a standardised definition due to the varying scopes and concepts. 31 Consequently, ISPs are commonly categorised into three levels: 32

Enterprise-Specific ISP: Also known as the security programme policy, this executive-level document articulates strategic directions for information security across all departments, projects, and systems within the organisation. These policies are primarily strategic, guiding the overall development, implementation, and management of information security efforts, such as security programmes. Their main objective is to meet regulatory requirements by demonstrating a comprehensive security program, rather than providing day-to-day guidance for employees. 32

System-Specific ISP: These policies focus on the security architecture of technical systems. Unlike enterprise and issue-specific policies, system-specific security policies are not distributed to non-IT employees but are aimed at IT professionals managing the organisation’s IT infrastructure. They offer clear guidance for daily administrative tasks, translating managerial directives into specific standards for IT configuration and maintenance. System configurations based on these policies, like access control lists, are also considered system-specific ISPs. 32

Issue-Specific ISP: These policies address particular areas of information technology, such as employees’ devices, network infrastructure, or specific applications such as browsers or email clients. They also cover the usage of external website, including social media. 31 , 32 Issue-specific ISPs set formal procedures, guidelines, roles, and responsibilities that employees must follow to protect and properly use organisational information and technology resources. These policies may also include penalties for non-compliance. 32 , 33

2.1.2 Compliance

The specification of an ISP is a crucial initial step in safeguarding an organisation’s information technology. However, the mere presence of an ISP does not ensure compliant behaviour; active commitment from employees is essential. Historically, security incidents caused by lack of commitment were termed “insider threats”. 20 This terminology must be used cautiously, as it is essential to distinguish between criminal behaviour, intentional deviant behaviour, insider misbehaviour, and simple mistakes. 34 In security research, end users often have a negative image, commonly being attributed with bad habits such as lack of awareness, carelessness, ignorance, negligence, apathy, mischief, and resistance as prime reasons for misbehaviour and errors. 21 , 35 , 36 , 37 , 38 Yet, in most cases, users are not the enemy; 39 often, they are simply trying to focus on their primary work tasks, such as accounting or project management. 40

Instead of blaming individuals, a deeper understanding of the underlying reasons for non-compliant behaviour is necessary. Various theories of compliance behaviour are proposed in the literature. 41 These theories exhibit considerable overlap but differ in their specific focus and nuances. Below is an overview of the most important explanatory models:

Negative Motivators and Punishment: Negative motivation 42 , 43 , 44 , 45 , 46 in the form of threats, punishments, and severe sanctions aims to increase the cost of non-compliant behaviour. Through these measures, non-compliant behaviours become irrational when rationally weighing pros and cons. ISPs may also serve as negative motivators by implying that non-compliant behaviour signifies a lack of loyalty, solidarity, and ethical standards.

Punishments, sometimes mentioned in ISPs, range from warnings to severe measures, such as job termination or legal action against the employee. However, it is crucial to note that while punitive measures may deter non-compliance, they may not necessarily foster the desired behaviour, especially in the long term.

Positive Motivators and Rewards: Positive motivation 42 , 47 , 48 , 49 focusses on encouraging compliant behaviour rather than suppressing deviant behaviours. This motivation stems from a positive attitude and normative beliefs, such as a personal commitment to ethical behaviour or a sense of responsibility for the organisation’s information security. Additionally, psychological factors such as perceived behavioural control and self-efficacy play an important role. These factors influence an individual’s perception of their ability to comply with the ISP, their confidence in doing so, and their belief in the effectiveness of compliance measures, ultimately influencing their motivation for compliant behaviour.

Facilitating Conditions: Facilitating conditions 15 , 28 , 50 focus on promoting compliant behaviours through measures such as increasing awareness, education, and training. These enhance the ”ease of compliance” and reduce barriers. Additionally, environmental factors such as organisational information security culture, support, and technical assistance further promote compliant behaviour. Enhancing ISP comprehensiveness and readability, and making them easy to use, will also facilitate compliance.

2.2 Landscape of policies

Besides organisational ISPs, also governments as well as civil organisations issue policies, i.e., ”document(s) regulating human actions regarding information security or expressing the organization’s [or contexts] information security aims”. 10 Based on this broad definition, we will broaden our view and also reflect on usability as well as compliance of governmental and civil ISPs.

2.2.1 Governmental laws & regulations

“The proliferation of digital technologies has expanded the opportunities for data and knowledge exchange, 51 , 52 yet it also presents new challenges for governance.” 53

Although technology regulation is not a new topic, the emergence of new digital technologies poses new challenges to governments around the world. 53 A particular challenge in regulating these new digital technologies lies in ensuring information security within an evermore connected society as well as ensuring safety during use, e.g., of high-risk AI or robotics. 54 , 55 Governmental ISPs are an essential part of providing governance and guidance to the economy as well as users. 54 , 56 Moreover, their primary objective is quite similar, but the scope is much broader than the narrow focus on a particular organisation.

2.2.1.1 Foundations

National Security Policy: National Security Policy are usually high-level document that outlines the strategic directions for information security at the national level (or supra-national level). These documents establish the foundation for all subsequent policies and regulations. 3 , 11 , 57 They aim to ensure a cohesive approach to information security across the state legislation. These policies are more strategic in nature and aim to protect national interests by establishing a comprehensive security framework. 11 , 58 They provide a basis for developing sector-specific policies or issue-specific policies that are meant to provide more specific regulation.

Sector-Specific Policies: Sector-Specific Policies are tailored to the unique situation of different sectors, 59 such as healthcare, 60 finance, public sector, 61 or critical infrastructure. 62 These policies provide more detailed governance and guidance on security measures that are relevant to the specific sector. They address the unique risks and regulatory requirements of the sector in focus. 59 Hence, they aim to ensure that organisations within these sectors implement appropriate security mechanisms when developing or operating technologies. Thus, the policies are directed at both IT professionals and non-IT employees, providing clear guidelines for maintaining security in their respective domains.

Technology-Specific Policies: Besides high-level strategic documents and governance for specific sectors, some technologies are regulated by technology-specific policies, e.g., an AI-specific policy such as the AI-Act. 14 These policies provide more specific instructions on the implementation, management, and use of these technologies with respect to information security. 11 They are designed to translate high-level strategic directives into actionable structures that can be followed by the regulated entity.

2.2.1.2 Compliance

The establishment of governmental policies on information security is a crucial initial step in safeguarding national information technology infrastructure. 11 , 58 However, the mere existence of these policies does not ensure compliant behaviour. Instead, the active commitment of organisations and individuals is essential. Quite similar to ISPs on organisational level, compliance with governmental policies is influenced by a variety of factors.

Enforcement by Deterrence Enforcing compliance by deterrence is a quite old concept, 63 which is based on utilitarian theories. The basic idea is that rational actors comply with policies based on the expected outcome of non-compliance compared to the expected punishment and the probability of detection. 64 Thus, to regulate rational entities, e.g., corporations or individuals, the regulator should either increase detection rates or the severity of punishments. 65 The basic assumption of this approach is also supported by empirical evidence. 66 , 67 , 68

However, other research suggests that this formula of sanction severity and detection probability has just a minor influence on the behaviours of the regulatees. For example, the utilitarian view does not explain the high-level of voluntary compliance in face of low detection rates in some domains. Moreover, enforcing compliance through punishment and detection is quite costly. 64 We can assume that especially in our domain of interest, the cost of increasing detection rates is quite expensive as IT-experts would need to be hired. In addition, research has shown that the use of sanctions, especially when perceived illegitimate, could lead to even less compliance. 69

Enforcement by Cooperation As an answer to the high costs and potential negative effects of deterrence, cooperative approaches to regulation became a popular tool . 64 In contrast to a rational actor-based view, this approach views the regulatee as a social actor, that has a basic attitude of compliance with the law, because they are socialised by and belief in the rule of law and its value for their well-being. 70

Instead of a simple formula of punishment and detection, this approach aims to establish a cooperation between the regulator and the regulatee, 71 as non-compliance is more frequently based on ignorance of the regulations, negligence, incompetence, or disagreement. 70 Thus, regulators should act as service providers that support the compliance of the regulatee. 70 Thus, regulators should facilitate compliance by providing information and education to regulatees to increase comprehensiveness, incentivise compliance, e.g., by less inspections, and providing assistance and additional information on operationalising the regulation. 72

2.2.2 Civic guidelines & recommendations

2.2.2.1 Foundations

“The increased use of digital technologies and services brings with it a similarly increasing requirement for their end-users to have the awareness and ability to protect the security and privacy of their devices and data. However, this raises the related questions of what they need to know and from where they may obtain related guidance.” 73

Civil guidelines and recommendations on information security come in various forms and with various functions. Thus, they represent the most diverse set of policies. Nevertheless, we will try to provide a broad categorisation based on the regulator – regulatee relationship.[1]

Governmental Recommendations & Guidelines: This type of document directly relates to the government as well as the governmental policies as introduced in the previous section. In contrast, to policies that are operationalised as law, guidelines and recommendations have no binding character, but attempt to govern based on standards, reference architectures, or best-practices. 73 Examples for these recommendations can be guidelines for groups or individuals provided by the national information security agency, e.g., the Guidelines provided by the German Federal Office for Information Security. For example, they provide recommendations for individuals to secure their smart-home . 74

Product Manuals & Documentation: Besides governmental agencies, the producers of IT products provide documents that aim to increase information security for their customers when using their products. 73 , 75 Often, the manual of these products includes specific recommendations on how to operate the IT product securely and safely. Still, it is well-known, how these, often poorly written documents, are a barrier to secure usage. 75 , 76

Community Recommendations & Guidelines: Lastly, individuals, non-governmental organisations or associations might publish recommendations and guidelines to increase information security and safety. 73 Depending on the regulator, the form of these policy-like artefacts focus on certain products or domains, such as social media. For example, users might provide best-practices for others or a consumer protection NGO might publish recommendations for a certain category of products.

2.2.2.2 Compliance

In private contexts, compliance with information security guidelines and recommendations is a matter of positive motivators and facilitating factors, as there is no regulatory or contractual relationship between the regulator and the regulatee.

Positive Motivators While we are not aware of any behavioural research on ISPs as recommendations or guidelines in a civilian context, there are adjacent areas and artefacts that give us an idea of the prerequisites for compliance. Security awareness trainings are a prominent example. Here, for example, gamification approaches increase the motivation to participate and engage with such policies. 77 , 78 , 79 The policy as such does not motivate, but the engagement with it and the resulting awareness, as an important compliance factor, are stimulated. Another motivator, well-known from the end-user or consumer segment, is nudging to motivate certain security behaviours. 80 For example, artefacts could nudge users into choosing the more secure settings for their devices or creating longer passwords.

Facilitating Factors In general, from usable security and privacy research, we are well aware about the importance of increased usability of security features as foundation for safe usage. 81 , 82 , 83 However, there are also other textual artefacts that allow us to better understand the required usability features of policies. For example, research found that the understanding of complex privacy policies of websites can be facilitated by security icons, as a simple and summarising tool. 84 , 85 , 86

Within this conceptual research, we focus on all human-facing ISPs, e.g., issue-specific ISPs to guide employees client usage, sector-specific ISPs (regulation) guiding IT-professionals work, or handbooks explaining end-users a safe usage, as their Usability, as we will argue below, has the strongest implications for the relation between human behaviour, ISP, and the context.

3 From usable design characteristics to usable information security policies

The prevailing discourse on ISPs, as shown above, has been largely involved in discussions around compliance. We acknowledge the traditional concept of ISPs, primarily characterised by standards, strict adherence to established procedures, best-practices, and regulatory frameworks. 10 Our proposal aims to build upon and expand this established understanding, integrating usability without diminishing the regulatory component. In the rapidly evolving digital landscape, this rigid focus on compliance can often overlook the practical usability of these policies for end users, as well as local requirements from the context.

Thereby, our approach follows the assumption that usability is not just an inherent design characteristic of an artefact, but an emergent relational property that depends on interactions of users, artefacts, tasks, and environments. 25 , 26 Any change to the design of the artefacts, the goals, the context of use, or the user impacts usability. 87 Therefore, even well-designed artefacts can exhibit a lack of usability if there is a misalignment with regard to goals, context, or users. 88 From this stance our focus is not only on usable design characteristics of ISPs as artefacts, but on usability of the interaction if ISP with their contextual features (see Figure 1).

Figure 1: 
From usable design characteristics to usable information security policies.
Figure 1:

From usable design characteristics to usable information security policies.

3.1 Usable design characteristics of ISPs

Usable design of ISPs can be considered a facilitating factor, 89 although most ISPs are based on compliance-focused content and traditional formulation methods that do not consider such design characteristics. 10 Despite this disconnect, usable design of ISPs can be studied from different perspectives, 15 , 18 such as ISP Content, ISP Presentation, and ISP Management. This research mostly stems from organisational context, however, some research also focused on the broader landscape of policies. Below is a brief overview of these perspectives:

ISP Content: The content of a policy refers to instructions, guidelines, or regulations defined to direct or control specific behaviour. Issue-specific ISPs must also be congruent with the overarching enterprise ISP. 28 , 90 , 91 Inconsistencies could lead to insecurities and wrong interpretations of how to comply with the ISP.

The ISP should provide actionable advice 28 , 60 , 92 , 93 , 94 to guide employees toward secure use of technologies. This includes managing, saving, and protecting assets in alignment with organisational information security risks 92 and goals. 28 Guidance should also be applicable in uncertain scenarios. 95

ISPs should consider the work context and be adjusted to match employees’ work routines and procedures. 96 This includes considering the organisational risks, and specific risks of employees’ work environments and routines. 92 , 97

Policies and prescribed measures must align with the responsibilities, roles, and the vertical/horizontal hierarchy of the organisation. Clear responsibilities and expectations for the roles involved should be communicated, 91 , 98 including sanctions for security incidents and policy breaches. 98 , 99

Research on governmental polices especially supports such design features, as it, e.g., shows the importance of comprehensive categories 55 , 100 or well-defined terms or conditions. 100

ISP Presentation: The presentation of an ISP refers to the style of its communication and visual representation to the intended audience. This includes format, language, structure, and visual elements used to convey the ISP’s content and directives. A clear structure and communicative strategy are needed to avoid information overload. 90 , 101 , 102 The user should identify the current version of the ISP, 99 relevant chapters for task-specific issues, 90 , 99 and the overall scope of the document. 103

The writing style must be direct, clear, and understandable. 104 Terms should be clearly defined, especially those uncommon in the work context, with varying or ambiguous meanings. 90 , 105 The challenge lies in reducing the complexity of details, making them user-centred to avoid being ignored. 106 A well-designed ISP should have a visual appearance relevant to the readers, is easy to understand and read, 15 , 28 as well as appropriate in length and tone. 107 Achieving a balance between aesthetics, transparency, and intuitiveness is crucial to facilitate employees adoption of security measures. 104

Conceptual frameworks, such as the Consolidated Communication-Human Information Processing (C–HIP), help align the presentation with human mental information processing. 108 This includes adapting the presentation to the specific language skills, educational levels, and communication practices of the target audience. 31 , 60 , 91 Large organisations with diverse groups may need to present the same content in multiple ways. 60

Although there is little research on the presentation of ISP content in governmental or civil contexts, 73 adjacent research 109 allows us to get a glimpse of presentation issues in this context. Previous research supported the importance of comprehensiveness 110 and understandability. 111 Moreover, it highlights awareness and findability of information. 110 , 111

Other research highlights the alignment with users’ education and ability. 109 Usable handling and efficiency should be enabled by security and privacy artefacts. 109 , 111 A level of accessibility for non-experts should be granted. 109 , 112

ISP Management: ISP Management refers to the process of developing, maintaining, and updating security policies and procedures. Effective ISP management must provide up-to-date information. Lacking up-to-date information could lead to gaps in addressing risks, 107 based on changes in technology, work procedures, organisational structure, culture, goals, or legal requirements. 90 , 92 , 94 , 113 ISPs must be regularly reviewed and aligned with the organisation’s objectives and overarching documents. 91 , 92 , 94 , 95 , 113 A mismatch between documents could lead to conflicts and decrease usability, requiring users to determine which requirements are accurate and relevant for compliance.

Neither organisational ISP methods, nor design methods of governmental regulations 114 consider usable design characteristics within their processes. These are mostly oriented towards formal comprehensiveness and compliance goals. 91 , 92 , 94 , 95 , 113 This is why Jakobi et al. 16 argue that a more usability-oriented process of policy management could improve presentation, e.g., by providing policy evaluations to design more easily consumable legal documents and interactions with those.

3.2 Usable information security policies

To bridge the gap between designing ISPs based on usable design characteristics and the contextual requirements we argue for a stronger integration of usability as a relational property. 17 , 25 , 26 The following examples show how traditional approaches to the design of ISPs often fail as their focus is on comprehensive content and presentation, as well as alignment with strategic goals and other documents, instead of deep engagement with the context of application.

  1. Organisational ISPs: Lawo and Stevens show 17 how a policy that demands encryption of mails is not usable, as the work context especially the mail client and the crypto tool provided by the organisation are not well aligned with the policy. For example, the crypto tool is not by default and automatically encrypting mails but requires additional effort and awareness. Thus, users probably do not encrypt, which results in a lack of effectiveness of the policy. This is in line with previous research on the poor usability of tools for email encryption and their lack of integration into communication routines. 115

  2. Governmental ISPs: Alhazmi et al. 116 report that it is difficult for developers to understand the principles of the GDPR and implement them accordingly. Of course, developers are legally obliged to implement the law at this point, but due to a lack of understanding of the law, it cannot always be implemented in accordance with the regulation. As a result, GDPR does not achieve its full protective effect. More guiding formulations would support the developers in creating fully compliant implementations.

  3. Civil ISPs: In a recommendation for securing the smart home, issued by the German Federal Office for Information Security, 74 it states that consumers should set up a separate network for their smart homes devices. Here, we can easily assume that most users do not have the expertise to carry out the necessary configuration of their router. An alternative formulation ‘Only connect your smart home devices to the Guest Wi-Fi’ would probably increase usability as such guest Wi-Fi is usually available by default in most routers. Moreover, the security increasing network separation would be fulfilled no matter if a Guest Wi-Fi is used for device separation or a newly created IoT Wi-Fi.

The examples show that usable design characteristics alone are not enough to make a policy effective. It is crucial that mismatches between the policy and its application context are recognised and continuously addressed. For example, an encryption obligation remains ineffective if the provided tool is not integrated by default and requires additional effort. The GDPR cannot fully develop its protection if developers have difficulties implementing its principles in practice. And a recommendation on network separation in the smart home is useless if it fails due to users’ lack of technical expertise. Thus, usability is and should not be understood as just a question of good policy design, but is a relational property between policy and context of use. 25 , 26

In addition, the inherent complexity of modern digital ecosystems requires flexible and adaptable policies for different contexts. 11 , 73 ISPs, although legally sound in structure, often do not take into account the diverse user interactions with technology or other contextual features. This discrepancy can result in policies that are sound in theory but fail in practical application, leading to vulnerabilities and security breaches. 17 A usable security approach recognises this by prioritising usability and ensuring that policies are not only clear and understandable, but also actionable in real-world scenarios. 28 , 60 , 92 , 93 , 94 By incorporating user feedback and iterative design principles, usable security policies can adapt to the changing technological landscape and user needs.

In the following, we will argue for a fundamental change ISP Contextual Alignment as well as subsequent changes in ISP Content and ISP Management to move towards more usable ISPs. Usability and Compliance should be considered as inherent features of the ISPs. These features should not replace the protection goal of the policy, but inform the whole lifecycle of the artefact as well as its managements. For our argument we will not revisit the ISP presentation, as our goal is to move beyond usable design characteristics. Still, the research presented in Section 3.1 remains valid.

UISP Context Alignment: Traditional ISPs primarily focus on compliance, treating socio-technical practices as objects of monitoring and deterrence. However, usability and appropriation research reveal that contextual factors significantly influence the actual use of IT artefacts, often creating a gap between design intentions and real-world application. 87 , 117 To address this, we argue that UISPs must be embedded within the socio-technical contexts they aim to regulate, considering the dynamic interactions between users, tasks, and environments.

Current ISP approaches tend to overlook these contextual factors, focusing solely on compliance rather than on addressing real-world practices. This oversight limits the effectiveness of policies, as they fail to account for how users actually interact with technology in their specific environments. While traditional policy-making aims to involve society, 118 it often disregards the nuances of regulatees’ lived experiences and practices.

To improve ISPs, we propose integrating policy with socio-technical practices, emphasising the need for long-term, usability-informed evaluations. These evaluations would help track how artefacts become embedded in evolving practices and contexts, acknowledging the dual relationship between practices and artefacts. 119 For example, the introduction of an ISP may require new competencies from users or lead to changes in the workplace practices themselves, 120 , 121 such as the adoption of new tools like encryption software. 122

Given these considerations, we define UISPs as artefacts that are not only designed but also evaluated and managed in the context of users’ practices. These policies should be adaptable, ensuring they align with the socio-technical environment in which they operate. Changes in the policy or context can influence both the artefacts and the users, creating a feedback loop that requires continuous monitoring and adjustment.

Definition: UISPs are embedded in the socio-technical contexts of users and organisations, addressing both security and practical usability concerns. While traditional ISPs often overlook the influence of context on policy effectiveness, UISPs recognise the dynamic relationship between contexts, user practices, and the artefacts in use, ensuring that policies are aligned with real-world tasks and conditions.

UISP Content: Traditionally, ISPs have focused primarily on organisational regulatory compliance, emphasising confidentiality, integrity, and availability of information. 10 However, these practices often overlook the usability aspect crucial for ensuring that policies are not only adhered to but also effectively utilised in daily work environments. 96 We therefore argue that UISPs extend the concept of ISPs content by ensuring policies fit the regulatees practices, tools, and contextual needs. For example, in an organisational context, UISPs consider the tools available as well as the work routines. 60 , 90 , 96

Aligning ISPs and the regulated context is critical to ensure compliance and minimise barriers such as techno-stress, distractions, and interruptions. 123 , 124 For example, if work environments are misaligned with the policies, employees are more likely to deviate from security directives. Workplace safety regulations, such as the EU Directive on Workplace Safety and the US OSH Act, already emphasise that protective measures must be compatible with the work conditions to minimise interference and physical or psychological harm. 125 , 126 , 127 Thus, it is the responsibility of employers to ensure that work environments and IT systems are designed to facilitate seamless compliance with ISPs. For example, if encrypted email communication is required, employers should provide a tool that is well-integrated into the existing email client, used by default, and easy for employees to use without additional effort.

As digital technologies, such as AI, IoT, and robotics, blur the lines between physical safety and information security, new safety risks [2] emerge in these contexts of regulation. 128 Governmental regulations are adapting to this shift, as seen with the European AI Act, which combines safety and security considerations. 55 , 129 Similarly, manufacturers’ ISPs are expanding to address both operational and information technologies due to their growing interconnection. 130 , 131 Numerous examples show safety risks stemming from security risks, such as hacker-induced energy supply failures 132 or the takeover of transport systems. 133 A stronger conceptual link between safety and security is necessary. 131 , 134

However, these evolving ISPs often suffer from fragmentation, inconsistent terminology, and disjointed document structures. From a usability perspective the multitude of policies and approaches exacerbates existing problems. making it challenging for regulatees to ensure compliance. 134 , 135 UISPs address this by harmonising terminology and integrating security and safety considerations across multiple levels. This enables a more cohesive and actionable approach to risk mitigation that centres around the user of a certain technology and all risks occurring from this, no matter if digital of physical. 131 , 134 , 136

Research highlights the interconnectedness of governments, organisations, and civil society in security and sovereignty issues 4 , 5 , 6 , 7 and scholars advocate for an architectural approach to national information security, linking policies across different levels. 11 , 73 , 137 As digitalisation intensifies, the regulatory landscape expands, 12 , 13 and civil associations are increasing their policy recommendations. However, ISPs often lack an overarching architecture, leading to methodological and content fragmentation among the levels. 11

An integrated architecture[3] for UISPs, ensures that terminology is consistent and aligned with overarching protection goals, thus improving usability for the regulatee, 28 , 90 who is regulated by multiple concurrent policies within the same context. Independent and uncoordinated ISPs covering the same content hinder compliance and shift the burden of understanding to users. 11 , 73 Aligned content minimises user confusion, enhances compliance, and supports a unified, coordinated strategy across all levels of policy-making. 11 , 73 Without such integration, both governmental and organisational ISPs risk inefficiency, as regulatees might face different recommendations from different policies, which undermines the effectiveness of protective measures. 10

We propose embedding UISPs – ”document(s) regulating human actions regarding information security or expressing the organization’s [or context’s] information security aims” 10 – into a multi-level architecture, combining governmental, organisational, and civil guidelines into a cohesive security framework. 10 This integrated approach will facilitate both compliance and usability, fostering a more effective system of protection against emerging risks in the interconnected digital landscape.

Definition: UISPs extend traditional ISPs by integrating security and safety goals into a unified user-centred framework within a structured architecture. Unlike ISPs, UISPs’ content accounts for the socio-technical contexts in which security practices are applied, ensuring that policies are not only legally compliant but also practically usable, adaptable, and integrated across different organisational and societal levels.

UISP Management: The shift from ISPs to UISPs introduces several methodological challenges. Usability, as an emergent property from interactions among users, tasks, products, and environments, requires iterative and user-centred design approaches. 25 , 26 While usability evaluations traditionally rely on qualitative methods such as interviews, observations, and questionnaires, 15 , 138 , 139 there is a lack of standardised, transferable methods for large-scale, legal-compliant usability evaluation of ISPs. 17

A key barrier to improving ISP usability is the lack of established interfaces for stakeholder participation. Traditional policy-making methods, often bureaucratic and one-way, fail to incorporate user feedback during policy creation. 10 , 140 However, policy interfaces, crucial for integrating stakeholder knowledge, are well-researched in environmental legislation 141 , 142 , 143 and traditional governmental policy-making, 140 where research advocates for updated communication and participation channels. In contrast, user-centred design emphasises active user participation throughout the design phase, allowing policies to better align with real-world tasks and needs. 144 However, despite the value of user-centred approaches, these are rarely integrated into ISP design, which typically follows a more rigid, risk-focused process. 145

Integrating different policy levels already poses significant challenges at the organisational level. 91 , 92 , 94 , 95 , 113 The complexity of integrating technical guidelines with overarching policy objectives is compounded by the need to incorporate legal texts, especially as emerging technologies complicate this process. 28 , 90 , 91 However, harmonised and user-friendly documents are essential for usability and effective protection. 91 , 107 The absence of integration across governmental, organisational, and civil policies undermines their overall effectiveness in achieving security and safety goals.

To address this gap, we propose to integrate participative as well as usability design methods, e.g., user-centred design, into the lifecycle of UISPs. This lifecycle, similar to iterative user-centred design, involves continuous user feedback and evaluation, 144 integrating both qualitative and quantitative approaches to ensure policies are both usable and enforceable. Such an approach differs significantly from traditional policy-making cycles, 10 , 114 , 140 as traditional cycles, while iterative, do not focus on participatory, usability-focused policy creation (see Figure 2). 10 , 146

Figure 2: 
Organisational policy-making (see, e.g.,
146
; the governmental policy-making cycle (see, e.g.
140
); and the user-centred design cycle (see, e.g.,
144
).
Figure 2:

Organisational policy-making (see, e.g., 146 ; the governmental policy-making cycle (see, e.g. 140 ); and the user-centred design cycle (see, e.g., 144 ).

Lawo and Stevens’ Information Security Policy Usability Scale (ISPUS) offers a useful model for evaluating organisational policies but lacks similar tools for governmental or private ISPs. 17 Traditional methods of risk-assessment in policy-making fail to fully capture the nuances of usability, leading to policies that are often ineffective in practice, despite having appropriate technical instruments. To enhance the management of UISPs, we define them as user-centred artefacts, guided by usability-informed methods throughout their lifecycle. These methods promote active user involvement, iterative design, and multidisciplinary collaboration, ensuring that policies remain adaptable and relevant as contexts change. 147

Definition: UISPs redefine the management of ISPs by adopting iterative, user-centred design principles. In contrast to traditional ISPs, which rely on rigid compliance frameworks, UISPs prioritise continuous user feedback, usability evaluations, and legal validity, fostering an adaptive policy lifecycle that evolves in response to both user needs and regulatory changes.

In summary we define UISPs as: UISPs extend traditional ISPs by integrating usability as a relational property between policies and their socio-technical context with the goal of alignment. Unlike conventional ISPs, UISPs prioritise clarity, accessibility, and adaptability, ensuring that policies are not only legally compliant but also practical, intuitive, and aligned with real-world user practices. This includes a stronger alignment with contextual factors, such as physical security or multi-regulatory environments. By incorporating user-centred design principles, continuous feedback, and iterative improvements, UISPs foster an adaptive policy lifecycle that evolves with organisational, technological, and regulatory changes.

3.3 Challenges & research avenues

In light of this analysis, it becomes evident that a shift in our reconceptualisation of ISPs is imperative. Considering the arguments presented above, it becomes clear that this is not a minor adjustment but a fundamental shift in the conceptualisation of ISPs, their integration within the context, and the processes involved in their creation and use. Consequently, a comprehensive reevaluation of the role of ISPs is necessary, which in turn requires a corresponding transformation in established standards and security cultures. Such change comes with multiple challenges, that cannot be answered by this conceptual research. Thus, in the following, we present these challenges, as anticipated by us. Each challenge 24 is introduced with a discussion of the issues arising from the shift towards usability and a corresponding research agenda in direct response to these issues.

3.3.1 Challenge 1: Balancing compliance with usability in policy goals

ISPs have focused primarily on compliance, often serving as legal safeguards rather than tools to achieve protection goals. This compliance-centric approach neglects usability, leading to impractical policies that are difficult to follow, ultimately hindering compliance. 60 , 93 A fundamental challenge in reconceptualising ISPs as UISPs is to effectively integrate usability into policies while maintaining legal robustness. Achieving this balance requires ensuring that policies are not only compliant but also intuitive and actionable for users in real-world contexts, fostering genuine engagement with security practices. The challenge lies in creating policies that enhance security without sacrificing the legal framework required for enforcement. Hence, we formulate the following related research questions for future research:

  1. How can compliance and usability be effectively integrated to enhance the overall effectiveness of ISPs while maintaining legal robustness?

  2. How to effectively design cooperative enforcement styles in achieving compliance and protection levels for security and safety policies?

3.3.2 Challenge 2: Integrating safety and security in protection goals

The traditional focus of ISPs on organisational information security fails to account for the increasingly interconnected nature of digital and physical worlds. 134 As technologies like AI, IoT, and robotics evolve, they introduce new safety risks that stem directly from security vulnerabilities. 128 , 132 Addressing both safety and security within a unified framework presents a significant challenge in creating UISPs. Integrating these two perspectives requires not only a conceptual shift but also a redefinition of terminology, guidelines, and risk mitigation strategies across domains.

While research has begun to bridge the gap between safety and security, 135 there is no established standard or methodology to effectively merge these two areas. Furthermore, integrating safety and security measures within UISPs risks compromising the legal robustness of existing risk-management frameworks. Therefore, a key challenge lies in designing UISPs that can address both safety and security concerns while maintaining compliance and usability. Related research questions for future avenues are:

  1. How can safety and security measures be effectively integrated into UISPs to increase overall protection and usability?

  2. How can we ensure a usable and legally robust design of safety and security integrated UISPs?

  3. What are the impacts of integrated safety and security measures on the practices of regulators and regulatees?

3.3.3 Challenge 3: Architectural integration of ISPs across levels

The lack of a cohesive, multi-level architecture for ISPs presents a major challenge in creating UISPs). 11 , 73 Traditional ISPs are often fragmented across various levels – government, organisation, and civil society – resulting in inconsistent terminology, redundant guidelines, and uncoordinated enforcement. 4 , 7 As digitalisation increases, so does the complexity of the regulatory landscape, with new technologies and expanding safety concerns further complicating the integration of policies . 12 , 13 Without a unified approach, policies become less actionable, forcing users and stakeholders to navigate multiple disconnected documents, each with its own terminology and objectives. 10 , 11

To address this, UISPs must be embedded within a comprehensive architecture that links governmental, organisational, and civil policies, ensuring consistency, alignment, and clarity across the ecosystem. 148 However, designing and implementing such an integrated system presents several challenges. Regulators need support in harmonising policies across levels, while the complexity of maintaining coherence among diverse documents adds another layer of difficulty. 28 , 90 Related research questions are:

  1. How should civil UISPs be implemented and governed to ensure usability and overall protection?

  2. How can we support regulators in aligning their ISPs with the entire architecture and vice-versa?

  3. How can we implement and govern ISP architectures to increase overall protection and usability?

3.3.4 Challenge 4: Methodological integration of usability in ISP design

The shift toward usability in ISPs introduces significant methodological challenges, particularly in terms of integrating user-centred design principles within the policy-making cycle. 25 , 26 While traditional policy-making often follows a formal, bureaucratic, and iterative process, it lacks the participatory, usability-focused approach needed to create effective ISPs. 10 , 146 Usable design principles emphasise the importance of iterative, micro-cycle evaluations that involve users and stakeholders at every stage of the policy lifecycle. However, such methods are currently underdeveloped, particularly in the context of ISPs, leading to a disconnect between user needs, usability goals, and legal compliance. 17 , 28

A key issue is the absence of established methods for creating and evaluating ISPs that integrate both usability and legal robustness. Existing frameworks focus on either usability or compliance but fail to combine the two in a way that can support iterative design and evaluation. 139 Moreover, while the policy-making cycle is established for governmental and organisational policies, these cycles do not typically involve the iterative, user-centred methods necessary for designing truly usable ISPs. 140 , 146

The challenge is to design and implement methods that integrate usability into the ISP lifecycle, combining qualitative and quantitative evaluation methods, and ensuring that stakeholder involvement is continuous and meaningful. This shift would require the establishment of a new kind of policy interface – a channel through which regulators, regulatees, and other stakeholders can exchange feedback and co-create usable policies. 147 The success of UISPs hinges on the ability to adapt existing policy cycles to incorporate user-centred design principles while maintaining legal and compliance requirements. we formulate the following related research questions for future research:

  1. How can we combine the policy-making cycle with user-centred design to include stakeholders?

  2. How can we adapt policy management cycles to increase overall usability for regulators and regulatees?

  3. How can we measure the usability of ISPs (qualitatively and quantitatively)?

3.3.5 Challenge 5: Socio-technical integration in ISP design and evaluation

ISPs are primarily compliance-driven, focusing on monitoring and sanctioning violations. However, socio-technical practices research demonstrates that the usage of IT artefacts is deeply influenced by the context in which they are used, and the gap between the intended design and actual use can significantly impact their effectiveness. 87 , 117 Usable design characteristic based approaches miss out on the opportunity to consider how users’ practices, behaviours, and contexts evolve over time, which hinders the potential for continuous improvement in policy design and usage.

The disconnect between the compliance-driven nature of ISPs and the actual practices and contexts in which these policies are enacted leads to missed opportunities for enhancing usability and protection. Long-term policy evaluations informed by usability research are needed to understand the role of ISPs within shifting socio-technical environments. Furthermore, ISPs’ design and enforcement often fail to recognise the dual nature of artefacts and practices: the implementation of a policy can change both the practices of the users (regulatees) and the context in which the policy is applied. 119 , 120 , 121 Related research avenues are:

  1. How do regulatees appropriate and use UISPs, and how do these UISPs influence and shape the socio-technical contexts in which they operate?

  2. How does the architecture of UISPs influence socio-technical practices across traditional ISP boundaries?

  3. How do civil UISPs influence consumer practices and the overall protection level of society?

4 Implications

4.1 Theoretical implications

From a theoretical point of view, the new conception of ISPs as UISPs, as presented here, offers a new concept within Usable Security/Safety. This concept lies at the interface between HCI and information security and IT security, the core of this research area. The UISP concept thus addresses the problems identified by scientists such as Malik et al. 11 and Holton et al. 73 They argue in favour of a more coordinated ISP architecture in order to guarantee information security throughout society.

Thereby, scholars are most prominently challenged by the changing role of ISPs. The shift in the function of ISPs from purely legal instruments to tools that promote both compliance and usability has far-reaching implications for the understanding and application of ISPs. For example, the traditional understanding of security hierarchies 148 is challenged by more non-contractual, non-legal forms on influencing and regulating human behaviour through aligning document and context more strongly. It is necessary to develop models that resolve the contradiction of usability as a relational property and legal requirements beyond applying usable design characteristics.

The proposed reconceptualisation of ISPs content to include government policies and civil programmes broadens the scope of design and research. Policy makers and researchers should broaden their perspective to include interdependencies beyond their organisational boundaries and focus on a whole-of-society approach to information security. 149 , 150 This research builds on research in recent years focussing on the intersection of law and HCI, 16 e.g., the analysis of the AI Act. 55

Moreover, the integration of design methods for aligning document and context into the yet longer macro-cycles of policy making 10 , 114 , 140 , 146 is a challenging, yet interdisciplinary task. New methods are required and will change the way scholars and partitioners understand and create these artefacts. These methods need to integrate usability experts, designers, security/safety experts, besides legal-experts. Similarly, combining security and safety into one policy requires change across an ecology of methods, standards, and practices. 131 , 134

Lastly, this approach also offers a link for usable security/safety to the recently observed turn-towards-policy in HCI. 151 , 152 Here, our own findings on the research and implementation of policies, combined with the wider discourse, can open up future-oriented fields for HCI involvement.

However, in addition to the argumentatively derived idea, many open questions remain. The transfer between state, organisational and civil policies offers initial hypotheses for their design and impact. However, a research programme that addresses these questions has yet to be established. Of course, we would like to point out that the research questions and open issues are certainly not an exhaustive list. Especially in the course of future research, more open points will become apparent.

4.2 Practical implication

Even if many theoretical questions remain unresolved, the outlined approach of UISPs already offers implications for the practice of regulators in its current form. Practitioners should start with aligning their documents more strongly with the context of regulation.

In the UISP approach, we argue on the basis of existing research in favour of changing the function of ISPs. These should no longer be created by practitioners solely as tools for legal compliance, but should already include initial approaches to usability design. This would support the regulatees in their implementation so that security/safety gains can be achieved.

Beyond this overarching perspective, however, the framework shows ways to implement such an increase in usability.

For example, practitioners, particularly in the robotics, optical technology or AI-related industries, should consider the physical risks resulting from a lack of information security. In the first step, the boundaries between artefacts probably still need to be dissolved. However, joint risk management workshops could uncover new risks or a joint glossary could help to harmonise content and presentation.

The focus on compliance, moreover, can sometimes create a checklist mentality where the goal is to meet the minimum requirements set by a standard rather than achieve optimal and usable security. 10 This can lead to a false sense of security where organisations believe they are protected because they comply with standards, despite potential gaps in actual security practices. Practitioners should shift this focus from mere compliance to true understanding and commitment to security practices. Teaming with UX and Usability Experts from their companies could, e.g., help them improving their policies contextual fit.

Even if there is still a lack of methods for the design and evaluation of ISPs, we already know today which criteria make an ISP usable. Accordingly, practitioners should establish an UISP-interface with the regulatees in their specific context and take the criteria of a usable policy into account in their daily work. On a first glimpse, even user-centred design approaches would loosely fit the policy-making cycle, such that first experiments could be made here. Even without, a perfectly aligned method, they should include experts, from other fields, especially usability-experts to uncover the most-critical usability issues. Even small steps can certainly increase usability here.

Lastly, practitioners, even without changing the whole policy, could provide an accompanying guideline that visualises or explains the most important rules based on practical examples. Here, for example, research on icons could help to find meaningful and supportive visualisations (see, e.g., 84 , 86 ).

5 Conclusions

In line with the recent turn-towards-policy in HCI, in this article we examined the challenges and opportunities of UISPs. Based on the argument that conventional ISPs are often too complex, confusing and impractical, the need for a new concept became clear. The UISPs approach focusses on usability and aims to create policies that are understandable, applicable and aligned with the socio-technical practices of the regulatees.

A central element of the UISP framework is the integration of security and safety measures. As the physical and digital worlds become increasingly interconnected, ISPs must ensure not only information security but also safety. This requires a rethink on the part of regulators, who need to connect the traditionally disconnected disciplines.

In addition, UISPs must be embedded in a coherent architecture that encompasses governmental, organisational and civil society policies. The use of common terminology and the pursuit of common protection goals are essential to avoid inconsistencies and duplication of effort. The establishment of civil society security programmes that are integrated into overarching documents and government strategies is an important step in this direction.

We argued that involving users in the design process is crucial to ensure that the guidelines meet their needs and are applicable in practice. However, there is still a need for research in this area, as few user-centred methods for the development and evaluation of ISPs exist to date.

However, the implementation of the UISP approach is a complex challenge that requires the cooperation of different actors. Regulators, businesses, academia and civil society must work together to develop and implement UISPs. We argue that adopting this concept is critical for the usable security and safety community, as current approaches failed to provide meaningful support. We invite other scholars to critically engage with the concept, research open questions as well as work together on sharpening the concept of UISPs.


Corresponding author: Dennis Lawo, 14312 University of Siegen, Verbraucherinformatik Research Group , Siegen, Germany, E-mail:

  1. Research ethics: Not applicable.

  2. Informed consent: Not applicable.

  3. Author contributions: All authors have accepted responsibility for the entire content of this manuscript and approved its submission.

  4. Use of Large Language Models, AI and Machine Learning Tools: Microsoft CoPilot to improve language.

  5. Conflict of interest: The authors state no conflict of interest.

  6. Research funding: None declared.

  7. Data availability: Not applicable.

References

1. Admass, W. S.; Munaye, Y. Y.; Diro, A. A. Cyber Security: State of the Art, Challenges and Future Directions. Cyber Secur. Appl. 2024, 2, 100031. https://doi.org/10.1016/j.csa.2023.100031.Search in Google Scholar

2. Ustundag, A.; Cevikcan, E.; Ervural, B. C.; Ervural, B. Overview of Cyber Security in the Industry 4.0 Era. Industry 4.0: Manage. Digital Transform 2018, 267–284. https://doi.org/10.1007/978-3-319-57870-5_16.Search in Google Scholar

3. Montasari, R. Cyber Threats and the Security Risks They Pose to National Security: An Assessment of Cybersecurity Policy in the united kingdom. Countering Cyberterrorism: Confluence Artif. Intell. Cyber Forensics Digit. Policing US UK National Cybersecur. 2023, 7–25. https://doi.org/10.1007/978-3-031-21920-7_2.Search in Google Scholar

4. Haddad, C.; Binder, C. Governing through Cybersecurity: National Policy Strategies, Globalized (In-) Security and Sociotechnical Visions of the Digital Society. Osterr. Z. fur Soziol. 2019, 44 (1), 115–134. https://doi.org/10.1007/s11614-019-00350-7.Search in Google Scholar

5. Lawo, D.; Neifer, T.; Esau, M.; Stevens, G. Human-centred Digital Sovereignty: Explorative Conceptual Model and Ways Forward. In International Conference on Computer-Human Interaction Research and Applications; Springer: Rome, 2023; pp 84–103.10.1007/978-3-031-49368-3_6Search in Google Scholar

6. Lawo, D.; Neifer, T.; Esau-Held, M.; Stevens, G. Digital Sovereignty: What it is and Why it Matters for Hci. In Extended Abstracts of the 2023 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: Hamburg, 2023; pp 1–7.10.1145/3544549.3585834Search in Google Scholar

7. Lehto, M. The Ways, Means and Ends in Cyber Security Strategies. In Proceedings of the 12th European Conference on Information Warfare and Security; Academic Conferences and Publishing International Limited: Jyväskylä, 2013; pp 182–190.Search in Google Scholar

8. Kumar, S.; Benigni, M.; Carley, K. M. The Impact of Us Cyber Policies on Cyber-Attacks Trend. In 2016 IEEE Conference on Intelligence and Security Informatics (ISI); IEEE: Tucson, 2016; pp 181–186.10.1109/ISI.2016.7745464Search in Google Scholar

9. Vitel, P.; Bliddal, H. French Cyber Security and Defence: An Overview. Inf. Secur. 2015, 32 (1), 1.10.11610/isij.3209Search in Google Scholar

10. Paananen, H.; Lapke, M.; Siponen, M. State of the Art in Information Security Policy Development. Comput. Secur. 2020, 88, 101608. https://doi.org/10.1016/j.cose.2019.101608.Search in Google Scholar

11. Malik, W. J. Information Security Policy in the Us National Context. In Information Security; Routledge: New York, 2016; pp 175–195.Search in Google Scholar

12. Bygrave, L. A. The ‘Strasbourg Effect’on Data Protection in Light of the ‘Brussels Effect’: Logic, Mechanics and Prospects. Comput. Law Secur. Rev. 2021, 40, 105460. https://doi.org/10.1016/j.clsr.2020.105460.Search in Google Scholar

13. Siegmann, C.; Anderljung, M. The brussels effect and Artificial Intelligence: How Eu Regulation Will Impact the Global Ai Market. arXiv preprint arXiv:2208.12645 2022, 1–97; https://doi.org/10.48550/arXiv.2208.12645.Search in Google Scholar

14. F. of Life Institute High-level Summary of the AI Act | EU Artificial Intelligence Act, 2024. URL https://artificialintelligenceact.eu/high-level-summary/.Search in Google Scholar

15. Alkhurayyif, Y.; Weir, G. R. Evaluating Readability as a Factor in Information Security Policies. Int. J. Trend Res. Dev. 2017, 54–64.Search in Google Scholar

16. Jakobi, T.; von Grafenstein, M. What Hci Can Do for (Data Protection) Law–Beyond Design. Hum. Factors Priv. Res. 2023, 115, 115–136. https://doi.org/10.1007/978-3-031-28643-8_6.Search in Google Scholar

17. Lawo, D.; Stevens, G. Information Security Policy Usability Scale: A Questionnaire for Evaluating the Usability of Information Security Policies. In Mensch und Computer 2024-Workshopband; Gesellschaft für Informatik eV: Karlsruhe, 2024; pp 10–18420.Search in Google Scholar

18. Rostami, E.; Karlsson, F.; Gao, S. Requirements for Computerized Tools to Design Information Security Policies. Comput. Secur. 2020, 99, 102063. https://doi.org/10.1016/j.cose.2020.102063.Search in Google Scholar

19. Vance, A.; Siponen, M.; Pahnila, S. Motivating Is Security Compliance: Insights from Habit and Protection Motivation Theory. Inf. Manag. 2012, 49 (3–4), 190–198. https://doi.org/10.1016/j.im.2012.04.002.Search in Google Scholar

20. Chiu, C. M.; Cheng, H. L.; Hsu, J.; Huang, C. H. Examining Employees’ Intention to Comply with Information Security Policies: The Roles of Loafing and Commitment. In PACIS 2022 Proceedings; Taipei-Sydney, 2022.Search in Google Scholar

21. Safa, N. S.; Von Solms, R.; Furnell, S. Information Security Policy Compliance Model in Organizations. Comput. Secur. 2016, 56, 70–82. https://doi.org/10.1016/j.cose.2015.10.006.Search in Google Scholar

22. Stanton, J. M.; Stam, K. R.; Mastrangelo, P.; Jolton, J. Analysis of End User Security Behaviors. Comput. Secur. 2005, 24 (2), 124–133. https://doi.org/10.1016/j.cose.2004.07.001.Search in Google Scholar

23. Eckhardt, G. M.; Houston, M. B.; Jiang, B.; Lamberton, C.; Rindfleisch, A.; Zervas, G. Marketing in the Sharing Economy. J. Market. 2019, 83 (5), 5–27. https://doi.org/10.1177/0022242919861929.Search in Google Scholar

24. Richter, A.; Richter, S. Hybrid Work–A Reconceptualisation and Research Agenda. I-Com 2024, 23 (1), 71–78. https://doi.org/10.1515/icom-2023-0027.Search in Google Scholar

25. Bevan, N.; Carter, J.; Harker, S. Iso 9241-11 Revised: What Have We Learnt about Usability since 1998? In Human-Computer Interaction: Design and Evaluation: 17th International Conference, HCI International 2015, Los Angeles, CA, USA, August 2-7, 2015, Proceedings, Part I 17; Springer: Los Angeles, 2015; pp 143–151.10.1007/978-3-319-20901-2_13Search in Google Scholar

26. Lewis, J. R. Usability: Lessons Learned-And yet to Be Learned. Int. J. Hum. Comput. Interact. 2014, 30 (9), 663–684. https://doi.org/10.1080/10447318.2014.930311.Search in Google Scholar

27. Wobbrock, J. O.; Kientz, J. A. Research Contributions in Human-Computer Interaction. Interactions 2016, 23 (3), 38–44. https://doi.org/10.1145/2907069.Search in Google Scholar

28. Goel, S.; Chengalur-Smith, I. N. Metrics for Characterizing the Form of Security Policies. J. Strat. Inf. Syst. 2010, 19 (4), 281–295. https://doi.org/10.1016/j.jsis.2010.10.002.Search in Google Scholar

29. Landoll, D. J. Information Security Policies, Procedures, and Standards: A Practitioner’s Reference; Auerbach Publications: New York, 2017.10.1201/9781315372785Search in Google Scholar

30. Knapp, K. J.; Morris, R. F.Jr; Marshall, T. E.; Byrd, T. A. Information Security Policy: An Organizational-Level Process Model. Comput. Secur. 2009, 28 (7), 493–508. https://doi.org/10.1016/j.cose.2009.07.001.Search in Google Scholar

31. Cram, W. A.; Proudfoot, J. G.; D’arcy, J. Organizational Information Security Policies: a Review and Research Framework. Eur. J. Inf. Syst. 2017, 26, 605–641. https://doi.org/10.1057/s41303-017-0059-9.Search in Google Scholar

32. Whitman, M. E. Security Policy: From Design to Maintenance. In Information Security; Routledge: New York, 2016; pp 123–151.Search in Google Scholar

33. Lowry, P. B.; Moody, G. D. Proposing the Control-Reactance Compliance Model (Crcm) to Explain Opposing Motivations to Comply with Organisational Information Security Policies. Inf. Syst. J. 2015, 25 (5), 433–463. https://doi.org/10.1111/isj.12043.Search in Google Scholar

34. Crossler, R. E.; Johnston, A. C.; Lowry, P. B.; Hu, Q.; Warkentin, M.; Baskerville, R. Future Directions for Behavioral Information Security Research. Comput. Secur. 2013, 32, 90–101. ISSN 0167-4048. https://doi.org/10.1016/j.cose.2012.09.010.Search in Google Scholar

35. Chen, C. C.; Medlin, B. D.; Shaw, R. S. A Cross-Cultural Investigation of Situational Information Security Awareness Programs. Inf. Manag. Comput. Secur. 2008, 16 (4), 360–376.10.1108/09685220810908787Search in Google Scholar

36. Safa, N. S.; Maple, C. Human Errors in the Information Security Realm–And How to Fix Them. Comput. Fraud Secur. 2016, 2016 (9):17–20, https://doi.org/10.1016/s1361-3723(16)30073-2, 2016.Search in Google Scholar

37. Safa, N. S.; Sookhak, M.; Von Solms, R.; Furnell, S.; Ghani, N. A.; Herawan, T. Information Security Conscious Care Behaviour Formation in Organizations. Comput. Secur. 2015, 53, 65–78. https://doi.org/10.1016/j.cose.2015.05.012.Search in Google Scholar

38. Yerby, J.; Floyd, K. Faculty and Staff Information Security Awareness and Behaviors. J. Colloq. Inf. Syst. Secur. Educ. 2018, 6, 23.Search in Google Scholar

39. Adams, A.; Sasse, M. A. Users Are Not the Enemy. Commun. ACM 1999, 42 (12), 40–46. https://doi.org/10.1145/322796.322806.Search in Google Scholar

40. D’Arcy, J.; Herath, T.; Shoss, M. K. Understanding Employee Responses to Stressful Information Security Requirements: A Coping Perspective. J. Manag. Inf. Syst. 2014, 31 (2), 285–318. https://doi.org/10.2753/mis0742-1222310210.Search in Google Scholar

41. Moody, G. D.; Siponen, M.; Pahnila, S. Toward a Unified Model of Information Security Policy Compliance. MIS Q. 2018, 42 (1), 285–A22. https://doi.org/10.25300/misq/2018/13853.Search in Google Scholar

42. Bulgurcu, B.; Cavusoglu, H.; Benbasat, I. Information Security Policy Compliance: an Empirical Study of Rationality-Based Beliefs and Information Security Awareness. MIS Q. 2010, 34, 523–548. https://doi.org/10.2307/25750690.Search in Google Scholar

43. Moody, G. D.; Siponen, M.; Pahnila, S. Toward a Unified Model of Information Security Policy Compliance. MIS Q. 2018, 42 (1), 285–311. https://doi.org/10.25300/misq/2018/13853.Search in Google Scholar

44. Myyry, L.; Siponen, M.; Pahnila, S.; Vartiainen, T.; Vance, A. What Levels of Moral Reasoning and Values Explain Adherence to Information Security Rules? an Empirical Study. Eur. J. Inf. Syst. 2009, 18 (2), 126–139. https://doi.org/10.1057/ejis.2009.10.Search in Google Scholar

45. Siponen, M. T. Critical Analysis of Different Approaches to Minimizing User-Related Faults in Information Systems Security: Implications for Research and Practice. Inf. Manag. Comput. Secur. 2000, 8 (5), 197–209. https://doi.org/10.1108/09685220010353178.Search in Google Scholar

46. Straub, D. W. Effective Is Security: An Empirical Study. Inf. Syst. Res. 1990, 1 (3), 255–276. ISSN 1047-7047. https://doi.org/10.1287/isre.1.3.255.Search in Google Scholar

47. Blythe, J. M.; Coventry, L.; Little, L. Unpacking Security Policy Compliance: The Motivators and Barriers of Employees’ Security Behaviors. In Eleventh Symposium On Usable Privacy and Security (SOUPS 2015), 2015; pp. 103–122.Search in Google Scholar

48. Hong, Y.; Xu, M. Autonomous Motivation and Information Security Policy Compliance: Role of Job Satisfaction, Responsibility, and Deterrence. J. Organ. End User Comput. (JOEUC) 2021, 33 (6), 1–17. https://doi.org/10.4018/joeuc.20211101.oa9.Search in Google Scholar

49. Siponen, M.; Mahmood, M. A.; Pahnila, S. Employees’ Adherence to Information Security Policies: An Exploratory Field Study. Inf. Manag. 2014, 51 (2), 217–224. https://doi.org/10.1016/j.im.2013.08.006.Search in Google Scholar

50. Dinev, T.; Goo, J.; Hu, Q.; Nam, K. User Behaviour towards Protective Information Technologies: the Role of National Cultural Differences. Inf. Syst. J. 2009, 19 (4), 391–412. https://doi.org/10.1111/j.1365-2575.2007.00289.x.Search in Google Scholar

51. Hanelt, A.; Bohnsack, R.; Marz, D.; Antunes Marante, C. A Systematic Review of the Literature on Digital Transformation: Insights and Implications for Strategy and Organizational Change. J. Manag. Stud. 2021, 58 (5), 1159–1197. https://doi.org/10.1111/joms.12639.Search in Google Scholar

52. Verhoef, P. C.; Broekhuizen, T.; Bart, Y.; Bhattacharya, A.; Dong, J. Q.; Fabian, N.; Haenlein, M. Digital Transformation: A Multidisciplinary Reflection and Research Agenda. J. Bus. Res. 2021, 122, 889–901. https://doi.org/10.1016/j.jbusres.2019.09.022.Search in Google Scholar

53. Hanisch, M.; Goldsby, C. M.; Fabian, N. E.; Oehmichen, J. Digital Governance: A Conceptual Framework and Research Agenda. J. Bus. Res. 2023, 162, 113777. https://doi.org/10.1016/j.jbusres.2023.113777.Search in Google Scholar

54. Dziundziuk, V. B.; Kotukh, Y. V.; Krutii, O. M.; Solovykh, V. P.; Kotukov, O. A. State Information Security Policy (Comparative Legal Aspect). Cuest. Polit. 2021, 39 (71), 166–186. https://doi.org/10.46398/cuestpol.3971.08.Search in Google Scholar

55. Recki, L.; Lawo, D.; Krauß, V.; Pins, D. A Qualitative Exploration of User-Perceived Risks of Ai to Inform Design and Policy. In MuC (Workshopband); GI: Rapperswil, 2023.Search in Google Scholar

56. Rogerson, K.; Milton, D. A Policymaking Process “Tug-of-War”: National Information Security Policies in Comparative Perspective. J. Inf. Technol. Politics 2013, 10 (4), 462–476. https://doi.org/10.1080/19331681.2013.843989.Search in Google Scholar

57. OECD National Strategies, Agendas and Plans AI Strategies and Policies, 2024. https://oecd.ai/en/dashboards/policy-instruments/National_strategies_agendas_and_plans (accessed 2024 10 20).Search in Google Scholar

58. Ku, C. Y.; Chang, Y. W.; Yen, D. C. National Information Security Policy and its Implementation: A Case Study in Taiwan. Telecommun. Policy 2009, 33 (7), 371–384. https://doi.org/10.1016/j.telpol.2009.03.002.Search in Google Scholar

59. Colabuono, C.; Wiemer, D.; Marabello, M. V.; Lofù, D.; Pappalardo, M.; Bogacki, P.; Dziech, A.; Derkacz, J.; Sanchez, L. A. G.; Konieczna, E.; Bojilova, M.; et al.. Approach to Sector-specific Cybersecurity Schemes: Key Elements and Security Problem Definition. In International Conference on Multimedia Communications, Services and Security; Springer: Kraków, 2022; pp 104–117.10.1007/978-3-031-20215-5_9Search in Google Scholar

60. Stahl, B. C.; Doherty, N. F.; Shaw, M. Information Security Policies in the uk Healthcare Sector: a Critical Evaluation. Inf. Syst. J. 2012, 22 (1), 77–94. https://doi.org/10.1111/j.1365-2575.2011.00378.x.Search in Google Scholar

61. Hagen, J.; Albrechtsen, E. Regulation of Information Security and the Impact on Top Management Commitment–A Comparative Study of the Electric Power Supply Sector and the Finance Sector. In Safety, Reliability and Risk Analysis; CRC Press: London, 2008; pp 445–452.10.1201/9781482266481-65Search in Google Scholar

62. Tvaronavičienė, M.; Plėta, T.; Beretas, C. P.; Lelešienė, L. Analysis of the Critical Infrastructure Cyber Security Policy. Insights Reg. Dev. 2022, 4 (1), 26–39. https://doi.org/10.9770/ird.2021.4.1(2).Search in Google Scholar

63. Bentham, J. Principles of Penal Law, Reprinted in: Jh Burton; The Works of Jeremy Bentham: Edinburgh, 1983.Search in Google Scholar

64. Murphy, K. Enforcing Tax Compliance: to Punish or Persuade? Econ. Anal. Pol. 2008, 38 (1), 113–135. https://doi.org/10.1016/s0313-5926(08)50009-9.Search in Google Scholar

65. van Velthoven, B.; van Wijck, P. Becker’s Theory on Crime and Punishment, a Useful Guide for Law Enforcement Policy in the netherlands. Recht Werkelijkh. 2016, 37 (1), 6–31. https://doi.org/10.5553/rdw/138064242016037001002.Search in Google Scholar

66. Farrar, J.; King, T. To Punish or Not to Punish? the Impact of Tax Fraud Punishment on Observers’ Tax Compliance. J. Bus. Ethics 2023, 183 (1), 289–311. https://doi.org/10.1007/s10551-022-05061-w.Search in Google Scholar

67. Franzoni, L. A. Tax Evasion and Tax Compliance; University of Bologna: Italy, 1998. https://ssrn.com/abstract=137430.10.2139/ssrn.137430Search in Google Scholar

68. Tittle, C. R. Crime Rates and Legal Sanctions. Soc. Probl. 1969, 16 (4), 409–423. https://doi.org/10.2307/799950.Search in Google Scholar

69. Murphy, K.; Harris, N. Shame and Recidivism: A Test of Reintegrative Shaming Theory in the White-Collar Crime Context. Brit. J. Criminol. 2007, 47 (6), 900–917.10.1093/bjc/azm037Search in Google Scholar

70. Kagan, R. A.; Scholz, J. T.; Hawkins, K.; Thomas, J. The ‘criminology of the Corporation’and Regulatory Enforcement Strategies. Boston 1984, 67–95. https://doi.org/10.1007/978-94-017-5297-8_4.Search in Google Scholar

71. Grabosky, P.; Braithwaite, J. Of Manners Gentle: Enforcement Strategies of Australian Business Regulatory Agencies; Oxford University Press: Canberra, 1986.Search in Google Scholar

72. Burby, R. J.; May, P. J.; Paterson, R. C. Improving Compliance with Regulations: Choices and Outcomes for Local Government. J. Am. Plan. Assoc. 1998, 64 (3), 324–334. https://doi.org/10.1080/01944369808975989.Search in Google Scholar

73. Holton, N.; Furnell, S. Assessing the Provision of Public-Facing Cybersecurity Guidance for End-Users. In 2020 IEEE 6th International Conference on Collaboration and Internet Computing (CIC); IEEE: Atlanta, 2020; pp 161–168.10.1109/CIC50333.2020.00028Search in Google Scholar

74. B. für Sicherheit in der Informationstechnik. Smarthome – Den Wohnraum Sicher Vernetzen, 2024. https://www.bsi.bund.de/DE/Themen/Verbraucherinnen-und-Verbraucher/Informationen-und-Empfehlungen/Internet-der-Dinge-Smart-leben/Smart-Home/smart-home.html?nn131484 (accessed 2024 10 20).Search in Google Scholar

75. Johnson, E. Computer Documentation: Writing about Technology. Comput. Hum. 1995, 29 (5), 409–411. https://doi.org/10.1007/bf02279530.Search in Google Scholar

76. Szewczyk, P.; Valli, C. Insecurity by Obscurity: A Review of Soho Router Literature from a Network Security Perspective. J. Digi. Forensics Sec. 2009, 4 (3), 1.10.15394/jdfsl.2009.1060Search in Google Scholar

77. Denning, T.; Lerner, A.; Shostack, A.; Kohno, T. Control-alt-hack: the Design and Evaluation of a Card Game for Computer Security Awareness and Education. In Proceedings of the 2013 ACM SIGSAC Conference on Computer & Communications Security; Association for Computing Machinery: Berlin, 2013; pp 915–928.10.1145/2508859.2516753Search in Google Scholar

78. Dincelli, E.; Chengalur-Smith, I. Choose Your Own Training Adventure: Designing a Gamified Seta Artefact for Improving Information Security and Privacy through Interactive Storytelling. Eur. J. Inf. Syst. 2020, 29 (6), 669–687. https://doi.org/10.1080/0960085x.2020.1797546.Search in Google Scholar

79. Gjertsen, E. G. B.; Gjære, E. A.; Bartnes, M.; Flores, W. R. Gamification of Information Security Awareness and Training; ICISSP: Porto, 2017; pp 59–70.10.5220/0006128500590070Search in Google Scholar

80. Prange, S.; Thiem, N.; Fröhlich, M.; Alt, F. “secure Settings Are Quick and Easy!”–Motivating End-Users to Choose Secure Smart Home Configurations. In Proceedings of the 2022 International Conference on Advanced Visual Interfaces; Association for Computing Machinery: Frascati, 2022; pp 1–9.10.1145/3531073.3531089Search in Google Scholar

81. Brodie, C.; Karat, C. M.; Karat, J.; Feng, J. Usable Security and Privacy: a Case Study of Developing Privacy Management Tools. In Proceedings of the 2005 Symposium on Usable Privacy and Security; Association for Computing Machinery: Pittsburgh, 2005; pp 35–43.10.1145/1073001.1073005Search in Google Scholar

82. Mathis, F.; Vaniea, K.; Khamis, M. Prototyping Usable Privacy and Security Systems: Insights from Experts. Int. J. Hum. Comput. Interact. 2022, 38 (5), 468–490. https://doi.org/10.1080/10447318.2021.1949134.Search in Google Scholar

83. Prange, S.; Von Zezschwitz, E.; Vision, F. A. Exploring Challenges and Opportunities for Usable Authentication in the Smart Home. In 2019 IEEE European Symposium on Security and Privacy Workshops (EuroS&PW); IEEE: Stockholm, 2019; pp 154–158.10.1109/EuroSPW.2019.00024Search in Google Scholar

84. Holtz, L. E.; Zwingelberg, H.; Hansen, M. Privacy Policy Icons. In Privacy and Identity Management for Life; Springer: Heidelberg, 2011; pp 279–285.10.1007/978-3-642-20317-6_15Search in Google Scholar

85. Rossi, A.; Palmirani, M. What’s in an icon? Promises and pitfalls of data protection iconography. In Data Protection and Privacy: Data Protection and Democracy; Hart Publishing: Oxford, 2020.10.5040/9781509932771.ch-003Search in Google Scholar

86. Yamagishi, R.; Fujii, S. Survey and Analysis of User Perceptions of Security Icons. In European Interdisciplinary Cybersecurity Conference; Association for Computing Machinery: Xanthi, 2024; pp 202–209.10.1145/3655693.3661295Search in Google Scholar

87. Bevana, N.; Kirakowskib, J.; Maissela, J. What Is Usability. In Proceedings of the 4th International Conference on HCI; Elsevier: Citeseer, 1991; pp 1–6.Search in Google Scholar

88. Dzida, W.; Freitag, R. Usability Testing—The Datech Standard. Softw. Qual.: State Art Manage. Test. Tool 2001, 160–177. https://doi.org/10.1007/978-3-642-56529-8_12.Search in Google Scholar

89. Niemimaa, M.; Niemimaa, E. Abductive Innovations in Information Security Policy Development: an Ethnographic Study. Eur. J. Inf. Syst. 2019, 28 (5), 566–589. https://doi.org/10.1080/0960085x.2019.1624141.Search in Google Scholar

90. Karlsson, F.; Hedström, K.; Goldkuhl, G. Practice-based Discourse Analysis of Information Security Policies. Comput. Secur. 2017, 67, 267–279. https://doi.org/10.1016/j.cose.2016.12.012.Search in Google Scholar

91. Maynard, S.; Ruighaver, A. B. What Makes a Good Information Security Policy: a Preliminary Framework for Evaluating Security Policy Quality. In Proceedings of the Fifth Annual Security Conference, Las Vegas, Nevada USA, 2006; pp. 19–20.Search in Google Scholar

92. Al-Hamdani, W. A.; Dixie, W. D. Information Security Policy in Small Education Organization. In 2009 Information Security Curriculum Development Conference, 2009; pp. 72–78.10.1145/1940976.1940991Search in Google Scholar

93. Doherty, N. F.; Fulford, H. Aligning the Information Security Policy with the Strategic Information Systems Plan. Comput. Secur. 2006, 25 (1), 55–63. https://doi.org/10.1016/j.cose.2005.09.009.Search in Google Scholar

94. Lopes, I.; Oliveira, P. Applying Action Research in the Formulation of Information Security Policies. In New Contributions in Information Systems and Technologies; Springer: Ponta Delgada, Vol. 1, 2015; pp 513–522.10.1007/978-3-319-16486-1_50Search in Google Scholar

95. White, G. L. A New Value for Information Security Policy Education. In Proceedings of the Information Systems Educators Conference ISSN, Vol. 2167, 2013; p. 1435.Search in Google Scholar

96. Doherty, N. F.; Fulford, H. Do Information Security Policies Reduce the Incidence of Security Breaches: an Exploratory Analysis. Inf. Resour. Manag. J. 2005, 18 (4), 21–39. https://doi.org/10.4018/irmj.2005100102.Search in Google Scholar

97. Corpuz, M. The Enterprise Information Security Policy as a Strategic Business Policy within the Corporate Strategic Plan. In Proceedings of the 15th Multi-Conference on Systemics, Cybernetics and Informatics, WMSCI 2011: Volume III; International Institute of Informatics and Systemics (IIIS): Orlando, 2011; pp 275–279.Search in Google Scholar

98. Whitman, M. E. In Defense of the Realm: Understanding the Threats to Information Security. Int. J. Inf. Manag. 2004, 24 (1), 43–57. https://doi.org/10.1016/j.ijinfomgt.2003.12.003.Search in Google Scholar

99. Palmer, M. E.; Robinson, C.; Patilla, J. C.; Moser, E. P. Information Security Policy Framework: Best Practices for Security Policy in the E-Commerce Age. Inf. Secur. J. A Glob. Perspect. 2001, 10 (2), 1–15. https://doi.org/10.1201/1086/43314.10.2.20010506/31399.4.Search in Google Scholar

100. Balebako, R.; Shay, R.; Cranor, L. F. Is Your Inseam a Biometric? a Case Study on the Role of Usability Studies in Developing Public Policy. Proc. USEC 2014, 14.10.14722/usec.2014.23039Search in Google Scholar

101. Al-Mukahal, H. M.; Alshare, K. An Examination of Factors that Influence the Number of Information Security Policy Violations in Qatari Organizations. Inf. Comput. Secur. 2015, 23 (1), 102–118. https://doi.org/10.1108/ics-03-2014-0018.Search in Google Scholar

102. Corpuz, M.; Barnes, P. Integrating Information Security Policy Management with Corporate Risk Management for Strategic Alignment. In The 14th World Multi-Conference on Systemics, Cybernetics, and Informatics, Proceedings Volume III; International Institute of Informatics and Systemics: Orlando, 2010; pp 337–342.Search in Google Scholar

103. Alshaikh, M.; Maynard, S. B.; Ahmad, A.; Chang, S. Information Security Policy: a Management Practice Perspective. arXiv preprint arXiv:1606.00890 2016, 1–14; https://doi.org/10.48550/arXiv.1606.00890.Search in Google Scholar

104. Carroll, F. Usable Security and Aesthetics: Designing for Engaging Online Security Warnings and Cautions to Optimise User Security whilst Affording Ease of Use. In Proceedings of the 2021 European Symposium on Usable Security; Association for Computing Machinery: Karlsruhe, 2021; pp 23–28.10.1145/3481357.3481376Search in Google Scholar

105. Buthelezi, M. P.; Van Der Poll, J. A.; Ochola, E. O. Ambiguity as a Barrier to Information Security Policy Compliance: A Content Analysis. In 2016 International Conference on Computational Science and Computational Intelligence (CSCI); IEEE: Las Vegas, 2016; pp 1360–1367.10.1109/CSCI.2016.0254Search in Google Scholar

106. Zaaba, Z. F.; Furnell, S. M.; Dowland, P. S. Literature Studies on Security Warnings Development. Int. J. Percept. Cogn. Comput. Eng. 2016, 2 (1); https://doi.org/10.31436/ijpcc.v2i1.22.Search in Google Scholar

107. K. Höne and J. Eloff. What Makes an Effective Information Security Policy? Netw. Secur. 2002, 2002 (6):14–16, https://doi.org/10.1016/s1353-4858(02)06011-7, 2002.Search in Google Scholar

108. Wogalter, M. S. Communication-human Information Processing (C-hip) Model. In Forensic Human Factors and Ergonomics; CRC Press: Boca Raton, 2018; pp 33–49.10.1201/9780429462269-3Search in Google Scholar

109. Habib, H.; Li, M.; Young, E.; Cranor, L. “Okay, Whatever”: An Evaluation of Cookie Consent Interfaces. In Proceedings of the 2022 CHI conference on human factors in computing systems; Association for Computing Machinery: New Orleans, 2022; pp 1–27.10.1145/3491102.3501985Search in Google Scholar

110. Feng, Y.; Yao, Y.; Sadeh, N. A Design Space for Privacy Choices: Towards Meaningful Privacy Control in the Internet of Things. In Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: Yokohama, 2021; pp 1–16.10.1145/3411764.3445148Search in Google Scholar

111. Schaub, F.; Cranor, L. F. Usable and Useful Privacy Interfaces. Intro. privacy Technology Profession. 2020, 176–299.Search in Google Scholar

112. Morville, P. User Experience Design; Semantic Studios LLC: Scottsville, Virginia, 2004.Search in Google Scholar

113. Tuyikeze, T.; Flowerday, S. Information Security Policy Development and Implementation: A Content Analysis Approach. In Haisa; Plymouth, 2014; pp 11–20.Search in Google Scholar

114. Jann, W.; Wegrich, K. Phasenmodelle und politikprozesse: der policy cycle. Lehrbuch der Politikfeldanalyse 2003, 2 (2), 75–113.Search in Google Scholar

115. Ruoti, S.; Andersen, J.; Zappala, D.; Seamons, K. Why Johnny Still, Still Can’t Encrypt: Evaluating the Usability of a Modern Pgp Client. arXiv preprint arXiv:1510.08555 2015, 1–5; https://doi.org/10.48550/arXiv.1510.08555.Search in Google Scholar

116. Alhazmi, A.; Arachchilage, N. A. G. I’m All Ears! Listening to Software Developers on Putting Gdpr Principles into Software Development Practice. Personal Ubiquitous Comput. 2021, 25 (5), 879–892. https://doi.org/10.1007/s00779-021-01544-1.Search in Google Scholar PubMed PubMed Central

117. Stevens, G.; Pipek, V.; Wulf, V. Appropriation Infrastructure: Supporting the Design of Usages. In International Symposium on End User Development; Springer: Siegen, 2009; pp 50–69.10.1007/978-3-642-00427-8_4Search in Google Scholar

118. Sokolovska, N.; Fecher, B.; Wagner, G. G. Communication on the Science-Policy Interface: an Overview of Conceptual Models. Publications 2019, 7 (4), 64. https://doi.org/10.3390/publications7040064.Search in Google Scholar

119. Stevens, G.; Pipek, V. Making Use; Oxford University Press: Oxford, 2018.10.1093/oso/9780198733249.003.0005Search in Google Scholar

120. Draxler, S.; Stevens, G. Supporting the Collaborative Appropriation of an Open Software Ecosystem. Comput. Support. Coop. Work (CSCW) 2011, 20 (4-5), 403–448. https://doi.org/10.1007/s10606-011-9148-9.Search in Google Scholar

121. Draxler, S.; Stevens, G.; Stein, M.; Boden, A.; Randall, D. Supporting the Social Context of Technology Appropriation. In Proceedings of the 2012 ACM annual conference on Human Factors in Computing Systems - CHI ’12; ACM Press: Austin, 2012.10.1145/2207676.2208687Search in Google Scholar

122. Bødker, S.; Klokmose, C. N. Dynamics in Artifact Ecologies. In Proceedings of the 7th Nordic conference on human-computer interaction: Making sense through design; Association for Computing Machinery: Copenhagen, 2012; pp 448–457.10.1145/2399016.2399085Search in Google Scholar

123. Chen, H.; Liu, M.; Lyu, T. Understanding Employees’ Information Security–Related Stress and Policy Compliance Intention: the Roles of Information Security Fatigue and Psychological Capital. Inf. Comput. Secur. 2022, 30 (5), 751–770. https://doi.org/10.1108/ics-03-2022-0047.Search in Google Scholar

124. Kisekka, V.; Goel, S. An Investigation of the Factors that Influence Job Performance during Extreme Events: The Role of Information Security Policies. Inf. Syst. Front. 2023, 25 (4), 1439–1458. https://doi.org/10.1007/s10796-022-10281-6.Search in Google Scholar PubMed PubMed Central

125. der Justiz, B. Verordnung über Sicherheit und Gesundheitsschutz bei der Benutzung Persönlicher Schutzausrüstungen bei der Arbeit (Psa-Benutzungsverordnung - psa-bv), 1996. https://www.gesetze-im-internet.de/psa-bv/__2.html (accessed 2024 05 23).Search in Google Scholar

126. Lex, E. Council Directive 89/656/eec of 30 November 1989 on the Minimum Health and Safety Requirements for the Use by Workers of Personal Protective Equipment at the Workplace (Third Individual Directive within the Meaning of Article 16 (1) of Directive 89/391/eec), 1989. http://data.europa.eu/eli/dir/1989/656/oj/eng (accessed 2024 05 23, ).Search in Google Scholar

127. U. D. of Labor. Osh act of 1970, 2004. https://www.osha.gov/laws-regs/oshact/completeoshact (accessed 2024 05 23,).Search in Google Scholar

128. Butterfield, M. Hacked U.S. Robot Vacuums are Yelling Racial Slurs, Chasing Pets: Report, 2024. https://www.msn.com/en-ca/news/politics/hacked-u-s-robot-vacuums-are-yelling-racial-slurs-chasing-pets-report/ar-AA1s7bzu (accessed 2024 10 20).Search in Google Scholar

129. Junklewitz, H.; Hamon, R.; André, A.; Evas, T.; Soler Garrido, J.; Sanchez Martin, J. Cybersecurity of Artificial Intelligence in the Ai Act. Luxembourg: Pub. Office European Union 2023, 10, 271009. KJ-NA-31-643-EN-N (online)) ISSN 1831-9424. https://doi.org/10.2760/271009.Search in Google Scholar

130. Staves, A.; Gouglidis, A.; Maesschalck, S.; Hutchison, D. Risk-based Safety Scoping of Adversary-Centric Security Testing on Operational Technology. Saf. Sci. 2024, 174, 106481. https://doi.org/10.1016/j.ssci.2024.106481.Search in Google Scholar

131. Von Solms, R.; Van Niekerk, J. From Information Security to Cyber Security. Comput. Secur. 2013, 38, 97–102. https://doi.org/10.1016/j.cose.2013.04.004.Search in Google Scholar

132. Mikellidou, C. V.; Shakou, L. M.; Boustras, G.; Dimopoulos, C. Energy Critical Infrastructures at Risk from Climate Change: A State of the Art Review. Saf. Sci. 2018, 110, 110–120. https://doi.org/10.1016/j.ssci.2017.12.022.Search in Google Scholar

133. Chapman, M. Teenager Hacks Polish Tram System - Security - iTnews, 2008. https://www.itnews.com.au/news/teenager-hacks-polish-tram-system-100838 (accessed 2024 10 20).Search in Google Scholar

134. Boustras, G.; Waring, A. Towards a Reconceptualization of Safety and Security, Their Interactions, and Policy Requirements in a 21st Century Context. Saf. Sci. 2020, 132, 104942. https://doi.org/10.1016/j.ssci.2020.104942.Search in Google Scholar

135. Blokland, P.; Reniers, G. An Ontological and Semantic Foundation for Safety and Security Science. Sustainability 2019, 11 (21), 6024. https://doi.org/10.3390/su11216024.Search in Google Scholar

136. Fallatah, W.; Furnell, S.; He, Y. Refining the Understanding of Usable Security. In International Conference on Human-Computer Interaction; Springer: Copenhagen, 2023; pp 49–67.10.1007/978-3-031-35822-7_4Search in Google Scholar

137. Ghernaouti-Hélie, S. An Inclusive Information Society Needs a Global Approach of Information Security. In 2009 International Conference on Availability, Reliability and Security; IEEE: Fukuoka, 2009; pp 658–662.10.1109/ARES.2009.127Search in Google Scholar

138. Hopf, C. Qualitative Interviews: An Overview. A companion to qualitative research 2004, 203 (8), 100093.Search in Google Scholar

139. Randall, D.; Harper, R.; Rouncefield, M. Fieldwork for Design: Theory and Practice; Springer Science & Business Media: London, 2007.10.1007/978-1-84628-768-8Search in Google Scholar

140. Janssen, M.; Helbig, N. Innovating and Changing the Policy-Cycle: Policy-Makers Be Prepared. Gov. Inf. Q. 2018, 35 (4), S99–S105. https://doi.org/10.1016/j.giq.2015.11.009.Search in Google Scholar

141. Lacey, J.; Howden, M.; Cvitanovic, C.; Colvin, R. Understanding and Managing Trust at the Climate Science–Policy Interface. Nat. Clim. Change 2018, 8 (1), 22–28. https://doi.org/10.1038/s41558-017-0010-z Search in Google Scholar

142. Orr, B.; Cowie, A.; Castillo Sanchez, V.; Chasek, P.; Crossman, N.; Erlewein, A.; Louwagie, G.; Maron, M.; Metternicht, G.; Minelli, S.; Tengberg, A.; Walter, S.; Welton, S. Scientific Conceptual Framework for Land Degradation Neutrality: A Report of the Science-Policy Interface; United Nations Convention to Combat Desertification: Bonn, Germany, 2017.10.1016/j.envsci.2017.10.011Search in Google Scholar

143. Perrings, C.; Duraiappah, A.; Larigauderie, A.; Mooney, H. The Biodiversity and Ecosystem Services Science-Policy Interface. Science 2011, 331 (6021), 1139–1140. https://doi.org/10.1126/science.1202400.Search in Google Scholar PubMed

144. Dwivedi, M.; Upadhyay, S.; Tripathi, A. A Working Framework for the User-Centered Design Approach and a Survey of the Available Methods. Int. J. Sci. Res. Pub. 2012, 2 (4), 12–19.Search in Google Scholar

145. Knill, C.; Tosun, J. Policy Making. In Comparative Politics, Working Paper Series/Chair of Comparative Public Policy and Administration; Caramani, D., Ed.; Oxford Univ. Pr.: Oxford, 2008; pp. 495–519.Search in Google Scholar

146. Flowerday, S. V.; Tuyikeze, T. Information Security Policy Development and Implementation: The what, How and Who. Comput. Secur. 2016, 61, 169–183. https://doi.org/10.1016/j.cose.2016.06.002.Search in Google Scholar

147. Mao, J. Y.; Vredenburg, K.; Smith, P. W.; Carey, T. User-centered Design Methods in Practice: a Survey of the State of the Art. In Proceedings of the 2001 Conference of the Centre for Advanced Studies on Collaborative Research; IBM Press: Toronto, Vol. 12, 2001.10.1145/503376.503460Search in Google Scholar

148. Kuerbis, B.; Badiei, F. Mapping the Cybersecurity Institutional Landscape. Digit. Policy Regul. Govern. 2017, 19 (6), 466–492. https://doi.org/10.1108/dprg-05-2017-0024.Search in Google Scholar

149. Cederberg, A. Comprehensive Cyber Security Approach: The Finnish Model. In Cybersecurity Best Practices: Lösungen zur Erhöhung der Cyberresilienz für Unternehmen und Behörden; Springer Vieweg Wiesbaden: Wiesbaden, 2018; pp 83–105.10.1007/978-3-658-21655-9_8Search in Google Scholar

150. Wigell, M.; Mikkola, H.; Juntunen, T. Best Practices in the Whole-Of-Society Approach in Countering Hybrid Threats. European Parliam. Coordinator: Policy Dep. Extern. Relations Dir. Gen. Extern. Policies Union 2021, 10, 379.Search in Google Scholar

151. Manuel, J.; Crivellaro, C. Place-based Policymaking and Hci: Opportunities and Challenges for Technology Design. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: Honolulu, 2020; pp 1–16.10.1145/3313831.3376158Search in Google Scholar

152. Spaa, A.; Durrant, A.; Elsden, C.; Vines, J. Understanding the Boundaries between Policymaking and Hci. In Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems; Association for Computing Machinery: Glasgow, 2019; pp 1–15.10.1145/3290605.3300314Search in Google Scholar

Received: 2024-11-27
Accepted: 2025-03-07
Published Online: 2025-03-26

© 2025 the author(s), published by De Gruyter, Berlin/Boston

This work is licensed under the Creative Commons Attribution 4.0 International License.

Articles in the same Issue

  1. Frontmatter
  2. Special Issue on “Usable Safety and Security”
  3. Editorial on Special Issue “Usable Safety and Security”
  4. The tension of usable safety, security and privacy
  5. Research Articles
  6. Keeping the human in the loop: are autonomous decisions inevitable?
  7. iSAM – towards a cost-efficient and unobtrusive experimental setup for situational awareness measurement in administrative crisis management exercises
  8. Breaking down barriers to warning technology adoption: usability and usefulness of a messenger app warning bot
  9. Use of context-based adaptation to defuse threatening situations in times of a pandemic
  10. Cyber hate awareness: information types and technologies relevant to the law enforcement and reporting center domain
  11. From usable design characteristics to usable information security policies: a reconceptualisation
  12. A case study of the MEUSec method to enhance user experience and information security of digital identity wallets
  13. Evaluating GDPR right to information implementation in automated insurance decisions
  14. Human-centered design of a privacy assistant and its impact on perceived transparency and intervenability
  15. ChatAnalysis revisited: can ChatGPT undermine privacy in smart homes with data analysis?
  16. Special Issue on “AI and Robotic Systems in Healthcare”
  17. Editorial on Special Issue “AI and Robotic Systems in Healthcare”
  18. AI and robotic systems in healthcare
  19. Research Articles
  20. Exploring technical implications and design opportunities for interactive and engaging telepresence robots in rehabilitation – results from an ethnographic requirement analysis with patients and health-care professionals
  21. Investigating the effects of embodiment on presence and perception in remote physician video consultations: a between-participants study comparing a tablet and a telepresence robot
  22. From idle to interaction – assessing social dynamics and unanticipated conversations between social robots and residents with mild cognitive impairment in a nursing home
  23. READY? – Reflective dialog tool on issues relating to the use of robotic systems for nursing care
  24. AI-based character generation for disease stories: a case study using epidemiological data to highlight preventable risk factors
  25. Research Articles
  26. Towards future of work in immersive environments and its impact on the Quality of Working Life: a scoping review
  27. A formative evaluation: co-designing tools to prepare vulnerable young people for participating in technology development
Downloaded on 19.12.2025 from https://www.degruyterbrill.com/document/doi/10.1515/icom-2024-0066/html
Scroll to top button