In 2018, Rick Smith, founder and CEO of Axon, a stun gun and body camera manufacturer based in Scottsdale, Arizona, was concerned that advances in technology were creating new and challenging ethical challenges. So he created an independent AI ethics board, made up of ethicists, AI experts, public policy experts, and law enforcement officials, to advise Axon’s leadership. In 2019, the council recommended against adding facial recognition technology to the company’s body camera lineup, and in 2020 made recommendations regarding the use of automatic license plate recognition technology. Axon management followed both recommendations.
In 2022, the board recommended that a proposal for a manual for the production of a drone stun gun designed to combat mass shootings be dropped. After initially accepting the board’s recommendation, the company changed its mind and in June 2022, following the Uvalde school shooting, announced anyway, he was running the taser program. The board’s reaction was dramatic: Nine out of 13 members resignedand they released letter who expressed their concerns. In response, the company announced a freeze on the project.
As societal expectations for the responsible use of digital technologies rise, firms promoting best practices will have a clear advantage. In accordance with 2022 study, 58% of consumers, 60% of employees and 64% of investors make key decisions based on their beliefs and values. Increasing your organization’s digital responsibility can drive value creation, and brands that are seen as more responsible will enjoy higher levels of stakeholder trust and loyalty. These enterprises will sell more goods and services, it will be easier for them to hire staff and maintain fruitful relationships with shareholders.
However, many organizations have struggled to balance the legitimate interests of competing stakeholders. The main tensions arise between business goals and responsible digital practices. For example, requirements for data localization often conflict with the aspirations for efficiency in globally distributed value chains. The ethical and responsible checks and balances that need to be put in place during AI/algorithm development tend to slow down development speed, which can be a problem when time to market is paramount. Better data and analytics can improve service personalization, but at the cost of customer privacy. The risks associated with transparency and discrimination issues may discourage organizations from using algorithms that could help reduce costs.
If managed effectively, digital responsibility can protect organizations from threats and open up new opportunities for them. Based on our ongoing digital transformation research and in-depth research on 12 major European consumer goods, financial services, information and communications technology and pharmaceutical firms that are actively involved in digital responsibility, we have developed four best practices for maximizing business value. and minimize resistance.
1. Tie digital responsibility to corporate values.
Commitment to digital responsibility can be formulated as a charter that sets out the key principles and guidelines that your organization will follow. Start with a basic question: how do you define your digital responsibility goals? The answer can often be found in your organization’s values, which are articulated in your CSR mission or commitment.
According to Jakob Wessner, manager of organizational development and digital transformation at beauty and personal care company Weleda, “Our values determined what we wanted to do in a digital world where we set our own limits on where we would go or not go.” . “. The core values of the company are fair treatment, sustainability, honesty and diversity. So when it came to setting up a robotics process automation program, Weleda’s executives were careful to make sure it didn’t involve job losses, which would violate the core value of fair treatment.
2. Extend digital accountability beyond compliance.
While corporate values provide a useful anchor point for digital responsibility principles, relevant provisions on data privacy, intellectual property rights and artificial intelligence cannot be overlooked. Forward-thinking organizations are taking steps to go beyond compliance and improve your behavior in areas such as cybersecurity, data protection and privacy.
For example, UBS Banking Group’s data protection efforts began with GDPR compliance, but have since become more focused on data governance practices, AI ethics, and climate-related financial disclosure. “It’s like puzzle pieces. We started with GDPR and then you just start using these blocks and the level goes up all the time,” said Christoph Tammers, head of data at the bank.
We have found that the key is to establish a clear link between digital responsibility and value creation. One way to achieve this is to complement compliance efforts with a forward-thinking approach to risk management, especially in areas where technical implementation standards are lacking or where the law is not yet enforced. For example, Deutsche Telekom (DT) has developed its own risk classification system for AI-related projects. The use of AI can expose organizations to the risks of biased data, inappropriate modeling techniques, or inaccurate decision making. Understanding risks and developing methods to mitigate them are important steps in digital responsibility. DT includes these risks in the scorecards used to evaluate technology projects.
Turning digital responsibility into a shared outcome also helps organizations go beyond compliance. The Swiss insurance company Die Mobiliar has created an interdisciplinary team consisting of representatives from compliance, business security, data science and IT architecture. “We structured our efforts around a shared vision where business strategy and personal data work together to proactively create value,” explains Matthias Brandle, Data Science and AI Product Owner.
3. Set clear controls.
Ensuring that digital responsibility is properly managed is not easy. Axon had the right idea when it was created by an independent AI ethics board. However, governance was not well thought out, so when the company disagreed with the board’s recommendation, it fell into a governance gray area marked by competing board and management interests.
Establishing a clear governance structure can minimize such tensions. There is an ongoing debate about whether to create a separate team for digital responsibility or distribute responsibility throughout the organization.
Pharmaceutical company Merck took the first path, creating digital ethics council provide guidance on complex issues related to data use, algorithms and new digital innovations. He decided to act because of the growing focus on AI-based approaches to drug development and big data applications in human resources and cancer research. The board makes recommendations for action, and any decision that is contrary to the board’s recommendations must be formally justified and documented.
Swiss Re, a global insurance company, has adopted a second approach based on the belief that digital responsibility should be part of an organization’s overall operations. “Whenever there is a digital corner, the owner of the initiative, who is usually in business, is responsible. The owners of business initiatives are supported by experts in central teams, but business lines are responsible for its implementation,” explained Lutz Wilhelmi, risk and regulatory adviser at SwissRe.
Another option we have seen is a hybrid model, consisting of a small group of internal and external experts who guide and support managers across business lines to implement digital responsibility. The benefits of this approach include increased awareness and shared responsibility throughout the organization.
4. Make sure employees understand digital responsibility.
Today’s employees not only need to assess the opportunities and risks associated with working with different types of technology and data, they must also be able to ask the right questions and have constructive discussions with colleagues.
Training digital responsibility staff has been one of the key priorities of Otto Group, a German e-commerce company. “Lifelong learning becomes a success factor for every individual and also for the future viability of the company,” explained Petra Scharner-Wolf, Member of the Executive Board for Finance, Control and Human Resources. To jump-start its efforts, Otto developed an organization-wide digital education initiative using a central platform that included many videos on topics related to digital ethics, responsible use of data, and ways to resolve conflicts.
Learning about digital responsibility is both a short-term challenge to upskill the workforce and a long-term challenge to create a culture of self-directed learning that adapts to the evolving nature of technology. Since issues related to digital responsibility rarely arise in a vacuum, we recommend that aspects of digital responsibility be included in ongoing ESG professional development programs that also aim to promote ethical behavior in a broader stakeholder manner. This type of contextual learning can help employees navigate the complexities of digital responsibility in a more applied and meaningful way.
The needs and resources of your organization will determine whether you decide to upskill your entire workforce or rely on a few specialists. The balance of both can be perfect, providing a solid foundation of knowledge and understanding of digital ethics throughout the organization, as well as having experts available to provide specialized guidance when needed.
Digital responsibility is becoming almost an imperative for today’s organizations. Success is by no means guaranteed. However, by taking a proactive approach, progressive organizations can create and maintain responsible practices related to their use of digital technologies. These practices not only improve digital performance, but also improve organizational goals.