A guest article by Werner Kunz, Jochen Wirtz, Nicole Hartley, and James Tarbit.
Welcome back to our blog series on Corporate Digital Responsibility (CDR). In case you missed our previous article, we introduced the concept of CDR and discussed the significance of CDR for service companies in the Age of AI. In part two of this series, we dive deeper into this topic by exploring why firms may not follow good CDR practices and what can be done to overcome these challenges.
Surprisingly, many companies do not prioritize good CDR, given the risks associated with poor CDR practices. One would expect that digitally sophisticated service firms would want to follow good CDR practices.
However, it is important to acknowledge that implementing good CDR practices may come with barriers and costs that could discourage firms from taking action. Here, we have identified four distinct categories of motivations for poor CDR behaviors.
First, service firms can benefit from capturing and utilizing consumer data in their sales and marketing. This can create a sales and profit opportunity for service firms. For example, applying technology and data allows for enhanced consumer targeting, price discrimination, and personalized pricing. By using data and technology, service firms can better understand their customers and better tailor their products and services to meet their needs.
Second, customization and personalization can enhance the customer experience by offering more convenience, enhanced accessibility, and speed of service.
Third, service firms are interested in cutting costs by automating customer service processes, which often results in enhanced agility, adaptability, performance, and productivity.
While these objectives are frequently aligned with customer needs, they can also carry CDR risks related to privacy, fairness, transparency, and discrimination. And often service firms face tensions related to these risks with increased reliance on and integration of technology and data to achieve these business objectives.
Further, we acknowledge that it’s not always easy for service firms to allocate the necessary resources to establish and maintain a strong CDR culture. Investing in CDR has uncertain ROI outcomes such as avoidance of legal and reputational risks and potential long-term enhanced brand equity when consumers recognize and value good CDR practices of the firm. The previous discussion shows clearly that organizational objectives are often opposed to the adoption of good CDR practices. The resultant tensions can be viewed as trade-offs that service firms need to navigate in order to be profitable yet ethical. We call that the Service Firm’s CDR Calculus and argue that firms weigh the benefits and costs of good CDR to determine how much they will engage in CDR-positive practices. Figure 3 provides an overview of the benefits and costs of good CDR for service firms. A deeper discussion can be found in Wirtz et al. (2023).
How Can Firms Build a Strong CDR Culture?
CDR behaviors can be shaped and enhanced by addressing CDR tensions that arise from factors related to customers in the ecosystem front-end and business partners at the back-end (see the digital service ecosystem model that was introduced in part one of this blog). Figure 4 provides an overview of these factors. In the following, we introduce the key CDR practices. Some are relatively simple and can be put in place quickly without impacting sales or profits. However, other practices may involve more significant trade-offs that require strategic consideration, as outlined in the CDR calculus.
First, it is crucial to establish company-wide norms that prioritize ethical practices. To achieve this, all stakeholders, including developers and frontline employees, need to understand and embrace the standards they must uphold. They should also be rewarded and motivated to commit to these standards, creating a collective culture of CDR throughout the organization. To support this culture, it is important to formalize and embed digital governance practices across all organizational units. This ensures that everyone is on the same page and that CDR practices go beyond mere box-checking and window-dressing. It is essential to avoid machine-washing, which refers to intentionally misleading behaviors and communications about a firm’s ethical AI practices. Thus, establishing a strong CDR culture requires a concerted effort from all stakeholders, with managers leading the charge. By prioritizing ethical practices, formalizing digital governance practices, and avoiding misleading behaviors, we can create a sustainable culture of responsibility and trust that benefits our customers, employees, and the organization as a whole.
Second, customer privacy and data protection must be at the forefront of managers’ minds as core principles of good CDR. For example, Amazon’s Ring’s sharing of private camera footage with government agencies or Facebook’s WhatsApp’s sharing of user data with advertising partners are typical violations. For good CDR, organizational transparency about business models and customer data use is essential. In the future, companies may need an external audit of the design features of their services against industry standards and regulatory requirements for CDR.
Finally, to minimize the chances of accidental or forced CDR problems, managers must strive for equitable power dynamics between service companies and their business partners. Shared governance models, which can draw from current regulations, external non-profit auditing groups, and industry codes of conduct, should be employed to reduce conflicts and friction among collaborating organizations. By fostering a fair and cooperative environment, managers can ensure that all parties involved in the service ecosystem uphold CDR principles, thus reducing the risk of any potential ethical or legal issues.
New CDR Challenges Emerging from Generative AI
The advent of generative AI (e.g. Dall-E, ChatGPT, Bard) has reached the general public and forcefully communicated the revolutionary power of AI and how it will impact how we work, live, play, learn, and communicate. These new technology developments have also reemphasized the importance of CDR for service firms and are paving the way for a lot of highly needed new research.
Important new opportunities and challenges include linked to these AI advances include:
- Automated decision-making processes: One of the challenges of AI is that it can be used to automate decision-making processes. Given the sheer amount of data needed for generative AI, how can companies ensure a fair outcome and avoid discriminating against certain groups or individuals? How can service firms weigh the cost versus benefit of depending on such systems?
- AI complexity: As AI systems become more complex and opaquer, it can be difficult or even impossible for stakeholders to understand how the results are generated (“black box”). This leads to serious transparency issues that might be caused unintentionally or even intentionally for-profit motives.
- AI monitoring: On the one hand, AI can be used to enhance CDR by providing new tools and techniques for monitoring and managing digital risks. On the other hand, AI can be used to create detailed profiles of individuals that were not possible just one or two years ago. An example is the Cambridge Analytica case, which used big datasets to micro-target (often deceiving) messages and systematically and effectively influenced individuals’ voting behavior. Further, the sheer capability to process vast amounts of personal data, raises concerns about data privacy and protection. How to harvest the positive side of AI monitoring without also enduring its negative consequences? How do we increasingly rely on these advanced AI systems without a human in the loop?
- AI accountability: AI systems can make decisions that can have serious real-world consequences ranging from college admission to insurance coverage decisions, raising the question of who is accountable. AI systems themselves, as non-self-conscience entities, cannot be accountable for their decisions. Thus, human oversight is required to ensure that decisions made by AI systems are ethical and in line with a company’s values. How do we embed human oversight into the operation of these technologies that is meaningful given the complexity and vast amount of data AI systems deal with?
- AI replacement: For numerous knowledge-based service sectors – i.e., law, financial planning, marketing and communications, and creative industries – generative AI can be used to replace roles and tasks within service firms that were once human-centered. This integration has ramifications for service workforces in terms of changing roles or the decreased need for staff. It also calls for increased training and development for staff in working alongside AI agents and systems. How do service firms best prepare their workforces for the enhanced adoption of AI systems in the development and delivery of service?
These are only a few of the new issues that are unfolding with generative AI and its application in service contexts. We hope that this blog series helps to show the increasing importance of CDR for service firms and sparks a discussion around CDR in our service research community.
- You find more details can read more about CDR in our paper:
Wirtz, J., Kunz, W.H., Hartley, N., Tarbit, J. (2023). Corporate digital responsibility in service firms and their ecosystems. Journal of Service Research, forthcoming.
- All associated slides are here (and free to use):
- And a YouTube masterclass here:
Professor of Marketing at University of Massachusetts Boston, U.S.A.
Vice Dean MBA Programmes and Professor of Marketing at NUS Business School, National University of Singapore, Singapore,
MBA Director and Associate Professor at The University of Queensland, Australia
PhD Candidate at The University of Queensland, Australia