This article first appeared in CU Insight and is Part II of a four-part series about digital responsibility.
Credit unions pride themselves on being socially responsible, and with good reason. Social responsibility is one of the nine credit union operating principles, and the movement prioritizes meaningful community involvement.
But is your credit union being socially responsible in the digital realm?
As part of our four-part series on digital responsibility we’re working to better understand the various ways that digital can lead to unintended consequences, including critical steps credit unions should consider taking to avoid negative outcomes for their communities.
Our first article covered the environmental impacts of digital, including both eye-opening facts — such as the reality that if the IT industry were a country, only the United States and China would contribute more to climate change — and some “try now” steps for credit unions to improve their digital environmental footprint.
This month, we’ll focus on the social impacts of digital.
As we look back on the last 12 months, there are powerful examples of the positive effects of digital. Could you imagine getting through 2020 without Zoom, online grocery orders, at-home deliveries, and virtual healthcare and learning options?
But sadly, while digital tools were imperative to staying connected and fulfilling basic needs during COVID-19, they were just as frequently used to drive partisan wedges. Every flavor of digital, but especially social media, was used to spread violent, racist, and hateful rhetoric. The English Premier League even enacted a three-day social media blackout to call attention to the racist online abuse to which its Black players were being subjected.
And that’s not to mention that frequent hacks of businesses, hospitals, utilities, government entities, and more demonstrate not only of the fragility of our digital ecosystem, but also our utter dependence on it. The recent shutdown of the Colonial Pipeline following a ransomware attack, which caused widespread gas shortages, is a particularly ominous reminder.
And let’s not forget other social aspects of digital, such as disparities in access — geographic, economic, physical/cognitive abilities — and the potential for built-in bias when using tools like artificial intelligence (AI).
Here are the three critical questions your credit union should be asking:
1. When it comes to personal data, are we offering our members insight, control, and security?
In 2018 the European Union enacted strict laws to manage how personal data was collected and used by digital players, and now there’s something similar closer to home: the California Consumer Privacy Act (CCPA). The CCPA, which went into effect last year, gives consumers control over if and how their data is collected and used. Those affected by the law include businesses with at least $25 million in revenues who have data from at least 50,000 California consumers (or devices) and who earn more than half of their revenue selling consumers’ personal data.
It seems inevitable we’ll see similar regulation in more states in the coming years. Although credit unions don’t typically sell consumer data, as we mentioned in this 2020 article, data breaches and privacy issues are increasingly common, and we may see a trajectory similar to what we saw when the Americans with Disabilities Act (ADA) requirements were applied to online technology.
The move to give consumers more control over how their data is used isn’t surprising. Pew Research reported most Americans feel they have little control over how their data is collected, are at least somewhat worried about how it’s used and believe the risks of collecting their data outweigh the benefits. With more than 36 billion consumer records exposed in the first three quarters of 2020—last year was the worst year on record—members are right to be concerned.
What can your credit union do now?
- Recognize that protecting member data is the right thing to do. We know it won’t be easy and will impact a wide variety of credit union efforts, but being able to protect your members’ data and offer them critical control will go a long way toward building strong member relationships.
- Involve resources from multiple departments in your strategy development: legal, marketing, IT and compliance to start.
- Audit your current data collection practices. Understand what you’re actually tracking.
- Create a policy for data collection. Consider sharing this on your website — members are increasingly likely to want to know what you’re collecting and how you use their data.
- Let members control or opt out of data collection. Civic has a tool that can help, and at PixelSpoke we’ve created a custom solution for some of our California-based clients, like USE Credit Union.
- Take steps to protect data security. This is a broad topic that certainly can’t be addressed in a paragraph, but here are some key things to consider.
- Security isn’t IT’s job—it’s everyone’s. Act accordingly.
- Ransomware attacks are likely your main area of concern. Ransomware attacks grew by 150% in 2020, and the average payment as of Q4 2020 was an eye-popping $154,108.
- If you haven’t added information security to your C-suite, now’s the time to consider it.
One thing to keep in mind as you improve your data security: the impact it could have on your marketing. Some of the regulations built into laws like CCPA might make it more difficult to tap into useful member data, harder to personalize your messaging and more challenging to know what elements of your marketing are/aren’t working. It will be up to you to find the right balance between protecting your members and tapping into data that will help you better serve them.
One example to consider is Apple, which has taken an aggressive stance with their new tools to help customers restrict how they are tracked on iPhones. Apple is championing the cause of consumer control over their data and, in the process, differentiating from its biggest competitor, Google. Can credit unions differentiate around this approach?
2. Do our digital tools meet the needs of all our members?
Over the past year, we’ve seen rapid deployment and adoption of digital tools in credit unions across the country—a true pandemic silver lining. But if you haven’t done it already, make sure those resources meet everyone’s needs, not just those of the “average user.”
As an example, though this article from the World Bank is nearly seven years old, it advocates for “digital financial inclusion,” demonstrating a prescient grasp of the importance of e-money and mobile banking options to provide the financially excluded and underserved with access to financial services.
What can your credit union do now?
- Ask members what they need from you. Don’t assume you know what frustrates your members about your digital tools or which digital updates they’ll appreciate most—ask them. We’ve found members are surprisingly willing to share their thoughts when given the opportunity.
- Focus on “stress cases.” Instead of designing products, services and digital resources for the “average” user, design them for the outliers. As we discussed in an earlier CUInsight article, this mindset will allow you to serve all of your members well. For instance, closed captioning doesn’t just help members with hearing impairment, it also helps those in a noisy environment. Similarly, voice-to-text can give those with limited vision access to your website and help those who can’t look at a screen at that moment (say someone who’s driving or a parent juggling the needs of young children). And what are you doing to create products, services and support that meet the needs of the under-served—especially people of color?
3. Do our digital tools contain built-in bias?
Tim Frick—a recent Pixelspoke podcast guest and digital ethics expert—pointed out that organizations don’t intentionally put irresponsible digital practices into place. He suggested a host of issues could be to blame: for instance, a failure to include diverse perspectives in the research, development, or creation process, or a failure to fully understand and address the impact of your digital products, services, and practices.
Another underlying cause: a reliance on algorithms with built-in biases. According to this article from Brookings, our country’s current legal and regulatory structure around non-discriminatory lending doesn’t mesh well with AI and many of the factors built into AI underwriting — like credit scores — have a disparate impact on people of color. According to Federal Reserve Governor Lael Brainard, “Depending on what algorithms are used, it is possible that no one, including the algorithm’s creators, can easily explain why the model generated the results that it did.’”
We’d love to think algorithms offered impeccable objectivity, but there’s plenty of evidence to suggest they’ve replicated — and sometimes worsened — human bias. After all, any algorithm is only as good as the data and assumptions that power it, and many are built on decades of inequity. A 2018 study from UC Berkeley, which found that fintech algorithms charged minority borrowers an average of 40% less when compared to face-to-face lenders, could be used as an argument in favor of algorithmic lending. But these rates were still higher than rates offered to non-minority borrowers. That means we still have a lot of work to do.
What can your credit union do now?
Manage potential problem areas upfront. Before diving into your next digital project, hold a pre-mortem: Imagine yourself in the future, faced with a disastrous rollout or unintended consequences. Then, come up with all the things that could have caused it and solve for them going in.
Frick’s list of problem areas provides built-in solutions. Start with appropriate research/development teams and provide the resources they need. Involve critical voices—for instance, don’t launch digital tools designed to help underserved people of color if you don’t involve them in the process. Make an effort to understand what happens after you launch your product/service/practice—collaborating with people outside your organization might be a good way to address these challenges.
Remove bias from your data and improve your models. A Harvard Business Review article about how AI can make bank loans more fair recommended the following three steps:
- Use AI to remove bias, not reinforce it. Existing data tends to reflect previous bias and it’s incredibly difficult to uncover and address this manually. HBR found AI does a better job spotting/correcting patterns of historic discrimination in raw data.
- Regularize the algorithm. After data clean-up, the article recommends building in an additional layer of fairness by “regularizing” the algorithm so that it penalizes the model if it treats protected classes differently.
- Add an AI-driven “adversary.” HBR recommends one final step to assure that what appears to be a neutral model truly is: Creating an “adversarial” AI-driven model. It’s this model’s job to predict bias in your chosen model and pinpoint underlying problems.
Credit unions’ cooperative, member-centric focus drives a special appreciation for the social ramifications of digital. Although it won’t necessarily be easy to address the issues we’ve brought up throughout this piece, the first step is to be aware of potential pitfalls.
In our next article, we’ll discuss the economic ramifications of digital responsibility. Meanwhile, let’s encourage each other to find better ways to use digital to responsibly protect and assist the members who need us most.