Brunswick Review The Resilience Issue

Techlash

A new partnership is being proposed between technology, public policy and society. Liam Benham, IBM’s Vice President for Government and Regulatory Affairs in Europe, maps the future for Brunswick’s Nick Blow.

As the internet turns 30, criticism is mounting of how it is being exploited by some technology companies. Concerns of abuse of market power, privacy, fake news and tax avoidance seem to have engulfed the technology sector in what appears to be a perfect reputational and increasingly regulatory storm.

Brunswick Insight’s online consumer polling shows a majority of European and US informed consumers view the contribution of technology companies to society as positive. They do however have a much more negative view of social media companies. Conversely, while their social media use is increasing, they do want platforms to be liable for content that they host and be more transparent about how their personal data is used.

In the absence of a unified response from tech, these concerns draw the attention of regulators—of particular concern for companies whose business is spread over many regulatory jurisdictions. In a white paper published in April, for instance, the UK’s Department for Digital, Culture, Media & Sport and Home Office, backed by Prime Minister Theresa May, proposed significant penalties against companies that ignore the spread of harmful content as they focus on growth.

In its 100-plus-year history, IBM has seen more than a few technological waves—and has led many of them. Leading government affairs activities for IBM in Europe, Liam Benham is at the center of the tech regulation debate and ideally placed to comment on the challenges that the sector faces. In our interview, Mr. Benham shares his view on trends and attitudes in tech and regulation and how IBM is responding.

Are the informed consumer views we’re hearing in our polls a problem for social media alone or for the entire technology sector?
Your consumer polling reflects exactly what we are seeing. Technology—whether blockchain, quantum, internet of things or artificial intelligence—can transform all sectors of society for the better. Yet this digitization comes with responsibility. We’ve had the situation for some time now where playing fast and loose with people’s trust, as some in the tech sector have done, is having an impact across the industry.

We would agree with the consumers in your poll: Platforms need to accept liability for the content they host. And they need to treat data more responsibly while being transparent about how they do it. Companies need to step up and take action to strengthen consumer trust. Your polling shows that use of social media is increasing—but will that be sustained if companies don’t make meaningful changes to their behavior?

Our polling points to the fact that consumers don’t want to give up their technology, but they do want to be protected from its excesses. Consumers continue to place data and privacy as their top concerns and want technology companies to be more transparent. What’s your view?
Companies need to face a new reality. It is no longer enough to repeat “sorry” or say the issue is too complicated when there is yet another abuse of customer data. Principles and practices need to be put in place up front to demonstrate good data handling.

IBM has publicly declared principles on trust and transparency that we translate into the development and delivery of our offerings across our business. As a business-to-business company we have always believed that our clients’ data is their own. We believe the unique insights derived from clients’ data are their competitive advantage, and we will not share them without their agreement. We make clear when and why AI is being applied, where the data comes from, and which methods were used to train algorithms. These training methods must not only be transparent, they must be explainable.

Pressure from politicians for action on platform liability is growing. Indeed 76 percent of the consumers we polled in Europe agreed that technology companies should actively edit content, and remove fake news and hate speech from their platforms. Is existing regulation enough? Is significant change required? Liability and transparency are at the core of this debate. Collectively, dominant online platforms have more power to shape public opinion than newspapers or the television ever had, yet they face very little regulation or liability. They are no longer startups that need to be shielded from liability in order to find their footing.

Some sort of regulation on consumer-facing technology companies is coming. But exactly what is less clear. Businesses have leverage when they negotiate for someone to use their data. Consumers don’t, and that’s where government may need to step in.

By using a regulatory scalpel, not a sledgehammer, governments and regulators can focus on real problems where there is harmful behavior, while avoiding collateral damage.

Regulators are also very wary of impeding innovation and competitiveness. Where is the balance between business freedom and excessive regulation?
To avoid excessive regulation, the onus is on companies to demonstrate that it is not necessary. For example, at IBM we are setting out our vision for a new partnership between technology, public policy and society. This is particularly timely as the European Union, under new leadership, will move to the next phase of the Digital Single Market.

We believe that for the DSM to be successful globally, it must focus on a digital future that is responsible, open and inclusive. This cannot be achieved only through regulation, but through companies themselves committed to changing their mindset, and focusing on actions that build trust.

That’s not to say that regulation doesn’t play a part. There are areas where precision regulation is warranted. Platforms that tolerate the dissemination of illegal content should not be shielded from liability. People also need to know who is behind the political messaging they see in their feeds.

By using a regulatory scalpel, not a sledgehammer, governments and regulators can focus on real problems where there is harmful behavior, while avoiding collateral damage.

Some in the sector say they don’t need to change their business model or their lobbying tactics and cite healthy user figures and financials. How should technology companies be engaging with regulators?
First, regulators want companies to walk the talk on good behavior. Moving from theory and promises to tangible actions. I’ve referred to IBM’s principles for trust and transparency, but we have also launched a service that brings greater transparency to AI decision-making—for the first time businesses will be able to “live” monitor AI for bias. We have also published “Everyday Ethics for Artificial Intelligence,” a guide for designers and developers to help them embed ethics in their work.

Secondly, companies can work with policymakers to develop alternatives to regulation. IBM and other companies worked with the European Commission for four years to develop the Cloud Code of Conduct. Companies that sign up to the independently governed code are committing to a gold standard of data handling in the cloud. We are also a member of the European Commission’s Expert Group that developed recently published guidelines on AI ethics. Investing in being part of the solution pays off.

Thirdly, technology companies should accept that recurring bad behavior needs targeted regulation to root it out.

And finally, like-minded companies should also come together to engage with regulators. For example, IBM is a founder member of the Charter of Trust, a global cross-industry initiative centered around 10 cybersecurity principles designed to strengthen trust in the security of the digital economy. We are engaging with policymakers in Europe, US and Asia to help make high standards of security the norm across all sectors.

Nick Blow is a Partner in Brunswick’s Brussels office, where he specializes in EU and international public policy and government relations.

Illustration: David Plunkert

Brunswick Review Sign Up

Download (12 MB)