Martin Chavez on how to regulate big tech like big banks

Source

THE LIST of complaints against the leading web platforms runs long, from privileging their own products and operating both sides of an advertising exchange, to ignoring the problems that result from their activities. Applying conventional rules like antitrust law is difficult, since it is hard to show consumer harm when the service is free. But there is another way to regulate big tech: treat the companies like big banks.

The American banking industry learned much from the financial crisis of 2008, the regulatory crackdown and the collapse of return on equity as banks had to build up reserves of capital and liquidity under the new, Dodd-Frank regulations. Today, banks lend to clients and make markets for them, with occasional stresses but no panic. Although many bankers look back longingly at their pre-2008 returns on capital, they still operate lucrative businesses.

I saw this transformation first hand as an executive at Goldman Sachs. Yet I came to banking from the technology sector and today I invest in and sit on the boards of several tech companies. So I appreciate how both industries work, and how regulatory trends from one domain can usefully apply to the other. The lessons from banking offer a way forward that regulators, the public and even the web firms themselves can buy into.

Today big tech is in disrepute, not unlike banks after the Wall Street Crash of 1929 and the situation in 2008. In both cases, regulators marched in. The 1933 and 1934 securities-exchange laws required that investors receive financial information and prohibited deceit, misrepresentations and other fraud. It curbed the worst excesses. Since then, caveat emptor has given way to the “best interest rule”, which requires that broker-dealers place retail customers’ interests first, and creates an obligation of disclosure, diligence, care and skill to the customer.

Lawmakers and regulators should apply that ethos by imposing similar obligations on the tech titans. The state could require the companies to create and manage meaningful information barriers. For banks, this means separating mergers-and-acquisitions departments from the trading business, or the trading business from prime brokerage. For tech platforms, those barriers could limit the interactions between a logistics business and a product business, or an app store and a hardware business, or the ad-exchange operator and the advertiser- and publisher-facing units.

Cross-applying the spirit of banking regulations onto web platforms, lawmakers should ask: Does the content served up to retail users meet appropriate standards for truthfulness and accuracy? Does the digital business establish and enforce a code of conduct, with structures and practices where the customer comes first? Has the business attested that customers understand what the firm does with their data, and how its algorithms influence customer behaviour? Does the business, in its role as a “common carrier” of information, give fair and symmetric access to the products and services of rivals?

Next is the issue of identification and anonymity. The 1970 Bank Secrecy Act introduced anti-money-laundering rules to safeguard the financial system from abuses, such as terrorist financing and organised crime. The “know your customer” rules from the 2001 Patriot Act require banks to implement a client-identification programme. Similar rules should be placed on tech platforms: a global system for digital identities, perhaps along the lines of Singapore’s National Digital Identity or India’s Aadhaar digital identity—but with a difference: individuals can opt in to get a better, more trustworthy user experience.

For example, the rules could require digital businesses to flag content created by unverified identities and highlight content created by verified people. This would give consumers the ability to consider the source of the content and to discount it appropriately. The bots and trolls that stifle online interactions would be pushed back to a corner of the web. Anonymity and privacy are sacred rights, but just as in the financial system, there can be no digital right to cover for deliberate misinformation, incitement to violence, human trafficking and other illicit activities.

Then there are stress tests: simulating crises as a way to protect against them. The Comprehensive Capital Analysis and Review (CCAR) programme, established under the Dodd-Frank rules, is an annual exercise to assess whether the largest American banks account for their unique risks and have the capital to continue in times of economic and financial disruption without having to turn to the government. Banks are presented with disaster scenarios and have to simulate their cash flows, income statements and balance sheets for nine quarters into the future, showing that they could continue ordinary operations. The process requires collaboration inside banks, and between banks and regulators. (CCAR’s rigour is one reason why banks operated smoothly amid the economic gyrations of the covid-19 crisis.)

The rules place onerous burdens on 34 major financial institutions. Tens of thousands of people, and perhaps millions of computers, work year-round to calculate and interpret the data, spurring deep and manifold changes in the decision processes that underlie mergers and acquisitions, leverage, capital- and risk-allocation, share buybacks and dividends. There is complaining and grumbling by bankers, of course. But there is also nearly universal acknowledgment that it has greatly reduced the probability of a catastrophic banking crisis.

There are useful parallels between CCAR and the regulation of big tech platforms. Regulators can require that digital businesses simulate potential disasters and develop policies to prevent them. As in banking, the online companies can be required to measure, capitalise for and compensate society for negative externalities, ranging from lost peace of mind and competitiveness, to self-harm, violence and human-rights abuses.

For example, tech platforms measure emotional resonance, engagement and outrage when they choose to increase their advertising revenue by amplifying or promoting specific content. Those companies should have to pay the equivalent of a carbon tax, estimated by their own models and governed by the regulator, if they pollute the infosphere with falsehoods and incendiary content.

Just as CCAR places limits on the leverage in derivatives portfolios by imposing capital requirements, digital regulators would insist that platforms constrain the algorithmic amplification of outrage and emotional resonance by limiting the propagation of viral content. Regulators would base their rules on a deep and shared understanding between the regulator and the regulated of how digital companies make money, just as banks simulate their complex chain of interlinked businesses into the future in a stress test.

No one would call the current financial regulations perfect. Bright lines do not divide global, systemically important banks from other institutions. Fines and remedies vary inconsistently. Still, regulators and bankers can step back and acknowledge their common efforts to create a safer, sounder financial system. And it is a system—a collection of rivals and regulators working together with society’s interest in mind.

The parallels between financial networks and social networks are obviously inexact. Some digital firms will balk at rules that mandate they simulate their business and capitalise for externalities (though they have computing power and financial resources to do the job). Yet just as the public deserves a banking system it can trust, why should it settle for less when it comes to sharing and accessing information, which is the basis of liberal democracy?

_________

R. Martin Chavez is a senior advisor to Sixth Street Partners and the former chief financial officer of Goldman Sachs, which he joined in the 1990s after earning a PhD in bioinformatics. He currently invests in and serves on boards of several technology startups. He is also president of the Board of Overseers of Harvard University.