Facebook Ignores Pleas on Fair Lending Rules, Unnerving Financial Services Execs

Published on Dec 10, 2021

Facebook parent Meta Platforms (FB) is refusing to guarantee that its advertising algorithms comply with U.S. rules on fair lending, leaving banks and other financial service providers worried that the social media giant could expose them to fines and hits to their reputations, executives said.

The federal rules forbid lenders to base their marketing on traits such as race, religion and national origin. Ads that target one racial or ethnic group and exclude another break the law, even if the targeting is done by a computer instead of a human.

Meta’s chief operating officer, Sheryl Sandberg, has said that protecting people from discrimination is one of the company’s “top priorities.” But financial services executives have told The Capitol Forum that the platform will neither certify that its ad systems comply with lending rules nor modify them to meet the requirements.

“Facebook has the power to change their algorithms to support financial inclusion and avoid discrimination related to access to credit, but has failed to make any such changes,” said Roger Hochschild, the chief executive of Discover Financial Services (DFS).

Hochschild expressed his frustrations in a personal letter to Sandberg in August 2020, but she never responded, according to three people familiar with the note.

A Meta spokesperson declined to comment on the letter but said the company was aware of Hochschild’s concerns. Meta has taken steps to prevent discrimination and has told advertisers “how our products work,” the spokesperson said. Beyond that, lenders must make their own decisions about Facebook advertising.

“Financial institutions regulated by fair lending rules are best suited to make their own compliance determinations,” the spokesperson said.

Advertising brings in most of Facebook’s revenue, which surged 35% in the third quarter to $29 billion from a year earlier.

Divided responses. Questions about Facebook’s advertising algorithms have left financial services companies divided over whether they should offer credit through the platform and its sister services, including photo-sharing app Instagram.

JPMorgan Chase and American Express advertise credit cards on Facebook, according to a searchable list of Facebook ads. Citigroup, Wells Fargo and Bank of America do not.

An American Express (AXP) spokesperson said the company does what it can to monitor that its social media presence complies with the law. A Chase spokesperson said the bank does use Facebook on a limited basis to promote the benefits of customers’ existing accounts.

PNC Financial Services Group (PNC), the nation’s seventh-largest bank by assets, decided more than a year ago that advertising with Facebook was too risky.

“We made the decision to pause lending-specific advertising on Facebook and Instagram,” said a PNC spokesperson. “This pause continues today.”

PNC and some other leading banks that shun Facebook do advertise on Google. The world’s largest search platform is more open about its advertising algorithms and provides assurances about its fair lending safeguards, said several financial services officials.

Google declined to describe its algorithm in detail but said it strives to tailor its ad software to avoid bias.

“Our personalized advertising policies have prohibited advertisers from targeting users on the basis of sensitive categories,” said spokesperson Elijah Lawal at the Google unit of Alphabet (GOOG).

The difference between the algorithms deployed by Google and Facebook might lie in how finely tuned they are, said Peter Romer-Friedman, a lawyer who worked on a legal team that sued Facebook over housing discrimination in 2017.

“If Google’s predictive algorithms for deciding which ads to send users are more conservative than Facebook’s, then Google might avoid some of the problems Facebook has had with employment, housing and credit ads,” said Romer-Friedman, who works at law firm Gupta Wessler.

‘Black-box algorithms.’ Facebook’s reluctance to certify that its advertising systems avoid bias could draw renewed attention from U.S. regulators, who have long raised questions about studies showing how Facebook’s ad systems can be used to discriminate.

Rohit Chopra, the new head of the Consumer Financial Protection Bureau (CFPB), recently said that secret advertising algorithms must not be used to sidestep fair lending rules.

“I am very worried about black-box algorithms,” Chopra told a congressional hearing in October. “We need to make sure that firms cannot dodge fair lending laws and anti-discrimination laws under the guise of their secret algorithm.”

Lingering bias. U.S. lenders were forced to open their doors wide after the Equal Credit Opportunity Act (ECOA) of 1974 banned ads that might “discourage” prospective customers. The

legislation expanded protections provided in the Fair Housing Act of 1968, which sought to stamp out racial discrimination in housing and targeted “redlining” practices that denied credit and insurance to minority neighborhoods seen as “hazardous” to investment.

But these protections came under pressure with the arrival of targeted online advertising, which allowed advertisers to exclude users from viewing housing and credit ads based on their race, sex, age and other characteristics. In 2016, civil-rights groups and housing advocates began filing lawsuits that accused Facebook of violating fair lending rules.

Three years later, Facebook agreed to pay $5 million to settle suits alleging that its ad platform had allowed for discrimination.

As part of the settlement in 2019, Facebook agreed to harden its controls on advertisers offering housing, employment and credit. Under the new policies, advertisers would no longer be able to target based on gender, age and zip codes which can be a proxy for identifying race.

But Facebook ads can still show bias, said researchers and lending executives. Facebook targeting tools such as Lookalike Audiences and Special Ad Audiences allow advertisers to skirt anti-bias controls, according to researcher Jinyan Zang of the Public Interest Tech Lab at the Harvard Kennedy School of public policy.

A lender who could no longer target or exclude Facebook groups for “African Americans” or “Hispanics,” could instead focus “interest groups” such as “African-American Culture” and “Hispanic American Culture,” according to Zang.

His conclusion: “Facebook still has a discrimination problem by race and ethnicity on its advertising platform,” he wrote in an article for the Brookings Institution. This poses “a significant threat to the public interest,” he wrote.

Quietly Stewing. Several bank executives told The Capitol Forum that they had enlisted the American Bankers Association (ABA) to raise fair lending issue with Facebook.

“It’s not surprising that [banks] want and expect their vendors and partners to do everything possible to help them meet their fair lending compliance obligations,” said an association spokesperson. “ABA supports those efforts.”

Financial services executives said they had been quietly stewing about Facebook’s reluctance to stand behind its online ad software when Hochschild asked Facebook’s Sandberg to intervene last year.

In his letter, Hochschild beseeched Sandberg to modify Facebook algorithms so that they would not inadvertently discriminate.

Facebook’s algorithms “may cause your financial institution clients to unwittingly violate important anti discrimination laws such as the Equal Credit Opportunities Act,” according to people who have seen or been briefed on the letter. Discover declined to make a copy of the letter available to The Capitol Forum.

CFPB actions. Several bank regulators police fair lending alongside the Justice Department. Two other agencies charged enforcing the rules are the Federal Trade Commission and the CFPB.

In October, the CFPB faulted Trustmark National Bank, a Mississippi lender, for marketing to white borrowers in several southern states while doing relatively little to draw Black business.

Under the Trump administration, the CFPB sued a Chicago-based mortgage broker for using a radio show to discourage black borrowers from applying for a home loan. The broker—Townstone Financial—has denied the charges and is fighting the bureau in court.

During the congressional hearing in October, Rep. Sean Casten (D-IL) asked Chopra whether lenders should be worried when “Facebook cannot share and has refused to share any information about whether the algorithms they use to boost their ad-tracking are in fact intentionally targeting certain racial groups.”

“Facebook, in your hypothetical, may be liable for that,” Chopra said.