• Verify4
  • Posts
  • Does your financial institution have a gender gap?

Does your financial institution have a gender gap?

Fairness Through Unawareness: The Use of Sex as Variable in Credit Underwriting

Within a limited number of countries (the United States and South Africa most notably), the use of socio-demographic information—including Sex Disaggregated Data or “SDD”—for credit decisioning is strictly prohibited. The exclusion of this information from credit decisioning is justified asserting the belief that objective risk models focused upon neutral predictive variables such as past repayment history, amount of outstanding debt, gross annual income and employment status—are likely to be fairer and more inclusive than are models using socio-demographic data. 

In the United States, the Fair Housing Act (FHA) and the Equal Credit Opportunity Act (ECOA) together are known as the Federal Fair Lending laws. Both these laws were viewed as extensions of the 1964 Civil Rights Act. In fact, President Lyndon Johnson used the tragic assassination of Martin Luther King, Jr. in April 1968 to push for the Fair Housing Act as a tribute to the slain civil rights leader who had long advocated for fairer federal housing laws. The FHA of 1968 expanded the Civil Rights Act to prohibit discrimination in the sale, rental, and financing of housing based on race, religion, national origin, sex, handicap and family status. Despite being granted fair “access” to residential mortgage credit, many women still could not apply for credit owing to discriminatory policies practiced by lenders. 

Prohibition on the use of sex data was expanded beyond housing with the enactment of ECOA in 1974. This law was implemented exclusively owing to widespread discrimination against women in all credit markets, but especially regarding continued discrimination in residential mortgage lending. A common practice among US mortgage lenders at the time was to discount a married woman’s income, particularly if she was of child-bearing age. Single women also suffered discrimination, as groups lobbying in support of ECOA cited data showing single women being rejected for mortgage loans at a much higher rate than comparably qualified single men. A few years later, in 1976, ECOA was amended to prohibit discrimination in lending on the bases of age, ethnicity, race, religion, national origin, marital status, receiving public assistance, or exercising certain consumer rights and protections. But overwhelmingly the impetus was discrimination against women in lending markets.

These laws were passed in the United States during a period of great social unrest around an increasingly unpopular war in Vietnam—which was disproportionately harming lower income persons and members of minority classes—and generations of injustice and discrimination suffered by African Americans and women. While progress is evident in lending to women in particular and to members of minority communities in the US since the enactment of the Federal Fair Lending laws, some have argued that institutional discrimination within the US financial system remains a major barrier to access to affordable mainstream credit. 

These policies worked very well in housing markets. By 1981 single women owned more homes in the US than single men. This trend continues, as a recent study by LendingTree found that single women own 2.64 million more homes in the US than do single men.

Are Women Better Credit Risks than Men in the U.S.?

The conventional wisdom in the US (and globally) is that women are less risky than are men, on average. In credit markets, this would mean women are less probable to default. 

The U.S. Federal Reserve staff report (called a “note”) utilized detailed credit report data enriched with sociodemographic information. Although it couldn't account for every relevant characteristic, it did control for age, educational attainment, race, and income. After adjusting for these variables, the study found that single females generally have higher installment loan balances, greater revolving credit utilization rates, and a higher incidence of delinquency and bankruptcy histories compared to similar single males. Consequently, these differences in debt usage and credit history result in single female consumers having, on average, lower credit scores than their single male counterparts.

Another more recent study by Federal Reserve Board staffers examines over 1 million single male and female mortgage loan borrowers over a 10-year period. The study found that female borrowers had an average bankcard credit limit nearly 6 percentage points below comparable male borrowers. This difference closed during the Great Recession (2008-2010) but increased thereafter and continued to grow until 2017. Absent sufficient data, the staff study was careful not to offer any causal explanations as to why women with similar credit risk profiles (based upon credit bureau data from Equifax) had lower bankcard credit limits than male borrowers.  

While it would be tempting to explain this as evidence of discrimination, it is likely not so straight forward. For instance, the same study notes that women carried lower bankcard balances and had more bankcards on average than male borrowers. And while bankcard issuers undeniably care about credit risk, they care about other things too—such as projected margins and lifetime expected value. A borrower who is occasionally late, and rolls over a larger average balance but who still pays down the debt is far more profitable for a card issuer than is a person who pays their balance in full each month, and is proportionally more profitable than a person who carries a lower balance.

Put differently, single male cardholders in the US may be offered higher credit limits on average precisely because they are riskier (more likely to be late) which generates fee revenues, and are more likely to carry larger balances (more interest income). 

Is Discrimination Against Women in Credit Markets Alive and Well in The US?

In 2019, Apple rolled out a credit card promising to bring the transparency and simplicity of Apple products to financial services. However, early on the card (which was issued by Goldman Sachs bank) was marred with controversy. Software developer David Heinemeier Hansson (who has a large online following) posted on Twitter (now “X”) that he received 20x the credit limit of his wife, despite filing taxes jointly, being married a long time, and living in a community property state. Shortly thereafter, Apple co-founder Steve Wozniak posted his Apple card credit limit was 10x his wife’s. 

Mainstream and social media alike were abuzz with stories alleging discrimination and decrying the bias of algorithms. However, Goldman was cleared of all charges. An investigation by the New York Department of Financial Services concluded Goldman did not use sex to discriminate, and made credit limit decisions instead based upon variables including credit bureau credit report data, income, The New York regulator stated that Goldman did not consider prohibited characteristics and after analyzing impacts on a sample of 400,000 persons concluded the algorithm would not lead to disparate impacts. New York Department of Finance went on to point out “…the idea that spouses with shared finances would receive the same credit terms was a common misconception. Normal guidelines like credit history or unpaid debt were what determined whether or not a spouse received a higher limit.” 

While the NY Department of Financial Services was satisfied with Goldman Sachs’ defense of their algorithm, there are reasons to remain concerned. Algorithms that don’t use Sex as a variable can still discriminate by sex even when they are programmed to be “blind” to that variable. For example, a gender-blind algorithm could still be biased to the extent that it’s using other variables that are correlated to gender. For instance, where a person shops may be highly correlated with gender. And relatedly, a second reason to be concerned about gender blind models is that imposing blindness to that variable makes it harder to detect bias on that very same variable. Companies using algorithms for credit decisioning must collect protected attributes like sex and race to measure the performance of their algorithms and protect against bias and discrimination.

Ultimately, as an approach “fairness through unawareness” has been largely successful in stamping out gender discrimination in credit markets in the US. As they say: “We’ve come a long way, baby.”