Social media and the sharing economy have created new opportunities by leveraging online networks to build trust and remove marketplace barriers. But a growing body of research suggests that old gender and racial biases persist, from men’s greater popularity on Twitter to African Americans’ lower acceptance rates on Airbnb.

Now, using the photo-sharing site Instagram as a test case, Columbia researchers demonstrate how two common recommendation algorithms amplify a network effect known as homophily in which similar or like-minded people cluster together. They further show how algorithms turned loose on a network with homophily effectively make women less visible; they found that the women in their dataset, whose photos were slightly less likely to be ‘liked’ or commented on, became even less popular once recommendation algorithms were introduced.

By working out the math of how this happens, the researchers hope that their work, to be presented April 25 at the Web Conference in Lyon, can pave the way for algorithms that correct for homophily.

“We are simply showing how certain algorithms pick up patterns in the data,” said the study’s lead author Ana-Andreea Stoica, a graduate student at Columbia Engineering. “This becomes a problem when information spreading through the network is a job ad or other opportunity. Algorithms may put women at an even greater disadvantage.”

The researchers scraped their data from Instagram in 2014, after Facebook bought the company but before automated prompts made it easier to connect with friends-of-friends. Though women outnumbered men in their sample of 550,000 Instagram users (54 percent to 46 percent), the researchers found that men’s photos tended to be better received: 52 percent of men received at least 10 ‘likes’ or comments compared to 48 percent of women.

As expected, homophily played a role. The researchers found that men were 1.2 times more likely to ‘like’ or comment on other men’s photos rather than women’s, while women were just 1.1 times more likely to engage with other women.

When they used two widely used recommendation algorithms — Adamic-Adar and Random Walk (friends-of-friends) — the researchers found that the percentage of women connected to, or predicted to be recommended to, at least 10 other Instagram users fell from 48 percent in the original dataset, to 36 percent and 30 percent respectively. As predicted in a series of mathematical proofs in the paper, the researchers also found that the disparity was greatest among Instagram’s super-influencers — people like Instagram CEO Kevin Systrom, whose popular posts and 1.5 million followers put him in the top tenth-of-one percent for engagement.

When algorithms were turned loose on this exclusive network of ultra-engaging individuals, women’s visibility plunged. Though women in the top .1 percent for engagement (with at least 320 connections) outnumbered men (54 percent to 46 percent), the men were far more likely to be suggested to new users and expand their networks rapidly. Just 26 percent and 28 percent of women in the top .1 percent were likely under the Adamic-Adar and Random Walk algorithms respectively to be recommended at least 23 times and 12 times, the researchers found.

“Algorithms pick up subtle patterns and amplify them,” said the study’s senior author, Augustin Chaintreau, a computer scientist at Columbia Engineering and a member of Columbia’s Data Science Institute. “We’re not asking that algorithms be blind to the data, just that they correct their own tendency to magnify the bias already there.”

The study is the latest to show that recommendation algorithms, in addition to filtering content, may influence the long-term structure of a social network. “It’s remarkable that a simple assumption of homophily leads algorithms to amplify disparities in social status,” said Amit Sharma, a researcher at Microsoft Research India who was not involved in the study but recently spoke at Columbia about his own work exploring recommendation engines and social influence.

Algorithmic interventions that balance convenience with ethical goals may be one way to address the problem, he added. “Through studies like this, we’re learning that the practice of optimizing a single metric exclusively, for example, number of new friends added, is not the right way. Unfortunately, the alternative is unclear. We are still scratching the surface of understanding how algorithms affect long-term human behavior.”

 

Source: Read Full Article