EASTON, Pa. (WLVT) — Search engine results are not as randomly generated as some may think.
For instance, when you Google the term "cute baby," the image results returned feature mostly white babies.
"Search the term genius and you’ll see mostly white males pop up," Cindy Casey, who heads up the computer science program at Gwynedd Mercy University, told PBS39. "Search basketball player. You’ll see mostly African Americans."
Search engines, she says, are not always neutral.
"The data used to train the machine, the algorithm, could be biased," explained Casey. "For example, if I train a machine to recognize a banana, maybe I train it to recognize multiple sizes of bananas. That’s great. But, if I exclude red bananas, the machine won’t recognize them."
Casey says the issue stems from a lack of diversity in the tech industry. She pointed out a recent issue with some restroom technology.
"There was an automatic hand soap dispenser that was found to only dispense soap to Caucasian hands. Why? The developers only used white hands to test and train the machine," said Casey.
Jenn Rossmann co-directs the Hanson Center for Inclusive STEM Education at Lafayette College. She agreed that software development becomes subjective when the developers all look the same.
"If you search for beautiful, you’ll see mostly white women," Rossmann told PBS39. "The people who wrote the code had to teach the machine what beauty looked like. They created metrics for beauty and fed the machine data sets about beauty."
Some of the bias comes from users, too.
"The algorithms learn from what we type into the search bar and what we click on," said Rossmann.
Casey believes more inclusive tech will be developed over time.
"This is really combated by getting more minorities and women into computer science," said Casey.
Rossmann suggests users reach out to tech companies and call for change.
"The biggest mistake we’re making is entrusting machines to make decisions that ought to have some diverse human judgement applied to them," said Rossmann.