Stay updated with the latest in Tech, Science, Culture, Entertainment, and more by following our Telegram channel here.
Since the creation of the camera, photography has been technologically optimized to capture white people best. Engineers at Google are trying to change that.
At Google’s developer conference, Google I/O, Tuesday, the company announced that it’s working to re-work the algorithms and tweak the training data that power the Pixel camera in order to more accurately and brilliantly capture people of color.
Specifically, it is working to better light people with darker skin and more accurately represent skin tone. Also, silhouettes of people with wavy or curly hair will stand out more sharply from the background.
Photography plays an important role in shaping how people see you and how you see yourself. That’s why we’ve been working with industry experts like @thekirakelly, @micaiah_carter & @deunivory to build a more accurate and equitable camera for people of color. #ImageEquity pic.twitter.com/Vz6z9Gox6k
— Google (@Google) May 18, 2021
Google isn’t the only company having a technological reckoning with racial bias. Just last month, Snap announced it was re-working its camera software to better represent people of color.
Google is calling its project “Image Equity.” Like Snap, the company worked with outside experts in photography and representation to guide the undertaking.
Some of the changes will involve training the algorithms that render the photos on a more diverse dataset, so white people and white skin aren’t the default definition of “person.” Google will also be tweaking the Pixel’s auto white-balance and auto-exposure capabilities to better optimize for people with darker skin.
ที่มา : Mashable