Is this the most racist object you’re carrying today? Have you ever considered
your smartphone to be a racist object?
The Black Lives Matter movement is gaining global prominence. And the smartphone is becoming an important evidence gathering tool. But is that same device actually guilty of racial bias. Since tech is designed by people and people have plenty of prejudices, known or otherwise, biases will be in technology design.
Back in the 1950s, Kodak developed something called a Shirley card. It was a picture of a beautiful white woman, and the card was used in film processing to get skin tones right. There
was a big problem however, although various women play the role of Shirley they were always white. This meant for darker skin people photographs featuring them will become less and less
distinct, the darker they were. People of colour just weren’t considered in film technology.
This was the case until the 1970s and something prompted the film manufacturers to consider
broadening their range of colour tones colour was becoming increasingly important in
marketing. Photographs in colour were common and colour TVs were appearing in every home but furniture and chocolate makers were not happy with their marketing material. The pictures of their dark brown products lacked detail. This was because film processing was fixed on trying to show the best in white skin tones. Colour Management in film needed to change to adapt to marketers needs.
These woes of chocolate and furniture manufacturers had the unintended side effect of
helping to create films that could better portray black people. That unplanned side
effect didn’t go unnoticed by the film manufacturers. They realised that darker skin tones
would also benefit. However, in order not to offend white customers, the improvements in
the handling of dark skin tones was presented as the film now being good enough to photograph
the details of a dark horse in low light.
There were however some glimmers of hope. Shirley cards did eventually have more racial and ethnic diversity portrayed on them. They were still always glamorous women but different skin tones and races began to be included in the 1990s Philips designed a TV camera chip system that could capture different skin tones with more elegance. Stars like Oprah Winfrey were on the rise, and limitations of photographic, TV and film technology in capturing the beauty of diversity was being recognised.
However, in the main people of colour remained a secondary afterthought in film technology. Generations of photographers and filmmakers have been trained to think, ‘white first’, and to consider dark skin tonality as a problem to be fixed and not a beauty to be explored.
Sadly, the digital age has not seen the racist bias in the technology of photography resolved. In 2009. Hewlett Packard’s webcams face-tracking technology failed to work with dark-skinned people. Google’s facial recognition algorithms in 2015, categorised black people as gorillas. And studies continue to show that biases in the algorithm of globally used facial recognition technology will lead to the misidentification of women, black and Asian people. To be recognised you need to be a white male.
The Continuing Struggle
The filmmaking industry continues to struggle with getting dark skin tones right. Ava Berkofsky, cinematographer on HBO’s show Insecure, had to use a variety of techniques to bring the cool beauty of the black cast onto the screen. This is because, in the implementation of the technology, people of colour remained an afterthought. Light skin tones are the expected norm. Now, as the world learns how to live in the shadow Coronavirus video telecommunication technology has become vital to many businesses and institutions Zoom Google Hangouts and Microsoft Teams are all affected by the camera technology having a legacy of whites first.
Consider job interviews you put on your sharpest clothes and switch on the webcam for your remote assessment. But because the camera design is focused on getting light skin tones right, the fact that you are black will mean many details will be lost in the image the interviewer sees. How many remote job interviews have been lost because black people simply didn’t look as clear as whites on camera. Because the technology treats them as second class citizens.
The Rise of Artificial Intelligence
We’re now entering the era of Artificial Intelligence enhanced photography and filmmaking. On the latest smartphones when you take a photograph or make a video Then the devices making thousands of decisions about the light, shadow and even the beauty of the image. But the algorithms of AI, like the Shirley cards of the last century are designed by people. AI is rapidly moving into self-learning, but these are relatively early days of AI in photography. And it carries a great risk that people of colour will still be mistreated as second class citizens in the processing of images. This is because if a bias is programmed into the algorithm that started the AI process by human operators then as the technology itself learns it will be building upon that human bias that started with.
Will the 21st century continue to embrace the prejudices of the 20th in technical design? Or is now the right time to challenge technology companies that are creating smartphone cameras along with movie and TV, hardware and software to demand that Black lives do matter and are not a secondary choice or an afterthought in the design and implementation process?
Further reading for those interested in knowing more about this.
Looking at Shirley, the Ultimate Norm: Colour Balance, Image Technologies, and Cognitive Equity by Lorna Roth
The Racial Bias Built Into Photography by Professor Sarah Lewis
Leave a Reply