Founder and CEO, uCodeGirl | Bush Foundation Leadership Fellow | PhD Candidate | Lecturer of Computer Science | YWCA Woman of the Year in Science and Technology
I could never forget this moment in time at a fancy bathroom of a conference center in the upper midwest. It was kind of a bit embarrassing, actually. I was in a frantic and visible struggle with the faucet that conspired with the water to deny me a simple hand washing ritual. Worse, I already had soap on my hands. “Why can’t I get this automatic faucet to dispense water?” I tried every which way, for over ten seconds. It didn’t look like rocket science. Simple algorithm: put your hands under the sink, faucet dispenses water, wash your hands, and done. All other valves in the bathroom did the same thing to me while my colleagues took care of business without a hitch. “Why can’t I?” It was puzzling. Little did I know the automatic water dispenser was actually racist. Well, sort of, the technical explanation is that the faucet sensor functions by measuring the infrared (IR) light radiating from an object in its field view (from How Do Touchless Faucets Work? | Hunker). When that object is darker skin tone, it can cause the light to absorb rather than bounce back. If only I could have waved a white paper towel, I would have been spared from wiping the soap with it instead. Later, I came to notice this was not an isolated incident. According to the American Standard installation instruction, faucets can be adjusted during installation for the range of light beams, from typically 2-10 inches, with a default value of 6 inches. The installation and calibration of that particular bathroom sensor design didn’t include me. I wasn’t the problem.
Wearable tech is another space where the problem of skin science has yet to be resolved. Smartwatches and fitness trackers from fitbit to apple watch when first released to market were getting inaccurate heart rate monitor readings for people of color. An article by STAT Health Tech found that several of these brands only rely on optical sensors that use beams of green lights which have a shorter wavelength (simpler and cheaper) but are more readily absorbed by melanin, a natural skin pigment that is more prevalent in people with darker skin. Mounting complaints of people with tattoos have forced some of the manufacturers to do more research and augment the reading with other methods. It begs the deeper question though, “who is in the room when these products are being researched, designed, prototyped, developed and human tested?”
Computing technologies are designed to solve real-world problems. Computer code at the machine level is binary, zeros and ones, neutral and value free. But when we start writing instructions to the computer and wrap meaning about them, then, unfortunately, it can’t escape the socially neutral test as is reflected in sensor design, in algorithm development as well as user interface design.
I am often the invisible dark one in group photos as the autofocus feature of phone cameras adjusts its light based on the lighter pigmentation. My own computer login screen doesn’t recognize me half the time. OK, I will admit I change my hairstyle every other day but nevermind the software is called facial recognition, not fashion tracker. In response to a google search that misidentified and mislabeled images of black people in 2015, Google’s Chief Architect for Social, Yonatan Zunger, had said it is rather due to the way machines recognize faces or more specifically the way algorithms learn. He further elaborated that, in this case, the algorithm was fooled by some aspect of an image’s pattern and assured that the algorithm will learn from getting feedback from users. Could this have been prevented with a leadership decision to delay product release until more research and more testing with diverse data sets is achieved? How could ethics in software development be addressed? Recently, Microsoft joined Amazon and IBM in pushing the pause button from selling facial recognition A.I. software for any U.S. police departments until there are laws in place governing the use of such technology grounded in human rights. The cost of inaccurate recognition coupled with lack of regulation has had dire consequences.
A 2019 Georgia Tech published research paper entitled, “Predictive Inequity in Object Detection”, suggested that pedestrians with darker skin may be more likely to get hit by self-driving cars than those with lighter skin. Fatal consequences. “Companies don’t want the public to know about any issues of inaccuracy, so consumers need to learn to ask a lot of questions,” said Jamie Morgenstern, School of Computer Science (SCS) assistant professor and the study’s lead author.
Millions of “tiny” things like this exist in our world which demonstrate a significant usability gap for a diverse spectrum of people. Diverse representation in the technology industry and leadership roles, users research and Value Sensitive Design (VSD) all contribute to the economic sense of diversity of thoughts to create inclusive and just environments and institutions. At uCodeGirl, we are dedicated to seeing a world where the creators and builders of technology mirror the people and societies for which they create and build.
“It is not faith in technology. It is faith in people, that they’re basically good and smart, and if you give them tools and opportunities, they will do wonderful things with them.” Steve Jobs, Co-founder of Apple.
About uCodeGirlThe vision of uCodeGirl is to inspire and equip young women to become the future face of innovation in technology. uCodeGirl is uniquely designed to inspire, engage and equip young women with computational design thinking skills, leadership traits, and an entrepreneurial mindset. uCodeGirl strives to remove roadblocks and bridge the gender gap in technology so that young women can confidently pursue opportunities suitable for the 21st century. By building confidence, enhancing skill sets and tapping into their intellect and curiosity, uCodeGirl helps young women chart a pathway to the T of STEM careers. More information here:- www.ucodegirl.org | Twitter @ucodegirl | Facebook: /ucodegirl