Meet the Disruptive Women Powering Our Autonomous Future

Meet the Disruptive Women Powering Our Autonomous Future
Source: Pexels, Trace Hudson

Women’s History Month makes space for summits like Disruptive Women Powering Our Autonomous Future to take place. Velodyne Lidar and the Association for Unmanned Vehicle Systems International (AUVSI) – both tech companies known for their autonomous vehicle advancements – brought together a formidable ensemble of women on March 25 to discuss their roles in the autonomous vehicle (AV) industry. The increasingly ubiquitous nature of autonomous technology promises to revolutionize our society and its infrastructure, and with that growing ubiquity comes ample opportunities for employment in the AV industry.

Though women represent about 50% of the population, only 8% of top executives and 16% of all workers in the automotive industries and 20% of all college engineering students identify as women. These disproportionate statistics speak to a yawning gap in gender equity, especially in STEM fields. Disruptive Women Powering Our Autonomous Future discussed female representation in the AV industry, how gender inequality manifests in tech and how to support young women who aspire to work in AV technology.

Making a place for women in STEM

woman coding on computer
Photo by ThisIsEngineering on Pexels.com

TMS spoke with two women involved in the summit about what diversity means to them and how it impacts tech industries. Sally Frykman, Velodyne Lidar’s chief marketing officer, began as a social worker and an educator working with people with disabilities and then worked her way into administration.

Velodyne organizes summits on autonomous vehicles and safety to demystify the industry by addressing safety issues and public concerns. They aim for transparency, and ultimately argue that autonomous vehicles are an integral part of reducing harm caused by the 94% of collisions as a result of human error. These summits inspired Disruptive Women Powering Our Autonomous Future.

“When we were planning the world safety summit, it just so happened that one of our panels was all women, and we thought, gosh, this is amazing … This should not be unique,” says Frykman.  “And born from that reflection is Disruptive Women Powering Our Autonomous Future … There is not a dearth of female leadership, there is a dearth of female representation, and we want to bring these female leaders forward.”

These leaders don’t pop up in the workforce; they emerge in classrooms. “It’s critical to open STEM opportunities for young women and young girls really, really early,” explains Frykman. “I also think that from the industry perspective, and this is what we’re trying to do with our event, is show the women leaders there so that there are role models for anyone to look up to. And I’m talking anyone from the age of five to if you want to enter the career later in life.

“I think what happens, too, is that companies get hyper focused on innovation, and wanting to put out the very best technology. And something that fortunately now is becoming more prominent, but has been an afterthought, is the importance of diversity. As long as we’re committed to innovation, we need to be committed to being innovative in diversity as well.”

Diversity and representation in tech

photo of people doing handshakes
Photo by fauxels on Pexels.com

How does a lack of diversity impact technology, and what does it mean for you when you are not represented by tech developers or in algorithms or data sets? Genevieve Smith, Associate Director of the Berkeley Haas Center for Equity, Gender & Leadership (EGAL), moderated the Eliminating Hidden Bias in Autonomy and Beyond panel at the summit. Her work at Berkeley focuses on advancing inclusive artificial intelligence (AI) by mitigating bias.

“I think it helps, when thinking about bias and AI, to recognize that AI systems are biased because they are created by humans,” says Smith. “They’re ultimately decisions around what data is for an AI system and what the AI system should be thinking about, different kinds of variables that it should consider. Those are decisions made by humans.

And so who is making the decisions, forming AI systems, who is on the team developing AI systems really matters because it’s their perspective, values, etc. that are integrated into these different technologies. There’s a huge gender gap with who’s represented in computing and within AI. A crazy statistic that I like to think about is there are less women in computing today than there were in 1960.”

The gendered associations of industries, activities and even colors shift throughout history, which speaks to the arbitrary act of assigning gender. Just as pink was once considered a masculine color, data collection was once a feminized occupation. UNESCO examined how tech developers gender AI technology in the series of think pieces entitled “I’d Blush If I Could: Closing Gender Divides in Digital Skills Through Education,” specifically in the case of digital voice assistants like Apple’s Siri and Amazon’s Alexa. The feminine names and synthetic voices reinforce gender bias and perpetuate the idea of women as subservient by making women the default virtual assistants, thus linking women to the errors that result from the flaws of hardware and software designed predominantly by men.

Gender bias also infiltrates AI in more inconspicuous ways. “Bias comes into AI systems on a more granular level based on the data the AI systems are fed,” explains Smith. “There are really large gender data gaps, and one way to think about this is there’s a really big gender digital divide. 300 million fewer women than men access the internet on a mobile phone. And a lot of these technologies, like [the] internet, smartphones, etc. generate data about their users, but if less women have access they will be inherently skewed toward males.”

These imbalanced data sets then affect algorithms, which “are told what data sets to learn from and what variables to consider when making decisions,” says Smith.

Smith continues: “I think a really [good] example to illustrate how bias can come into play within algorithms is this online tech hiring platform called Guild, which enabled employers to use ‘social data’ and other resources like resumes to rank candidates. And basically social data is a proxy that refers to how integral a programmer is in the digital community.

And Guild drew from time an individual spent sharing and developing code on platforms like GitHub, but because women have more expectations on their time … they have less time to chat online. And also a lot of research has shown that women are made to assume male identities on platforms like GitHub because of online gender-based violence or gender-specific safety concerns around harassment, trolling, etc. And so, by prioritizing [social data in the hiring process] it actually is biased against women.”

Bias in data, algorithms and AI on the whole is not limited to gender. Race, disability and sexual orientation all expose various gaps in the tech industry’s diversity or lack thereof. If the team creating the tech does not represent the entirety of the consumer population, it not only excludes the experiences of marginalized identities, it can also pose a danger to those communities. “Diversity is good for everyone,” says Frykman, “and it’s critical for us in this particular industry to be able to not just recognize that, but see that there is space for it already.”

Have a story to share? Get in touch at contributors@themilsource.com