Utopias, Dystopias and Today's Technology

Techno Womanism

January 25, 2023 Johannes Castner Season 1
Techno Womanism
Utopias, Dystopias and Today's Technology
Chapters
0:00
theme song (written, performed and mixed by Neal Rosenfeld, sang by Jennifer Youngs)
0:35
welcoming message
0:47
introducing Shamika Klassen
2:04
introduction to Techno Womanism
2:30
what is Womanism
3:48
the three waves of Womanism
4:53
how Shamika came to create Techno-Womanism
8:02
Liberation Theology
10:06
theology and ethics
11:12
the tenets of Techno-Womanism
15:00
biases and the need for intersectionality
17:19
Foucault's critique of human categories as domination
19:12
is removing categories cultural erasure?
19:49
can AGI (Artificial General Intelligence) force us to rethink our identites as humans?
20:48
where does identity come from?
21:51
human identity after the white supremacist system falls
31:57
technological perpetuation of the white suprimacist system
32:59
biases and blind justice, the case of recidivism
41:47
technologists must understand whom they are building things for
43:58
what does scaling mean for diversity; is Techno Womanism a global concept?
45:03
the need to teach humanities alongside computer science
45:46
the Black Mirror Writer's Room exercise
46:33
Speculative Design
49:24
the ethical dimension of Speculative Design
52:18
Augmented Intelligence
55:19
who is and who will be in the driver's seat, AI, corporate leaders, or most of us?
58:33
user experience research to bring marginalized peoples into the process of building the future
59:44
how to keep up with Shamika's work
1:00:30
call to keep the conversation going
1:00:50
what's next on the show
More Info
Utopias, Dystopias and Today's Technology
Techno Womanism
Jan 25, 2023 Season 1
Johannes Castner

This is a conversation between Shamika Klassen and host, Johannes Castner, about Shamika's original concept Techno Womanism, what influenced her to create the concept and many other related ideas, such as identity. 
Shamika's paper on Techno womanism can be downloaded here:

https://academiccommons.columbia.edu/...

Here is her paper, with Dr. Xeturah Woodley: https://www.learntechlib.org/p/207712/

In our discussion we referred to a number of people and books:

https://en.wikipedia.org/wiki/In_Sear...

https://www.amazon.co.uk/Search-Our-M...

https://en.wikipedia.org/wiki/Emilie_...

https://www.amazon.com/Womanist-Cultu...

https://www.amazon.com/Breaking-Fine-...

https://www.amazon.com/Aint-Womanist-...

Show Notes Transcript Chapter Markers

This is a conversation between Shamika Klassen and host, Johannes Castner, about Shamika's original concept Techno Womanism, what influenced her to create the concept and many other related ideas, such as identity. 
Shamika's paper on Techno womanism can be downloaded here:

https://academiccommons.columbia.edu/...

Here is her paper, with Dr. Xeturah Woodley: https://www.learntechlib.org/p/207712/

In our discussion we referred to a number of people and books:

https://en.wikipedia.org/wiki/In_Sear...

https://www.amazon.co.uk/Search-Our-M...

https://en.wikipedia.org/wiki/Emilie_...

https://www.amazon.com/Womanist-Cultu...

https://www.amazon.com/Breaking-Fine-...

https://www.amazon.com/Aint-Womanist-...

Johannes Castner:

Hello, I'm Johannes. Um, welcome to the show. I'm here today with, uh, Shamika Klassen and let me introduce her, um, for which I always read a little script. So, she is the founder of the Tech, uh, chaplain Chaplaincy Institute.

Shamika Klassen:

Mm-hmm.

Johannes Castner:

In 2020, which she recently sold a learning forte in the fall of tw uh, 2022. She's currently a doctoral candidate in information Science at the University of Colorado Boulder. Advised by Dr. Casey Fiesler She's the creator of the concept of techno womanism, which is what we will be talking about today. During her time at the Theological Seminary in the city of New York, uh, where she received a master's in, uh, of Divinity across the street where I got my master's and also my undergraduate degrees. She is passionate about making technology better for everyone and using her skills and talents for social good. The voices and experiences of BIPOC people are integral to every phase of technology, design and development. And she hopes that her work in this, uh, in the world facilitates that, uh, as often as possible. So we are quite kindred spirit in spirits in this, and so I'm very excited to be speaking with you today, Shamika. So let's just get right into it then. Um, what is techno womanism.

Shamika Klassen:

That's a great question and I really appreciate you having me on the podcast, and thank you so much for the invitation. Uh, techno womanism is applying the womanist ethic to social justice issues that happen in and around the digital space and around technology. which of course may beg the question for some in your audience, what is womanism? So, yeah, Exactly. Uh, there was an author, Alice Walker, uh, who's still with us, and she, in 1983, wrote a book or in search of our mother's gardens, A Woman is Prose. And in that book, she defines womanism in four parts. The first part is a womanist is a black feminist or feminist of color. Uh, in the second part, she talks about what a womanist does. So a womanist, uh, is a, is a se, is not a separatist except for, uh, health. And, uh, loves women, loves men. and the third part of the definition, she goes into what a womanist loves, such as roundness, the moon, the struggle, the folk herself, regardless, all of these things she loves regardless. And then the fourth part of the definition is the most famous part of the definition for womanism, which is womanist, is to feminist as. Uh, pur purple is to lavender. So that quote really encapsulates what womanism is. Uh, and then of course, after this definition was provided, there were theologians, so black women theologians who took this concept of womanism and anchored it in their religious experiences their perspectives on divinity. So there are women like Emilie Townes, um, uh, there's also the, the waves of womanism similar to sort of the, the waves or the cadences of feminism. So the first wave of womanism was about connecting the womanist womanist theology and establishing it in the experiences of black women and making the case that the black women's religious experiences were a valid starting point for theological reflection. Uh, and then the second wave of womanism. more about expanding who was a part of the conversation and what womanism was about. So this is establishing the womanist ethic, uh, opening up the conversation beyond just religious experiences. And the third wave of womanism, uh, I, I really think it's captured well in the book eight i a Woman is two, where the conversation expanded even further to people who don't identify as black women and outside of the context of, uh, theology and ethics into the classroom, into healthcare, into all these different spaces. Uh, so by the time I came to learn about womanism, I was in, I was at Union Theological Seminary, as you mentioned, in the city of New York, and I was learning from Dr. James Cone. may he rest in power? He was teaching a class at the time, it was an introduction to Liberation Theology. And one of the things I'd never forget him saying was that each of us has our own theology. Uh, and so I was on the quest to find my own liberation theology, and I was really drawn to womanism. Uh, there are people like Dolores Williams and, and others who had gone through Union Theological Seminary in the beginning of the womanist, uh, Theo Theology, first wave. And so I wanted to find a way to engage with womanism with my own passions, which were about technology and social justice. So when I was an undergrad at Stanford University, I was going to study either engineering or math or computer science. And unfortunately, I was weeded out of those classes my first and second year, uh, as an undergrad. So I ended up studying African and African American studies, and that's what I got my bachelor's degree in. And it was such a, a blessing in disguise because I've been able to use that initial education to really undergird the way I approach social justice issues, the way I understand myself as a black woman. Uh, and then combining that with what I did at Union around Liberation theologies, and combining those with the social justice issues found in technology in particular. Uh, and that's how I came to create this concept of techno womanism for my, um, my Master of Divinity thesis, I wrote about the social justice issues that were occurring at the time. So this was when, uh, the Black Lives Matter movement was really on things like Eric Garner, uh, the, uh, the different hashtags that were coming up on social media, such as if they gunned me down. so there were, there were people on places like Twitter and Facebook, et cetera, people of color who were posting the pictures of themselves. Uh, that the news media would use if they were gunned down versus the picture they would have wanted to use. And this is something that happened in Missouri with, uh, Mike Brown. And all these different social justice issues that were occurring in and around the digital space, uh, were things that I brought up in my thesis. And so I wanted to, to think of a way to put womanism in conversation with these social justice issues, specifically focusing on the things that were happening in social media and things that were happening with different technologies. This was also around the time. This was 2014, 2015 when I was writing this, uh, this thesis. time that Google Photos was first starting to tag photos, and there was a black engineer at Google whose pictures of himself and his friend were being tagged as guerrillas.

Johannes Castner:

I get just, uh, just for clarification purposes, you know, you mentioned a few things like, for example, liberation theology. That's a big term. Not everyone is necessarily familiar with. Um, could you, could you, uh, could you elaborate on that a little bit?

Shamika Klassen:

absolutely, so liberation theology are, uh, theologies that are rooted in, uh, the religious experiences or perspectives of people of color, of marginalized people. So, for example, there's black liberation theology, which recognizes that, um, that black people are central starting points and their experiences of religion and faith, uh, liturgies, et cetera. Uh, valid in and of themselves, but then also they can inspire theological thought reflection. There's different aspects of theologies, for example, uh, Christology, who is Christ. Uh, and so for example, in

Johannes Castner:

Mm-hmm.

Shamika Klassen:

Christ is, uh, a man of color. He's, uh, not as the white Jesus that

Johannes Castner:

Mm-hmm.

Shamika Klassen:

seen in traditional church, church, um, uh, iconography. and so recognizing that Jesus was a person who lived in the Middle East, uh, so, so many centuries ago, and was not a white man

Johannes Castner:

of course. Yeah.

Shamika Klassen:

and long flowing blonde hair. Uh, so that kind of thing. So that's black liberation theology. But there's also Dalit theology from, uh, the, the Indian perspective. There's, uh, Mujerista Theology that is uh, Hispanic women's experiences, and it's about the quotidian the every day, and that's a valid starting point for religious experiences and faith traditions. So that's just a little bit about liberation theology, if

Johannes Castner:

Mm-hmm. Yeah. Absolutely. Yeah. Just put it into context, you know, what, what, what your work is about. Uh, you're in, in some sense a theologian? Is that, is that correct?

Shamika Klassen:

Well, I've never thought of myself as one, but I would imagine you could make the case

Johannes Castner:

Okay. Because your master's degree is in, in theology, then.

Shamika Klassen:

It's in social ethics and

Johannes Castner:

Ah, I see.

Shamika Klassen:

yes, I was exposed to a lot of the same kinds of training that a pastor or an imam or a rabbi would be exposed to on the way to becoming those things. of going into a faith tradition or a faith community to go out into the world, uh, with desire to really address the social justice issues that are happening in technology, uh, from the perspective of a

Johannes Castner:

Mm-hmm.

Shamika Klassen:

as opposed to the perspective of a practitioner in a faith community.

Johannes Castner:

And, and you are also a technologist, and so you are combining these also, it's an ethics basically, that you're applying to technology in particular that comes, a sort of theological background. And is is that, is that a, and so then I would like to ask you, um, you know, what are the tenants, you know, like usually when you, when you have an ethical system, you have some sort of principles or axioms if you will, of uh, you know, what are the, the tenets of, of, uh, of techno womanism.

Shamika Klassen:

Yeah. So I recently did, uh, my dissertation proposal and I was able to write more specifically about techno womanism since I had first, uh, established it in my Master of Divinity. And one of the things that I came out of that process with was a more clear definition of what techno womanism is and the way I want it to be understood in terms of tenants. Um, and so I had done a paper with Dr. Xeturah Woodley, who's also on my dissertation committee, uh, back in 2019. And so when we wrote that paper, it was sort of the next step from my thesis where I was just introducing this concept, into flushing it out and really thinking about how uh, exists in the context of not only liberation theology, but also the next step outside of that, uh, that sort of, discipline. so what I'm doing, uh, with techno womanism and the tenants is going from that definition that Dr. Woodley and I created. Um, and thinking about breaking that down. So techno womanism an opportunity look at the digital space and technology, which is one of the tenants, is that focus on that particular industry discipline, uh, life experience, et cetera. Uh, but also do that from an interdisciplinary and intersectional perspective. So, interdisciplinary meaning I'm not just using theology, I'm not just using ethics. I'm not even just using, you know, science and technology in society or science in, uh, philosophy or the philosophy of science or, information science. But I'm using all kinds of disciplines. informed by critical race theory, uh, black feminism, uh womanism, digital, black feminism, all of these different concepts, theories, disciplines that will go into how we address those social justice issues in the digital space and technology. And then also there's the intersectional piece. A lot of folks, uh, will tie that to Kimberly Crenshaw, who coined the term, but intersectionality is something that has existed for decades. For decades. Um, there are women like Sojourner Truth in her often misquoted speech, uh,"Ain't I a Woman". But in that speech, she talks about how her womanhood is not recognized because she's a black woman. and so she talks about how she did, uh, all of these things as a slave, had all of these children, was able to work as hard as anyone else, but no one holds her hand as she's stepping off of a carriage, if she's ever even given the opportunity to ride in one. Uh, people don't treat her the way they, they did white women at the time.

Johannes Castner:

Yeah. Right.

Shamika Klassen:

during the truth was around in the 18 hundreds then you had the Combahee River Collective in the seventies. They talked about having, uh, identities with which needed, which, which were a part of their, their own political action, uh, as recognizing each of those kinds of identities across race, sexuality, class. Uh, and then when Kimberly Crenshaw wrote about intersectionality, she was inspired to do so from a court case that was happening against General Motors. Uh, there were five black women who sued them because they, they were not, uh, offered positions or they weren't hired because the factory, uh, and the judge ruled that they were wrong, uh, uh, because the General Motors factory hired black men and in the offices they hired white women.

Johannes Castner:

Mm-hmm.

Shamika Klassen:

well, it's not racist cuz they hire black men and it's not sexist because they hire white women. But here you were. of black women who were not gonna be hired in the factory because they were not men and they weren't gonna be hired in the office because they were not white women. Kimberly Crenshaw, who comes from the legal, uh, the legal field, saw this particular court case and wanted to name this liminal space that the legal system put these black women in. also expanded the conversation as well to other identities that we all carry, um, and put those in conversation with, uh, Patricia Hill Collins Matrix of Domination, which is the interlocking, uh, systems of oppression in, play in these kind of scenarios.

Johannes Castner:

then it's really important to not perpetuate these things when more and more decisions are being made by AI algorithms, for example. Right. So I see, I see how this connects to, uh, uh, very well to technology. So, um, I, I can see is that, is that sort of the direction that you are working in to basically say that y you know, um, that we have to be careful. To, to not, you know, that, that all of these kind of systemic issues that are going on as we, uh, as you just described them, that, that they will be perpetuated by algorithms made by particular types of people thinking in particular types of ways. Uh, is is that sort of, uh, cuz that is absolutely important. I think this is absolutely critical, uh, as well. So this is also why I invited you, because this is something I wanted to talk about as well. But I think that, you know, in the very long run, right, my, my worry is a little bit, it's, it's, so I can see all of these things as huge issues, right? It's, uh, and, and it's very important to, to understand this intersectionality issue, but at the same time, I think that we, we should maybe not perpetuate it as well because, um, because categories themselves were made up by particular types of people at a particular time. And, you know, in a way that is like sort of the deepest way of. Colonialism is really the categories themselves, you know, black and woman even, you know, there is this critique of, um, of, of, of this by, Michel Foucault, uh, French Intellectual who went maybe too far in this, I, I could say. But I still find it very interesting to think that, you know, these categories of being black and being white are made up by particular type of people in a particular time period? And they were meant to dominate somebody. And so then in the long run, would we not be better off to just basically dispense with all categorization of humans? And just to say that we finally arrive at a place where we are all just human. But, you know, I can see, you know, the, the right, the right wing, you know, is often using this sort of type of thinking to say, oh yeah, let's just get rid of these categories so that we can keep discriminating, uh, b ba based on them. And, you know, without those categories, I guess you would not know that, uh, all the women that that were hired were white women and all the, uh, all the black people who were hired were men, right? So you would have to have these categories to actually know the bias. So it's kind of a trap, but the categories themselves, I feel like at some point, you know, in the future, somehow I feel like the dream should be to not have them. What do you think of that?

Shamika Klassen:

I see, I feel like that is, uh, a tricky conversation in the sense that the idea of removing all categories, uh, and having everyone instead of being whatever various categories they identify with, being simply human, uh, because there is so much erasure in that concept. Um, so. my identity as, uh, an African American

Johannes Castner:

right

Shamika Klassen:

woman, female identified, uh, woman aligned. All of these different pieces, um, are the way that we organize our identities, so,

Johannes Castner:

mm-hmm.

Shamika Klassen:

here's an example that you might appreciate. Once we have, uh, for example, general intelligence, ai, sentient ai, AI that's able to think for itself, right? If, if these are embodied in humanoid robots, So then that will force us to recategorize how we identify ourselves as humans as a collective. Because now there's a whole nother entity on the scene that is identifying as, um, a on the level of, or even beyond where humans are. Uh, it would be similar if we, we, uh, were able to officially, you know, if aliens came and we were able to officially say, okay, these UFOs that the government has confirmed have now been identified. Right? And there's another sort of sentient being out there in the universe. We would have to sort of figure out how we. identify ourselves in the midst of that other entity. Um, so it,

Johannes Castner:

so, so you,

Shamika Klassen:

us to think about it from that perspective.

Johannes Castner:

Okay. So, so basically identity is always kind of constructed by otherness, by opposition to, or, or not necessarily opposition too, but by, by contrast with some other, some otherness.

Shamika Klassen:

Well, or you could see it as who, who was within my c. Mm-hmm. right? So I have various communities that I belong to. So I don't necessarily identify my blackness against whiteness. I identify my blackness within my history, my culture, my context, my, the black oral traditions that I'm able to draw from.

Johannes Castner:

Mm-hmm.

Shamika Klassen:

Uh, and I think that's something that is really important to make a distinction about So if whiteness is, is centered and dominant, and then everyone else is sort of, okay, this is how you're different from whiteness or this is how you are, uh, aligned with whiteness. And that's the way we categorize things. That's part of the white supremacist system.

Johannes Castner:

Yeah.

Shamika Klassen:

Is centering that whiteness and that dominant culture. Uh, and then having everyone else be the spokes around that.

Johannes Castner:

Right, right, right. But, but what if we could break this kind of, you know, let, let's say this, uh, you know, this dominant, uh, white supremacist system, which I, I hope we can. So let's say we break it at some point, you know, at the end of the day, like if we can all get together and break that system, would it not be beautiful to be able to have a community that is really global in nature with just human to human?

Shamika Klassen:

I think that's possible without necessarily erasing the context, history, culture, and,

Johannes Castner:

okay. Okay sure yeah,

Shamika Klassen:

all of the different offerings of the communities that exist. Yeah.

Johannes Castner:

And so then all, yeah,

Shamika Klassen:

so there doesn't need to be animosity across diversity.

Johannes Castner:

Yeah. Okay. That's, that's, that's true. Uh, I hope that is true because that's, you know, I feel like, you know, maybe it's always the sort of danger there, you know, at some point.

Shamika Klassen:

Mm-hmm.

Johannes Castner:

But the, also, the, the other question, you know, would have around that is, you know, ultimately at some point, right, there was no, there was no, uh, either whiteness or blackness, right? Because people just lived wherever they lived and they saw whoever they saw and they didn't actually have that concept That concept came about, I think in through the process of colon colonization. Because suddenly you had some other people who looked rather different from yourself, invading your villages somewhere and, you know, killing everyone and raping everyone. And then suddenly they were there and they looked very different and they were very, you know, uh, not particularly friendly. And they started to dominate everyone. And then they started calling you"black" for example, and, and, and what is that? I think it started there. I mean, it, it, it didn't exist before.

Shamika Klassen:

Well, even before

Johannes Castner:

historically,

Shamika Klassen:

right, so even before those racial categories existed, there were still. people like for the Roman Empire or any of these other empires that are the British empire that existed right before racial categories. There were power dynamics that were in place. There were nationalities that were in place, there were religious communities that were in place, and those were used to, um, otherize and to create hierarchies.

Johannes Castner:

Right? Right. That's where this thing of blackness and whiteness came about right through this process So is it not true that at some point we, if this process actually defined us as who we are, we should maybe rethink, rethink this definition of ourself because it was given to us by a force that we should maybe not be aligned with? In other words, historically,

Shamika Klassen:

I think it wasn't just the categories, it was the hierarchy of the categories.

Johannes Castner:

Yeah.

Shamika Klassen:

It was the, um, assignment of better, worse,

Johannes Castner:

no. Right, right.

Shamika Klassen:

Higher.

Johannes Castner:

but the categories themselves as well. Right. So this is the Foucault, you know, the Foucault argument that the, that, you know, that these categories themselves, like being gay for example, was, was a product of the enlightenment period. Which, um, which before that there were some men having sexual relationships and romantic Roma relationships with other men. And there were some women who had romantic relationships with other women, but they weren't calling themselves gay. And, and so this, this I, this idea, this category came about in the Enlightenment period. And, and that is his argument. That's Foucault that's the Foucauldian argument. And, and so I'm, I'm just saying, you know, maybe there is something about tearing it all down. you know, tearing down, not as in the erasure, but something like more conscious than that. You know, like a conscious choice because erasure is something that we, that, that, that happens when Disney comes into town and erases your, your existence basically from, from their movies. That's, I think, sort of erasure. Right? But, but if you were to make a conscious choice, if you said, okay, well where do these categories come from to begin with are we in agreeance with that? should we agree with that? Or should we just tear that whole sucker down, basically is what I'm saying.

Shamika Klassen:

Mm-hmm. Well, I, I also feel like because the categories have been embedded for centuries now, there is a, a human understanding of who we are and how those categories have affected us as a society. So, if we were to change those categories or remove those categories, we're still contending with, uh, the human, um, sort of perspectives on those categories. You would have to erase everyone's experiences and understandings of themselves and others.

Johannes Castner:

not erase, I think

Shamika Klassen:

and start from scratch from there.

Johannes Castner:

No, I wouldn't say erase, but re re reinterpret. So the idea would be to. you know, we can still understand the pains that, for example, um, James Baldwin was going through as a young man in America for, for reasons that had to do with some definitions and categorizations actually, we can still understand everything that he's writing and we can, but we can re maybe reinterpret it a bit and say sort of like we could still actually in some ways make our own identities out of these, you know, we can maybe instead of owning identities that were kind of given to us by, by the enlightenment period, which is hundreds of years ago, I agree. But instead of owning those, we could reinterpret them and make the, make those old definitions part of our new definitions, if you will, you see, but, but in a very conscious, sort of creative approach rather than a sort of approach that, where we take these categories as a given, you see

Shamika Klassen:

mm-hmm.

Johannes Castner:

you see what I mean? Because where I came from, you know, where, like, I want to tell you this too, because where I came from, I was just weird. Okay? So that's a weird category. I, I didn't wanna own that category for sure, but everyone, like, and there was a lot of hatefulness taught me, actually, you wouldn't believe it, but, um, uh, you know, I, I ca I was living in Austria and it, it wasn't all hateful. It was, some of it was just, even pity, pity toward me. And, and some people had, uh, hatred. So it was a mixture of various things like that and,

Shamika Klassen:

uh mm-hmm.

Johannes Castner:

And it had nothing to do with anything you could pinpoint really, you know, I was thinking rather differently from most other people. I was, you know, wearing my hair differently. I, I was, living differently than they had, would expect.

Shamika Klassen:

Mm-hmm.

Johannes Castner:

I had

Shamika Klassen:

mm-hmm.

Johannes Castner:

I was completely, the category is actually not normal. That's. they, they, they, they have a category called normal and then another one that's called not normal. And it seems to be so, it's so ingrained to them that if I ex, when I explain to some, some people back in Austria that, uh, you know, this kind of categorization doesn't exist in America, normal versus not normal, uh, they, they wouldn't believe me. They're like, they surely there must be that just gotta be different, normal, you know, it's, it's, it's very interesting. It's, it's, it's so experienced, but obviously someone who's not normal. they one can own that as well and sort of say, okay, I'm not normal. I agree. I'm, I'm, you know, that's who I am now and that's going to be part of my brand. And in a way you can do that. But in Austria, that's actually very, very hard. Very hard.

Shamika Klassen:

Mm-hmm.

Johannes Castner:

it's a fight all the time, you know? and then in some cases physical. And you mean you get attacked physically attacked by people? Yeah. And, and no guns involved for being weird, uh, not normal, basically. And, and so what do you do with that kind of identity?

Shamika Klassen:

Right. Well, I feel like, well, I would say marginalized people have been shifted into this category of not just not normal, but of subhuman. And because of that differentiation between. I'm human, and this person is less than me, therefore less than human. That's where a lot of the violence comes from for LGBTQ folks, for black and brown people. There's also been, since, since, uh, Africans were brought to the United States and enslaved, they were considered subhuman. They were three-fifths of a person according to the 1850, um, three fifths compromise. Uh, and that that was done so that, uh, cattle, the human cattle that were in the south wouldn't count for more than, uh, for one person at a time because that would give so much power to the south politically, uh, as they were figuring out how many people to. um, count for these states in order to give them electoral college votes, whatever the case was.

Johannes Castner:

Yeah.

Shamika Klassen:

But I think because black people in America have been advocating for their humanity for so many centuries, uh, through abolition and then through the Civil Rights Movement and even today, um, we are not seen as human to so many people we're still considered to be subhuman. A number of people, far too many people. And because of that discrepancy we have within our own communities, we've had to tell ourselves not only are we human, we are valid. We are worthy. Right. And that's where, uh, In the seventies after rights movement, we had the, the blackest beautiful movement. We had um, black, and I'm proud, all these kinds of things to refill our own understanding of ourselves of this kind of society that doesn't see us that way. And it's, it's found in many marginalized communities around the world, uh, like, like with Dalits in India, um, within that kind of categorization as well. But I don't think, uh, the categories themselves, I think it's also the human, um, perspectives on the hierarchy within those categories. So I think even if we had and we didn't have the, either the white supremacy or the power dynamic, depending on the culture and communities, um, that's where the difficulties lie.

Johannes Castner:

And so to bring that back to technology, um, I do, I mean the, a lot of the white supremacists, I think are not able to build technology, and they're not really involved too much in that scene. Right. Is that, is that fair to say?

Shamika Klassen:

Oh, yes, yes, yes, yes. Uh, white supremacy is not a person, it's a system!

Johannes Castner:

Yeah. Yeah. Is that perpetuating itself in the, in the technology space?

Shamika Klassen:

Absolutely. When you think about the way that, um, that machine learning works, a database, you feed the database into the system, and then you get an output. The databases that are collected for different things are often, um, skewed with white men, and those are feeding into these systems that are making decisions about people who are not white men. And because they're not well represented in the database, then they have poorer outputs and outcomes from these machine learning algorithms.

Johannes Castner:

Yeah. I'm familiar with that. Biases. Yeah. Absolutely.

Shamika Klassen:

Yeah.

Johannes Castner:

That's the bias. That's the basic, you know, the basic bias argument. Um, uh, which is a very important ethical, uh, principle in many cases. And often it, it, it, it, it has a problem though when it comes to, for example, um, so that's very interesting. This other principle, justice is blind, can sometimes come against, A bias concern actually, because, um, if you're blind, you can't know whether you're biased or not, right? So if you have for example, this is, this came up in, in, in the case of a recidivism algorithm that is supposed to predict whether or not someone who has committed a crime, say six years ago and was imprison for it, and then, uh, was coming, coming up for review. And then, uh, you know, the algorithm was suggested as a way to predict whether or not, um, the person will recommit a crime. It was later found out to be very biased. But the problem is that, you know, that, that, uh, it sort of can't really check its own biases because the principle was used that justice should be blind, right? So there are some correlated things that are going on that it sort of uses to, you know, that are, that are correlated with, with, uh, certain categories.

Shamika Klassen:

Mm-hmm.

Johannes Castner:

that, um, that then, uh, sort make the prediction perhaps bias. So this is also not clear because you know, you have to com you have to then sort of, I think it our bias with respect to the actual rates. So the different rates, right? So if you had, for example, an African American recidivism rate, a real one that was measured, then the predictions of that thing were, were, would give, this would give an average African American ma male, um, a greater risk factor than was warranted on average. in the particular category. So, so, um, but, but the algorithm obviously, obviously is not allowed to know whether this person is black or white. Right? Because if, if, if the algorithm were, was allowed to know that, then it could even create a worse bias or it could, you know, make decisions in, in, in ways that are, you know, against that other principle actually, you justice should be blind and so, so, so you have these issues like that and they are very difficult to resolve and practice actually

Shamika Klassen:

Kathy O'Neill, who wrote, uh, weapons of Math Destruction,

Johannes Castner:

yes, yes.

Shamika Klassen:

about Yes. So she speaks about how, databases and machine learning and predictive algorithms can only predict the past, because that's what's

Johannes Castner:

Clearly.

Shamika Klassen:

historical

Johannes Castner:

absolutely.

Shamika Klassen:

And so when you have like proxies for race, such as zip codes, or ask like, are anyone, you know, people who have committed crimes or were you, uh, in the foster system? Or like some of these Mm-hmm. not about race explicitly, about race

Johannes Castner:

Right.

Shamika Klassen:

that's how these things get baked into the systems of these algorithms.

Johannes Castner:

So you have to be very careful to, to take variables because you want them to be predictive of actual recidivism, right? You want to be pretty good in regard because, you, uh, because also it turns out actually that often, When there's a murder committed against, uh, uh, for example, an African-American, often an African-American committed the murder. So then the, the victims are also African-American, right? And you could say that if you, now, if you want to be just from the perspective of different types of vi victims, right? And just with respect to, um, the perpetrators at the same time that this could represent a conflict. And so you, you have to kind of, in a way be really, truly blind. And so that's, uh, but at the same time, you want to take into consideration enough variables that are also very predictive of the true risk, right?

Shamika Klassen:

Mm-hmm. Mm-hmm. Yeah. I feel like there's, uh, black on black crime is sort of like a, what would you call it? Like a dog whistle yeah. Against the black community. This sort of, uh, caring call that says, well, black on black crime, uh, black people kill each other more than any other,

Johannes Castner:

I'm not saying that. I, well, I I'm definitely not saying that. I, I don't, I don't wanna say that in any kind, in any kind of category, right? It doesn't really matter what category you're talking about. So, because America is so segregated, right? So if you think about it, America is super segregated actually. You are kiss someone from your category than you are likely to kiss someone from a different one. Right. So similarly, you're much more likely to kill someone from your category than your, your, uh, and so if you are, if you are predicted to not be risk, a risk to society, say, and, and then, and then you are, you're let go and you are actually going to kill someone, right? It's very likely going to be someone who looks like you and is like you and some other dimensions as well. So that's to say you are now have a bias of whom you put at risk. You see that? that could that you see, and, and I'm not saying this at all in a, in a dog whistle way. I don't believe that at all, uh, at one minute that, you know,

Shamika Klassen:

Okay.

Johannes Castner:

that, you know, particular, uh, phenotypes have a higher tendency to kill, uh, people with their same phenotype. I don't believe that. But what I, what I do think is that, that, that people who do have phenotypes do actually end up killing people who all have the same phenotype. you know, so that's true of all, all, all categories. And this And, and, and so if you want to prevent someone to be killed from a particular phenotype, right, you've gotta make sure that you, you're not underestimating the risk of that person to come out and, you know, kill again. So at the same time, you don't want to overestimate it because then you're keeping someone in prison unjustly, right? So, so, so this is a very, this is a very difficult problem actually when you look at it from the various things that you want to optimize simultaneously; because you really want to optimize a very accurate risk profile. At the same for, for e for an individual regardless of, of whatever their category is.

Shamika Klassen:

Mm-hmm.

Johannes Castner:

And at the same time though, you also don't want to, um, you don't want to unjustly overestimate someone's risk because of some category. So, so you have to kind of do this at the same time. And, and, and what I'm saying is that this is not easy to do.

Shamika Klassen:

No it's not.

Johannes Castner:

Yeah. So we, we, we, you know,

Shamika Klassen:

yeah, I agree.

Johannes Castner:

It's, uh, it's not necessarily true that what I'm saying is that, that, you know, if someone in Silicon Valley or whatever, most likely going to be someone of Indian origin, by the way, um, is going to coat up some algorithm and it turns out that it is bias that, that this was, you know, so because it's such a difficult problem, you, it's, it's very likely that this person was not a racist. Actually, that's what I'm saying.

Shamika Klassen:

Uh, yeah, and I think that idea that the individual racism that exists is, uh, the thing to be most concerned about. Like, like I said, it's a systemic sort of, uh, force. and this is what the Matrix of Domination talks about.

Johannes Castner:

Mm-hmm.

Shamika Klassen:

like there's these four different components to oppression that are existent throughout society that affect different people in different ways that privilege some people and oppress other people based on this intersectional exper like, so all those different pieces.

Johannes Castner:

Mm-hmm.

Shamika Klassen:

but then also I would say the people who are creating these recidivism algorithms, for example. how many of them are in contact with activists who have been trying to fight the prison industrial complex, who have a deep understanding of how the system They should I agree. They should. They should certainly be. And I mean that's, that's really important that you create a network of people who, with, with different voices, which by the way is what I'm trying to do on this show, uh, as well, so that people who, who care about particular type of technologies come into contact with, with your idea, for example, of techno, techno womanism.

Johannes Castner:

And so that, you know, people come in, they, they must. Seeking more connections between radically different areas, uh, that might be affected by their actions. I agree with that 100%. In fact, that, that, that is sort of the, the, um, the Amartya Sen's approach, I would say in some ways of, of, uh, uh, this is, uh, this is a professor I had when I was, uh, briefly at Harvard. Um, and, and he, he wrote a book um, this book right here, the Idea of Justice.

Shamika Klassen:

Mm.

Johannes Castner:

so he, he and Martha Nussbaum together worked on this concept, uh, as called capability Approach. And this is really related to what you were saying, that you really have to bring into contact. The people who your policies in their case, you know, so they, they were talking about policies in, in our case we're talking technology, but the algorithms or technologies, you know, the, the, the, the, the, um, the who they, who are going to be affected by these algorithms. also by, like you said, you know, spokespeople who have a lot of experience and like you said, activists in this case, you know, um, you know, they, that is actually a really important missing link. I think that, you know, technologists, people who are very much going into the weeds of by building algorithms, they, they're very much occupied by this, right? So they, they spend a lot of time thinking about the math of this, you know, uh, object. They think a lot about, um, you know, how to deploy this sort of object. You know, I, I, I know this because I, I do that and. that occupies a lot of your time. So it's very difficult then to reach out to all of the different stakeholders and all the different people whom you're actually affecting by building these things, right? So this is the kind of connection that I am talking about here on the show, and I want to really, uh, you know, stress, you know, this importance of, of reaching across the divide and really understanding who are we serving when we built this, built this technology, and, and what kind of effects will it have on them? And then get feedback from them. And, and I think this is very important to to, to sort of close the loop and, you know, have feedback of the, of the whole diverse community that you're serving. When you are putting out an algorithm, then it's supposed to scale, right? You say, yeah, we all all the time. You know, what does that mean? Scaling means it's going to affect very different people all over the world, globally, in a way, you So, so do you think of, um, it brings me to this question. Um, do you think of, of of techno womanism as a global concept, or do you think of techno womanism mostly as an American concept.

Shamika Klassen:

Well, I think that it was created by me as a person who lives in the United States, but I feel like anyone who resonates with any of the tenants could certainly use it anywhere in the world. So the idea of having an interdisciplinary approach to addressing these social justice issues while keeping intersectionality in mind, um, I feel like it's a flexible enough approach that it can be applied in a number of different scenarios, like the one we're talking about, you have a human in the loop of your AI system, that human doesn't always need to be, uh, the machine learning specialist or analyst, uh, but it could also be some of these other stakeholders. Uh, and I think about the way that computer science education is being taught. There's more emphasis now on the ethics of AI and the ethics in, uh, technology, but there's also such a need for humanities to be taught alongside computer science. There's so much to learn in computer science, but that's just the, the practical piece. There's also the thinking around what you're doing, why it's important, who it involves, uh, the context of it, the historical and cultural impact of it. Uh, so all these different pieces that I would say a lot of computer scientists or people who are working in technology have up till very recently thought, oh, that's someone else's job. I'm not an ethicist. I can't predict what someone's gonna do with my algorithm or my technology. And that's one of the things that my advisor, Dr. Casey Fiesler works on as someone who created the Black Mirror Writer's Room exercise, for example. Black Mirror is a sci-fi anthology that talks about stories rooted in technology and how technology is being used or misused by humans. So she created this activity where people could put on the hat of a Black Mirror writer and come up with a Black Mirror episode pitch for a particular technology. And this helps think ab uh, this helps people develop ethical speculation skills. So I think that's so important for people who are in the driver's seat or even in the, uh, seat of creating or implementing these various technologies to be able to think about what might someone do with this? How could it go left, you know?

Johannes Castner:

I've seen, uh, that you, that you've done something in the area of speculative design. Is that correct? Because, because, you know, I came across this idea recently. Um, you know, uh, uh, a friend of mine, uh, recently made me aware of this concept, and then I think I saw it on your, uh, on your LinkedIn page. So, so could you, could you tell me actually about how that may connect to, to, uh, uh, techno womanism as well and what it as well

Shamika Klassen:

Absolutely. Well, I first found about out about it in my first year in the doctoral program. When my advisor gave me the book, speculative Everything, which is by Dunne and Raby. Um, and I think that book is a really great overview of how, uh, you can design something whose use is not necessarily traditional, but it causes, uh, the, the viewers of this particular design to think more critically about what the design is saying or what it's representing. So, um, example, I think there was a, uh, I'm forgetting the name of the designer now, but I believe he had created like a, um, telephone that had, uh, a chip that you could put in your, your, your body and that would be how you would communicate with people. And it sort of made people think about the connection between the human body and technology Uh, but there's also different ways to. Do speculative design. So I've written like a design fiction, for example, where I create a, a story based around a technology that doesn't yet exist or a technology based in the future. And I walk people through what that world looks like this, uh, futuristic technology, how people respond to it, what does it mean in society. and there's been so many folks who have done either design fictions in particular, or you can look to science fiction writ large. And there have been people who have written about submarines hundreds of, or tens of y ears, decades before they existed. Mm-hmm. Yeah. Or, uh, da Vinci's designs of, of flying machines before, you know, helicopter years before the Wright brothers. Yeah. Yeah. Absolutely. Yeah. I, I've seen it. It's, uh, incredible. Mm-hmm. It's really incredible. Yeah. So that's like a speculative design, right? He didn't, it didn't exist. Uh, and it was certainly provocative at the time and it's something that did come into fruition later. Um, but I think speculative design itself is something that can be very, um, that can invite more critical engagement with, uh, whatever's being designed in particular technologies. But, uh, the designs I've seen for speculative design, uh, have ranged wildly. So it could be any number of things. I encourage folks to take a look at speculative everything, cuz they give some really great examples.

Johannes Castner:

I recently heard about this, and this is also sort of connected to this ethic that we, we were talking about this ethic of reaching out to, to the people you affect when you, um, when you are, um, when you're building technology, the, the various stakeholders, if you will. And, and, and, and, you know, I think that there, there is something connected to this speculative design, uh, design concept, uh, with this ethic. I saw it, uh, on your, on your profile and I sort of connected, but it, it, there is something ethical about this as well, right? About the speculative design concept.

Shamika Klassen:

yes. In the way that it, uh, again, invites people to think critically about what the design is speaking So there's this Japanese concept called Chindōgu, and it's basically where you design something. like an everyday gadget. but it's this ingenious invention that seems to be an ideal solution to a really specific problem that may cause more problems than it actually solves. But one example of this that actually became a real thing is the selfie stick. So stick was, yeah, it, it was a useless invention at the time, but then in the 21st century, obviously it blew up. But then we had people using selfie sticks and scenarios that were dangerous. People have, uh, been fatally injured from using selfie sticks. So, It, it really, I think at the time, the person who created this stick with the camera on the end, uh, maybe wasn't thinking about that. But it does allow us to think about now the things that are being invented or introduced to the market or different technologies that are coming up. Like, um, the, the chatbots, the lens AI image, uh, creator, like all these different ways that machine learning and AI are interacting with society. Like a lot of people are talking about how the, the chatbot, the GTP chat, bott, I think it is, or chatGPT, is, uh, going to be very, um, impactful in education, like negatively impactful as students are going to use this for essays in junior high, high school, undergrad. And it's gonna really cause educators to think more critically and carefully and intentionally about how they assign essays. Um, because if I was a student and I had access to this chat bot, I could just create a quick essay and turn that in. It's not necessarily plagiarized cuz it was, I'm not copying someone else's work. I'm creating my own work, but I'm using this chat bot. Like, so it's raising these questions

Johannes Castner:

yes it is.

Shamika Klassen:

that I think are really important to think through

Johannes Castner:

Mm-hmm.

Shamika Klassen:

should have been thought through before it was introduced to market. Maybe they were and they

Johannes Castner:

But, but it is interesting. You could think of it this way also. You could say that AI could be an augmentation of our intelligence. So that's what stand for, right? It could stand for, uh, augmented intelligence. I'm a proponent of this idea that instead of actually thinking of it, because, because in the end, you know, this idea that there would be robots walking around with, with their own minds or something like this, seems a bit absurd to me because they don't have, they don't really have any wants or needs they're not, they're not human in that way, right? So they don't have children to raise, they don't have, um, any worries. They don't have any apartments to, to live in or whatever. They don't need any of that, and therefore they don't really have their own interests. They're always going to be, in some ways fulfilling the interests of someone else, kind of a human being. So in a way, they're augmenting their intelligence, right? They're doing something for someone else. It's a tricky thing, but I don't think you can really think of it as its own artificial intelligence. And so I, I prefer this term augmented intelligence because it makes it also very clear than what you can expect. And then what you can say is that instead of asking people to write a very particular essay, you now have much more demands from them because now you know that their intelligence is augmented, and now your tests are going to be very, very different, because you're actually expecting this augmentation now. So instead of saying they're cheating on this human task instead of that, we now expect them to have these tools, and we expect them to do way, way more than, than, than before. And, and in, in a ver in very particular ways. These articles that AI writes, you know, to me personally, the, it's, you know, it lacks experience, it lacks human experience actually in, in my mind. So I, I'm not a big fan of this, and I can pretty quickly tell it's either really bad human, uh, you know, someone writing this who is not very, very good, uh, as sort of, um, having human experience in writing about it. Uh, or it's an ai, right? So in, in either case, you know, the grade deserves to be not very high. So I, I don't know. at this point, but then if, if the, if the AI can truly assist you to make actual arguments that are novel and, and interesting, then suddenly we will expect much more from you. We will, because in, in the end, uh, uh, in AI will only be, you know, an augmentation of, of, of your, of yourself. So, you know, whatever you call it, art or, or writing, um, it really depends on the inputs that you give. actually, you know, a lot of it depends on what to put into it. And so in a way it's kind of a tool to perhaps do more or do something more interesting even, or find, figure out some things that are difficult for the human mind to actually grapple with and then make arguments in this sort of augmented way. Uh, and then it could be that, you know, the, the, the grades are just, the bar is just gonna be much higher.

Shamika Klassen:

Yeah. I think that's good point, yeah. Well, it, it also reminds me of a comment that, uh, Ruha Benjamin made. So Ru Hop Benjamin is the author of Race After Technology. She is a sociologist over at Princeton, I believe, and she's talked about how Hollywood pitches this story that, uh, technology or AI is going to destroy us. And then Silicon Valley tells us that technology or AI is going to be our savior. But either way, technology is in the driver's seat. Technology has the agency in both those scenarios, whereas she is advocating for people to be agents in this process, so,

Johannes Castner:

Yeah, I mean, there really have to be, you cannot let the juggernauts, the Google and Amazon run away with all of it, because they will be in the driving. I mean, the, the, the, the real person. There will be people in the driver's side, uh, driver's seat for sure. I mean, no question about it. Right? We have no commercial reason to build something that has it's own driver's seat, right? It doesn't make any sense. And, and, and at least not to me, uh, I, I've heard people talk about this, you know, individual kind of intelligence that sort of has its own wants or needs, but I just don't see how they, they would come about, right? You know, what, what do they really want? Wh why would they want it? But, um, so in the end, it'll be a human in the driver's seat, but whether it's going to be Mark Zuckerberg and, uh, you know, a few others, or it's gonna be you and me and everybody else. You know, that is the big thing. I think that is the struggle in a way. You know, we have to actually make sure So next week I will be speaking with Ashish Kuma Singh about the metaverse and the decentralization of it. And this is a theme, this decentralization of systems that we come back to in on this show over and over again. I think this is a very important part of the discussion of ethics around AI and, and around technology more generally. Who's in control of your data, you know, who's the control of the computation, right? So there, there's several things that are at, at play here. Who has the computational power? Who has all the computers that they're sitting somewhere? Mm-hmm. you know, someone can crunch all the numbers and then you know who has all the data. And you know, of course Mark Zuckerberg wants us to give, continue giving him all of our data because that has been very lucrative for him.. But, uh, that's clearly not, you know, in our interest or I don't think it's, so, you know, there is something that I think might also be involving this, you know, when we are talking about this whole, you know, being, reaching out to your stakeholders, well maybe one should integrate the, the stakeholders directly into one's organization in a way that, you know, we should co-build everything. So we should actually be active players in this, in this world. and, and this is exactly the topic I will be discussing with Ashish Kumar Sing next week, uh, on this show. It, I see it as a struggle, uh, from going from, from a, a centralized to a decentralized system. And it is, it, it's emerging as a major theme on this show. You will see this theme arise over and over and over again in this show.

Shamika Klassen:

Hmm. Yeah. This is the kind of thing I would like to do after I graduate with. Experience research. Cause I feel like as a user experience researcher, I would want to bring in marginalized voices who are impacted by various digital products, whether they're, uh, websites or apps or hardware technologies or software, whatever the case may be. Bring those folks in to be able to advocate for their experiences, advocate for their needs, advocate for their pain points, and bring them into the conversation from the inception of a technology all the way through to. um, when it's released out into the market. So I think that the role of the UX researcher is becoming more and more prominent, and I, I'm hoping that through that kind of role, especially in a, a position of, uh, social entrepreneurship or so, or technology for good or social impact technology, even civic tech, uh, these kinds of corners of the industry, uh, would allow for. The conversations with stakeholders who are impacted the most, especially negatively impacted by technologies. Bringing them into the process, just like you're saying.

Johannes Castner:

Well, this has been, it's been a really fascinating conversation. Maybe you, you want to say something else to the audience? You might wanna plug some, some project you're working on

Shamika Klassen:

well, I think right now the biggest project I'm working on is my dissertation, but my dissertation is rooted in techno womanism. It's actually my first opportunity to use it, uh, both in theory and in practice. So putting it into practice in particular. Um, if folks want to follow my research journey or learn more about the research that I've done up to this point, uh, they can visit my doctoral website@shamikalashawn.com. S h a m as in Mary, I k a L as in Larry, a s h a w n as in nancy.com.

Johannes Castner:

Fantastic. Shamika. That's been a really great conversation and you know, I, I hope that we continue the conversation. Maybe some people will write some comments, uh, and we, you know, can continue the discussion and maybe we'll have you on the show again at some.

Shamika Klassen:

Absolutely, I'd be happy to. Thanks so much again.

Johannes Castner:

This show is published every Wednesday at 5:00 AM Eastern Standard Time. Next week I will be speaking with Ashish Kumar Singh about the metaverse. Join me and subscribe now.

Ashish Kumar Singh:

Metaverse is something which internet was, you know, way back when. You know, it was just an experiment around the world. But today, if I have to put it in a very simple, you know, terminology, it's the next evolution of social revolution.

theme song (written, performed and mixed by Neal Rosenfeld, sang by Jennifer Youngs)
welcoming message
introducing Shamika Klassen
introduction to Techno Womanism
what is Womanism
the three waves of Womanism
how Shamika came to create Techno-Womanism
Liberation Theology
theology and ethics
the tenets of Techno-Womanism
biases and the need for intersectionality
Foucault's critique of human categories as domination
is removing categories cultural erasure?
can AGI (Artificial General Intelligence) force us to rethink our identites as humans?
where does identity come from?
human identity after the white supremacist system falls
technological perpetuation of the white suprimacist system
biases and blind justice, the case of recidivism
technologists must understand whom they are building things for
what does scaling mean for diversity; is Techno Womanism a global concept?
the need to teach humanities alongside computer science
the Black Mirror Writer's Room exercise
Speculative Design
the ethical dimension of Speculative Design
Augmented Intelligence
who is and who will be in the driver's seat, AI, corporate leaders, or most of us?
user experience research to bring marginalized peoples into the process of building the future
how to keep up with Shamika's work
call to keep the conversation going
what's next on the show