Studion

Confronting Bias in Learning Experience Design: How We Do It at Extension Engine

It’s an unfortunate fact of life that bias is everywhere: in the people with whom we interact, in the technology we use, and even in ourselves. Whether it's conscious or unconscious, bias can and does affect the behaviors and outcomes all around us.

At Extension Engine, we’re mindful about confronting bias in learning design not only because it’s the right thing to do but also because it promotes better learning. We need to deal with it head-on so that we can create the best possible learning experience platforms and assessments for all types of learners. As a result, addressing bias ranging from digital accessibility to algorithmic bias are part of our everyday work.

Before we talk about how we deal with the bias we face, it’s crucial to state that this kind of work is never done. Interrogating bias must be an ongoing practice, one that is undertaken with the awareness that we’re all aiming to improve, all the time, but with the realization that it’s a process we will never complete.

With that said, here’s how our teams at Extension Engine approach bias, both conscious and unconscious, on our projects.

Recognize up front that bias will be an issue. It’s almost universally acknowledged these days that tech like social media amplifies and responds to biased content. But many people still think of online learning programs as somehow more objective than in-person learning, because they’re driven by algorithms. The truth is that most of these programs still contain biases — because humans created them. So even biases that are unconscious and unintentional must be addressed in order to avoid inserting them into learning experience designs and negatively affecting learners.

Of course, we identify obvious potential biases and related pitfalls when we begin a project, but as People Operations Manager Stacey Vamvas asks, “How do we account for the biases that we don't know about, and that no one knows about?” We’re constantly striving to answer this question — which brings us to our next point.

Create assessments and programs thinking not only of the client but also of the client’s clients. Extension Engine Vice President Evan Brown says, “Trying to correct for bias, if you identify it, is great — but listening to the actual users and what they're saying, instead of making guesses, is better. The problem often is that we and our clients are making guesses about the composition of an incoming group. Step one is just saying, hey, this might be an issue. Let’s think about it.” Centering the user of the learning experience platform you’re designing is key to ensuring that you end up with a bias-free design. For example, as General Counsel Glyn Polson says, “We have to create assessments that aren’t going to exacerbate technological inequities, which we’re seeing a lot of with schooling from home and home networks.” Certain other examples of bias may be trickier, such as whether a learner feels safe turning on a camera in their home, whether they’re learning in an environment where they can hear recorded content, and so on.

In addition to knowing you should center your learner, it’s important to know which learner(s) to center. Partner Furqan Nazeeri says, “The temptation is to always look at the profile of the median user, but also, looking at the longer tail, it turns out biases are mostly for the outliers. It’s less likely to get a bias for the median user. We talk about accessibility, and obviously we talk about racial, economic, and gender diversity. We have to be mindful of that, but I also think it’s important for us to look at the long tail and see who’s being left out. Those long tails are where bias exists.” This issue is why we try to follow our next point closely.

With each project, focus on the principles of universal design. Senior Learning Experience Designer Lexie Bryan states, “The philosophy behind universal design is to design for the needs of the most vulnerable parts of your population. Basically, when you design for the people with the greatest need, then it ends up benefiting the entire population.” You may not be familiar with the term “universal design,” but you’re probably familiar with at least one real-world outcome of this approach: curb cuts. These areas where curbs slope down to meet street level didn’t always exist. They were originally created to provide wheelchair users with access to sidewalks. Thanks to the fact that curb cuts are now widely used, many more people can take advantage of sidewalks more easily. Parents with strollers, small children, and people on bikes, among others, can all move from the sidewalk down to the street and up onto another sidewalk more smoothly. 

Remember some of the less obvious issues that can arise when counteracting bias in a media environment. One issue we frequently encounter is availability, and the other is cost. Media Producer Savannah Gillespie says, “When you look for stock [images or footage], it’s complicated, because you want to make sure you’re finding footage that’s representative of the population and everybody in the target audience, but you also don’t want to use the really blatantly trying-to-be-diverse stock footage, where it starts to look fake. And there’s a cost associated with this. It takes a lot longer to find footage that is appropriate and diverse — because there’s less of it.”

In addition to considering the extra cost, it’s also necessary to look at the accessibility of what you’re creating. Media and Content Lead Joey Azoulai shares, “We’re always trying to use media to make experiences more dynamic and engaging, and media is a very visual medium. So if we were to imagine media as a medium that does those things, but for people who — as an example — might be visually impaired, and maybe those people represent 1% of users, even if we could find clever ways to create experiences that fit with a lot of different types of people, how do we justify the cost and the time and all that it would take to actually create just taking a simple talking-head video? The alternative is to caption or transcribe the video.” Although the up-front cost of making a more accessible media experience may be greater, it will be beneficial in the long run, because you won’t have to make it accessible after the fact and its accessibility will expand your potential group of learners.

It may seem as if these issues are coming up more frequently now, given how much everyone is online. But as Furqan comments, “It’s worth acknowledging: this is not new. When we say algorithmic bias, it sounds like it’s this fancy new thing, [but] this has been around for centuries, if not longer. ‘Algorithm’ is a fancy way of saying bureaucracy or the system.” The systems we’re working with can always be improved to open up learning to a wider variety of learners, and that’s our goal. Any improvements made to avoid bias in the digital world are bound to help in the offline world, too.