- Thank you to everyone here joining us as we talk about setting accessibility in motion at Peloton. Before we get started, we'll take a second to introduce ourselves. Be? - Hi, I'm Be Birchall. My pronouns are she/her. For a bit of visual description on my video background, well, I'll tell you that I'm wearing a axe-con t-shirt, and I've also got a Peloton jacket that says Peloton software engineering. So having both of those allegiances, and my background, I've got some windows and bookcases. I'm a engineering manager at Peloton. I've led some accessibility initiatives at Peloton, including in web accessibility, and a bike initiative we'll be to calling you about today. Currently, we have a product accessibility group at Peloton where I'm the engineering lead, and which means I get to work with Oliver. - Thanks, Be. I'm Oliver. My pronouns are he and him. I'm a young white man with a beard and earrings. I'm wearing green today for St. Patty's Day, and behind me is a virtual background up in the Berkshires, a lake house that is near and dear to me. At Peloton I'm a product manager leading accessibility and inclusion, and I am also involved in accessibility programming across all of Peloton. Glenda? - Awesome. I'm Glenda the good witch Sims. My pronouns are she/her. I will not reveal my age, because inside I'm a four-year-old, and I'm coming to you from Austin, Texas. And in my background, I have interesting Easter eggs, including a rainbow caticorn. I'm the team accessibility lead at Deque, and I have had the great honor of working with Peloton on some of these beautiful accessibility projects. - So excited to have you here. For today's talk, we have a couple of main takeaways that you'll see throughout our presentation. The first is involving people with disabilities has a large positive impact. Secondly, designing equivalent experience requires creativity, especially when there's a ton of visual data. Inclusive design and retrofitting can coexist, is our final point. So, before we jump into things, what is Peloton? So you'll see on the left side, there's a woman taking a class on her Peloton bike. On the top right, there's a person taking a digital class outside on their phone. They're looking at running classes outside, and on the bottom, there's a person taking a Peloton class on a treadmill. Peloton is a connected fitness product, so what we're talking about there, that connection looks like a lot of different things. We have media, we have amazing world class instructors, and Glenda might speak to one of her favorites. These instructors are near and dear to people who exercise on Peloton, as well as the community that connected fitness brings. You don't just get the metrics. You don't just get your performance over time. You're able to take classes with all of your friends. You're able to see your output change over time, and really connect with the fitness experience from your home in a way that wasn't available to people before. So that looks like hardware in our bikes and our treadmills, that looks like software that runs on those equipment. It's also our digital app. We have TV programming, and then there's the communities that form out of Peloton too, that are very special. So for today's talk, we're gonna focus mostly on the screen reader for our connected fitness equipment, the tread and the bike, but our commitment to accessibility goes a lot further than that. So you'll see, first off on the far right, we are so excited to welcome Logan Aldridge to our team. Logan is an incredible athlete and an amputee. He is missing his left arm from the shoulder down, and is an absolute wonder when it comes to CrossFit. He'll be teaching classes, as well as leading our strategic adaptive fitness programming content. There's going to be a lot more to come on that, so stay tuned, but on the other sides of things, our software, we have subtitles for live and on-demand classes. Our Bike+ offers an automatic resistance follow, so for people who have motor dexterity limitations with their hands, or who just like to keep their hands off as they ride, this tracks with the instructor's guidance. And so as the instructor says, "Turn your resistance up," or, "Turn your resistance down," this will do that for you. Our target metric zones have been a really powerful tool for when you're working out, you have a high cognitive attack. And so the target metric zones really help you follow the instructor's guidance as well, letting you know if you're pedaling fast enough, if you're in the right zone, or if you need to ramp things up or take things down a notch. We've also been working on touch target sizes. As you're working out, it can be a lot harder to hit those buttons. And then across the board, improving web and mobile accessibility. On the hardware side, this has been a bit more behind the scenes, and our team has started including inclusion requirements. So when we're defining new products, we're looking at the diverse perspectives that might be influenced differently as we build hardware, and bringing those voices in early on in the processes as we do the design and research and development. This has also led to a number of different accessibility biomechanics workshops, and so our team is just very excited across the board, and it's been a unifying effort. I'm gonna pass it over to Be. - Hey, thanks, Oliver. So I'm gonna tell you about our journey making the Peloton bike accessible to blind and low vision members. We initially released screen meter support for the bike in 2020, and then have continued to test and gather input to iterate and improve. On the slide, there's a picture of the Peloton bike touch screen with a rider in the foreground, leaning over the screen. Here's a little more about the bike and Peloton. Peloton is an inclusive fitness community, sorry, previous slide still. Peloton is an inclusive fitness community. Our bike brings the fitness studio experience to the home. It provides an immersive experience by streaming live and on-demand classes to you from our fitness studios in New York and London. This accessibility initiative started from our members. We heard from blind bike owners about how much they love their bikes. One of our members even described it as an ideal fitness machine for blind people, but you need to operate the touch screen to navigate into a class, so these members were frustrated that they had to depend on the assistance of somebody sighted. Also, as you ride, metrics are displayed and updated on the touch screen. There's an image here on the screen, on the slide, such as your cadence resistance and output. Blind members didn't have access to this. We heard from advocacy groups and members that there's actually a growing problem here, that more and more fitness of equipment requires touch screens or controls with only visual feedback. This is a challenge in gyms, as well as for home fitness equipment. Our blind and low vision members wanted to be able to ride their bikes independently. Next slide. So why did we prioritize this work once we heard about this need? It really comes down to Peloton's mission and values. On this slide is text from Peloton's accessibility page, which we crafted as part of this initiative, and published when we released the screen meter on the bike. Let me read to you from this page. Accessibility at Peloton, our commitment. Peloton is committed to providing the best, most immersive and accessible experience for our members. Everyone has different fitness abilities and ambitions, and we strive to provide a variety of classes and content that allows all our members to reach their personal goals. Our core values are putting our members first, and empowering people to be the best version of themselves, and we want that to be inclusive of the abilities of all of our members. Next slide. So what was our approach? Right from this start, we set out to define our approach for the project, which was to involve members of the blind and low vision community throughout the process, and to test and iterate. This was actually partly influenced by accessibility conferences like this one, and the broader accessibility community. The advice I kept getting was involve people with disabilities. This slide has a collage of images. There are two pictures of blind or low vision members with their bikes. Since we are already in touch with blind members who use Peloton bikes, we could talk to them about how they used the bike at the time, before the improvements, what they expected, and what was most to them. Also on the slide, there's a logo for American Council of the Blind. Working with advocacy groups was important to get a broader community perspective about fitness equipment. Finally, on the slide, there's a picture of me interacting with the bike touch screen. I'm wearing a sleep mask in the picture to avoid light perception getting in the way. Testing with members, as well as internally, was critical in this project. Something that surprised me a little that I learned from working on this was how important it was to block light perception for this type of testing. Even a screen curtain, which is something that makes the screen black, let me rely on a lot of visual cues about position on the large screen based on my familiarity with the app. Next slide. Adding screen reader support to the bike had some unusual challenges. So the bike tablet is an Android tablet, so we knew we'd be using Google's TalkBack screen reader, but there were some unusual aspects that meant we couldn't just straightforwardly apply screen reader best practices designed with phones in mind. First of all, you're using this tablet while exercising, which means there's a cognitive load from your brain focused on working out, as well as competing inputs, such as from the instructor, music, and your metrics. Another unusual aspect is the bike has a very large touch screen. This can make it unwieldy or harder to find things by swiping. Finally, it's a dedicated tablet. The bike tablet isn't on a phone for example, where the user already uses it with a screen reader, and has expectations of how the screen reader will work with that device. For instance, if a user usually used voiceover with an iPhone, they might not even be familiar with Android's TalkBack at all. Next slide. With all the community input and challenges, we had to carefully define goals to narrow down scope for this project. We defined scope of the project with Peloton members, and American Council of the Blind. We decided to focus on bike initially, we realized the tread would have additional complexity due to safety requirements. Oliver is actually gonna be telling you more about that in a moment. Our product goal was to let members use key features independently. Key features were determined from community input, and these included the ability to turn on TalkBack independently, the ability to browse and select a class, ability to take a class, and ability to check your metrics. And our process goal was to ensure that we had good feedback groups with the community. Next slide. I'm gonna show you a demo of our first, our initial launch, which was in 2020. After that, Glenda's gonna tell you a little bit about the usability study we did right after launch. In the demo, you're gonna see... and here, using TalkBack to find a class and start a class. After Glenda tells you a little bit about the usability setting, we're gonna tell you about some improvements that we made, so I want you to be thinking about what could be easier here. Oliver, do you wanna play the video? - [Speaker 1] 30 Minutes intervals and arms ride. Cody Rigsby, seven hours ago, taken by me. Row two, double tap to activate. Start button, double tap to activate. Cody Rigby 30 minutes intervals and arms ride. Exit button, double tap to activate. 30 minutes intervals and arms ride. Cody Rigsby cycling, Thursday, July 9th, '20 at 8:00 AM. They will need light waves. Start button. Double cap to activate. - What's up, Peloton? - [Glenda] Awesome. So in the summer of 2020, Peloton asked Deque to help them conduct a usability study for people that were blind and low vision, and what was unique about this? I've done more accessibility, usability studies than I can count. And I was super excited about this, but from the very beginning, I realized it was unique. And number one, we had to recruit people with disabilities that already used Peloton, so that was greatly restricting the pool of people we could recruit from. Luckily, Peloton already had a great relationship with the American Council of the Blind, and with that relationship and one blog post, we were able to recruit all the participants that we needed. The next challenge was a usability test that we were originally gonna conduct in person. Had to go remote, because we were in the early months of the pandemic. And so to keep everyone safe, we did do it remote via zoom. It's a little bit challenging for me as the facilitator to help a person on the other end position their camera so that I can see them for the interview portion, but then reposition the camera so that I can watch them interact with the bike, especially if that person might have no light reception. But we figured it out, and if you want really detailed hints, I've got some. That remote testing challenge was a lot of fun. It also gave us an opportunity to record all of the sessions. And you heard earlier, and maybe you, like me, picked this up, okay, there's cognitive load when you exercise. I didn't get how significant the cognitive load was. It's huge, and so trying to make sure that it's a good user experience with little to no friction is incredibly important. Now, before I move on from this slide, I need to have a call out to some people at Peloton that were crucial to the success, Andrea Sutyak, and Onyx, and others did the recruiting for the people with disabilities, and helped moderate the sessions so that we had a really, really rich and positive experience. Let me go to the next slide. There were also some really surprising insights. As I said, I've been doing this for years. I've done remote usability testing? I've done in person, so I should have this all down. Well, one of the things that we determined from the very beginning is that we were definitely going to make sure that we didn't just have people that were blind, but that we also had people that were low vision. And I have been, for many years relying on the ADA definition of legally blind for that cut point. And so when we were recruiting, we were asking for people if they had low vision, which we defined as acuity less than 2200 in the best corrected eye. What was fascinating is that as we looked at the results, there was nowhere near as meaningful as expected. We actually had to group between people that primarily used their vision and used TalkBack as a supplement, versus people that had no vision or they chose to rely on this screen reader of TalkBack. And this was so surprising to me because one of the participants had one degree of visual field, but used their vision. Another person that was at 20/1600 vision did not rely on the screen reader, but used their vision. So really important results, and why is that so important? What I realized is since we had gotten 10 participants and it was roughly half and half, if I didn't pay attention to the different use cases, if I just stuck to this 2200 legal definition, I might have missed the nuances of what was happening for a friction and a usability point. So, next slide. - [Oliver] Just a quick time check here too. - Oh, yeah. Accessibility usability heat map. I just wanna tell you that this is a quick, focused look at how my score tasks. And while we did 12 tasks, there were 3 tasks that I want you to focus on here, take a class in starting a class. You saw Be show you that. Tech metrics to understand the cadence and the how hard you're pedaling and your output, both the experimental and the basic. And then we score these as we observe where the lowest score is, it's like golf. Zero means zero difficulty, one, minor problems, two, medium, and three, point of failure. The only purpose to this heat map, which you can review at a later time is that if I had just looked at overall usability, I would've found that taking a class was a 0.6, which is not bad, 'cause remember, one is just minor. But if I looked at it, the people that were blind had a score of one problem of frustration, whereas the low vision had 0.33, and so this allowed us to find those subtle friction points for different user types. The same was through in that Be had provided experimental metrics for checking how fast and hard you're pedaling, and we were able to see was this working or an not. Handing it back over to you. - [Be] Thanks, Glenda. This usability study kind of brought the project to life for me in a way, and we got to see both how much our blind members valued the work, and we also found pain points and room for improvement. So I wanna tell you about some improvements we made based on what we learned from this study. I'm gonna focus on two areas, starting a class, and reading your metrics during a class. Next slide. So the first example is about starting a class. When I showed the demo video earlier, starting a class, you might have noticed it took quite a few swipe stops. On the slide, there's a picture showing a screen on the bike that you encounter just before starting a class. There's information about a class, and a start button. The green talk back focus indicator is around the class title and instructor. Below that, there's information about equipment needed, then below that there's a large red start button. In our initial implementation, we used a logical swipe order, which we took to mean top to bottom, left to right, and all the information perceivable to the visual user could be reached by the TalkBack user. This follows standard screen reader best practices, and would be a great approach if this screen was on a phone, for example, and not encountered as part of exercising. However, we learned from our users that it was hard to locate the start button on the large screen, and cumbersome to have to swipe through information when you just wanted to start your workout. The TalkBack user experience wasn't really equivalent to the visual user in an important way. The visual user has attention directed to this large red start button right away, and so it can immediately select it. An accessible product offers equivalent experience to people with different abilities, but what is the right equivalence here? Next slide. On this slide, we see the green TalkBack focus ring around the start button. In our next iteration, we autofocus the start button when the user enters the screen, that way attention is immediately brought to the start button. You hear start, and can double tap anywhere on the touch screen to begin the class, in this case, figuring out what will amount to equivalent experience took some testing and iteration. Next slide. The second improvement I'll tell you about is to reading your metrics during a class. The slide has a picture of the bike screen with a lot of visual information. We see an instructor in the class video, it's Ally Love for those of you Peloton members in the audience. There's also a lot of other visual information, including a row of metrics along the bottom of the screen, telling you about your cadence, output, and resistance. These metrics constantly update as you ride. What would be an equivalent audio experience? Figuring out where to tap the screen to read metrics is difficult while exercising, so on our initial TalkBack implementation, metrics were read out when they updated, automatically reading out every time there's an update. This turned out to be a top user complaint in our first usability study. Metrics could change frequently, then TalkBack was much too chatty, and distracted from the music and instructor. So our first improvement was just to turn off those chatty- - [Oliver] Right on active teaming due to accents and... - [Be] Sorry, there's some feedback. So our first improvement was to immediately turn off those chatty readouts, but we knew we had to make improvements to help people be able to understand their metrics easily. The challenge is that attention works differently, or it can work differently for visual and auditory inputs. If you're a visual user, it's easy to direct attention to metrics along the bottom of the screen, or to the instructor. Audio inputs don't let us selectively direct attention in the same way. Next slide. So thinking about what's important for the visual user, a visual user can easily read metrics when they're interested. They can also ignore metrics, and they also have the ability to collapse or hide the metrics altogether. So the solution we designed, and Glenda and I collaborated on this solution, drawing on the usability study, is to develop a metrics auto-read feature. The user hears metrics automatically read out at 90 second intervals. They can also tap to hear metrics if they want to otherwise, and they can also collapse or hide the metrics auto-read. We found that members were really delighted with this update that we made, the metrics auto-read feature. It really helped solve the problem of giving people an equivalent experience to the one visual users got. I'd like to give one of our members the last word in this section about the bike before handing off to Oliver. This is a short audio clip from the usability study where you'll hear Glenda, and one of our members. - [Glenda] What are your overall impressions of the new accessibility features on your Peloton bike? - [Natelie] Well, I love it, 'cause it's a game changer. It means that the whole reason I ride is it's the one exercise I can do independently, right? And now I can even set up my ride independently, so it's huge. - So powerful, so cool. And now we're going to launch into talking about building the screen reader experience for our treadmill. This is a project that we started last year, and it's still in progress, and you're actually one of the first people to hear about it. I think this is the first time we're talking about this project publicly, so very, very exciting. So before we started, we learned from our experience with the bike, that it was important to be working with people with disabilities all throughout the process. So to begin with, we jumped into research, how were blind people currently using treadmills? And what we found was while blind people do use treadmills very frequently as an exercise tool, there's no treadmill on the market that can talk to people as they're using it. There were users who we talked to who were reporting having to memorize every single time they hit the speed or the incline buttons just to know how fast they're going. There was one user who had talked about finding a treadmill where they could punch in the values and how fast they wanted to go, but they still were having frustration with coming back to understanding, wait, did I punch in 4 miles per hour, 10 minutes ago? 'Cause folks are going through so much cognitive load. We heard about people finding workarounds, and then eventually getting frustrated with how much cognitive tax was on them as they were working out, and eventually giving up on using their treadmills. Another point of frustration was a lot of the treadmills today have metrics and programming that's entirely visual. So those programming and the metrics that were available to people were completely inaccessible. Folks were having to use workarounds, using different sorts of wearables to tell what their heart rate might be, to estimate how long they've been working out, and to follow along their progress. Another thing that we talked about, we described our hardware today, and we described the sighted person's experience, and the knobs that someone turns to adjust the speed, and adjust the incline, and the general layout of both the tread as you're standing on it and the screen, and then we just listened. We asked people what their expectations would be for that experience. And one of the top things that people reported was the importance of having meaningful auditory feedback as the speed and incline changes. One thing that they did call out was what Be had mentioned, that information overload. They didn't want to hear about every single speed change, but they did want to have tactical feedback, or tactile feedback. So they were hoping to have, as you turn a knob, little increments that they could feel as they change the speed, which is a little tricky when your hardware doesn't actually have that, so that was one thing that we kind of tucked away. Also, instructors are constantly calling out the target metric zones that they want people to be in classes, so naturally people we talked to wanted to have that information available to them so that they could be sure that they were in the ranges that the instructors were calling out. Essentially, again, coming back to that equivalent experience, people wanted the same level of access to onscreen information and interactions that anyone else had. Another thing that came up was that it can be really hard to hear what's going on during a class. So when they were thinking about the TalkBack interactions they might have, or any sort of speed or incline changes, they wanted to make sure that they were perceptible, and they noted that on our bike experience, sometimes it can be hard to hear things read out while they're in class. So with all this information, we asked our question, this question, can we just turn TalkBack on? And Kimberly McCarty, he's one of our software engineers, had this amazing quote, "TalkBack is a screen reader, and our treadmill is more than just a screen." So while we found that the navigation was usable, everything getting into the class, even in class was very close to operational the hardware itself and the systems changes as someone cranks up the incline or speed, we weren't able to deliver reliable and perceptible speed and incline changes and alerts. One of our core safety metrics for evaluating the tread for both sighted and people who are blind and low vision folks was that as someone changes the speed, as someone changes the incline, we flash that information on screen. That needs to be top of mind for people as they're making these changes because when you're biking, you can stop your feet and the pedals stop. But on the treadmill, the motor actually keeps going. So there were important safety alerts that we wanted to make sure were constantly available to people. The other thing is when you're working out on a treadmill, you not only have your own exertion, you're breathing hard, your music, your instructors, you also have the foot strikes. You have the motor, which is pretty loud. So you're in a lot more noise competition to get people's attention. We also found limitations in TalkBack, in making sure that really important announcements weren't interrupted midway. Again, TalkBack is a screen reader, and it's told to jump focus to whatever someone might be touching on screen, even if there is a more important safety alert that might need to take priority. So at the end of the day, we wanted to build an equivalent experience, and balance that entertainment aspect with perceptible feedback and safety alerts. The challenge was condensing how much information we had into single accessibility audio screen that plays nicely with everything else. So we looked to this model that has nothing to do with treadmills, the car infotainment system. There's a picture here of someone driving, they're interacting with their GPS or something on the dashboard. They're seeing all these different visual cues about other cars around them, how fast they're going, different danger that might present itself, and so we modeled things after when you're driving a car, you wanna listen to music, you might want your GPS to interrupt that music. But then if you're drifting off into a lane, you really want to have that safety alert that blares over the GPS. And so we establish this hierarchy of announcement, the most important things, things like if your safety key isn't attached to your belt pulls out of the treadmill, 'cause you've drifted back too far, that's the most important thing. That's far more important than the speed changes that might happen, and then more important than auto metrics readouts. The other thing we learned was how important and how helpful chimes and different noises can be to really quickly draw attention, and minimize entertainment interruption. We ended up using these as kind of stand in for the tactile requests as someone changes increments of speed, really quickly letting someone know that that's happening, but without interrupting the rest of the entertainment. Finally, using audio ducking, and we actually dove into interesting human factors research about the desirable amount of audio ducking, so you don't startle the user into hurting themselves, but you're also perceptible enough. So this next video, I actually would love you to focus mostly on the audio here. This was our initial proof of concept to make sure that the audio chimes were going to be sufficient to guide someone through using all of the different hardware interactions. So this isn't a complete video of all of the different interactions that are possible, and it's not the final thing, but it was what we used to validate our approach. So Haley's going to demonstrate changing the speed, which is done by the knobs on the right side of the treadmill. - [Speaker 2] You'll hear a different chime if the speed's going up or down, you'll also get an announcement if you pass increment of two miles an hour- - [Speaker 1] 2.0 miles an hour. - [Speaker 2] Or if you let the speed settle for 1.5 seconds, you'll get announcement of that speed. - [Speaker 1] Speed, 2.7 miles per hour. - [Speaker 2] You also get a chime if an announcement if the belt stops. So whether that's a stop button or turning the nob all the way down... - [Speaker 5] Stopping track. - So, that was our proof of concept, and Patrick... Patrick, who works at Deque, and who is a blind accessibility expert was one of the people that we evaluated that with to validate our approach. And he has been working with us, our whole developing team to validate this approach. - You realize you're gonna spoil me. You also do realize this is revolutionary, right? I know of no treadmill that can do that. I did look around about four or five years ago when I was buying a treadmills. Like, is there any way this thing could talk to me? And everyone said no. And I think the other thing is just adding the audio to the controls for a blind person, that is revolutionary in itself, but you just now reminded me not only did we get that, but we got the Peloton training experience added on it, so it's like a double revolutionary thing. Not only do you just get a treadmill that I can operate the controls and know how fast I'm going, I'm getting one that allows me to be included in the classes, so that's really good. If you realize you're gonna... - Be, do you wanna take it from here? - [Be] All right, thanks so much Oliver, that was great. Yeah, so wrapping up, I wanna highlight three key takeaways we want you to have from this talk. So the first is that involving people with disabilities has a large positive impact. The slide, sorry, same slide. Just wanted to explain that the slide has a photo of one of our blind members riding the bike. The projects we told you about would be completely different without community impact and feedback loops. It's almost hard to imagine what they would be like. First of all, improving the accessibility of our connected fitness products started from hearing from our members, gathering input, testing, and refining had a big impact on results. There are also feedback loops within our teams, hearing from blind and low vision members about the positive impact we're having helps to motivate our development teams as well as leadership. Now we're making sure to have those same user involved approaches in developing TalkBack for the tread. Next slide. The second takeaway is that designing equivalent experiences can require creativity. This slide has a picture of part of the bike screen, part of an image shown earlier with a lot of visual information. An accessible product gives an experience to people with different abilities. You perceive the same things, even if in different ways, but it's not always straightforward to figure out what an equivalent experience is. This is especially so when there's a lot of visual information, and you're trying to design an audio equivalent. Attention works differently with visual and audio inputs, so it can require some creativity, testing, and iteration to figure out the best equivalents. We saw this with reading metrics on the bike, and now we're also seeing that with understanding safety alerts on the tread. Next slide. The final takeaway we wanna leave you with is that inclusive design and retrofitting can coexist, maybe a surprising takeaway for an accessibility conference. An inclusive approach is about involving people with disabilities and incorporating more perspectives to make your product accessible, and typically goes hand in hand with iterating and testing as you gather feedback from the people with disabilities along the way. Retrofitting, by contrast, retrofitting is often used in a derogatory way, something you shouldn't be doing, and it involves not taking accessibility into account at the start of a project, but adding it in later. We often say that retrofitting is to be avoided in favor of inclusive design. We should bake accessibility in, not add it on later. I completely agree and often say this myself. However, in reality, many accessibility efforts do involve making products accessible after the fact. I hope the stories we've shared today make you excited to bring an inclusive design approach to your accessibility initiatives, whatever stage your products are in their accessibility journey. I think we have a little bit of time for Q&A. - Yes, we do. Awesome presentation, let's get into a couple questions in our remaining time. So first question is what were some of the challenge in performing accessibility testing while people are exercising? - So one of the things I'll say is I didn't get it. We ran the first two usability tests, and Andrea had to tell me, "Glenda, you need to let them hit full exertion," and I was a little nervous, because I'm not used to making people work out in the middle of a usability study, but how much of your brain is available to interact with the device changes when you're at full exercise exertion, so we wouldn't have gotten good usability results if I hadn't let that occur. - Absolutely, thank you, Glenda. Next question is about process, so what's your process for managing accessibility feedback and ongoing user touch points. How do you decide what to prioritize? - That's a great question, and it's something I think we're constantly asking ourselves too. So, we tend to look at things based off of the degree of impact to users, and I think Be outlines some of the core user flows. So starting a class, exiting a class, some of the browsing for a class, there are areas that are core to our user experiences, and so we weigh both how impactful this is to being able to participate fully in a class, and then also just how critical it is to the user experience. Is it just annoying or is it absolutely like a blocker for someone to participate? And we have gotten really great feedback via some of like our accessibility email that has pointed us, and we have developers and QA folks who point us directly to bugs that might happen that we jump on immediately because they are critical. There are other things as well, like we ran a more experiential-based study last summer with Glenda's help, and that was not about usability at its core. It was about inclusion. We did this among a number of different cohorts, but one of the groups was with blind and low vision users, and really wanted to see like from end-to-end participating in both the exercise and the community, all sorts of different features. Were people feeling like they belonged in that experience? What could we do to improve upon that? - Wonderful, thank you. I think we have time for one more question here. This is from Kimmy. They write, "Thinking of the customized settings for screen reader users, have you considered adding settings for specifics, like metrics and detailed info for the classes?" - That's a great question. It is something that we've talked about. It's not something that's been on our immediate roadmap, because we've been prioritizing making strides on the treadmill right now. So it's something that we find, a personalization, of course, is always very important when it comes to accessibility, and it's something that we do want to work on, there's just only so much we can work on at once. - Fantastic, thank you. Okay, I think we are just about at time, so I want to thank you, Oliver, Be, and Glenda for your presentation today, super exciting stuff. I'm really excited to see the kinds of accessibility features that Peloton develops next. - Thank you, Grace. - Thank you so much. - Thanks.