amazonv: (Default)
[personal profile] amazonv
Accessibility in a Product with Thousands of Pages (TurboTax)
Type: Breakout
Track: Development
Learn how engineers at Intuit tackled accessibility issues across thousands of screens in their tax-filing product, TurboTax. Attendees will learn how the team used axe to pinpoint high impact issues, ultimately reaching zero violations in the product’s core interview experience.
 
all righty, I think we'll go ahead and get started. Hi everyone. My name is Grace Kirkly, with the Deque team here. It's my pleasure to be moderating accessibility in a product with thousands of screens with our special guests, Kendall and Tyler Krupicka. I'm going to go over a couple of housekeeping notes. This will probably be the last time you hear them. I'll go ahead and get through them. Today's session will be recorded and the recording will be hosted on demand for everybody to enjoy and watch again on this same presentation page.
Second, these slides for today's session are available also on the presentation page where it says download the documentation in a nice blue button  there. If you don't see them, you might have to refresh the page. Finally, we are going to save the last 10 minutes of today's session for Q&A so please if you have questions, post them in the SLIDO Q&A section that's located next to the chat. Also, next to the video screen. And I recommend sorting your chat by recent so new comments scroll dynamically to the top. So with that, I will turn things over to Kendall and Tyler to get us started.
>> Thanks so much for the introduction, Grace. I'm Kendall and I'll be speaking with Tyler. If you're joining us today, thank you. We know we're the last conference of the day so thanks for taking your personal time, probably not work hours, to join us on accessibility in a product with thousands of screens.
So a little bit about Intuit. We're the makers of TurboTax, QuickBooks and mint. Our headquarters are in mountain view but both Tyler and I are from the San Diego area. We were founded in 1993 and we're about 9400 employees with 50 million customers worldwide. For our agenda, we'll be focusing first on the scope of the problem, the routes we took to solve that problem, hitting strides to cleanup our code and where we are today and the progress we've made.
>> So over the course of this talk we are going to be talking about a project that took place over the last like couple years working on TurboTax, which is a product maybe everyone on this call might not be familiar with. What is TurboTax?
TurboTax is an online tax preparation product that is primarily used in the United States and  Canada. So what that means is every year people have to go and submit some paperwork to the government for their taxes. And TurboTax helps customers go ahead and fill out those forms in what is hopefully an easy and intuitive way. People aren't going on to a website and filling out all the forms that the United States Government put together. The product is kind of taking those and breaking them down into simple questions that hopefully are a lot more intuitive to navigate and on top of that, the product's doing a lot of work to either help you import documents or also figure out what forms you even need to fill out and file. Hopefully you're only answering the questions that you actually need to answer. When referring to that style of experience, we call that the interview experience. And it's kind of mirrored after an interview thaw might have in person. If somebody is asking you questions about your past year, that's kind of what the whole product tries to feel like. So the whole goal of that is it's more personal, more conversational. But with that interview experience comes a lot of complexity. In order to kind of express all of this tax code and all of these different forms as a bunch of broken down, simple questions, it takes thousands and thousands of possible screens. And some of those screens are hit by almost every customer. And some of them are only hit if you have very specific things happening in your tax return. Maybe there's a specific form for boats or something and so only a subset of customers that that applies to are actually going to see that screen. And kind of on top of that screens can get dated or they're updated less frequently if the tax code around them is also updated less frequently. If you think about what teams are actually focusing a lot of their time on in the interview, a lot of them is focused on the screens everyone sees and also the things that we need to refresh to match new tax code. So you have got thousands and thousands of possible screens, you have got these teams working across it. Some of the screens may not have been worked on in the last year or two. And with all of that, a lot of the same product constraints that every other has applies. When somebody is going through TurboTax, ideally it will be a consistent experience that feels like it was built by one company and one team of designers and developers. So there's a lot of complexity with making all these screens feel like one cohesive experience even if there's a bunch of people working on it and they may not have worked on that screen in a long period of time. To manage a lot of that complexity, we treat a lot of the central interview experience as a design system where it's all managed and deployed centrally. So the page is built out of components like input fields and radio buttons and other common patterns that appear in the forms or in our question-and-answer experience. And so those will all be designed and updated centrally and even a screen that hasn't necessarily been updated by a team in the last year will still get the latest designs.
And on top of that, in order to kind of make that system work, we've also built out about 500 of what we call mocks which are basically example screens of the common patterns you'll see. So even though we might not run a test that touches all of the thousands of different tax scenarios that are possible, we hope that we're testing all the different normal UI patterns that might appear on those screens.
And so over the course of this talk we are going to be going through a project that we did to actually be a lot more proactive and fix a lot of the problems in the interview experience with regards to accessibility. And so taking you back in time to kind of the beginning of the project and where we were when we started, we have this huge number of screens. There's a team of anywhere from like three to five people actively working on managing the central set of designs for all of those screens. We have some accessibility physics that we're doing but we're doing those mostly through bug reports. And bug reports tend to skew towards really common screens like the ones that everyone hits, more people either tell us that there's an issue or more of our QA resources go towards those so it's easier to find them. We didn't really have any necessary processes in place from the development or the design side to kind of proactively audit issues from our end. It was mostly a feedback loop where we would put out new releases and then hear back from the different quality teams about things we needed to fix. So scoping the whole accessibility problems in the space was pretty difficult because we knew we had all these screens but we really didn't have great visibility into how good we were doing on accessibility.
So with that, we put together some goals for being what we call proactive about accessibility. So taking a more active role in our design and development process to, one, figure out how many problems do we actually have, and then two, how do we build a solution for all of this so we can be proactive about accessibility and integrate it into our developer workflow and get rid of issues that may not have been found yet. The first goal for that that we knew had to be in there was using automation. This was just due to the constraints we have. We have a huge number of screens and a small number of people to work on this so we need some sort of tools to help us focus in on what's important. We wanted to make sure that as a first step we tried to stop developers from introducing any new accessibility issues into the interview experience. That would come before actually learning more about what issues we have out there and starting to fix them over time.
And then we wanted to be able to focus in on the most widespread issues first. Since things that occur on almost every screen will have such a huge benefit for customers if we can go ahead and fix it as opposed to fixing the experience on just one screen. We also went into the project knowing this would be a long-term thing to learn more about the accessibility of the product, go ahead and fix it and maintain it long-term. We wanted to be able to track our improvement over time and have that as a benchmark to keep us focused on continuing to solve problems. With that, I'm going to head over to Kendall, who will talk about automation.
>> Thank you, Tyler. We had our goals and knew we had to find an automated -- a way to automate accessibility so we began looking at a few options. After looking around a bit we settled on Axe core. It fulfilled most of our requirements. Zero false positives, for example. Axe will not any accessibility testing rules unless they're sure they will not create false positives. You might not cover every single test case using Axe, but you'll be able to look through the errors and output and be sure they are all accurate results and all are errors showing up. It was important to us to find something that's easy to integrate into our system. Axe for us, being able to add that into our builds, we weren't changing much. We were able to seamlessly and have it work. We began integrating into our code. We wanted to utilize the fact that we were running cross-browser testing already. After each automation test that ran on a browser, we ran Axe before leaving the web page. Once we gathered results we compiled them together to provide feedback. And with that, we got our initial results. After integrating Axe core we found that we had approximately 2000 violations in 500 of our mock screens. This was the number of violations after we turned off some rules that we were saving for later in our accessibility story such as color contrast and duplicate ID. And then we began recording those violations on every PR. This way we can ensure that we are not introducing new bugs as we're implementing and eliminating them. We determined that many of the violations were caused by 13 high impact components components such as links and buttons, common components that were used a lot in our design system. And therefore, by focusing on those high impact components we were able to eliminate many bugs on several different pages that we were seeing.
And then after initial result we wanted to create a process for documenting these test cases. We began adding Axe into our PR pipeline. We categorized those results to showcase our biggest failures and where we needed to spend the majority of our time. We only added features and bugs that would improve accessibility, not hinder it. If a PR had a violation, we would wait to merge it until that violation was fixed. We provided links to the screens that were causing bug failures so we knew where those bugs were happening and therefore they were easier to fix for  us.
>> On the right side of the screen here, we have kind of an example of that. Basically this is what would be shown to a developer. So in their actual PR flow, a bottle come in and comment and say hey, here's the accessibility report for your changes, here's a number of screens or text topics, and here's the number of errors that were found in master, in like the main branch of the code base, and here's all of the errors that were in there with your change. So at a glance, the developer can get an idea of any of the things that may have been impacted. So that kind of goes back to what Kendall said about like helping the developers narrow in on individual pages that were impacted by the change.
>> Thank you. And then charting our progress. So when we were going through this process, we wanted to chart the progress and show our effects on the  product. So we created a script that would collect the Axe violations every release and we could track the number of violations over time and ensure we were making improvements and showcase the work we had been doing on our products.
To the right we have a little chart showing just like an idea of what our charts might look like.
>> So after laying all of that groundwork for finding out we have 2000 violations across all of these screens, we've identified some high impact components and have things in place to actually chart progress over time, we actually started to go in and start addressing issues proactively.
So the first month we were actually able to fix a surprising number of the issues. I think this was a huge take away for the project was that when you have issues that are really widespread, fixing very small trivial things that are mistakes but fixing them can have a really huge impact across the entire product and it definitely multiplies as your product gets more complicated and has more screens. In the first month we were able to fix 1350 out of the approximately 2,000 violations which is a huge amount. When you breakdown some of the fixes we started doing, it makes sense. One of the things is links. Links are something that appear on most screens in the product. They're very small and fundamental but you need to make sure you're doing them properly because a violation in your core like link component is going to be reflected on every screen. For us there were some minor ARI attributes so it was really important to fix those. That got rid of almost 500 violations. Another really big example for us was tables. Inside of TurboTax you can encounter screens that might have a table grid with a lot of inputs in it. Those kind of relate back to a little bit more of the forms that are going into the tax return. So with those it's really important that we're using all of the proper roles to allow people to navigate within a table grid but it's also important that we're linking every input to the rows and columns and making sure they're all labeled because when they're just floating in the center of a table that might be a little bit confusing for a screen reader without all the context on the rest of the page. Those were a fairly common pattern and going in and addressing those issues could have a widespread impact as well. And probably the last thing that I'll address on this topic is hidden labels. Sometimes as we would be resizing screens or things like that, you might hide some of the labels on the screen because they might be duplicated based on different section headings or things like that. It just makes it more clearer visually when using a mobile device. But when we do things like that, we have to make sure that all of the labels were still available and linked and routed properly using the ARIA attributes so screen readers wouldn't suddenly have their label disappear. Those were core standard changes that we fixed. And that drastically contributed to the over half of violations that we were able to remove. It's also worth noting in that time that due to the checks we put in place to stop people from adding new  violations, there were none added. And another side effect was around color contrast. As Kendall mentioned in our reports for developers we turned off the color contrast rules initially. That was two pieces. One was there was a lot of color contrast. So our whole reports would be flooded with them if we turned it on. But the other piece was in order to fix that it requires a full design approach for that that refreshes a lot of the designs through the product to have better contrast. And that's not something we would necessarily be addressing on the development side of things. But we still wanted to push that project forward and we had a lot of great design partners who were working on actually doing those refreshes. So now that we had reports available we could turn on color contrast and generate a report to start talking with our design partners on where to prioritize their work to improve the contrast.
And so even though in the first month we were able to get rid of over half the violations, that became increasingly slower as time went on. As you can imagine, we were addressing the widespread problems first which means the last things we were addressing might only be occurring once in the entire product and they require specialized tweaking and debugging to figure out what's actually going on. And in fact the final 15 violations that we solved were a problem that occurred only on a single page and so it just required as much development time to solve one Axe violation as it did to solve 500 in the early part which makes it a bit harder. But even with that we were able to reach zero Axe violations in about five months. And that was just with some constant work each sprint to focus on whittling away at remaining problems and prioritizing it. One thing I would note is over time we did find more violations because we would update Axe and Axe would actually have new rules available and it would find new violations, which is great, but it would also mean our numbers would go up and we would have to address those as well. And one of the things that did help make this process easier is as we were going along we were also doing development work to introduce new designs or refresh designs in the product. So as we were taking on those tasks, we could come into them and make sure that we were also addressing some accessibility issues at the same time or making sure that any new designs we added had no Axe violations. So that kind of helped us with that while including accessibility and other work streams as well. In order to make sure we were tracking along the project along the way we made sure we took time each sprint to create Axe reports. And what we were kind of going through and solving those problems we found a few things that made it easier along the way. One thing was it really helped to publish a build of the TurboTax UI every time somebody was making a code change in such a way that you could go on different devices or different team members could easily navigate to our URL and test out your change and see what happened. Why this was really helpful is like for example, if we're testing in multiple screen readers as well, those might require windows, some of our developers might be on Windows versus Mac and you want to be able to quickly send a URL either to another computer you have or to another teammate or QA and have them pull it up and take a look. And that also helps you with some screen size testing and things as well. And also on the development side we focused on adding a few linting rules that could help us be more accessible and find accessibility problems inside of our code editor instead of getting all of our results from Axe once it's actually on the page. One of the tools that was helpful for that was a package called  ES lint plug inJSX accessibility. That one helps you find issues or have hints in accessibility in react. We also added another one in CSS called style lint accessibility. And one of the things that was really helpful in that plug inis over time we started to make sure we had reduced motion support across the product. And the style lint accessibility product let us warn developers anytime they had CSS animation that they made sure to account for reduced motion.
>> So yeah, we have some kind of outcomes of this and our thoughts on automation in general. Do you want to go to the next slide?
So automation covers about one-third of all accessibility violations. It's important to lean on it but also not to solely use it when you're testing accessibility. Automation can fall short in areas that are subjective rules for example, maybe things that are harder to test. And it's important to do manual testing such as visual, keyboard nav and screen reader testing as well.
And expecting things to change. Guidelines are always being updated to improve user experience. We're learning things every day about users and better ways to display information on the web so expect new versions of rules and automated tools to align with those guidelines and it's important to stay up to date on those.
And accessibility should start at the design level, not at the development level. Overall, if you design -- if your designs are accessible, you'll save time during the development stage and not having to go back and forth with the design later. And one way we found that we can help that is we encourage accessible design by creating tools to make design easier to produce. An example of this was adding a color contrast checker that pulls into it color schemes and allows design to check those colors that they select from the Intuit design library to make sure that they meet color contrast.
Right here we have on this slide an image of our old version of TurboTax. We have a lot of things happening here that are kind of hard for users to be able to see so we have buttons that have very light coloring as well as text that's not very dark and it's hard for users to be able to find those on the page and be able to navigate to them. So from our Axe automation we were able to make a different display if you go to the next slide. We increased color contrast. We used a darker teal in our library. And we increased typography sizing as well as changed the colors. So again, ongoing process, but we're trying to update designs just to reflect what Axe is showing us on the automation side.
I think you're muted, Tyler.
>> So as we get towards the end of the talk, we would like to point out a couple of open source projects that we created along the way doing all of this work. If you're doing a similar sort of work on your projects, then maybe these would be helpful for you and could help you also integrate some of the things we talked about. The first project here is called proof. It is available on the public GitHub for GitHub.com/Intuit/proof. And what it does is it's a test return for a project called story book. Story book is a very popular project for documenting your user interface. So if you're building react components or Vue or anything, there's a chance you're using story book for your documentation site. If you're doing that, this test return can let you go run tests against your documentation. It includes a plug in for Axe which we use on our projects and that will generate the report for Axe violations. You can tell it go grab every story on our documentation site and go run Axe on it and give us the results and compile that. That's really handy if you have the same workflow as us.
And the second project which was released very recently, actually only in the last week, is called accessibility snippets. And it's an extension for the Visual Studio code editor and it adds a bunch of ARIA snippets that help you quickly piece together an accessible interface if you're using react. It's got a lot of the common pieces of code that you would put in as you're building out react components. Since this was only released very recently, it might not be in the distributed versions of slides, but we'll try to make sure that the link is posted in the channel. And it's free and open source. So with that, it looks like we're done a little bit early, but we're happy to take some time with questions and feel free to drop those in.
>> All right. Thank you so much, Tyler and Kendall. Let's get into some questions. Just a reminder, you can use the little thumbs up icon in the Q&A if there's a particular question that you would like to see answered.
So I'll kick it off with this first question  here. How did you identify the 13 high impact components that you referenced?
>> That's a great question. So that was actually a little bit more difficult. Actually that's probably something we could have covered a little more. When we started running Axe against our pages, we could kind of see a few things that helped us group and identify that. One was we already kind of had grouping of our types of pages. So it was pretty easy to start to like pick out more violations are happening on this type of page, so we can start to zero in on that. And then part of it too was when Axe was giving us results, it would give us invalid like ARIA owns property or something and we could see there was a ton of them. With that we could drill in and say what's causing this. Okay, that is coming from links or something like that. So it was kind of a two pronged approach to that. We would look at types of screens that had a high number and drill into the number of components manually by looking at the screen. We would look at Axe violations by number and try to figure out why is this one so high. And sometimes that could be multiple components involved and you kind of have to figure that out through a manual review. But often if there was one rule really sticking out it was a chance it was a widespread component that had that particular issue with it.
>> Awesome. Thanks for the clarification on that. Next question here is how do you consolidate accessibility reports from different teams/product?
Do you do it manually or is there any tools you have that helped automate that?
  
  
>> That's a great question. Luckily for us, like the set of mock screens that we use for testing as well as some of the other screens in the interview experience, all of those kind of span a large number of teams already so we don't necessarily have to coordinate as much down to specific teams as we're doing the design refreshes. But I know like there is some different work that goes on with maybe the marketing sites or something and we don't necessarily have combined sets of data for all of them, but there is kind of an internal community of people who are working on accessibility across products at Intuit. So between the different teams and the representatives who are involved with accessibility, then we have more visibility into issues. But definitely not a perfectly solved problem, but yeah, it's something that's always ongoing.
>> Yeah, to touch on that, with our design systems we do know which teams are using which components. We are able to find metrics on which teams our components show up in. Recently we had found an accessibility violations and were able to look at our metric and 20 repose are using our component and we need to get on and update it. The nice part is we make the fix in the design system and all it takes is the team updating the version.
>> Awesome. All right. Next question here has a couple of parts. How did you prevent new accessibility bugs from being merged?
Automated testing on PRs that prevented them from being merged while there were violations?
What did you do about developers saying I don't know how to fix this if they didn't have accessibility skills?
>> The nice thing is we're a smaller team so for us it was fairly easy. We were a team of four and we were adamant about not adding more violations. I definitely think at different companies you'll have a project and it varies like lots of people are working on it. I guess it falls the same under tests in general. You don't merge code that has a failing in automation, right?
This is the same thing. So you're not going to merge the code if a new Axe violation is showing. Granted, if you have 2000 violations and [Away from mic] can't tackle all 2,000, that's understood. We focused on not introducing new ones and making sure that count was pretty consistent. And again, the results did show up on our PR, so when someone's reviewing your PR on the team can easily see that it was two thousand violations before and now it's 5000, you have a problem. That's kind of an exaggeration.
 
 
>> To add-on to that, it wasn't necessarily perfect. We used it through GitHub so in GitHub it would stop you from clicking the button while there was an outstanding thing. It would look at how many violations a specific screen had before the change and after. Technically you could get where you fix a few and someone fixes a few. I don't think that happened but it was a possibility. The other thing to answer the third part, I think that's come up a bit more in the design system work as we've expanded that where there's more developers who are involved creating new components or maybe they did a design and they're contributing that design to the central system as something new and then our test might pop up and say hey, there's accessibility violations, you can't  merge. It does happen where we have people who aren't too experienced with any of the ARIA attributes or maybe aren't experienced with HTML and JavaScript and this is the first thing they're implementing so it does require a bit of training. One thing that's really helpful is Intuit has a kind of community group for accessibility called the accessibility champions that is run by Ted Drake, the accessibility lead for the company. So if our team was ever unable to kind of come in and help and answer questions, there are other people who have time around the company to kind of step in and help people. Ultimately the change won't go in until we feel like everything's been addressed but hopefully the developers who are contributing feel like they're getting the resources they need to actually address the problem up front.
So the goal is that everyone comes away from it learning something with a positive experience as opposed to we just blocked them to fix it and they're retrying until it works. It's very much a collaborative process.
>> Accessibility is a team sport, right?
>> Yeah.
>> Fantastic. So this question asks Kendall mentioned that automation tools may generate just a fraction of the actual WCAG violations. What percentage of the effort up to now was put into automated testing versus manual testing, would you  say?
>> Gosh, I guess [Away from mic]. For us when we first added Axe into our repo, we really wanted to focus on those violations, we still add new components and features are checking the keyboard nav and visual and screen reader testing. I don't know. Tyler, what do you think percentage-wise?
I guess we're really focused on both.
>> I would say up front like almost all of our time was spent on fixing those automated violations because, as Kendall mentioned, Axe doesn't really have false positives so we knew zero was a target that was like somewhat reasonable. And we knew that those were real issues that we needed to solve. In tandem to that though, as we're fixing those issues we're getting more accustomed to what kinds of things can come up. We also were kind of discovering new areas of product or things that could use a bit more of a manual touch to kind of review all of it. I would say more like it's definitely shifted towards the opposite now. If the project has zero violations now and like a component is added or something, it might get three issues or something. Those are pretty easy to fix. But we are still fixing more and more accessibility violations and most of those are coming from some sort of manual review and we treat the automation as a strict baseline for we should be addressing all of those and then beyond that we can improve. And one good example of that is we just did a pass-through a lot of the different interact developments and said is there anything on this page that is describing this interact development that isn't properly linked with a described by or something like that. Because it might have a description on it, but there might be visually three or four things describing it on the screen. So the automated tool is just checking that. You have the script development, we want to make sure it's including all of them. That's something you have to do a bit more manually, but because we had the baseline of making sure everything was described, we could start to identify yeah, it's described but really there's a section header that's also describing it that isn't included so let's loop back and fix that. It started 100% automated and it's flipped to probably 80% manual now because we can trust the automation.
>> And yeah, more of the subjective things that are harder to check through automation, like is your description accurate?
Does your button just say click me, because that's not helpful. Those things can't be checked with automation but we want to make sure that that's still being tested.
>> Absolutely. That's a great point. And I just wanted to call attention, I believe I starred this in the Q&A for our audience here, but Glenda the good witch Simms did chime in, she had a presentation which you can catch on demand that some recent data that Deque has pulled together shows that 57% of a site's total number of WCAG issues can be found with automation. Just the fact that you're covering your base side with automation, that goes a very long way, plus manual testing on top of that.
>> Yeah, that's great context. I didn't have any realize it had gotten up that high.
>> Yeah, same.
>> So I pinned her message in Q&A if anybody wants to check that out.
We have plenty more questions to get into and plenty of time so I'll keep rolling.
 
 
>> That's great.
>> Here's our next question. How did you track the violations and resolution of them and generate the accompanying charts?
>> Yeah. Originally it was very crude, actually. We used Jenkins for our build pipeline and Jenkins has a built in charting mechanism. It's not the most beautiful chart you've ever seen and it could only do page level granularity, not Axe violation level granularity. But originally our charts were just through that because it was very easy to hook in. After that we switched over to a more specialized third-party charting application that was already in use inside the company for actually charting some traffic metrics and things like that. We really wanted a line chart and a lot of the analytics stuff is very good for line charts. So that was kind of it. One thing that was kind of tricky to get all of that information displaying properly is we were running Axe on different pages as we were testing them and each individual page could have multiple Axe violations. You could have a page with 10, you could have a page with one. And initially the way a lot of our reports worked was each page just got marked as in violation or not. We found that wasn't completely accurate to what we were actually addressing because we were focusing on trying to solve the most Axe violations, not fix the most screens, necessarily.
And so we did adjust our reporting to do that. And basically we'd compile out every Axe issue per page and then we would go and tally that up and just at the end of every release build, go ahead and publish that metric so it becomes available.
But I will say as kind of low fidelity as the  Jenkins build output was, it still did the job and it took almost no development work so we had something even at the start so there was nothing wrong with  that.
>> Definitely.
This person says I've used TurboTax for the last 10 years or so and sometimes I see different components/patterns on different pages. For example, one link opens a modal, another opens side draw. Is this because of remediation at work?
>> That's a good question. And I think probably the best answer is that the links work that way just because of different features that were added over time, not necessarily a lot of decisions being made right at this moment. I don't know if you would add anything, Kendall, to that.
>> No, like it's definitely we have a design system we're trying to get everyone on to the design system, but there's a few different pages that are just a little more legacy that we're still trying to update. That's great feedback. We still have trying to update.
>> I will say like any piece of software, there are pieces of a product that may not use our design system. Like some of the states haven't necessarily been rolled over to that since it's an ongoing process almost to keep everything up to date. You may encounter things on there and people are still focusing on accessibility for those areas, even if they're a legacy system, so it's still an okay experience. If you have issues, definitely report  them.
>> Yes, absolutely.
>> things are always in motion with software, right?
>> Right.
>> Can you speak a little bit more to how you're using story book?
This question is asking are you able to check accessibility in story book using Axe.
>> Yeah. yes, we create components. We will put a component in story book by itself, say a button, so we have a bunch of stories just related to buttons and we're really easy to test keyboard interaction. We utilize one of the plug ins, the Axe plug in on story book to check it's meeting those requirements and it's been a great tool to get that component by itself on a page and be able to really focus right there and see what that component's creating issues for, making sure that's robust enough for a lot of spaces on platform.
>> To build on top of that, the way we actually run the tests is this project that I mentioned a couple slides previous called proof, and it's an open source test runner that we built. We build all of our components in story book and we test them with Axe using the Axe plug in in story book and then we test it on Axe on CI using proof. And when the components actually make it into the products, that's when our experts are test Suites going on top and running Axe again and making sure on the integration level once a page and built, we know the button is accessible on its own but once a page and built with it, is it still accessible and is everything labeled properly and things like that. There's multiple levels to it but story book is how we've been able to document component level changes.
>> Fantastic. And just for our audience, you did provide a link in the SlideDeck to the proof open source is that correct?
>> Yeah, it's in the last slide in the provided slides.
>> Awesome. All right. I think we have time for one more question before we wrap. So you did mention if people run into bugs or issues that they should report them. This question is just asking what are some common bugs that users might report.
>> Yeah, that's a great question. Do you want to give an example of something you're working on recently, Kendall?
  
  
>> Yeah, a recent bug that came up was actually [Away from mic] it was using our progress bar. We have a progress bar in TurboTax and we have it interactive so you can go down and see buttons inside the progress bar. And it was at the time very hard to interact  with. We had a progress bar that made it ARIA read only. So we had to script the progress bar from role progress, rethink it, make it still sound to a screen reader like a progress bar because it really was a progress bar but still allow it to be interactive and not read only. So being able to utilize ARIA labels and get those how many steps are in the progress bar and be able to showcase that without using an ARIA rule, that was one of the recent bugs, I guess.
 
>> Excellent. Thank you for that example,  Kendall. And at that we are at time. Tyler and Kendall thank you so much for your awesome presentation. Thank you so much to our audience who has stuck around for our final but definitely not least presentation of the evening. I just put a link in chat to fill out a survey. We're raveling off a thousand of these tees so please do fill out the survey about how Axe-Con went for you and we will be greatly appreciative of that. Thanks again to our presenters, our interpreter  Daniel, wonderful. All right, everybody have a wonderful rest of your evening. Thank you so much.
 
Link to a11y snippets: https://marketplace.visualstudio.com/items?itemName=accessibility-snippets.accessibility-snippets&ssr=false
 
Link to A New Intuit Open Source Release Medium Article: https://medium.com/intuit-engineering/accessibility-snippets-ea82a7439bbe
 
Here's more information on Intuit's accessibility champion program: http://www.last-child.com/intuits-accessibility-champion-program/ and https://www.intuit.com/blog/technology/lessons-learned-from-an-intuit-accessibility-champion/
 
Did u see Deque big data research shows 57% of a site's totally # of WCAG issues can be found w axe-core? & the 15 most frequently failed SC account for over 94% of all issues found? link to my pres https://axe-con.com/event/accessibility-testing-coverage-automation-and-intelligent-guided-testing/
 
This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

Profile

amazonv: (Default)
amazonv

December 2025

S M T W T F S
 123456
789101112 13
14 15 16 17 18 19 20
21 22 2324252627
282930 31   

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 10th, 2026 12:08 am
Powered by Dreamwidth Studios