Auditing Design Systems for Accessibility
Mar. 10th, 2021 03:22 pmHello to folks who are just joining us. This is Laura Gosselin from Deque. We are going to get started in a few minutes so sit tight. Looks like we have got folks joining us from the Netherlands, Baltimore, hi to folks in the UK. Hello and welcome to everyone from around the world. We're super excited to have you with us. We'll get started in a few minutes.
>> Okay it's now 4:30 Eastern Time so we are going to go ahead and get started. Hi everybody, my name is Laura Goslin. I work at Deque and I'm going to be moderating today's session, auditing design systems for accessibility brought to you by Anna Cook. She's a senior product designer. I'm going to take care of a few housekeeping things. Just a reminder, the slides are at the bottom of today's event session page if you would like to follow along with some accessible slides. Second, we have captions underneath the video for today. And then third, we are going to try to save the last 10 minutes or so for questions so feel free to put those in the SLIDO Q&A feature and I'll be sure to ask those to Anna. With that, I'm going to go ahead and pass it over to Anna to kick us off.
>> Hi everyone and welcome to auditing design systems for accessibility. My name is Anna emptCook and I'm so happy to join you virtually today. I I couldn't be more honored to join this group of speakers. I'm so excited to get started but first I'll tell you just a little bit about myself. As I mentioned my name is Anna. I use the pronouns she/her or they them. As senior product designer at recurly. As product designer my job is to design applications and websites, test those designs with users and prepare the most robust systems to be implemented in code. I'm a student at the Atlas Institute of CU boulder. In my free time I'm an artist, gamer and writer. I have two adorable cats named Phoenix and onyx shown here. You may see or hear one or both of them running around in my background today. They like to speak when I speak sometimes so just be aware. What are we going to focus on today?
Over the past 8 years of my career learning about inclusive design has been fettered with blockers and educational gaps. It's my intention to help unblock designers, and tech folks who want to -- especially my fellow designers about where we need to come in for accessibility considerations. But since we can't cover it all in 50 minutes I will focus on what I believe to be the most Pivotal accessibility works -- in particular we'll focus on learning how to find accessibility issues in your existing design system and documenting those issues in a way that makes them actionable for you and your team. Let's start with a little background on design systems. What exactly is a design system?
A design system is a single source of truth for our product, site or experience. These include a shared purpose and a set of values for design/development organization as well as brand elements and component pieces. Design systems group related implementation elements together, define their purpose and clarify their therefore. When done correctly a design system should evolve and scale with an organization. These systems are important because they enable us to work at scale, do something once and apply it everywhere and establish a more consistent brand but most importantly to our conversation today, a design system is essential because it scales accessibility as well. In this session's description I mentioned that each atom, molecule and organism we create in our system affects our ability to scale upward and meet disabled people's needs. My mention of atoms, molecules and organisms is a direct reference to Brad Frost's atomic design. When atomic design was released a little over four years ago it marked a Pivotal shift in design thinking in digital landscapes. Though many teams had already begun to -- blitz wrap, atomic design had -- digital design needed better systems. Four years later Brad Frost's book is still quoted regularly when design teams make the case for design systems. For our session we can use this methodology to help inform how to design accessibility at scale and audit our systems for accessibility compliance. In atomic design Brad Frost breaks down the scale of design systems using anatomic approach. Our -- elements such as atoms which can be combined to create molecules and can be combined to form organisms. These organisms form and scale to create templates that can then apply to specific pages in our platforms. Each team has a slightly different understanding or jargon around this scale but these elements tend to carry consistently through most digital design systems. You don't have to have a robust or perfect design system to use this methodology or audit for accessibility. What we are going to talk about today can be applied at any point in your design system's development regardless of whether your team is just starting out or already has a well documented usable system. Let's start by breaking down how atomic design tends to scale. I will create examples of interfaces similar to those I've audited at recurly with my team. Before I move to atoms, let's talk about subatomic particles. Bear with me, I'm not a scientist and I'm definitely stretching this atomic design analogy just a little bit but essentially these are the foundational elements that carry up to our design system. They aren't atoms but they inform the creation of atoms. These subatomic particles tend to be things like brand colors or typefaces and while these elements alone may not seem the most important for our system, they will affect all of the elements we create from our smallest scale to our full blown pages. I'll ask you to keep subTomics in the back of your mind as we move forward. Moving into our atomic scale. Atoms demonstrate all the base items at a glance which can be a helpful references to come back to as you develop and maintain your design system. Some systems have greater granularity than others so it's important to be mindful that words like atoms in this context can sometimes mean different things to different people. For our purposes atoms are elements such as form labels, inputs and buttons and other things that can't be broken out any further without ceasing to be functional. Using our example UI, I've put together an example -- but like atoms in the natural world, interface atoms don't exist in a vacuum and only really come to life with application. In our next step in the atomic scale, molecules are relatively simple groups of UI elements or atoms functioning together as a unit. For example, a form label, dropdown selector and save button can now be a functional form field event. When combined these abstract items suddenly have a purpose. Now we can create a purposeful interface by combining these elements. Selecting the button atom now saves the dropdown input and the label indicates it is purpose of the input. That can be dropped anywhere a dropdown input or save form is needed. Scaling up again, organisms are relatively complex UI components composed of groups of molecules and/or atoms and even other organisms. These organisms form distinct sections of an interface. Building -- provides designers and developers with an essential sense of context. Organisms demonstrate those smaller simpler components in action and serve as distinct patterns that can be used repeatedly. For our purposes we've taken our dropdown/save form and applied it to a specific context, in this case a currency selection. This organism is also made up of other atoms and molecules and serves a more specific purpose. Moving away from our analogy, we can use our atoms, molecules and organisms to create templates and pages. Templates are page level objects that place components into a layout and articulate the design's underlying content structure. If you find you're reusing the structure of a page for a lot of different applications, you can use that template for other pages. A template displays all the necessary page components functioning together, providing context for our relatively abstract molecules and organisms. By defining a page's template we can create a system that can account for a variety of dynamic content all while providing needed guardrails for the types of content that populate certain design patterns. finally, pages are specific instances of templates that show what an experience looks like with real representative content in place. In addition to demonstrating the final interface as your users will see it, pages are essential for the effectiveness of underlying design system elements. It is at the page level that we're able to take a look at how all the patterns hold up when real content is applied to the design system. We've taken our atoms, put them into a molecule, bundled the atoms and molecules to create an organism and applied them to a page or template. These are the atomic design principles that have made our design systems thrive. A well scaled design system can make it infinitely easier to ideate around a user's need and make your workflow highly effective. Pretty easy, right?
Wait!
Hold up!
Atomic design's excellent, don't get me wrong. Did some of you notice some of the issues we -- when we created this example interface we weren't thinking about accessibility. We were just thinking about the design system. And doing this can have implications on the entire system. So let's take a step back to understand some of those implications. I've taken our example page from earlier and marked it up with a set of things I've noticed right off the bat that need attention. Because we use these atoms, molecules and organisms without thinking about accessibility, we have got elements with accessibility concerns and issues baked in all over the page. We have got multiple elements on the page with too little contrast, a lack of clarity about what form fields are required, no clarity on -- there could be more than this list alone but we can notice these accessibility issues on our system when we start to audit them. Auditing both pages can be a little overwhelming especially when issues are related to a component's inherent design. This initial sweep of this page has shown us a handful of issues, many of them related to others. How about we break this back down using our atomic approach and focusing on one of these atoms. I'm going to take us back to the very beginning, before we made a molecule we had our atoms. But I'm not going to -- page's accessibility by beginning with one atom, our button. Our example button design may have a few shortcomings but to keep it simple, let's focus on how it didn't meet color contrast compliance. This relates back not only to our atom but our subTomics and our textiles and color options. This is part of why I mentioned to keep that in the back of your mind. At the very beginning we talked about the foundational elements and how they can affect these designs. Now we can see the ramifications of using those foundations to create atoms without accessibility in mind. For example, our button here has used a green color in its background and white text on top. Our text is 14 pixels and semi-bold weight. We test this using the stark plug in or any color contrast checker, we can see the button doesn't meet color contrast requirements. This combination of colors has a ratio of 3.21, but with a 14 pixel semi-bold font style, the contrast needs to be 4.51 to pass the minimum required level. As we discussed, this combination of colors is going to affect more than this component. We know there's an issue with the component and we know it's going to affect others because its problem is foundational. This is where our auditing really starts to come into play. Such issues like the ones we've encountered are why design needs to be thinking about accessibility, particularly with design systems. There's a misconception in design communities that accessibility tends to be mostly developers' responsibility but developers can't fix design issues when they're design centric. In fact, a Deque case study from last year found that 67% of accessibility issues originate in design. If fixing accessibility issues can amplify time and money spent if not addressed early and often. No pressure but there's a lot of responsibility on our shoulders to create accessible designs. Prioritizing accessibility in each system can save a lot of time and money. When applied in a broad scale through our design system, things like strong color contrast can cause sweeping accessibility improvement in our products. Enough about why. Let's talk about how. The easiest time to review your designs for accessibility compliance is before they are developed but that's not always how it goes. Many of us work on design systems with a combination of live components, inflight components and even components that are live but undocumented in our design files. The starting steps of an accessibility audit for your design system look the same as a standard component audit. We want to review our components which are being most actively used to create designs and document what exists. We don't have to start thinking about accessibility just yet. The intent will be to document what exists and gather elements with similar purposes together.
For our purposes, I'll focus on some example buttons for our audit. Your team may want to do a holistic design -- you may find that existing design components include atoms, molecules and organisms. It's important to capture all of these as well as noting their intended purposes. Now, you don't have to write up full documentation or anything like that just yet. Just a note will do. In a screenshot I've included here, we have got a set of buttons from our recurly experience as well as a note about their intended purposes. As a call out, we've noted the intended purposes are not based on things such as colors like green or blue but instead purposes like primary call to action or destructive state. Similarly, we are going to want to capture all live components. That is to say all components that exist in any existing pattern libraries such as what I have got pulled up from story book as well as components that may not exist in your design files or pattern libraries. This should also include such things as different interaction states like hover or focus. Again, the purpose of capturing these components is to audit the system itself first and what exists. We don't need to start looking specifically at accessibility issues in design or in code until we have got a solid frame of reference for what users are currently interacting with. The design system audit itself can take a little time and it's not always going to be perfect so keep that in mind. Our team member recurly went about -- via screenshot and then uploading it into an inventory file in thinking mawith the date, time and location. Each item that was captured was placed in a page named by purpose. In the screenshot here, I'm on the button page. What I've captured here looks pretty similar to what we had in our source design but in other cases we found variants of components that we didn't know existed or didn't have in our design system libraries yet. When we started capturing what is live we can cross-reference this with our design team and start to see where gaps exist. You might be thinking just auditing your design system isn't going to help with accessibility but as I mentioned earlier the best starting point to scale accessibility is within an effective design system. We need our design system to be audited and scaled and be powered to scale accessibility. These practices should go hand in hand. Since everything we discussed -- directly into auditing accessibility, let's talk about what we need to be reviewing for accessibility compliance as well as how.
What should we be auditing design-wise?
Well, as we talked about earlier 67% of accessibility issues start in design. So maybe more than we realized. There are many items outlined in the Web Content Accessibility Guidelines and many of them relate to design just as much as they do development. Even though our earlier example was focused on a common example of accessibility in digital design, there's a lot more than color contrast that we need to keep in mind. Some other accessibility requirements that need to be outlined in design include color usage, content strategy, heading structure, link behaviors, hover and focus states, form behaviors, and everything else on this list as well as more. When we got a set of components outlined by purpose in Figma, it's easier for us to look past UI and see into the other accessibility needs. Now we can look at our atoms, molecules and organisms according to their purpose and styling, which will help us address more than the surface issues because while UIAccessibility does matter, one of the reasons designers tend to think our responsibility to accessibility is focused mostly on color and type, it's because we don't realize how closely tied UX is to accessibility. So let's look at some examples of how we can audit past our styling alone. This example alert is designed to have a set of different types of messages: Success, warning, information and error. There are a couple of UX items we can investigate with this alert. For example, we can ask ourselves should these icons have alternative text or Alt text?
Let's say these icons don't currently have Alt text but we think they should because we don't want this icon to qua-- we want this icon to convey an alert tone to users who may not see the color or may not see the icon. So in our accessibility audit, we can note there's no Alt text being used for these icons and relate it to world content accessibility guideline 1.1.1, non-text content. If we're using these icons elsewhere with similar purposes this note in our audited will help us scale this UX element to many different instances. Or let's say our error alert shows multiple errors in one alert container. We will need to ask if this alert is clearly identifying what errors need to be fixed, where the errors are and how they can be fixed. If our alert doesn't do these things, we can note this issue and relate it to WCAG3.3.1, error identification. These are some of the questions that we as designers can and should be asking ourselves when looking at our existing components and building new ones. If we had these components lined up together and a reference for their context, it can be easier for us to understand potential accessibility gaps and add them to our audit. And by the way, this is why I tend to talk about accessibility the same way I talk about design in general. You may have noticed that as I talk through some of these items we would want to audit on a component like this, that it felt quite similar to your standard design system thinking. That's because it is. Accessibility is just a form of design that requires discussion and testing the way all design does. It's about making sure our designs are accessible with disabled people's needs. When you're auditing for accessibility, you can use WCAG as a guide into user feedback to see what needs -- another thing about our design system audit is we should be reviewing both designs and code for accessibility. Now, I know some designers may cringe at the idea of reviewing code for accessibility issues, but we don't have to be developers to audit code. There are many code that enable us to audit code on pages or specific components. And since we're here at Axe-Con, I'll call out the axe plug in. Using this tool, you can run both automated tests as well as guided manual tests to do a thorough review of your design code. At this point I've talked a lot about what a design system is and what to look for in a design system accessibility audit but I have not given you insight on how to document an audit. I know I've alluded to adding things to our audit a few times, but what does that actually look like?
Let me show you. By the way, this is where I diverge from my script so you'll start to hear more ums as I go through this. I'm going to open up links here and talk through what these are and put my glasses on because I'm looking more closely. There's my mouse. I have got three things I've pulled up here. The first is the ant design system. I pulled this up because it's a live component system that I can access and show you in context. We'll talk a little bit about this one in a moment. The next thing I pulled up is a spreadsheet, a Google sheet in this case and that is where I'm going to put my [Away from mic] information. We have got a couple of things to keep in mind. Any accessibility audit, regardless of whether it's your design system, code, page or anything like that, requires you to start by building a background. And by background, let me explain what that means. Essentially what I've outlined here and I've included it at a high-level. You may find you want more or less detail. Essentially it's who's reviewing it, me, a summary of what we're reviewing as well as what level of accessibility we're looking to meet. That is for our purposes today, WCAG2.1. Scope of review, in our case it's the button in the ant design system. URL or set of URLs. Time line is just today so we only have today but you may find it's a span of time. And then the review process, outlining that so people understand exactly what went into the process, especially if they are going to be looking at this later and going, oh, okay, so this is what Anna did when she was reviewing this, these are the pages she reviewed, this is why this button looks this way and not that way. The background is super important even though you're not personally referencing it a lot just for the sake of knowing when it was taken and how. The last thing I'll mention before I actually show you the audit, I'm going to talk about WCAG. I personally am a huge fan when I'm auditing of having WCAG's quick reference guide. I like this tool a lot because you have got your side bar navigation with a bunch of anchor links and you can jump between sections and link to specific items if you wanted to talk about them specifically. So we'll talk about that in a moment and what that means, but I like this tool a lot. It's also super helpful when I'm doing manual auditing and I need to remember certain logistics and details like what is -- gosh, what is a good way to use color, for example. Super helpful tool for any audit. We may not use it too much today but I always have it open when I'm auditing. And I use it pretty much three or four times a day anyway. Taking a step back.
We have got what we're auditing today. It's ant design system. What I'm going to do here is walk you through my process. Again, it's not a perfect process. Feel free to adjust and tinker with it as you see fit. I'm going to use X to get myself started. I'm in Chrome. I'm going to select inspect. My inspector opens and I'm going to select the axe dev tools. Maybe I'm not signed in. It should be signed in. Maybe not. It wouldn't be a conference if I didn't have some technical difficulties, of course. There we are. Perfect!
Going back so I can talk through that before I got flustered. I got my inspect, selected Axe, for our component audit we are going to select scan part of my page. Axe allows you to do all of the page or part of the page, but since we're only reviewing a button component we don't need to scan the full page. We could scan the full page but we could get flagged for navigation issues or things like that. So just focusing on one part of the page is enough for us. I'm going to select what part of the page I want to audit and click scan.
I have five automated issues that pull up. All of them are review issues, which means Axe doesn't know if they need me to make sure they are real issues because of some way it's been coded or how it appears but we have five issues all of them related to color contrast. Axe will find things like lack of landmarks or tab index or any myriad accessibility issues but for our purposes we'll focus on color contrast. I'm going to highlight specifically the first issue that comes up and that's going to be our primary button. So Axe will highlight exactly what it's talking about and show me exactly what it means. Since I've already done this, I already know this issue does exist so I'm going to save the result and come back.
Say it is an issue so we can talk about what that means. There's a few things that are really helpful here. I've saved the issue. If I wanted to I could use Axe to export that issue as a CSD or J son files. Highlighting the issue again, Axe will tell me what exactly the issue is by selecting more info and I can find out exactly what it's being flagged for, what WCAG item is related to that and the user impact of the issue. Now, again, I know this doesn't pass color contrast because I double checked earlier, but let's talk about that. How do we enter that into the spreadsheet?
We have got a few things. And again, Axe is going to allow you to export an issue with a lot more than this if you want, but here's the basic elements I tend to use in an audit. We have got our component, the primary button. The WCAG principle it's related to that is perceivable of our principles. The WCAG guideline that it relates to, 1.4.3, contrast minimum, level AA. I've also linked to that quick reference guide here so my stakeholder can understand exactly what I'm referring to with this audit. Additionally, I've described the issue in detail and why it exists for our stakeholders. So in this case WCAG 2.1 level acquires a contrast ratio of 4.51 for normal text. The text on the primary button is 14 pixels with a normal weight, meaning it doesn't pass color contrast compliance because the foreground color is white, the background is light blue and that means the ratio is only 4 at any time 24. The -- detail as necessary and it pairs nicely with the recommendation. This one's more what I tend to include but it just helps stakeholders understand how actionable an item is. So I'll say increase the contrast between foreground and background. Now, you can do this in a set of different ways and it really depends on what the design team wants to do but you know that's what you need to do. I've also got impact. I flagged this earlier because Axe lets you see the user impact. Impact is a little bit of a sliding scale in terms of perception. Axe flags it according to user impact. You may choose to flag impact according to business or fiscal reasoning or how often a component appears in your application or any combination of those things along with user impact. It's really up to you, your organization and your team. For our purposes I marked our impact as serious. There's also critical, moderate, minor, best practices and unknown. Unknown being things like is this the right use of aurya, let me make sure with my team. I want to be sure. I've also included the URL and the date it was created. Almost exactly the day. I'm off by about two minutes. Essentially I would go through every issue, both automated and manual, add each issue as a row item in the spreadsheet, and document that in detail. Again, Axe would also export all of these issues for you with more detail if you would like. I have found that to be helpful in the past. Doing it manually like this is also okay. What matters most is that you're creating a system that can be used and referenced later. So that's a high-level of what our audit looks like. It's obviously a lot to take in. I have a few things to call out with my approach before I wrap up. First off, I would love to talk about it all day. I'm sure there are many places where I can help share what my thinking is around this, but I shared this today because I want you to see my process and adapt it. In fact, I encourage you to diverge from my approach to find auditing that works best for you and your team. What matters most is that you're trying to document issues in a way that makes them actionable and so that your organization or team can act accordingly. If you're using Axe, you can export those findings as I'm showing here and fine tune the details based on your organization's needs. Secondly, I want to mention that when you're sharing results I recommend grouping themes of commonly occurring items together. This touches back on our atomic scale. If that primary button blue contrast ratio issue was occurring on a lot of different items, what I could do is essentially say this is related to these set of issues. If we just change the hex code related to the token or adjust it just a little bit or changed the font size a little bit for all of these, then we could fix this issue. So grouping and -- especially for your design system audit is super, super important because it means you're going to be able to have a much higher impact without having to do a lot of fine tuning in a lot of different places. It's more of like that power of design system right in one spot. A big change with a little amount of effort having a huge impact. So we're looking at that green color we had earlier and that white color combination. These are some items related to that in that audit at recurly. You can see how easy it is now when looking at them together to fix them together. Another mention I need to make here is that you will likely find you don't capture everything that needs to be captured when auditing your design system alone. And really it's because we tend to work in complex systems and a design system's all about intent. There's always going to be circumstances where what you capture in your audit and fix doesn't fix your design system's application. So maybe your atoms are amazing and accessible but your organisms need more attention because they're more unique to specific circumstances. So this is part of why I went through our atomic scale, to help us understand that there's a difference between intent and application. Starting with the design system's going to get us super far but it's not going to fix everything on your live site necessarily because the application is different. Okay, so let's say you finish your audit or part of the audit. What should come next?
Yes, that spreadsheet with a list of items and specific relation to those items being itemized is going to be super essential, but we want to empower our teams with those findings. Getting handed a spreadsheet with tens of possibly hundreds of items absolutely helps when going to fix something but it's going to be intimidating if you hand a stakeholder a spreadsheet and go here you go. What do we do with this?
Here's an example of an audit share I did with a client where I outlined the most commonly issued items and themes as well as the high impact items. So when we wrap up our audit we can make sure we present everything we discovered but at a high-level. That is we need to outline what is most critical to fix and key items, then share an impact framework to help identify which issues can be fixed most quickly with the highest impact. Putting these together along with our complete audit means we can share results in a digestible way and work with shareholders to prioritize improvements. No matter how comfortable you are with accessibility, I would encourage you to stay curious. Many designers are not trained in accessibility and that's on educational institutions, not you. The fact that you're interested and learning is a huge step so if you're starting out, download apps, open WCAG's quick reference and play around, document things you're finding, ask yourself questions, ask your team questions. Just playing around with these tools and scanning these guidelines and being curious is going to take you so far. Honestly, much of my accessibility knowledge has come from those practices.
To come to time for questions, I want to mention one last thing here. One of the most essential aspects of accessible design is designing with disabled people instead of for disabled people. WCAG is written with and by disabled people. So it's great to use in an audit, but is it everything?
No. Like all design, accessibility requires us to talk to people and learn about their unique circumstances, especially when using our sites and products. While I can't advise a perfect inclusive research strategy with the time we have left, I want to call this out because I don't want anyone to think that an audit is everything about accessible design. It's a key component but we need to continually hear and respect the needs of disabled users and people.
I encourage you to take the insights from your audits, find ways to ask users what they think, test your ideas, and pay particular attention with your marginalized users. Last and certainly not least, I would like to give a shout out to my team at recurly for helping me with this presentation. One of the things our product team believes is striving for accessibility should be about innovating. We hope by sharing this process with you we can empower your teams to make amazing, inclusive and innovative work. With that, I'm ready to open our time for questions.
>> Awesome. Really great job, Anna. You had some amazing content in there. And the chat is just loving what you presented so far. And because of that we have some questions for you.
>> Great!
>> Let me pick out one here. So how do you and your design team define interaction expectations when the components could be used in a multitude of different use cases and contexts?
>> So breaking down those interactive explorations and context, I think it's all about what that context does for our users. So recently we went through the process of defining a new component in our system and we needed to define different contexts for that and those interactions. We would be specific about things particularly when it came to different ways of communicating information. That is to say -- let's go back to our alert, for example. We would want to define how the interactive -- how it would be interactive when it came to what kind of message was there, for example. We would define the core elements of the component and then diverge as necessary according to the variations of that component. So let's say we have our error component versus a success component. Should we be using aurya polite or aurya aggressive I think is the word I'm looking for here. So we would diverge as necessary to help define that context for each component. And then each component's going to have some variation or specifications according to that context. We'll also find that a component will have one sole element and we can treat it a certain way like how should we treat our close button on the alert component, for example. That can be defined once instead of diverged.
>> Perfect. Great answer. Another question here. I think specific just to design and accessibility. How do you make disabled buttons accessible?
They usually look faded and the colors do not pass color contrast.
>> This is a really good question. And it's like the eternal conversation with accessibility in design. Because technically your disable buttons, according to WCAG, disable buttons don't have to meet color contrast compliance. However, disabled states have a myriad of usability issues in and of themselves. Yes, you can use disabled states and have them have lower contrast, but what are the ramifications of using that disabled state?
Why are we using it?
If it's for -- it's always a question of are we using this because it's easier?
Are we showing users they can use this eventually?
Do they know how to use this eventually?
So the short answer is technically you can use disabled states. The long answer is disabled states have a huge set of usability concerns that aren't necessarily captured in WCAG and so as a general rule you want to avoid using them if you don't have to use them.
>> Great. Someone is asking if you could expand on how you prioritize which issues to fix first.
>> Yeah, absolutely. So a couple of things I will look at when doing an audit. When I'm doing an accessibility, especially a component audit, I tend to look at core elements first. That is items I know are going to be occurring throughout an experience most commonly. That's things like our links, heading structure, our buttons, our form fields. Those core elements that I know are going to be reused the most regularly or I'm seeing being reused are ones I will reference as higher impact. I also will look at things like if I'm seeing -- like that color contrast example for example, if it's being used in a lot of different places, I'll flag that issue and make it a higher impact issue because I know it's easier for us to fix while having a huge swath of things that are being addressed at once. So I'll look at it that way as well. There's also things like is this blocking your user from doing an essential task. So like higher impact items definitely, I will label something as higher impact, excuse me, when it's like this form field has to be input to register for our site, so we can't have the label be wrong or can't have it not read to screen readers because otherwise the person doesn't know what's happening. So I'll look at those things as well. Additionally, I will also, when assessing impact, will look at -- I'm losing my train of thought, excuse me -- what level of compliance we're wanting to meet. If my organization or an organization I'm working with wants to meet AA compliance and that's their threshold, I will tend to log things that are AAA compliance issues as best practices or minor issues just because I still want them to know that they would be better to do things that way. Those are things I tend to look for most. And again, like if you find that registration form field, for example, that's going to have huge fiscal impacts, user impacts, business impacts, that means it's going to be critical to fix something like that. So it comes down to what you are assessing as high impact in terms of your user experience in general too. And most importantly, if I get the chance to talk to users, I'm listening to what they have to say. For example, I had a set of users talk about the date picker as an issue for a component recently and that's notoriously tricky to use so I put that as a critical issue. Screen reader users, excuse me, I should have been clear there.
>> That was a really great answer. This is another good question. Do you include leadership buy-in when you're choosing that prioritization to correct what you find on the audit before doing it?
>> Yes, usually what I will do is put the spreadsheet together. I'll make sure leadership understands what I found and what I'm flagging as the most critical and talk to them about what has been found and how they want to assess that priority. Because I am an individual contributor so my insights on to certain things might be different than someone who's at C-Suite or leadership. So my hope -- I always want to make sure they're bought in, not just because I have to, but because it helps them learn and it helps them ingrain this in their practice going forward.
>> Yeah, that's a really, really good point. Kind of switching gears here, someone is asking how often do you audit your product?
>> That's a great question because I think auditing the way I described it today, like a full system audit, you don't necessarily find yourself doing that all the time, like fully. But auditing components or auditing pieces or auditing pages, that's happening all the time. And it should be because that's how accessibility should be. Anytime I'm looking at an existing component and asking how it's built, I'm looking to make sure auditing it, both automated and manual auditing. If I'm looking at a page and making sure it's screen reader accessible, I'll be auditing that page. Personally, I find I'm doing a lot of auditing even just to answer my own questions as I go through and learn, but I also tend to flag things, especially like if I find there's a recurring issue I'll try to flag it and note it.
>> Great. Thanks for that. Looks like we got a question here. How do you close the loop?
After your audit is done and you find the defects, how do you make sure that the engineering team is fixing those, and that they're actually correctly fixing those?
>> That's a great question. It depends on how your team is structured, I think. You might have a design system team that is nuclear or like a single team that's focusing on it. And you'll be more likely to have a team like that to be really dedicated to making sure. Or you might find you have a spread of Scrum teams within a product organization where each of them is contributing to a system and making sure a component is well built and accessible. I tend to find -- you want to make sure that you're including, you know, clear documentation. Having accessibility considerations in your component documentation's key. I tend to reference IBM's Carbon a lot for this purpose because they do a good job of that. There's a myriad of design systems that do a great job of it. I would reference something like that, making documentation that includes accessibility considerations. Having constant conversations and not to the point of everyone having meetings all day, but having a conversation is huge here because there are things I don't understand. And like the other day I talked to the developer on a project I was working with and asked them about should we be using aurya here, what ARIA should we be using here and what do you think. It's like UX in and of itself. It's going to require conversation to make sure it's being done right and I would usually look at it that way. Also having accessibility in your acceptance criteria if you're building something within a feature, that's super important. Making sure your QA team is aware of it and finding a way to test it yourself or with QA. Those are going to be huge too.
>> Great. And I think that's a really good note to end on. We're also at the end of time. I want to thank you, Anna. You did such a great job. The chat has been super engaged. Anna, you're very active on Twitter so if folks can get their questions answered, it's probably a good idea to follow you there. I want to thank you again and thanks to everybody who joined us here today and hope you have a good rest of your Axe-Con.
>> Thanks everyone.