amazonv: (Default)
[personal profile] amazonv
After the Audit: Integrating Accessibility into the Testing Process
Type: Breakout
Track: Development
Accessibility isn’t a one-time project. It’s an ongoing initiative whose continued success is built on ensuring that existing and future features (and products) remain accessible. This talk discusses how to integrate accessibility into your testing workflow by focusing on realistic changes, along with techniques and approaches for post-accessibility audits.
 
I think we are ready to get started.  Again, if you're just joining, welcome.  If you are looking for the After the Audit:  Integrating Accessibility into the Testing Process session with Crystal Preston-Watson, you're in the right place.  My name is Liz.  I'm from it the Deque team here.   I'll be moderating today's session.
I'm going to take care of a little bit of housekeeping before turning it over to Crystal.  First, just a reminder that today's session is recorded and will be hosted on demand and available for all registrants.  We do have an ASL interpreter, holly.  Thank you for joining us in this session.  If you require live captions for today those should be right below the session page, below this video stream.  And lastly, please enter any questions that you have for Crystal into the Q&A section which is right negotiation is the video stream.  We'll save about ten minutes for Q&A near the end.
And with that, I will go ahead and turn it over to you, Crystal to get started.
>> thank you so much.  So hello, everyone.  As has been repeated this is After the Audit:  Integrating Accessibility into the Testing Process.  I am Crystal Preston-Watson, my pronounces are she/her and I am a quality engineer at Salesforce.  I am wearing an eye patch with a rhinestone X over my right eye and wearing a headband with roses.  And also on the slide there is an illustration of me.  There's no eye patch.  There is just glasses but the illustration is also wearing a rose headband so sort of matching.  Outside of my work at Salesforce, I spend time talking to my cats, Ms. Ed a James.  You have to say the whole thing once.  All out.  You can't just say ETA at that sames.
I also talk to my husband sometimes.  * I perform improv and gaming and hit man is my game of choice right now.  And if you have questions, more questions after the Q&A today, you can find me to ask those on Twitter.  My handle is scopic engineer by email contact@crystal Preston-Watson.Com and my website has every way to contact me at Crystal Preston-Watson.Com.
Okay, so here is the agenda for my talk today.  There are four parts.  Part one in the ends of the audit.  This is a small section on what accessibility audits are, for those who are new fought process.  Part two is learning about up skilling your team.  Part three is techniques and action, kind of things and steps your team can do during testing to kind of support your efforts that took place during audits.   And part four is thoughts and automation which is always fun to talk about as a tester.  So let's get started.
Part one end of the audit.
And there's a quote on the slight which is every new beginning comes from some other beginning's end and that is from semi sonic closing time.  Fun fact that song is actually about a baby being born and not a bar closing.  Who knew?  You learn something besides accessibility and testing today.
So during this first part it's called end of the audit, I want to go further into details that, what happens further -- what happens during an accessibility audit for those who might be new to the process and tuning in.  There's a lot of information to take in at the beginning.  So it's important that we kind of have a comment understanding common baseline of what the things, this is all about.
 
 
So what is an accessibility audit?  Basically it's a way to appraise a site or application, level accessibility against standards and guidelines to identify areas to improve  accessibility.  Most audits will and probably should be a mix of manual testing and also with some assistance from automation tools.  And audit can be conducted via internal team or external consultant like Deque.  But that can be dependent on if you're dealing with things like your budget that you have, the availability of team members and just other factors.
So the phases of an accessibility audit aren't going to look the same for every business and organization.  I've undergone a few audits both as part of an internal team and an organization that has brought in external consultants.   So this is kind of-- this is my perspective on what a typical audit will entail, the typical phases of an audit.   It might look different for other teams.  It might look different for your team.  Butch this is a rough estimate of what a typical audit would be.  So one is review of features to be awlded.  This is important as you're not going to be able to test every single section and feature of your application.
So you want your-- usually the audit is going to focus on the representation of your site or a specific feature.   You're not going to test every single thing.  It's just not possible.  Two is testing and that's going to consistent of manual testing and automated tools and he can tensions.  * after your testing is done, the testing can vary.  It's usually, makers up a good portion of what an audit, how-- how long an audit runs.  But after your testing, you're going to get a-- you're going to get a result, a report of your results.
Sometimes those results will come at the very end of an audit or they'll come you know, on a regular cadence  throughout testing.  I as a personal kind of preference, I like a regular cadence instead of getting one giant, you know, just issues to get started on remediation, which is the next phase.
 
 
These next issues I'm going to talk about four through six, I have a star next to these three points because knees are dependent on the organization and the motivation of the audit.
And with remediated issues, this is where your teams are going to start to triage, organize and can work on issues found during the audit.  And sometimes what ends up happening is that, everything ends up in backlog.  I'm being real.  It's like you get the results and then they all go, uh, kind of get put there.  So if were not items don't end up backlog or step forward, you're going to have retesting validation.  Again this depends on your team and your organization and the motivation of why the audit was conducted.  But this time, you know, the work will be retested that went through a fix and either found to need more work or, you know, the fixes are validated.
And number six is accessibility like performance report, ACR or sometimes it's called a V pad, voluntary product accessibility template.  This is-- that's-- this is going to be typical of an audit that is conducted by external vendor, not all audits, you know, will generate a V pad or ACR.   Some companies just want the issues.
So again as I said this is a quick part.  This sums up  what-- sum up what an accessibility audit is and what you might typically expect to have happen during one.  And I'm going to call back the semi sonic lyric.  The beginning or the audit in is the new beginning to integration into your existing process.  On the screen here is a curved path of stones in a probably a pond or, I guess swamp covered with like weeds.  Because really accessibility audit is a stepping stone.  So that brings us into part two.  Up   skilling your quote.  The quote, introduce you to some new things and upgrade you.  And that is from upgrade you from Beyoncé.  Fun fact, I don't have one because it's Beyoncé, and you probably already know everything, so yeah.
So an audit is a stepping stone.  And depending on what you do, it the a stone that also has momentum.  Bear with me because I'm not a physicist.  So momentum is mass in motion.   Momentum equals mass times velocity.  So to sound further, if you have an object, and that object is in moving or-- isn't moving or at rest it has zero momentum.  Let's do an example.  If I have one hand or, you know, one hand I have a stone and the other hand I have a feather.  If I throw this feather up in the air and leave the stone just in my hand, that feather has more momentum though the stone has more mass.
Ing so if you have an audit and you left it rest, you have no momentum.  The issues might end up getting fixed.  What about 9 new features a what do we do to keep on top of success I recall.  We use momentum the audit that happened to integrate accessibility and keep it a priority in the regular testing process.  And I'm going to go over a few things that you can do and a few actions that, you know, you can weigh to give velocity to your audit within your testing team.  *
 
 
So first off, you want to establish a baseline.  On the screen next to establish a baseline text is a very cute image of a cactus with like bubble googly eyes reading a book.  But the cactus cannot read.  Establish a baseline of knowledge.  It's important that everyone has a shared language when it comes to understanding and talking about what accessibility is.  So everyone can communicate on why and how to go through testing.  You know, you want to encourage teams to learn through courses, attend conferences like this one and offering team training on core  fundamentals.
Secondly you want to play the people's strengths-- the image next to this point is a cute little cat that is staring off in the distance.  But the shadow its casting is of the  lionments because this is how my cats think of themselves.   They think they ever lions.  I'm not-- you want to empower your members, the members of your team to make a difference.   And that is about finding people for the task that fit within their experience and their skill.  If someone's apprehensive about testing using a screen reader, while they're kind of learning, while they're learning about that skill you can have that person be a champion of keyboard testing and teaching that to others.
So when people feel and know they're part of a solution, the more they are involved in the process in the -- in making integration a success.  Continuing on, open and positive environment.  And to the side of this picture is an image of like side-- evergreen trees, large sky, reflection of the trees on the lake and multimountainses in the backgrounds.   If you're doing a goods job, you're going to ends up with an open one as well.  Encourage people to ask questions that may make them feel vulnerable.  Something is I've noticed when I've led accessibility testing initiatives is that people get really ashamed to tell me the struggles they have with learning a screen reader.
And you know, and I I understand that because I'm someone that does use a screen reader.  They're like, oh, no.  I can't tell you.  She's going to think I'm a bad person.  And I, you know, one thing I really want to make them understand is that, this is-- accessibility testing is not something you can learn overnight.  Just like with anything else, you don't learn other testing techniques and methodologies overnight.  It's okay to struggle with this as you're learning.  I mean, be I still struggle with like my screen reader.  I use talk back and there are new features added and I'm having to go through that and kind of relearn things and teach myself about these new features.  So you know, I'll not an expert.  And you can't expect anyone to be an expert super fast.  And it's really, it's a good thing to encourage people on your teams that, hey, it's good to take your time.  You'll-- we'll get there.
And the last point on the side is, up skill the whole organization.  Now this is one of my favorite * images.  I know they're decorative and they're cool and I want to know about them.  So it is a kangaroo with boxing gloves who's like looking like assumer-- yeah, I did this kind of attitude with a human boxer laying on the mat, kind of in defeat.
So it's not unusual for testing and QA to become the champions of accessibility for businesses and organizations.   But for accessibility to be the true priority and be done with quality, it has to be something that the whole organization is taking up and is concerned with.  So I'm, you know, though I'm talking just kind of in the context of testing departments and teams, this is something that  really, the whole organization needs to take on if it truly is going to be a success.  You can't-- otherwise it becomes this nice to have last, you know, the last thing you think of.  And it just-- you know, it becomes not done well.  And it puts a burden on the testing team to be the only kind  of-- the only people who are in charge with ensuring  accessibility.  And that's not -- that doesn't make quality accessibility initiative in anyway.
So let's go onto part three.  Techniques in action.
 
 
So there's a quote on this slide, and it says, you don't have to speak.  Just seek and peep the technique.  And this is from Eric BRakim, don't sweat the technique.  This song was in the new Tom and Jerry movie.  That's what I was told when I was looking up fun facts for the song.  And yeah, all right.  I have not seen that movie so you can tell me where they use it if you know.  I would like to know what it's the background for.
Sorry.  Okay.  So every little step you make.  Each step keep up the momentum to team and organizational -- those things are checklist, keyboard testing and automated tools.   These are the techniques you keep with- each of these steps are-- each of these steps is really a kind of steps to a big idea of integrating and making accessibility a priority.
So on the slide we have accessibility checklist and the image to go with this is two people who are really excited about their checklist.  One is holding a checklist that is all marked off and the other is holding a computer.  It's all these images on the slide are illustrations.
But when it comes to checklist, I know there are some thoughts because I know some people are probably right now going, hmm, checklist?  Really?  Okay, I am not -- I'm not a fan of checklist accessibility at all.  And this is not what I'm suggesting.  But I read an article by Karl groves who is fabulous.  And it gave me a different perspective on how checklist can be used by a team that's new to accessibility testing.
And one of the points that Karl drives is checklist quality is critical.  So the way you can use a checklist is that you know, make sure they're not static, they're not something your team will be like, I did this, I did this, I did this other thing.  But you're using asry regular review as up skilling of your team, going through the checklist, making sure that they're adapting to that particular feature, that particular page of a website.  Checklist, you know, really what you would really want to do is making sure that they're really checking what the accessibility of that particular feature or part of the application.
So that means that you do need to make these kind of like, I guess I want to say breathing documents.  But yeah, that's probably a good way to do it.  And so, you know, checklist are not inherently a bad thing if you are making sure that the quality is good and you are constantly reviewing them.
Next is keyboard testing.  And this image is the illustration of someone typing on clearly what is a Mac book pro.  It's a Mac book pro.  So keyboard testing.  Keyboard testing has the benefits of being low, pretty much many free cost, easily to implement, and a low barrier to entry.  You know, most people know how to use a keyboard.  I'm not going to assume because as my grandma used to say, you assume you make an A outs of me and you.  It's a benefit.  It's one of the most important tests you can do for accessibility.  When I'm asked, what's a good way to test with Assistive Technology, the first thing I tell people to do is get comfortable navigating with a keyboard alone without a  mouse.  Many of the issues you'll finds with a screen reader are going to be found with just doing candidate  navigatation.  It's really effective testing.  And it's something that can be done while your members get comfortable using Assistive Technology like screen readers.

 
And finally on this slide is automation tools.  And these, automation tools are things like lighthouse, color checkers, you know, things-- Deque has ACT, extensions, these are things that can discover many critical issues and a way for especially new members, one they can run these tools and  see, get used to like seeing what kinds of-- what these guidelines are and how they show up on a site.  So again, you know, these are not-- these are some of the things that you can have your teams use and use in cover junction with as your team is learning about accessibility and starting to integrate that into their regular process of testing.
All right.  So part four, thoughts on automation.  And this slide as a quote.  And it says, "these points of data make a beautiful line.  And we're out of beta and we're releasing on time" this is from Jonathan Coulton, still alive which is one of the theme song from are portal.  I didn't finish either one of those-- which is not the fun fact.  But the fun fact is that the cake is not a lie.  And I'll not going to say more than that because if you haven't finished the game like me and haven't had it spoiled even though it's been a while, you may want to learn that only your own.
So this-- on this slide, there are 20 -- it says 20 to 40 percent of accessibility issues are found with automation testing.  And one of the significant issues that face many test teams with accessibility is the pressure to automate fast and everything.
I mean it's something that plagues non-accessibility issues, teams who have to deal with non-accessibility issues.  But as you might have heard in other talks in this conference and as this says, be accessibility automation can't find everything.  And I know some people say up to 50%.  I like-- I say it's 20 to 40.  It could be 50 percent in the right hands.  But I tend to think it's around usually between 20 and 40 percent especially when you have a new team that's new to doing accessibility in general.
 
 
I'm not going to fight about percents.  I think just really to drive home is that automation can't everything.  It's definitely-- it's definitely something that is needed.  It's important part of your strategy.  But it's not the end all be all.  And none of this-- I'm going to describe this slide because on this slide is a robot like that is a giant robot that's holding a woman looking out of a spy glass.  I want to describe this because it reminds me the iron giant and the giant was played by vin diesel.  I want to talk more-- let's go a little bit more into automatic 3cation.
On this slide there's a test of the automation pyramid.  So it is-- on the base of the pyramid is unit testing.  The section is unit testing.  There are three sections to the pyramid.  The base is unit testing.  The middle is integration.  And the top is UI.
And the test automation pyramid is a way to approach test automation for your application.  It's focused around, you know, time and repeatability of test and what you want to automate.  So pretty much what it says that you want the bulk of your automation to be unit test.  And then the next largest group of test will be your integration test, you know, services, APIs and then at the top and the very -- there should be a very limited amount of use of UI test.
Being so what I've come up with is a different sort of accessibility test automation pyramid.  Here it is.
So this is, unlike-- let me describe this.  So this pyramid is inverse from-- I really realized I did this really wrong.   But this is cool.  So the pyramid is upside down and the biggest part of this pyramid is unit test.  Then there's integration which is the second -- biggest part and then UI still the smallest part.
Where the test automation period was focused around time and repeatability.  The accessibility test automation pyramid is inverted because it focuses on human experiences want to give testers more time to focus on important manual testing issues that can't be found with automation also to keep that the people using these sites and applications are not  robots.  That is something that needs to be kept in mind.   It's something I keep in mind every day that I work.  What I do with things that I find and don't find have a major impact on the access that, you know, others need and want.   So that's kind of -- that's why the -- I have this-- you really can't just rely on automation alone.  And also I should also note on this slide, you're not going to have really many many integration tests when it comes to accessibility.  Like, mainly if you're going to do automation you really want to focus on unit test.  There might be some UI test that might come about.  And usually when with you have some of those, they're going-- they're-- you can do that.  But for the most part, a lot of those UI tests either can be -- there are issues that cover been found in unit testing or issues that really need to be found during manual testing.
 
 
Really when I have this automation test pyramid inverted, really should just say, unit test for the most part and a tiny bit of UI.
Okay.  This is a regression slide.  So this is pretty much many it says regression, the slide.  It spells out regression in balloons.  Are and at the very end, the very end of -- the I-- and the O and the N are separated in a gap because I think it's making an inference there's a regression in these balloons.  I don't know.  I just selected this because it said regression.  So accessibility automation really shines when it comes to keeping regression out of your application.
So one of the things that can be tricky is making sure that you're using issues of fix-- fix audit -- you need to make sure the issues you find in our audit are making it into your regression testing.  It doesn't do any good if you're going through the trouble of an audit, finding bugs, and then you never monitor or evaluate the status of your application after the audit.  But I'm going to be completely honest, it is not unusual for audits, multiple audits to be done and they keep fining the exact same bugs, every audit because there was no monitoring of issues that were fixed previously.  And they went through regression.
So you know, if you neglect doing regression testing with your accessibility issues then you're probably going to have some-- you're going to have some regression.
So we're coming towards the end here.  And one of the  things-- I really-- so this slide is really just talking about an article that I really like to point people towards.  And the title of the article is ""building the most inaccessible site possible with a perfect lighthouse score by Manuel-- it drives home-- the focus on testing to be just automation.  Are there's a lot of issues that can come  about.  I would say, if you haven't read this article, please Google it it.  Building the most inaccessible site possible with the perfect lighthouse score, and do so.
Okay, so we're at the second to the last slide.  Takeaways.
 
 
I didn't write any sort of prompts or things for the slide.   I'm going to improv it.  The take away is this, don't let your audits, if you're going to get an audit, don't let it sit there.  You make sure that you're utilizing the audits, be the audits that your organization, your business have to drive the momentum not only in your testing but the organization as a whole.  But if you don't have or don't yet have the possibilities of really upscaling and really shifting accessibility in the whole organization, you can at least do it for your testing team.  And you know, I really just want to make sure that people understand again, accessibility really is vital.  It's important.  We need to understand -- it's not just about buttons and things like that.  It's about that.  But it's also about giving people access to applications, websites that really, really do affect huge can -- I want to drive home that accessibility needs to be a priority and needs to be quality.  Let's not just look for accessibility and then languish it.  Let's use that momentum and let it drive making a holistic approach to making quality accessibility and inclusive applications and software.
And this was my talk.  Just another slide.  My last slide again it's the same as the previous slide I did.  My name, Crystal Preston-Watson, quality engineer Salesforce.org.   You have another illustration of me.  This time this illustration has an eye patch and matches my look today and still a rose headband.  You can find me at Twitter@scopic engineer.  And contact @ Crystal Preston-Watson.Com.  And website Crystal Preston-Watson.Com.  Thank you so much.
 
 
>> Thank you, Crystal.  That's amazing.  So much questions coming in.  Thank you, audience for being engaged  here.
All right.  How do you handle feeling like an imposter when you become known as an accessibility advocate within your organization?  I struggle feeling confident in my knowledge.
>> I definitely get that because I felt like an imposter when I first started taking up the cause of accessibility.   And the thing is that you don't need to feel like that.   Again you don't have to have all the answers.  You don't need to know every single question someone poses to you about ARIA or how to do this with an accordion and things like that.  You-- just by advocating the importance of  thinking about accessibility and finding outlet ways to make things accessible means that you-- what you are doing is valid.  You don't need to have all these -- I don't have all the answers.  I Google a lot of stuff.  When people ask me about JAWS, I like-- I don't use JAWS that much when it comes to my personal life.  So when I am using JAWS, when I'm trying to like on my PC, I still swear at it.   [Laughter].
So don't let that, you know, stop of, while you're learning things.  There's going to be someone who always knows more than you and something, things that you need to know.   Things change.  Things, you know -- things, you know, fade away.  Don't let that stop you from championing and really pushing forward the importance and the priority of  accessibility.
>> Absolutely.  Thank you much so.  Okay, next question  here.  What do you consider core accessibility fundamentals when establishing a baseline of knowledge for a cross disciplinary team?
>> Yeah, so with that, it's really just understanding what is meant by web accessibility, the things of like, what is different disabilities because there are a lot of people don't understand what is disability.  They think they have an understanding.  But a lot of times they don't.  So understand being the nature of disability.  It's underring kind of like common, a lot of common baseline tips of like, ALT text.  That's something that everyone needs to understand.  And that's not just for designers or UX.   That's your developers.  That is your social media.  You know, and marketing people that things like that.  There's understanding just about how people gain access to a site.   You just understanding their Assistive Technology.  And honestly it doesn't really had putter for everyone to understand kind of the basic ways how basic role, role-based accessibility, the basics-- what the tester or developer do?   Not doing deep dives but good to understand what a UX person would needs to be concerned about?  What does a developer need to be concerned about?  What does our shareholder, exec need to be concerned about in it's not deep dives.  It's going to be kind of that basic, like, web accessibility 101 that you would find.
>> Right.  How would you recommend that people get some of that learnings?  Just the basics?  How did you get started?
>> That is going to be like how I got started is-- is that, you know, I have a visual disability.  So it was twofold of getting to god a ticket to test with JAWS which I had never done before.  And learning more about that, it is, you know, Deque has trainings and other external vendors have  trainings like that.  A lot of, you know, websites, blogs, conferences like this one.  You know, which is awesome being free so people can access that knowledge.  I mean, there is a lot of great talks I'm going to have to go back and watch.   So yeah, there's a lot out there.  And make sure-- diversify where you get things from, all sources.
>> Yep, for sure.  Okay, next question here from Alice.  Do you have a different process when auditing designs that are still living in Figma sketch, XD and have not been  programmed yet?
>> So auditing-- so with that, I don't so much-- I don't have a lot of experience.  What I usually do is work with a UX designer and kind of talk through.  I don't have like an official, so I've never really worked on that kind of like, okay, we're doing an official audit of these designs.  It's usually a lot of knowledge sharing on my part and helping the designers.  Because I'm not a designer.  But I'm a-- I started as a quality engineer.  I've done front end development.  That's my strong points.  But I'm also someone who really knows, background with journalism, someone who notes to find resources and information.  It's a lot of how to find the information and reviewing even just looking at kind of things and giving my suggestions.  And that does, depending on sketch, fig MA-- I don't know Figma that well, but I have experience with it.
>> I believe you said you wouldn't too big of a fan of checklist.  But there is a question here -- how do you create a good checklist?
>> So with that, like you -- what I'm going to say is that there's plenty of checklist out there you can find.  You can Google accessibility checklist.  That's a very beginning starting point.  You really do need to take the context of your features and your application.  Because if your checklist has everything about captioning and audio accessibility and your application has no audio and video component, what are you checking?  What are you testing for?
So it really needs to be something that you-- so I'm not saying find a list and just start using that list.  It takes a review of like, okay, here are resouthernses of things that are in some checklist that we found.  Let's put together a checklist that would benefit us.  And that's going to be a document again.  As I said, it's going to be growing, you're going to have to keep up with it.  If not, then it becomes a checklist accessibility and it really is no use.
>> All right.  Thank you.  Is there a distinction or different between an accessibility review and audit?
>> I think when it comes to a review, it becomes this thing of-- I think there's really not an immediacy of we're going to-- look for things and we're going to -- look for things and then wants to fix, you know and fix those issues.  It more of like, okay, it's-- it doesn't have -- I don't think it has the strong-- it may be just kind of like semantics.   With audits, they do feel pressure.  We're spending money.   We have budget.  We have -- some exec was like, hey, better show what it's for.  But I think a review can still have a good outcome.  I think it really-- I guess-- that was a long way to say, it meets the intentions of the people conducting these things in what the outcome is and differences.  You can have an audit that just sits there and all the stuff goes into backlog and never see the light of day.  You can have accessibility review that really does bring about action and change.
>> Right.    Awesome.  We have time for one more question  here, maybe two.  What level of compliance should the QA team be aiming for?  Are some guideline violations acceptable or should there be no violations at all.   Interested to hear your take or prioritizations found by QA.
>> It would be nice to be no issues, everything's fixed.   That's not going to happen really.  You want to make sure, anything that's blockers, anything blocking functionality for your users important functionality of putting things, buying things, putting things in a cart.  Those are the things you focus on.  Everything is important but if it's blocking something doing basic action on your site, you need to make that a priority.  Accessibility issues like any sort of approach-- you can approach it like any issue when it comes to testing.  If it's blocking your clients to access your site no matter what it is, then you focus on that.   That does -- usually that falls in line with conformance of WCAG and other guidelines.  Sometimes it's not, sometimes it means instead of a AA kind of compliance, it's a single A that really is taking precedence because it is a major issue for your customers.
>> Right.  Got it.  All right.  And I think we're at time.   So thank you so much, Crystal for all of that information and thank you for all of the participants who joined and hopefully learned a lot.  We appreciate your time and hope everybody enjoys the rest of AxeCon.  
 
 
 
 
https://www.matuzo.at/blog/building-the-most-inaccessible-site-possible-with-a-perfect-lighthouse-score/
 
 
For designing UI/UX for accessability I recommend this amazing new course at Udemy. https://www.udemy.com/course/the-ux-designers-accessibility-guide/
 
https://www.w3.org/WAI/eval/report-tool/#!/#%2F

I saw that someone asked about what screen reader/browser combos to test. When I was trying to determine that at my org, I found this useful article that lists the most commonly-used combinations: https://webaim.org/projects/screenreadersurvey8/



 
This account has disabled anonymous posting.
If you don't have an account you can create one now.
HTML doesn't work in the subject.
More info about formatting

Profile

amazonv: (Default)
amazonv

December 2025

S M T W T F S
 123456
789101112 13
14 15 16 17 18 19 20
21 22 2324252627
282930 31   

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 11th, 2026 09:49 pm
Powered by Dreamwidth Studios