{
    "version" : "https://jsonfeed.org/version/1",
    "content" : "event",
    "type" : "single",
    "title" : "USWDS Monthly Call - June 2023 |Digital.gov",
    "description": "USWDS Monthly Call - June 2023",
    "home_page_url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/","feed_url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/event/2023/06/15/uswds-monthly-call-june-2023/index.json","item" : [
    {"kicker" : "USWDS","title" :"USWDS Monthly Call - June 2023","summary" : "Developing and evaluating content for manual accessibility and user experience (UX) tests","date" : "2023-06-15T14:00:00-05:00","date_modified" : "2025-01-27T19:42:55-05:00","start_date" : "2023-06-15T14:00:00-05:00","end_date" : "2023-06-15T15:00:00-05:00",
      "event_organizer" : "Digital.gov","host" : "U.S. Web Design System","registration_url" : "https://gsa.zoomgov.com/meeting/register/vJItceGprT0tHyOpdv40WB6ome7t20p6EHQ","captions" : "https://www.streamtext.net/player?event=","youtube_id" : "UxC-a48Vn7w","topics" : {
        
            "accessibility" : "Accessibility",
            "user-experience" : "User experience"
            },"primary_image" : { "uid" : "2023-uswds-monthly-call-june-title-card", "alt" :
  "Title card image of USWDS logo, a multi-colored pentagon shape consisting of five triangles, centered on a black background. In blue text, the first line says, U.S. Web Design System (USWDS). Below in white text, the second line has the event name, USWDS monthly call June 2023. Below in blue text is date followed by the date of the event in white, June 15, 2023. The next line in blue text is Time followed by the time of the event in white, 2:00 pm ET.", "width" :
  "1200", "height" :
  "628", "credit" :
  "", "caption" :
  "", "format" :
  "png" },"content" :"\n\u003ca\n    href=\"https://s3.amazonaws.com/digitalgov/static/uswds-monthly-call-june-2023.pptx\"\u003eView the slides (PowerPoint presentation, 2.0 MB, 79 pages)\u003c/a\u003e\n\n\n\u003cdiv class=\"usa-accordion accordion\"\u003e\u003ch3 class=\"usa-accordion__heading\"\u003e\n    \u003cbutton\n      class=\"usa-accordion__button\"\n      title=\"View \"\n      aria-expanded=\"false\"\n      aria-controls=\"accordion-1\"\n    \u003e\n      \u003cspan class=\"icon\"\u003e\n          \u003csvg\n            class=\"usa-icon dg-icon dg-icon--standard margin-bottom-05\"\n            aria-hidden=\"true\"\n            focusable=\"false\"\n          \u003e\n            \n            \u003cuse xlink:href=\"/preview/gsa/digitalgov.gov/bc-archive-content-3/uswds/img/sprite.svg#content_copy\"\u003e\u003c/use\u003e\n          \u003c/svg\u003e\n        \u003c/span\u003e\u003cspan class=\"src\"\u003e\n        \u003cstrong class=\"kicker\"\u003eSlide by Slide\u003c/strong\u003eUSWDS Monthly Call - Presentation Script for June 2023\n        \u003c/span\n      \u003e\n    \u003c/button\u003e\n  \u003c/h3\u003e\u003cdiv\n      id=\"accordion-1\"\n      class=\"accordion-body usa-accordion__content usa-prose\"\n    \u003e\u003cp\u003e\u003cstrong\u003eSlide 1:\u003c/strong\u003e Hi there and welcome to the U.S. Web Design System monthly call for June 2023, home to Pride month, Fathers Day, Flag Day, Kamehameha Day, and the Summer Solstice — as well as Juneteenth, this coming Monday, with the USWDS logo in red, white, and blue.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 2:\u003c/strong\u003e My name is Dan Williams, he/him, and I\u0026rsquo;m the USWDS product lead — and here on-screen is my avatar: dark hair, blue sweater, collared shirt. Today my physical self is wearing a blue T-shirt and a brown sweater. We\u0026rsquo;re back to the 40s in the morning out here on the west coast! I\u0026rsquo;d like to mention that when we introduce ourselves in these calls, you\u0026rsquo;ll hear things like self-descriptions and pronouns — these help everyone share the same context and know a bit more about who we are, whether or not you can see us.\u003c/p\u003e\n\u003cp\u003eFirst, I\u0026rsquo;d like to mention that we\u0026rsquo;re recording this monthly call, so please refrain from turning on your camera. We will manually turn off any cameras to ensure the recording doesn\u0026rsquo;t show us on camera. Unfortunately, while we are recording this call, we currently aren\u0026rsquo;t able to share the video publicly.\u003c/p\u003e\n\u003cp\u003eI’d also like to remind you that by voluntarily attending this Digital.gov event, you agree to abide by Digital.gov’s community guidelines at \u003ca href=\"https://digital.gov/communities/community-guidelines/\"\u003edigital.gov/communities/community-guidelines/\u003c/a\u003e — you can leave the meeting at any time if you do not agree to abide by these guidelines. We’ve posted a link to the community guidelines in the chat.\u003c/p\u003e\n\u003cp\u003eIf you are in the Zoom app, you can use integrated live captioning by selecting the “CC” button on the bottom of the screen. If you prefer live captioning in a separate window, we\u0026rsquo;ve posted a link to the live captioning in the chat.\u003c/p\u003e\n\u003cp\u003eWe\u0026rsquo;ll be posting other links and references into the chat as we go along, and I encourage you to ask questions in the chat at any time. If any member of our team can answer your question in the chat, we\u0026rsquo;ll do so, otherwise there\u0026rsquo;ll be some time for questions and answers at the end of the hour. Also, be sure to introduce yourself in the chat as well — it\u0026rsquo;s nice to know who\u0026rsquo;s here. It\u0026rsquo;s good to have you here today.\u003c/p\u003e\n\u003cp\u003eFor those of you who find the chat distracting, you’re welcome to close or hide the chat window during the main presentation. You can reopen it later during the Q\u0026amp;A session at the end of this call.\u003c/p\u003e\n\u003cp\u003eSo thanks! And, with that, let\u0026rsquo;s get started!\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 3:\u003c/strong\u003e So what\u0026rsquo;s our agenda for today? A straightforward call today!\u003c/p\u003e\n\u003cp\u003eFirst we\u0026rsquo;ll review a few of the new features in our last release: USWDS 3.5.0.\u003c/p\u003e\n\u003cp\u003eThen we\u0026rsquo;ll spend the rest of our time talking about the progress we\u0026rsquo;ve made on critical checklists — our new accessibility documentation initiative — and where we\u0026rsquo;re going from here.\u003c/p\u003e\n\u003cp\u003eAnd then we\u0026rsquo;ll have some time left at the end for Q\u0026amp;A. So let\u0026rsquo;s get right into it.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 4:\u003c/strong\u003e Last month we were at the cusp of releasing \u003ca href=\"https://github.com/uswds/uswds/releases/tag/v3.5.0\"\u003eUSWDS 3.5.0\u003c/a\u003e, and this month it\u0026rsquo;s out and it\u0026rsquo;s a big one, with over 25 new features and bug fixes. I\u0026rsquo;d like to recap a few of the bigger updates in 3.5.0 then highlight a couple of the updates we weren\u0026rsquo;t able to get to last time.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 5:\u003c/strong\u003e So, there are a number of key improvements in USWDS 3.5.0 that we outlined last month:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003e\u003cstrong\u003eConsistency and legibility of disabled form elements.\u003c/strong\u003e We updated and normalized disabled styling across our form elements, making disabled styles more distinct from active styles, and increasing the visibility of disabled elements. We also applied consistent styling to forced colors and high contrast mode, and are phasing out class-based disabled styling.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eFile input experience for voice and screen readers.\u003c/strong\u003e We made some important improvements to the file input component that improve its ability to interact as expected with voice command, and to better announce its status to screen readers.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eSpace bar trigger to links styled as buttons.\u003c/strong\u003e With USWDS 3.5.0, if it looks like a button you can trigger it as you would a button, whether it\u0026rsquo;s a button element or a link styled as usa-button.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eUpdated Identifier accessibility link.\u003c/strong\u003e We also updated our Identifier\u0026rsquo;s Accessibility Statement link. To align the component more closely with policy we\u0026rsquo;re updating the text of the link from Accessibility Support to Accessibility Statement.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eAnd in USWDS 3.5.0, we\u0026rsquo;re updating our form guidance to finally suggest labeling both required fields and optional fields.\u003c/strong\u003e\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003eSo in addition to those changes, we have a few more notable updates in this release.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 6:\u003c/strong\u003e Here\u0026rsquo;s some of the other good stuff in USWDS 3.5.0:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003e\u003cstrong\u003eStopped using font smoothing.\u003c/strong\u003e Font smoothing was a technique that resulted in thinner fonts when displaying light text on dark backgrounds. It\u0026rsquo;s a technique that\u0026rsquo;s no longer in favor and can compromise legibility and readability, so we\u0026rsquo;ve removed it from the limited cases where we used it.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eLabeled external links for screen readers.\u003c/strong\u003e Now screen readers will announce links that use the external link icon as external links. It will also announce links that open in a new tab.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eUpdated individual Sass map settings without affecting defaults.\u003c/strong\u003e Previously, changing a USWDS setting that was part of a Sass map — like utility output settings — meant you had to explicitly set every other value in the map as well, or else they\u0026rsquo;d get set to false. Now you can update only the map setting you want to change, and every other value in the map will retain its default settings.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eAdded two new settings to customize accordion background colors.\u003c/strong\u003e Now you can change the background color of accordions site-wide using settings.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eRemoved unused utility builder comments from compiled CSS.\u003c/strong\u003e Depending on how you compile your Sass, these comments could take up dozens of K. Now they\u0026rsquo;re gone!\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eFixed an issue with large file previews in File Upload.\u003c/strong\u003e Adding large files into the file upload could result in an infinite spinner for the file\u0026rsquo;s preview. Now the component can better handle large files.\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eAnd finally, we fixed a bug that prevented links that start with a number from scrolling when clicked in In-page navigation.\u003c/strong\u003e Now you can use numbers at the beginning of page headings and the in-page navigation will be able to properly link and scroll to them.\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 7:\u003c/strong\u003e And that\u0026rsquo;s a lot of what\u0026rsquo;s new in USWDS 3.5.0. It\u0026rsquo;s out now, so check it out!\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 8:\u003c/strong\u003e So, today we\u0026rsquo;re going to talk about a priority of ours over the last few weeks: content testing critical checklists. We\u0026rsquo;ll talk a bit about what critical checklists are, why we\u0026rsquo;re developing them, our approach to content for these checklists, how we\u0026rsquo;ve approached content testing, what we learned, and where we\u0026rsquo;re going with this important addition to the design system.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 9:\u003c/strong\u003e Last month we introduced the idea of critical checklists as a follow-up to the top tasks research project we discussed a couple months ago. One of the top tasks we identified in that research was that teams come to our website to know about the Section 508 and WCAG conformance of our components.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 10:\u003c/strong\u003e What we\u0026rsquo;ve decided to develop is something we\u0026rsquo;re calling component-based critical checklists. That is, for each component, we\u0026rsquo;re planning to publish something of an accessibility-focused used-car inspection checklist, outlining the checks we\u0026rsquo;ve performed on the component and result of those checks.\u003c/p\u003e\n\u003cp\u003eAs we\u0026rsquo;ll discuss a little later today, these checklists will provide an opportunity not only to be transparent about accessibility for the components we deliver, but also to provide a guide for design system users to check their own implementations.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 11:\u003c/strong\u003e It\u0026rsquo;s been a quick turnaround from last month\u0026rsquo;s call. When a month begins on a Thursday, we know that the third Thursday is going to come around faster than we think.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 12:\u003c/strong\u003e But it\u0026rsquo;s a good challenge! What can we accomplish in a couple weeks? How can we show some progress on an important initiative we just announced in last month\u0026rsquo;s call?\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 13:\u003c/strong\u003e What we tried to do was to get folks involved early, to try to perform some usability testing right at the start of the development cycle — and that meant leading with content testing. Before we had layouts, before we had even a prototype, we wanted to get some feedback about what content is the right fit for our idea.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 14:\u003c/strong\u003e To help talk about what we did, I\u0026rsquo;d like to introduce a couple members of the USWDS Core Team. First, Amy Cole, a contractor, and the accessibility expert on our team. Amy, can you introduce yourself and give a little self-description for anyone audio-only today?\u003c/p\u003e\n\u003cp\u003eAmy: Hi everybody. Thanks Dan. I am Amy Cole, I\u0026rsquo;m a caucasian woman with shoulder length curly brown hair, and I\u0026rsquo;m wearing a navy blue shirt, and my pronouns are she/her.\u003c/p\u003e\n\u003cp\u003eDan: Thanks Amy!\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 15:\u003c/strong\u003e I\u0026rsquo;d also like to introduce Jacline Contrino, a contractor, and the UX Researcher on our team. Jacline, can you introduce yourself and give a little self-description for anyone audio-only today?\u003c/p\u003e\n\u003cp\u003eJacline: Hi. Sure, I\u0026rsquo;m Jacline. I\u0026rsquo;m also a white woman with shoulder length brown curly hair, but I\u0026rsquo;m wearing a black shirt and big hoop earrings today. And I use she/her pronouns.\u003c/p\u003e\n\u003cp\u003eDan: Thanks Jacline! We\u0026rsquo;ll hear from Jacline in just a bit, but before we do, I\u0026rsquo;d like to pass it to Amy, so she can talk a bit more about content development for critical checklists.\u003c/p\u003e\n\u003cp\u003eAmy: Thanks for the introduction, Dan. I appreciate the time everyone is taking to join us today and I look forward to telling you about the plans we have for USWDS component accessibility.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 16:\u003c/strong\u003e When it comes to accessibility testing, and critical checklists, the first thing we asked ourselves is what\u0026rsquo;s our opportunity? \u003cstrong\u003eHow can we be useful?\u003c/strong\u003e\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 17:\u003c/strong\u003e As someone who has worked on a small state-funded web design team, I understand that many of you who work on a government project team have multiple roles and responsibilities. For example, designers are sometimes asked to do development work, developers to be designers, or to pitch in with writing. We often wear multiple hats. We can’t all be accessibility or Section 508 specialists, but we can all contribute to accessibility testing. That\u0026rsquo;s where USWDS can help.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 18:\u003c/strong\u003e We take our role as accessibility supporters seriously. Our aim is to help scale specialized skills and expertise so that even the smallest team can benefit from the collective expertise of designers, developers, and accessibility specialists across government. It\u0026rsquo;s our responsibility to help ensure that your websites and services meet the needs of your users and the letter of the law. Even if your team does have Section 508 support — and especially if you do not — you may rely on USWDS engineering, usability, and accessibility guidance to help direct your work.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 19:\u003c/strong\u003e We see an opportunity to use our component accessibility assessments as a starting point for anyone assessing the accessibility of their own sites and services. We\u0026rsquo;re thinking of critical checklists as \u0026ldquo;what any component needs to do to be accessible and usable,\u0026rdquo; starting with the requirements of WCAG 2.1 single-A and AA success criteria. These critical checklists will document our own USWDS component accessibility and guide project-specific accessibility revalidation, to ensure that the components in your project are at least as usable and accessible as the components we ship.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 20:\u003c/strong\u003e These checklists are designed to be usable by folks with a wide range of technical skills, including those whose job responsibilities may not have previously included accessibility awareness. When we all participate in accessibility testing, we can all make decisions with accessibility in mind throughout the design and development process. These checklists will be written to help people new to accessibility testing to learn by doing.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 21:\u003c/strong\u003e Lastly, we recognize that our federal design system is held to higher accessibility standards than other design systems. Section 508 is the law, and we need to work hard to provide accessible components, templates, patterns, and guidance — and grow the behaviors and practices that result in better Section 508 conformance across government.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 22:\u003c/strong\u003e \u003cstrong\u003eHow’d we get started understanding critical checklists?\u003c/strong\u003e\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 23:\u003c/strong\u003e Was anyone else doing this? In an effort to understand critical checklists in the design system environment, we performed a landscape analysis and reviewed about 30 design systems through the lens of accessibility. What are others doing? What accessibility resources did they offer? What could we emulate? What could we improve upon? What gaps did we find?\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 24:\u003c/strong\u003e We found different approaches to accessibility.\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eMany, but unfortunately not all, design systems we checked at least mentioned accessibility\u003c/li\u003e\n\u003cli\u003eSome included the WCAG success criteria, but didn\u0026rsquo;t always connect them to specific heuristics checked\u003c/li\u003e\n\u003cli\u003eMost accessibility resources were written using technical jargon and may not be suitable for audiences new to accessibility\u003c/li\u003e\n\u003cli\u003eNone had component-level checklists for keyboard, zoom, and screen readers\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 25:\u003c/strong\u003e There were a few interesting elements of accessibility documentation that we decided we might want to explore later:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eSome did include scripts for what a screen reader should announce per component\u003c/li\u003e\n\u003cli\u003eSome included component diagrams with accessibility-related callouts\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 26:\u003c/strong\u003e This analysis showed that there was an opportunity to fill a guidance gap, and that there was an additional opportunity to model a level of accessibility transparency that is rare at best in the design system documentation we encountered. If you\u0026rsquo;ve found great component-based accessibility documentation, please let us know, either in the chat, in the public Slack, or in our new Accessibility discussion channel in GitHub discussions, that Dan will talk about later in the call.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 27:\u003c/strong\u003e We don\u0026rsquo;t know everything! And we won\u0026rsquo;t ever know everything. More than anything else, we hope the design system can be a place where we can share and scale expertise. To do that effectively, we rely on the teams that use the design system and the folks who participate in these monthly calls to build on this body of knowledge. Beyond the landscape analysis, your questions also are helping us understand the kind of content we need to develop for our critical checklists. We read your emails, actively engage with your requests in Slack, and review your GitHub issues and pull requests. We expect that as we make progress on critical checklists, they\u0026rsquo;ll improve with your continued feedback.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 28:\u003c/strong\u003e And finally, as we considered the content we wanted to develop for critical checklists, we looked at other government accessibility resources. \u003ca href=\"https://ictbaseline.access-board.gov/\"\u003eSection 508\u0026rsquo;s Information and Communication Technology testing baseline\u003c/a\u003e is a tremendous resource, and \u003ca href=\"https://www.dhs.gov/trusted-tester\"\u003eDHS\u0026rsquo;s trusted tester\u003c/a\u003e program is an inspiration as well. Digital.gov has their \u003ca href=\"https://accessibility.digital.gov/\"\u003eAccessibility for Teams\u003c/a\u003e site and the folks at 18F have good practical accessibility training for their peers and partners. The \u003ca href=\"https://design.va.gov/\"\u003eVA Design System has select component checklists available\u003c/a\u003e. We want to draw from important federal resources without necessarily duplicating them. We see an opportunity to take these resources and focus them toward our components through a plain-language lens.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 29:\u003c/strong\u003e \u003cstrong\u003eWhat content did we test?\u003c/strong\u003e\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 30:\u003c/strong\u003e In general, our approach was to provide context and instructions for anyone new to the manual testing process. We wrote the checklist items in a plain-language question-or-prompt format. We tried to keep jargon to a minimum and used conversational language.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 31:\u003c/strong\u003e We decided to err on the side of more content in our first content draft, to see what resonated with our users.\u003c/p\u003e\n\u003cp\u003eOur first content draft was focused on the Accordion component because it\u0026rsquo;s a popular component and relatively straightforward. Our draft content consisted of nine sections, just as the large gold rectangle on this slide is divided into a stack of nine smaller rectangles. The nine sections were:\u003c/p\u003e\n\u003col\u003e\n\u003cli\u003eOverall accessibility results\u003c/li\u003e\n\u003cli\u003eRelevant WCAG success criteria\u003c/li\u003e\n\u003cli\u003eOur testing protocols for USWDS components\u003c/li\u003e\n\u003cli\u003eAccessibility best practices for the component\u003c/li\u003e\n\u003cli\u003eKeyboard testing checklist\u003c/li\u003e\n\u003cli\u003eZoom magnification testing checklist\u003c/li\u003e\n\u003cli\u003eScreen reader testing checklist\u003c/li\u003e\n\u003cli\u003eMobile testing checklist\u003c/li\u003e\n\u003cli\u003eSupport\u003c/li\u003e\n\u003c/ol\u003e\n\u003cp\u003eAnd as we move through the content sections in the next few slides, each section in the rectangle will get a gold highlight.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 32:\u003c/strong\u003e \u003cstrong\u003eSection 1: Overall accessibility results.\u003c/strong\u003e At the top of the page we had a single topline result: This component has achieved level 2.1 of WCAG conformance standards.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 33:\u003c/strong\u003e \u003cstrong\u003eSection 2: Relevant WCAG success criteria.\u003c/strong\u003e Below the topline results we included a section of the relevant WCAG success criteria that apply to the component. For Accordion, this was a list of six criteria, including the criteria text and a link to the original criteria on the W3.org website. We included success criteria that directly applied to Accordion, level A, AA, or AAA.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 34:\u003c/strong\u003e \u003cstrong\u003eSection 3: Our testing protocols for USWDS components.\u003c/strong\u003e The third section outlined the steps USWDS takes to test accessibility on its components. These include:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eAutomated tests used in our continuous integration\u003c/li\u003e\n\u003cli\u003eManual high contrast mode testing in Windows \u0026amp; Chrome\u003c/li\u003e\n\u003cli\u003eBrowser based accessibility tools for image accessibility\u003c/li\u003e\n\u003cli\u003eManual keyboard and voice assistant and screen reader testing\u003c/li\u003e\n\u003cli\u003eUsing necessary ARIA tags\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 35:\u003c/strong\u003e \u003cstrong\u003eSection 4: Accessibility best practices for the component.\u003c/strong\u003e Next, we included the accessibility best practices from the main Accordion component page. To this, we added a couple smaller sections on audience considerations for using the accordion component, and a disclaimer on what USWDS is not able to test, including unsupported browsers, and all configurations of platforms, browsers, and contrast modes.\u003c/p\u003e\n\u003cp\u003eFinally, this section also included the guidance that teams should be testing their own implementation of the accordion component.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 36:\u003c/strong\u003e \u003cstrong\u003eSections 5–8: Manual testing checklists.\u003c/strong\u003e By the fifth section, we arrived at manual testing guidance, which included sections for keyboard control tests, zoom magnification tests, screen reader tests, and mobile tests.\u003c/p\u003e\n\u003cp\u003eEach section had started with some general guidance about how to approach that category of testing, followed by 3-10 checklist-style tests. For example, for the Zoom magnification section, we included the following elements we\u0026rsquo;ll see on the next couple slides.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 37:\u003c/strong\u003e \u003cstrong\u003eHow to perform magnification testing\u003c/strong\u003e\u003c/p\u003e\n\u003col\u003e\n\u003cli\u003eNavigate to a page on your website where the accordion component is used.\u003c/li\u003e\n\u003cli\u003eEnlarge the view of your screen to 200% by going into the browser settings or clicking “ctrl + scroll wheel” (Windows) or “command +” (Mac) until you see 200% in a pop-up window on the top right of your screen.\u003c/li\u003e\n\u003cli\u003eTest the functionality and visibility of the accordion using a mouse and the Magnification testing checklist.\u003c/li\u003e\n\u003c/ol\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 38:\u003c/strong\u003e \u003cstrong\u003eZoom magnification testing checklist\u003c/strong\u003e\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eAre you able to see all the content in the accordion without horizontal scrolling?\u003c/li\u003e\n\u003cli\u003eIs any content cut off?\u003c/li\u003e\n\u003cli\u003eDoes any content in the accordion overlap?\u003c/li\u003e\n\u003cli\u003eDo you see anything covering the accordion (pop ups, images, etc.)?\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 39:\u003c/strong\u003e \u003cstrong\u003eSection 9: Support.\u003c/strong\u003e The content ended with short support information and the date we last updated the page.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 40:\u003c/strong\u003e Upon reflection, there was a lot of content in this test! But as we will learn, it gave us a good opportunity to see what was most of interest to our users. I’ll pass things over to Jacline now and she can share how we approached content testing for this first draft of critical checklists.\u003c/p\u003e\n\u003cp\u003eJacline: Thanks Amy! We just heard how we developed our content and the type of content we were testing.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 41:\u003c/strong\u003e Now let\u0026rsquo;s get into the content testing process and talk about our goals: \u003cstrong\u003eWhat did we test and why?\u003c/strong\u003e\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 42:\u003c/strong\u003e It can be really important and not too time consuming to test written content before the design phase. In the spirit of ‘testing early and often’ we wanted to get our ideas in front of USWDS users before any code was written or content published on our site. So, after we created the Critical Checklist content, we wanted to see if we were on the right track with what we wrote.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 43:\u003c/strong\u003e Before we talk about what we did, though, let’s take a step back and talk about what some of the typical options are when you need to test the effectiveness of written content. There are a few approaches you could take in order to assess users’ comprehension, perceptions, and feelings about the content.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 44:\u003c/strong\u003e First, let’s talk about good old fashioned \u003cstrong\u003esemi-structured interviews\u003c/strong\u003e. Semi-structured interviews can be flexible and good for getting general information about content. These usually involve showing participants the content and asking follow up questions. Some things you can learn in semi-structured interviews are:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eWhat users expect from the content before you share it with them (in other words, discover their mental models)\u003c/li\u003e\n\u003cli\u003eYou can learn what points of confusion they have with the content, or conversely, what’s particularly helpful and why\u003c/li\u003e\n\u003cli\u003eAnd you can gauge how well people understand the content by asking them to summarize it in their own words, or asking them what questions they have about the content.\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003eSpeaking of comprehension, there are a couple of other tests that are particularly useful for testing comprehension of written content: cloze tests and recall-based tests.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 45:\u003c/strong\u003e First, let’s talk about \u003cstrong\u003ecloze tests\u003c/strong\u003e. These are kind of like research fill-in-the-blanks. You present the content to the participant, but remove, say, every fifth word. Participants are asked to fill in the blanks with their best guess as to what the words should be. Then, you calculate the score for the percentage of correct words guessed. A 60% or above score means the text is reasonably comprehensible. These tests can be pretty high effort for users (so be mindful not to have them guess too many words — aim for around 25), but cloze tests can be useful for testing content that is highly complex.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 46:\u003c/strong\u003e \u003cstrong\u003eRecall-based testing\u003c/strong\u003e can test not only how understandable the content is, but also how memorable it is. It involves showing participants the text that you are testing, and then asking them fact-based questions to sort of “quiz” how well they understood and remembered what they read. For example, if you want to test content that outlines benefits eligibility, after showing them the text, you could ask participants about what factors make them eligible or ineligible to receive certain benefits to see if they understood that content.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 47:\u003c/strong\u003e There are also \u003cstrong\u003ehighlighter tests\u003c/strong\u003e. These involve asking the user to highlight material that evoke certain feelings, such as confidence. For example, green highlighting for the content that makes them feel more confident, and maybe pink for the content that makes them feel less confident. You could also use this method to discover other things, such as content usefulness versus confusing or unnecessary content. Highlighter tests can be great when you’re testing longer passages since they are easy to do and don’t require a high cognitive load for the participant.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 48:\u003c/strong\u003e Finally, another technique that’s especially useful for testing the performance of content is A/B testing, or comparative testing. A/B tests help you compare 2 versions of the content to see which one performs better at helping your user do the thing the content is intended to help them do. This can be done in a low fidelity way (like showing users 2 versions of the content, alternating which is shown first between participants so it isn’t biased) and seeing how they react or act on the content. You can also do it in a high fidelity environment, such as a live site, where half of your audience sees one version, and the other half sees another, and then compare relevant metrics of performance.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 49:\u003c/strong\u003e Regardless of which approach you use, as a general best practice, always be sure to test your content with actual or representative users that the content is intended for and within the context that it might be used. Participants’ motivation and background knowledge are really important in content testing, so using the wrong participants risks invalidating your study.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 50:\u003c/strong\u003e So, it seems there are a lot of methods you can choose to test written content. We considered a few of these when planning our own content test of the Critical Checklists, and ultimately decided to go with semi-structured interviews. We wanted to have the flexibility for users’ reactions and questions to drive our conversations rather than sticking to strict testing protocols with limited scope which can be the case with cloze tests, recall-based tests, and A/B testing. We strongly considered using the highlight method as well, but again, we decided to prioritize flexible conversation. We were also working in a very tight timeline. Other methods like A/B testing and cloze tests require a little more lead time to set up. Interview sessions are faster and easier to plan and implement, and usually yield worthwhile insights to boot.\u003c/p\u003e\n\u003cp\u003eOur primary goals with the content test were pretty varied.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 51:\u003c/strong\u003e \u003cstrong\u003eMental models:\u003c/strong\u003e We wanted to learn what users think about accessibility for accordions, and what language they use to describe it.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 52:\u003c/strong\u003e \u003cstrong\u003eExpectations:\u003c/strong\u003e We wanted to discover what designers and developers are looking for when it comes to accessibility guidance for components, and learn to what extent the content we are testing meets their needs and expectations.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 53:\u003c/strong\u003e \u003cstrong\u003eComprehension:\u003c/strong\u003e We wanted to gauge how well users understand the content we’ve created. Are there any parts that are confusing or unclear?\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 54:\u003c/strong\u003e \u003cstrong\u003eGaps:\u003c/strong\u003e We wanted to find out how complete the content is. What’s missing? Did we forget something important?\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 55:\u003c/strong\u003e \u003cstrong\u003eActionability:\u003c/strong\u003e Finally, we wanted to get a sense of how well the content enables users to take action on the guidance, particularly with the checklists that outline ‘how you should test’ your own implementations of our components.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 56:\u003c/strong\u003e \u003cstrong\u003eHow did we do it?\u003c/strong\u003e\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 57:\u003c/strong\u003e This was a fairly lightweight study that we were able to get done (from planning to execution to findings) within about 3 weeks.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 58:\u003c/strong\u003e We reached out to our community of testers for this study to invite them to participate. For those that were interested, we scheduled 30 minute testing sessions. We also informed participants ahead of time of what to expect from the session and how their information would be protected. We also asked for permission to record the session and reiterated that participation is completely voluntary. Following ethical research practices is very important to us.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 59:\u003c/strong\u003e As mentioned before, the sessions were semi-structured interviews and we carried them out with 4 USWDS users individually. In the session, before we presented the content to participants, we first asked them what they expect to find in an Accordion Accessibility page on our website. This would help us discover if the way they think about accordion accessibility support documentation lined up with how we were thinking about it, and where there might be any gaps.\u003c/p\u003e\n\u003cp\u003eThen, we briefly showed them the document with the content we were testing. We gave them 30 seconds to scan the document because we wanted to get their knee-jerk first impressions before they spent too much time with it. We also wanted to know if anything right off the bat stood out to them as particularly confusing, or particularly helpful.\u003c/p\u003e\n\u003cp\u003eThen, we gave them about 7 minutes to thoroughly read through the entire document. Afterwards, we asked them follow-up questions to see what might be missing or how it could be improved to better meet their needs and expectations.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 60:\u003c/strong\u003e Some of the questions we asked were:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eWould you please summarize the information on that page in your own words?\u003c/li\u003e\n\u003cli\u003eWhat was easy or difficult to understand? Why?\u003c/li\u003e\n\u003cli\u003eWhat questions do you have after reading that content?\u003c/li\u003e\n\u003cli\u003eWhat is missing that you’d like to see? Also, what might you change and why?\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003eThese questions helped us tease out where we might need to make improvements to our Critical Checklists.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 61:\u003c/strong\u003e If there’s anything we’d change for next time we run a test like this, there’s one that comes to mind.\nOne of the questions we asked folks was to summarize the information on the page in their own words. We found that this was not very effective with such a long document containing a lot of complex, detailed information. It forced participants to answer at a generalized high level, so it might have been more helpful to focus on one section of the content to ask users to summarize. We ended up leaving this question out in later sessions, and instead asked participants “what questions do you have after reading that information?”\u003c/p\u003e\n\u003cp\u003eOverall, the sessions went really well and we got some excellent feedback. It’s always good to get more input on your ideas before building it out too much!\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 62:\u003c/strong\u003e And speaking of input, I can’t emphasize enough how much we appreciate feedback from our community — it really helps us make USWDS better. If any of you would like to sign up to possibly be a tester for future studies we run, please fill out this \u003ca href=\"https://forms.office.com/r/DrYNkBJ3pu\"\u003eform\u003c/a\u003e we are putting in the chat. You can opt out at any time, just let us know. We do ask that only federal employees sign up — unfortunately we are unable to include contactors or the public in testing at this time. Again, we rely on our community’s willingness to help us improve, so thank you!\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 63:\u003c/strong\u003e \u003cstrong\u003eOk, so what did we learn from our content test?\u003c/strong\u003e\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 64:\u003c/strong\u003e We discovered that we are on the right track with the content, mostly.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 65:\u003c/strong\u003e Overall, it meets or exceeds user expectations — the quote you see on this slide shows one user saying, \u0026ldquo;\u003cem\u003eI think that\u0026rsquo;s it\u0026rsquo;s hard to think of all the pieces of accessibility… and I think this does a really good job (at a) high level, considering all of the pieces and also encouraging testing. Even showing or telling them how to test, (the) tools to use. It\u0026rsquo;s very well put together\u003c/em\u003e.”\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 66:\u003c/strong\u003e So, participants were really excited about it and felt that it’d help them feel more confident that the accordion component checks all the accessibility boxes while also empowering them to test their own implementations. The content was also fairly complete — there were a few suggestions for content to add, but overall, the document covered most of what users seemed to need.\u003c/p\u003e\n\u003cp\u003eWe also learned that the information was easily understandable to all participants. There weren’t too many points of confusion except for a couple of areas that needed some rewording or clarification. The tone was also on point — users felt that it was written in plain language and not too technical (except for the WCAG criteria, but we have some ideas to address that), and the content flows in a way that makes logical sense to people.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 67:\u003c/strong\u003e So what can we improve? One opportunity for us was around learning that participants felt that the content was extensive and that it could be overwhelming. One person said that they could see how users might get “lost” in it. So that’s something we’re going to be mindful of as we move forward into designing what the content might look like on the site.\nAdditionally, there are opportunities for clarifying content. For example, users weren’t clear on who was responsible for adding ARIA attributes to the code: were they already built into the component or is it something teams need to wire up in the code themselves.\u003c/p\u003e\n\u003cp\u003eAnd, speaking of ARIA, we had a couple of users express some hesitancy or wariness around ARIA. In our guidance, we encourage users to follow best practices for using ARIA. Users thought it could open a can of worms since it is easy to make errors. As one user put it “no aria is better than bad aria” and commented that we might want to put some disclaimers or words of caution around using ARIA.\u003c/p\u003e\n\u003cp\u003eWe also observed that some users want more basic, design-specific guidance, particularly on color contrasts, font sizes, and text length. USWDS provides that information in other places on our site so it may be worthwhile to link to those so we’re providing that guidance at the point of need.\u003c/p\u003e\n\u003cp\u003eFinally, we learned that there is an appetite for more detailed information on testing results and research that USWDS has done on specific components. Some users wished they could click into the details of research findings, so talking to participants reiterated the importance of transparency in research we’ve done.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 68:\u003c/strong\u003e \u003cstrong\u003eWhat are we going to do next?\u003c/strong\u003e\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 69:\u003c/strong\u003e We got some great feedback through these content test sessions. We were thrilled that, overall, people were excited and happy with the content of these Critical Checklists, but we are taking the areas of opportunity into careful consideration as we work towards the next iteration of these checklists. All the raw material is there, so now how might we package it so that it meets the identified needs of users? How might we present the information on the site so that users can easily interact with it and get what they need out of it?\u003c/p\u003e\n\u003cp\u003eThese are the questions that are guiding our next phase. We will soon begin prototyping our ideas for what these Critical Checklists might look like and how they might be structured on our site, and we hope to test those ideas again with our community of testers. We hope to roll out these checklists in the coming weeks, so stay tuned!\u003c/p\u003e\n\u003cp\u003eDan: Thanks Jacline! This is Dan again. We learned a lot by pushing ourselves — perhaps — a little bit out of our comfort zone and testing something while it was still very new.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 70:\u003c/strong\u003e What we learned is that there\u0026rsquo;s huge value in getting an early outside perspective from your audience, if only to serve as a gut check that you\u0026rsquo;re on the right track. But we got more than that. From my perspective, there were a few clear takeaways from what we observed.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 71:\u003c/strong\u003e First is that while all accessibility information is welcomed, the key content on this page is \u003cstrong\u003eWhat tests we performed, their result, and when we performed them\u003c/strong\u003e. Which is largely the same content as what teams can do to check component accessibility on their sites.\u003c/p\u003e\n\u003cp\u003eThis needs to be as clear and obvious as possible, and the primary focus of the page. While we may still have a number of content sections on the page, the testing sections need to be right up top.\u003c/p\u003e\n\u003cp\u003eAnd we want to tie these tests to the WCAG success criteria. The success criteria on their own have limited value — there needs to be a clear connection between the test and the conformance criteria. The tests themselves need to convey that we\u0026rsquo;re comprehensively testing against all relevant criteria.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 72:\u003c/strong\u003e Second is that we should be focusing on behavior- and outcome-driven tests. That is, tests that look not at the underlying code of the implementation, but on the component\u0026rsquo;s behavior in the browser or in the screen reader. Implementation methods may change, and we already document our implementation specifics on the component page. These critical checklist tests should be focused on the user experience. For instance, our tests should ask about whether the screen reader currently announces that an accordion is open, not whether the markup includes \u003cstrong\u003earia-expanded=\u0026ldquo;true\u0026rdquo;\u003c/strong\u003e.\nThis may also help reduce confusion, as teams won\u0026rsquo;t need to wonder whether they should be adding this or that ARIA label or JavaScript to their component. And even as component markup may change — for instance if and when we implement accordion with the \u003cstrong\u003esummary\u003c/strong\u003e and \u003cstrong\u003edetails\u003c/strong\u003e elements, and potentially remove ARIA — the accessibility-related behavior may not change. In the context of these checklists, folks can just be checking for the correct behaviors.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 73:\u003c/strong\u003e As we focus on \u0026ldquo;What tests we performed, their result, and when we performed them\u0026rdquo; we need to keep the content relevant to the related component. If there\u0026rsquo;s information that\u0026rsquo;s more general, or applies to every component, we should move to create a common home for that content that we can link to instead of including it in every critical checklist. For instance, guidance on how to set up a screen reader is really important, but we want to be able to link to it, not always include it directly in this content.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 74:\u003c/strong\u003e As we think about building this guidance and publishing it, we need to understand the relationship between components and subcomponents, or when the accessibility of part of a component might be covered in a checklist for a different component. For example, the button in the search component has the same accessibility requirements as the button component. This relationship should influence how we document the accessibility of the search component, and it should also influence how we sequence these checklists, as we write and develop them. We want to use something of an atomic design model, working on critical checklists for the smaller components before we attempt checklists for more complex components that may be dependent on the smaller ones.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 75:\u003c/strong\u003e All of this is to say that it\u0026rsquo;s good we\u0026rsquo;re thinking about all these things early so we can develop a repeatable structure. We have nearly 50 components, and creating all these checklists is going to take some time. We can\u0026rsquo;t spend four weeks on each one of them!\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 76:\u003c/strong\u003e Currently, we\u0026rsquo;re trying to schedule a six-month rollout for these critical checklists. Six months seems like a lot of time, but it also seems like just a little. I tend to think this is kind of an ambitious timeline!\u003c/p\u003e\n\u003cp\u003eSo we\u0026rsquo;re going to work quickly to iterate on what we\u0026rsquo;ve learned and try to move quickly toward a production-ready template. We\u0026rsquo;ll probably want to test a prototype soon, but we\u0026rsquo;ll want to move fast. That six-month clock is ticking.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 77:\u003c/strong\u003e So one last thing I\u0026rsquo;d like to mention, is that as we\u0026rsquo;re moving fast on this accessibility content, we\u0026rsquo;d like to stay extra-connected to accessibility-related discussions, and encourage them as much as possible. To that end, we\u0026rsquo;ve set up a special discussion channel in Github discussions related to accessibility. Since not every question you might have qualifies as an issue or a pull request, we hope that a discussion channel bridges the gap between the types of live conversations we have in the public Slack and the asynchronous, but still collaborative back-and-forth we might have in a Github issue.\u003c/p\u003e\n\u003cp\u003eSo if you have a question about accessibility, or an opinion you\u0026rsquo;d like to share, check out our new \u003ca href=\"https://github.com/uswds/uswds/discussions/categories/accessibility\"\u003eGithub Accessibility Discussions\u003c/a\u003e. We\u0026rsquo;re posting the link into the chat.\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 78:\u003c/strong\u003e Q\u0026amp;A\u003c/p\u003e\n\u003cp\u003e\u003cstrong\u003eSlide 79:\u003c/strong\u003e Thanks for joining today’s USWDS monthly call. We\u0026rsquo;ll be taking July off for a much needed break from monthly call, then we\u0026rsquo;ll be back in August. Please look out for an event feedback survey from Digital.gov. You\u0026rsquo;ll get this in your email, and there\u0026rsquo;s also a link in the chat. Your feedback makes a difference to us, so we\u0026rsquo;d appreciate the extra time it takes you to provide it.\u003c/p\u003e\n\u003cp\u003eAnd if you have a question we weren\u0026rsquo;t able to answer in the call, or thought of later, please head into our public Slack and ask it there. We\u0026rsquo;ll be around after the call to answer questions.\u003c/p\u003e\n\u003cp\u003eHave a great day, and a great July, and we\u0026rsquo;ll see you in August!\u003c/p\u003e\n\u003c/div\u003e\u003c/div\u003e\n\n\u003cp\u003eJoin us as we talk about the easy-to-follow, manual accessibility and user experience (UX) tests we\u0026rsquo;re developing for U.S. Web Design System (\u003ca href=\"https://designsystem.digital.gov/components/\"\u003eUSWDS) components\u003c/a\u003e. We\u0026rsquo;re calling these tests \u003ccode\u003eCritical Checklists\u003c/code\u003e.\u003c/p\u003e\n\u003cp\u003eIn this session, you’ll learn how we\u0026rsquo;re approaching \u003ccode\u003eCritical Checklists\u003c/code\u003e:\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003eOur vision for \u003ccode\u003eCritical Checklists\u003c/code\u003e\u003c/li\u003e\n\u003cli\u003eHow \u003ccode\u003eCritical Checklists\u003c/code\u003e address \u003ca href=\"https://www.w3.org/TR/WCAG21/\"\u003eWeb Content Accessibility Guidelines (WCAG) 2.1\u003c/a\u003e requirements\u003c/li\u003e\n\u003cli\u003eHow we\u0026rsquo;re evaluating checklist content\u003c/li\u003e\n\u003cli\u003eHow \u003ccode\u003eCritical Checklists\u003c/code\u003e can support your own accessibility testing\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cstrong\u003eThis event is best suited for:\u003c/strong\u003e Designers, developers, and accessibility specialists at all levels.\u003c/p\u003e\n\u003ch2 id=\"speakers\"\u003eSpeakers\u003c/h2\u003e\n\u003cul\u003e\n\u003cli\u003e\u003cstrong\u003eDan Williams\u003c/strong\u003e — Product Lead, USWDS\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eAmy Cole\u003c/strong\u003e — Accessibility Specialist, USWDS\u003c/li\u003e\n\u003cli\u003e\u003cstrong\u003eJacline Contrino\u003c/strong\u003e — UX Researcher, USWDS\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch2 id=\"join-our-communities-of-practice\"\u003eJoin our Communities of Practice\u003c/h2\u003e\n\u003cul\u003e\n\u003cli\u003e\u003ca href=\"https://designsystem.digital.gov/about/community/\"\u003eUSWDS\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"https://www.section508.gov/manage/join-the-508-community/\"\u003eSection 508 IT Accessibility\u003c/a\u003e\u003c/li\u003e\n\u003c/ul\u003e\n\u003ch2 id=\"resources\"\u003eResources\u003c/h2\u003e\n\u003cul\u003e\n\u003cli\u003e\u003ca href=\"https://digital.gov/topics/accessibility/\"\u003eAccessibility resources — Digital.gov\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"https://designsystem.digital.gov/documentation/accessibility/\"\u003eAccessibility: Usability for every ability — USWDS\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"https://www.section508.gov/create/\"\u003eCreate Accessible Digital Products — Section508.gov\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"https://www.section508.gov/tools/program-manager-listing/\"\u003eFind your federal agency’s Section 508 Program Manager\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"https://digital.gov/resources/how-test-websites-for-accessibility/\"\u003eHow to Test Websites for Accessibility (with video)\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"https://www.youtube.com/watch?v=CL6lOwJEMGQ\u0026amp;list=PLd9b-GuOJ3nFHykZgRBZ7_bzwfZ526rxm\u0026amp;index=22\"\u003eAccessible Digital Content: Tips and Tricks (Digital.gov video playlist)\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"https://accessibility.digital.gov/\"\u003eAccessibility for Teams\u003c/a\u003e\u003c/li\u003e\n\u003c/ul\u003e\n\u003cp\u003e\u003cem\u003eThis event is part of a monthly series that takes place on the third Thursday of each month. Don’t forget to set a placeholder on your personal calendar for our future events this year.\u003c/em\u003e\u003c/p\u003e\n\u003ch2 id=\"about-the-uswds\"\u003eAbout the USWDS\u003c/h2\u003e\n\u003cp\u003e\u003ca href=\"https://designsystem.digital.gov/\"\u003eThe U.S. Web Design System\u003c/a\u003e is a toolkit of principles, guidance, and code to help government teams design and build accessible, mobile-friendly websites backed by user research and modern best practices.\u003c/p\u003e\n\u003cul\u003e\n\u003cli\u003e\u003ca href=\"https://designsystem.digital.gov/\"\u003eThe U.S. Web Design System\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"https://github.com/uswds/uswds/issues\"\u003eContribute on GitHub\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"mailto:uswds@support.digitalgov.gov\"\u003eEmail Us\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"https://digital.gov/communities/uswds/\"\u003eJoin our community\u003c/a\u003e\u003c/li\u003e\n\u003cli\u003e\u003ca href=\"https://twitter.com/uswds\"\u003eFollow @uswds on Twitter\u003c/a\u003e\u003c/li\u003e\n\u003c/ul\u003e\n",
      "branch" : "bc-archive-content-3",
      "filename" :"2023-06-08-uswds-monthly-call-june-2023.md",
      
      "filepath" :"events/2023/06/2023-06-08-uswds-monthly-call-june-2023.md",
      "filepathURL" :"https://github.com/GSA/digitalgov.gov/blob/bc-archive-content-3/content/events/2023/06/2023-06-08-uswds-monthly-call-june-2023.md",
      "editpathURL" :"https://github.com/GSA/digitalgov.gov/edit/bc-archive-content-3/content/events/2023/06/2023-06-08-uswds-monthly-call-june-2023.md","slug" : "uswds-monthly-call-june-2023","url" : "/preview/gsa/digitalgov.gov/bc-archive-content-3/event/2023/06/15/uswds-monthly-call-june-2023/"
    }
  ]
}
