The following protocol describes steps you can take to identify and document accessibility features and barriers in new technologies. Not all of these tasks will apply in every scenario, and some situations may require testing that is not described here. Applicability and order of testing tasks may depend on your project’s requirements or other factors.

For support with conducting these tests, contact the IT Accessibility Group.

Request and Review Accessibility Documentation from Vendor

Outcome: VPAT/ACR or other accessibility documentation received and reviewed; accessibility issues documented

  • Request and obtain accessibility documentation from vendor
  • Review documentation with requesting unit representative
  • Document accessibility issues disclosed in VPAT
  • Include additional accessibility information, disclosures
  • Engage with vendor on remediation efforts, timelines, contract language

Note: This should be done prior to purchase or acquisition. See “Purchasing and Procuring Accessible Technology” for more information.

Identify and Assemble Testing Team

Outcome: Appropriate stakeholders convene to complete testing.

Testing team should include each of the following roles: accountable technology or service owner, local technical support staff, subject matter expert, knowledgeable end user, accessibility specialist. An individual might fill multiple roles.

This group should define the scope of the testing, identify and compile tasks for testing, determine success criteria, recruit test users, and perform any other testing tasks.

Define Scope

Outcome: Testing tasks are identified; list is compiled.

  • Identify essential tasks, processes, dependencies, sequence
  • Describe desired user outcomes
  • Develop any task lists, worksheets, or forms, noting success/failure criteria, relevant WCAG guidelines, or other criteria

ALT Text

Outcome: Non-text elements that contain content include alternate descriptive text that conveys information and purpose to the user.

  • All images and non-text elements that present content include alternate descriptive text (e.g., ALT, aria-label, aria-labelledby, aria-describedby)
  • Images that are purely decorative include the role="presentation" attribute
  • Images that accompany text instruction (e.g., an arrow adjacent to the text of a button) include role="presentation"

Keyboard Testing

Outcome: User can complete tasks independently using only keyboard

  • Locate, focus and activate any interactive controls (SC 2.1.1)
  • Identify any keyboard traps (SC 2.1.2)
  • User can bypass long or redundant navigation blocks (SC 2.4.1)
  • Tab/keyboard order follows logical reading order (SC 1.3.2, SC 2.4.3)
  • Track keyboard cursor focus visually (SC 2.4.7)

Visual Testing

Outcome: User can complete tasks independently with or without mouse using visual cues

  • Color is not the only means of distinguishing an object or state (SC 1.4.1)
  • Foreground elements contrast sufficiently with backgrounds (SC 1.4.3)
  • Content can be magnified to 200% without loss of content or functionality (SC 1.4.4)
  • Cursor position is visually apparent (SC 2.4.7)
  • Magnification doesn’t cause horizontal scrolling

Input Assistance

Outcome: User can identify and correct errors before submitting input tasks

  • Input errors and solutions are communicated to user before user input is submitted (SC 3.3.1, SC 3.3.3)
  • Interactive components include clear, visible guidance, cues, and labels (SC 3.3.2)

Navigability Testing

Outcome: User can traverse content with substantially equivalent ease

  • All user content is contained in an appropriate landmark
  • User can bypass long or redundant navigation blocks (SC 2.4.1)
  • Tab/keyboard order follows logical reading order (SC 1.3.2, SC 2.4.3)
  • Change of context does not unexpectedly move user out of expected focus position (SC 3.2.2)

Automated Testing (where applicable)

Outcome: Application passes violation and warning notifications; manual checks documented

  • Select automated testing tools (e.g., Siteimprove, Acrobat Accessibility Checker, Microsoft Office Accessibility Checker)
  • Test individual application views and states with appropriate testing tool
  • Identify Manual Checks for additional manual review as below
  • Validate code for accuracy, completion (SC 4.1.1)

Assistive Technology Testing

Outcome: User can complete tasks independently using specific assistive technology. To ensure compatibility, AT tests need to be conducted separately for all assistive technologies.

  • Identify testing environment: user agent, operating system, AT product 
  • Text alternatives for image or non-text elements (SC 1.1.1)
  • Structure allows user to maintain orientation (SC 1.3.1)
  • Content presents in logical reading order (SC 1.3.2)
  • Identification, focus and activation of any interactive controls (SC 2.1.1)
  • Clear guidance, cues, labeling of interactive components (SC 3.3.2)
  • Error prevention, alerts, and corrections (SC 3.3.4)

Post-Testing Review

Outcome: Testing results documented and shared with any appropriate stakeholders.

  • Document automated and manual results
  • Adjust for essential processes, unknowns, intangibles
  • Present results to project team
  • Develop or modify EEAAP as necessary to reflect test results
  • Engage with stakeholders on remediation, accommodation, or other next steps

EEAAP

Testing results should be considered when creating an Equally Effective Alternative Access Plan (EEAAP) or similar remediation efforts. Visit the UI’s EEAAP website for more information on developing an Equally Effective Alternative Access Plan. (https://itaccessibility.uiowa.edu/EEAAP)