Why are we developing our own software for professional accessibility testing?

by Annett Farnetani, published on

I’m often asked how we came to develop our own software for manual accessibility testing. We are a digital agency specialized in accessibility and have been testing digital products for accessibility for almost 15 years. Initially, we had no intention of creating an accessibility testing platform.

It all started with a table

Like almost all colleagues, we initially used a spreadsheet program (in our case LibreOffice) for our accessibility tests. With each test, new experiences and requirements arose, and we invested time in improving the file. We quickly combined spreadsheets and text documents to balance out their respective disadvantages. In addition, we integrated more and more automation features to take repetitive tasks off our hands and eliminate potential sources of error.

The larger the accessibility tests became and the higher the requirements for our reports, the more time we spent managing the spreadsheets and text documents. We were able to make this work because our team consists of diverse experts, so there was always someone with the technical skills to implement new requirements. However, throughout the entire time of using spreadsheets as a testing tool, we had a nagging feeling about the work. Especially when setting up and completing a test, we repeatedly thought that the process should be easier and more intuitive. About three years ago, we finally realized that the effort was no longer proportional to the benefit and decided that we had to fundamentally change our approach.

What does an accessibility testing tool or software need to be able to do?

We set out in search of an alternative. First, we put together our key requirements to find a suitable solution.

Time savings

One of the most important goals was, and still is, to spend less time managing and adjusting the testing tool, and more time actually testing: from setting up a new test to conducting it, all the way through to completing the report.

Flexibility through different test catalogs

Our experience has shown that, in addition to established guidelines and standards (e.g. Web Content Accessibility Guidelines (WCAG) 2.1 AA or EN 301 549 – web), we also need customized or proprietary test catalogs.

Collaboration within the team

We work collaboratively as a team and simultaneously on projects, and this must also be possible for accessibility testing – naturally in a team of people with and without disabilities.

“Helpful” reports

It may sound odd, but many accessibility reports are not very helpful. Their structure and chosen metrics are often not designed from the client’s perspective. No report alone, no test alone makes something accessible, so we need the ability to create optimal, user-centered reports that make working on accessibility easier.

Structured testing according to WCAG EM

It must be possible to carry out a structured evaluation according to WCAG-EM (Website Accessibility Conformance Evaluation Methodology). Two key capabilities for this are the ability to assemble samples and to work through the test catalog in a structured manner.

Issue-oriented notation

Issue-oriented notation is a direct result of our many years of working with implementation teams (both external and in-house). For every team that receives an accessibility report and wants to make improvements, the first step is to break the report down into tasks (issues) and distribute them within the team or enter them into the ticketing system. With issue-oriented notation, the tasks are already divided. This way, the implementation team is supported and follow-up questions are reduced.

Joy of Use

The new testing tool must ensure a smooth workflow. Testing remains demanding work but should also be enjoyable. To achieve this, the tool needs to support both the testing process and the tester, without giving the impression of working against them.

Alternative testing tools

With these seven criteria in mind, we evaluated the following alternatives three years ago:

Option 1: Office products (e.g. LibreOffice, Excel, Word)

We had already worked with this software for years. By far the most flexible solution if the technical skills are available. However, it can quickly become unstable and requires regular technical maintenance. You have full freedom in structuring the report, but collaboration within the team is difficult.

Option 2: Ticketing system or test platform software

In both systems, the problem is that the software is neither designed for accessibility nor to be accessible. Different guidelines and the creation of samples are not fully supported. With ticketing systems, an additional issue is the lack of a reporting function.

Option 3: Dedicated software for accessibility testing

We had actually assumed that we would go with this option and only need to choose between the limited number of providers. We looked at several commercial testing tools, such as those from Deque Systems, TPGi, and also Accessibility Insights by Microsoft. The only thing missing across all of them was flexibility in testing methods and catalogs. You inevitably have to adopt the “the way of testing” defined by the product. Custom test catalogs or adjustments are not possible. 

In addition, at that time, many companies focused exclusively on WCAG and Section 508. EN 301 549 was not a consideration. We also evaluated the WCAG-EM Report Tool, which stands somewhat on its own. The positive aspect is that it’s open source. However, since the Tool is specifically designed to generate a conformance report, it fulfills only a few of our requirements for effective team testing. Testing is limited to WCAG, with no option to select other standards.

Option 4: Our own software for accessibility testing

This option offers the greatest freedom and the greatest effort. All desired features can be implemented. On the other hand, it requires the significant effort of developing our own software.

CAAT – our own Software

The decision to develop our own software was both difficult and straightforward. None of the other solutions came close to meeting our requirements. The path toward creating our own software was only possible because, as accessibility specialists and a digital agency, we were able to combine the necessary expertise.

And so we began developing CAAT - Computer Aided Accessibility Testing. We knew that our resources were relatively limited, which meant we had to stay lean and agile. To achieve this, we had to identify the most important features and create a simple and effective user interface. Based on an extended planning phase, the first usable version was in fact built in just two weeks. Many of the decisions made proved to be solid and still exist in similar form in CAAT today.

CAAT meets all the criteria we set out and now offers a range of additional features. One of CAAT’s core principles, however, has remained the independent development of a testing platform without being tied to a fixed test catalog. Our goal is to provide maximum flexibility.


As of today

Since 2021, we have been offering CAAT as commercial software. CAAT is developed by testers for testers. We use CAAT every day ourselves and are critical users. However what has surprised us most is how much interest our clients and partners have shown in CAAT’s continuous development, giving us valuable insights into what they expect from professional accessibility testing software.

Our wish list for new features has grown very long and already goes far beyond the original set of criteria. We have implemented many of them and plan to tackle many more.

With every new customer, we gain the opportunity to invest more time in CAAT, making work easier for professional testers and offering something truly special:

Have fun testing.