This post explains some of the potential changes to our testing workflow. It also talks about the shorter set of questions that we will be responding manualy and asks for your feedback.
As we incorporate DAISY's automated tools into our testing, some of our workflow will likely change. I expect that after we become more proficient reading the Ace reports and learn to work with SMART, these tools will help us to optimize our time, and allow us to focus on the human experience of accessing specific features of an EPUB file. While we will be able to identify issues that have to do with code more quickly, we will still be doing some manual testing for sure. There is significant value added in performing tests by humans with experience using the tools.
For the next week, please allocate some of your hours to Ace and SMART. Prioritize this task and don't worry about finishing reports on new EPUB titles for this Friday.
As you explore these tools, please share your observations in the forum. Do you find the Ace reports useful in some way? Would you say that they can guide your exploration of the EPUB file? Ace can identify issues related to the code, but it is only the first step. It is still necessary to conduct some manual testing for headings, how well the table of contents works, external links, different types of text, etc. While some of it can be identified more quickly using SMART (it would be great for two people in the team to do), the idea is that each tester will still continue using their regular reading applications for this part of the test. This will also add information about how different types of content in the file are presented in various reading applications.
In order to come up with a shorter list of questions for human testing, I cross-referenced the questions from the latest version of our original list to DAISY's rules for HTML and EPUB, that encompass all the issues that ACE can find. Farrah provided information of the manual checks that SMART usually requires.
Before we begin recording testing findings of more titles, it would be useful to discuss the list of questions here. In the common Epub Testing folder in Dropbox, you will find a file titled "EPUB testing questions (experience.xlsx" It contains two tabs: the first one is the shorter list of questions for testing, and the second tab contains the original list of questions we were using, for your reference. The two tabs contain the same columns: column A contains the questions; column B indicates inclusion in the shorter list of questions; column C shows the automated tool that looks for each feature; and column D has comments.
The questions that each tester will answer in Excel must add value to the process. The shorter list is still quite long... Is there anything missing from this list that requires usability testing? Are there any redundant questions?
Please post your comments here, all feedback about the questions, tools, workflow or anything else is welcome.
EPUB testing discussion for the SDPP-D Grant project 2018-19
1 post • Page 1 of 1
Users browsing this forum: No registered users and 2 guests