To analyse my artefact i recorded myself using the speechapi by talking to the browser to make the pages navigate as it would do on a hand-held device. Below is the videos of me doing this...
Video 1: Navigation
This video I'm talking to the browser to make it navigate across using tags within the HTML code (scrollable navigation from previous artefact).
Video 2: Paragraph
This video is extending upon using speechapi to read out paragraph number back to the end-user using an automated voice.
Video 3: Scroll
This video uses the extra features in javascript to help users scroll up and down using voice only.
Video 4: Example
This video is a mix off all the above features using navigation mixed with paragraph readings.
Video 1: Navigation
This video I'm talking to the browser to make it navigate across using tags within the HTML code (scrollable navigation from previous artefact).
Video 2: Paragraph
This video is extending upon using speechapi to read out paragraph number back to the end-user using an automated voice.
Video 3: Scroll
This video uses the extra features in javascript to help users scroll up and down using voice only.
Video 4: Example
This video is a mix off all the above features using navigation mixed with paragraph readings.
No comments:
Post a Comment