Experiments with blind users, March 2018

To expand on the work we performed during our visit at Bristol, we performed the same experiments with a small group of real blind users from around the Lincoln area. The purpose of these tests were two-fold: to see if the performance of the blind users are in line with their blindfolded counterparts from the previous round of experiments, and to probe them for their opinion on the system. The latter was of more interest to us, since this is the first opportunity we've had to see if the system and our design choices, as well as what we have planned, would be helpful to a blind person in a real world context. We received positive feedback about the system in its current form and they were particularly optimistic about having a system that can help them navigate and find objects more independently.

Processing the results from the target search experiment, July 2017

In January 2017, we conducted a set of experiments with our system at the University Of Bristol to determine whether our audio interface is capable of directing a user to point a camera towards a virtual target. This interface uses spatialised sound signals to give the target's pan angle and sets the signal's pitch (i.e. high/low notes) to convey the tilt angle. These experiments were a success and we gathered hundreds of samples and the results thus far look positive. Keep an eye out for the published results, but feel free to contact he project team if you'd like access to the dataset we generated.

Presentation at the AAAI Spring Symposium, March 2017

In March, my supervisor and I were invited to present our work at the AAAI Spring symposium series hosted at Stanford University, USA. The campus and surrounds are very impressive; its easy to see why Silicon Valley is located around the Bay area. As for the symposium, it was a great experience and we received very positive feedback from the delegates.