Anomia
The Project :
Language is a complex cognitive process engaging linguistic abilities, but also attentional and socio-affective skills, particularly in a communication context. Those abilities may be impaired or reduced, for example after brain injury (aphasia). For now, it remains usual to evaluate language with standardized tests, out of a real communication context (pen-and-paper tests, computer tests).
In order to measure language in settings closer to real-life conditions, immersive virtual reality seems to be a relevant tool, from which it would be interesting to explore benefits in evaluation. The project consists in implementing a new assessment in an VR application and then presenting it to neurotypical adults and people with aphasia to see to what extent their performances would differ depending on the assessment’s condition.

The Market
The stage is made up of 6 stands: a fruit and vegetable stand, a tools stand, a musical instruments stand, a kitchen utensils stand, a bathroom items stand, and finally a checkout with all the items previously purchased at the end.
Interactible items
To purchase an item, participants point their controller at the item they wish to purchase. If he doesn’t know the object, he can choose to obtain additional information about it, such as its name, an explanatory video (for tools) or a melody (for musical instruments).


Smart avatars
After selecting the object, the participant has to pronounce its correct name using the VR headset’s built-in microphone.
The avatar then analyzes the request using a voice recognition algorithm (IBM Watson) that we have trained specifically for the study.
Custom objects
For this study, about 136 unique items has been 3D modelised. This is some of them.

Gallery :



