WIP: Aphasia Therapy and VR- Proof of Concept
Updated: Nov 18, 2020
The use of virtual technologies is beneficial to the medical field with breakthroughs in surgical sciences and diagnostics. In the avenue of intervention, this paper explores the potential of creating simulated real-world challenges with communication for patients with Aphasia. Using current models of therapeutic intervention and applying them to an immersive experience. The scope of the work presented outlines the approach to developing an intervention simulator for patients with Aphasia by using real-world exercises.
The scope of the work presented outlines the approach to developing an intervention simulator for patients with Aphasia by using real-world exercises similar to the example above.
User is presented with a Menu. On the menu is a simple word: Coffee. The Host will ask "What would you like to order?" The user must say coffee.
Then the coffee is revealed on the table to the user.
Using research and the guidance of a clinician, the recreation of one of the aforementioned exercises will be reimagined in a virtual simulation.
The premise is that the user/patient will appear in a virtual coffee shop in an avatar representation. The host is there to assist. The user is presented a menu with the word coffee. When the user says the word coffee correctly the host will then use a trigger method and a coffee will appear on the table.
More technical aspects of the design are the development within a Game Engine. This experience is being developed in Unreal Engine 4 (UE4) because of its ability to handle real-time renders and high-quality imagery. Textures, materials, lighting, and post-processing effects are also vital to create a believable experience.
Interactivity is the driving point of this experience. Choosing UE4, allows for faster development within the built-in system. UE4 has its own visual nodal code language, which allows creative artists to generate interactions quickly.
The interaction will compose of:
Virtual Reality Capabilities
Encompasses the HMD (Head Mount Display) and the touch controllers via Vive or Oculus
Allows users to enter the VR world
Touch Controllers allow for virtual hands to interact with environment
The User is presented with a simple visual interface
Hosts with have majority control of the experience
Simple on trigger system is utilized for Hosts can press a button to reveal the object
Teleportation system for Users to interact with other areas in the scene.
Multiplayer functions allow for more user participation. In this case, the Host is the clinician while the User is the patient. For full immersion an avatar system would need to present and audio.
For better realistic immersion, there is a need for users to represent themselves in collaborative virtual reality applications. The avatar system would allow users to see other users in the environment, basically a virtual representation of themselves.
There are a number of aesthetic ways to achieve this in UE4. One being, to generate 3D models and create human-like avatars that represent the characters for the experience. In this proof of concept, the user will not be aware of their physical appearance.
Audio is also a vital point in any virtual simulation as it can break the immersion. Auditory senses are one of the aspects of VR that is stimulated in an experience, the other being vision and sometimes touch. Audio is important for communication. Hosts will have the ability to communicate with Users through microphone feedback built into the HMD.
Proof of Concept Demo Video
special thanks to Justin as my user.
UE4 for the Collaboration system does allow multiplayer functionality across multiple platforms. Which means that I could use my Oculus Rift and my partner could use his HTC Vive