【谷歌研发3D音频激光系统,为视障人士带来VR交互体验】

【谷歌研发3D音频激光系统,为视障人士带来VR交互体验】谷歌正在探索空间音频提示如何用于虚拟环境的导航和交互,借助其他技术为视障人群提供了更多的工具和帮助。一个例子是在网站图像上的替代文字,另一个是Google TalkBack。以这些技术为灵感,谷歌创建了一个令VR更容易达至这部分人群的音频工具。为了在漆黑的房间进行导航,谷歌创建了一个3D音频激光系统,其中包括一个从Vive控制器延伸出来的激光指针,帮助用户选择和播放音频标签。目前可以单纯通过听觉线索在VR中导航和交互,之后还会有进一步的发展。http://www.looooker.com/?p=50146

Daydream Labs: Accessibility in VR

Oct 27, 2017 | Category: Google

Virtual reality offers the ability to explore new worlds and have adventures without leaving home. We love the sense of freedom that VR offers, but it’s a technology that still relies mainly on visual cues—which makes it inaccessible to people with visual impairments. To bring these incredible experiences to visually impaired people, the technology needs to offer new tools. So we’ve been exploring how spatial audio cues can be used for navigating and interacting with virtual environments.

Compared to VR, other technologies offer more tools and help to visually-impaired people. One example is the alternative text found on website images, and another is Google TalkBack, which adds spoken, audible and haptic feedback to help visually-impaired people interact with their devices. With these technologies as inspiration, we created an audio tool aimed at making VR more accessible.

Accessibility in VR

Using an HTC Vive, we built a prototype of a 1:1 scale virtual room, recorded the name of every object in the room, and linked these audio labels to the individual objects—including the floor, walls and other features. Then, we made the user’s field of vision entirely black to simulate complete blindness. To enable navigation in the pitch-black room, we created a 3D audio laser system that includes a laser pointer extending from the Vive controller to select and play the audio labels, and an audio location control (touchpad click) to provide distance and direction to the last object aimed at by the laser pointer.

roomexvr

When a person aims the laser pointer at a virtual object and selects the audio location control, the VR system plays a short impulse response tone at location of the controller. Then the sound is played a few more times as it quickly progresses to the location of the virtual object. Because all audio is processed using the Google VR Spatial Audio plugin, each tone provides enough information to understand distance and relative location of the object in the virtual space.

To test our prototype, we challenged participants to find and pick up a toy laser gun within the virtual room, navigate to the window, and finally shoot at a duck moving outside the window. We ran six non-visually-impaired people through the prototype, and all of them were able to complete the challenge successfully. After the task was completed, four of them went through the experience again, this time able to see the room without vision impairment. Because they had navigated the room by sound, we found that they were already familiar with their surroundings.

DisPerVR

It’s a small step, but this experiment demonstrated that it’s possible to navigate and interact with a room in VR using only auditory cues. We hope others will also continue to explore ways to make VR accessible for everyone. There’s much more to do in this area!

You can find more details in our published technical disclosure.

原文链接:http://www.googlebig.com/2017/10/27/daydream-labs-accessibility-in-vr/

 

 


Comments are closed.



无觅相关文章插件