In some embodiments, the computing device 110 includes peripheral devices for making physical contact with the user to make the virtual reality experience more realistic. The computing device may include gloves for providing the user with tactile sensations that correspond to virtual reality content. For example, the virtual reality content may include images of another user or object within the virtual reality environment and when the user reaches out to touch the object the computing device 110 provides pressure or vibrations that make it feel like the user is making physical contact with the object.
In some embodiments the computing device may include other sensor input devices (not shown), these may garner data from sources other than directly from the user. These sensors may include, by way of non-limiting example, distance sensors, motion sensors, location sensors, global positioning systems (GPS), accelerometers, gyroscopes, or the like. These and other input devices are often connected to the processing unit 120 through a user input interface 160 that is coupled to the system bus, but they may be connected by other interface and bus structures, such as a parallel port, game port or a universal serial bus (USB).
The computing device 110 may use data from the sensor input devices to track the orientation of a user or parts of the user. For example, the computing device 110 may include one or more accelerometers or gyroscopes used to detect a change in orientation of the user's head and/or hands. The computing device 110 may then alter the information displayed within the virtual reality environment accordingly. Audio information may similarly be modified based on the orientation of the head and/or hands of the user.