Introduction
This chapter continues our discussion of the hardware commonly used in 3D UIs. We examine a variety of input devices that are used in both immersive and desktop applications and their effect on 3D UIs. We also examine how to choose input devices for different 3D applications by looking at some important device attributes, taxonomies, and evidence from empirical studies.
Roadmap For 3D UIs
Discussion of a variety of different input devices used in 3D interfaces and how they affect interaction techniques. These devices are broken up into the following categories:
- desktop input devices
- input device characteristics
- tracking devices
- 3D mice
- special-purpose input device
- direct human input
Desktop Input Devices
Many input devices are used in desktop 3D UIs. Many of these devices have been used and designed for traditional 2D desktop applications such as word processing, spreadsheets, and drawing. However, with appropriate mappings, these devices also work well in 3D UIs and in 3D applications such as modeling and computer games.
Typically used in desktop 3D UIs
- Active sensing
- Used in more immersive settings with appropriate mappings
Types
- Keyboards
- 2D mice and trackballs
- Pen and touch-based tablets
- Joysticks Desktop
- Desktop 6-DOF input devices
Keyboard
- Contains a set of discrete components (buttons)
- Often used in 3D modeling and computer games
Traditional keyboards not conducive to immersive VR and mobile AR
- Device not portable enough
Several keyboard designs for alphanumeric characters in VR and AR
2D mice and trackballs
Two-dimensional mice and trackballs are other classic examples of desktop input devices made popular by the windows, icons, menus, and pointers (WIMP) interface metaphor (van Dam 1997). The mouse is one of the most widely used devices in traditional 2D input tasks and comes in many different varieties. The trackball is basically an upside-down mouse. Instead of moving the whole device to move the pointer, the user manipulates a rotatable ball embedded in the device.
Two essential components:
- Continuous 2D locator
- Set of discrete components
Pen and touch-based tablets
These devices have a manually continuous component (i.e., a 2D locator) for controlling a cursor and generating 2D pixel coordinate values when the stylus is moving on or hovering over the tablet surface.
- The stylus can move on or hover over the surface
- One or more fingers for multi-touch
- Most devices have buttons to generate discrete events
Absolute devices – reports where the stylus or touch is in the fixed reference frame of tablet surface
The smaller device can be used in immersive VR, mobile, and AR settings
Larger devices can be used in desktop and display wall settings
Pen and paper style interface (pen and tablet metaphor)
Joysticks Desktop
Joysticks are another example of input devices traditionally used on the desktop and with a long history as a computer input peripheral. These devices are similar to mice and pen-based tablets in that they have a combination of a manually continuous 2D locator and a set of discrete components such as buttons and other switches. However, there is an important distinction between the mouse and the joystick. With a mouse, the cursor stops moving as soon as the mouse stops moving. With a joystick, the cursor typically continues moving in the direction the joystick is pointing.
Types
- Isotonic: joystick handle must be returned to the neutral position to stop the cursor (rate control)
- Isometric: have a large spring constant so they cannot be perceptibly moved (output varies with user force)
Desktop 6-DOF Input Devices
Derivative of joystick
Uses isometric forces to collect 3D position and orientation data
- Push/pull for translation
- Twist/tilt for orientation
Designed specifically for 3D desktop
Input Device Characteristics
- 6 DOF has three position values (x, y, z) and three orientation values (yaw, pitch, roll)
- Discrete (one data value at a time)
- Continuous (multiple data values in sequence)
- Combination (discrete and continuous modes)
- Requires user to wear, hold, or manipulate the device to generate useful data
- Can generate both discrete and continuous data
- Do not require the user to hold or wear any input hardware to generate useful data
- Placed in a strategic location in the environment
- The passive sensor could be used actively (user wearing a camera)
- Determine position and/or orientation information
- Valuators
- Produce a real number value
- Choice
- Indicate a particular element of a set
Tracking Devices
3D user interaction systems are based primarily on motion tracking technologies, to obtain all the necessary information from the user through the analysis of their movements or gestures, these technologies are called tracking technologies. Tracking is important in presenting the correct viewpoint, coordinating the spatial and sound information presented to users as well the tasks or functions that they could perform.
There are three of the most common tracking devices which we examine:
- Motion Trackers
- Eye trackers
- Data Gloves
Motion Trackers
It is the process of recording the movement of objects or people. It is used in military, entertainment, sports, medical applications, and for validation of computer vision and robotics. In filmmaking and video game development, it refers to recording actions of human actors, and using that information to animate digital character models in 2D or 3D computer animation. Currently, there are several different motion-tracking technologies in use, which include:
- magnetic tracking
- mechanical tracking
- acoustic tracking
- inertia tracking
- optical tracking
- hybrid racking
Eye trackers
Eye trackers are purely passive input devices used to determine where the user is looking. Eye-tracking technology is primarily based on computer vision techniques: the device tracks the user’s pupils using corneal reflections detected by a camera. Devices can be worn or embedded into a computer screen, making for a much less obtrusive interface. Other eye-tracking techniques include electrooculography, which measures the skin’s electric potential differences using electrodes placed around the eye, and embedding mechanical or optical reference objects in contact lenses that are worn directly on the eye.
Data Gloves
In some cases, it is useful to have detailed tracking information about the user’s hands, such as how the fingers are bending or if two fingers have made contact with each other. Data gloves are input devices that provide this information. Data gloves come in two basic varieties:
- Bend- Sensing Gloves
- Pinch Gloves
3D Mice
In many cases, specifically with motion trackers, these tracking devices are combined with other physical device components such as buttons, sliders, knobs, and dials to create more functionally powerful input devices. We call these devices 3D mice and define them broadly as a handheld or worn input devices that combine motion tracking with a set of physical device components.
The distinguishing characteristic of 3D mice, as opposed to regular 2D mice, is that the user physically moves them in 3D space to obtain position and/or orientation information instead of just moving the device along a flat surface. Therefore, users can hold the device or, in some cases, wear it. Additionally, with orientation information present, it is trivial to determine where the device is pointing (the device’s direction vector), a function used in many fundamental 3D interaction techniques.
There are two types of 3D Mice are as follows:
- Handheld 3D Mice
- User-Worn 3D Mice
Special Purpose Input Devices
There are five special-purpose input devices are as follows:
- Shape tape is used in manipulating 3D curves.
- A user wearing Interaction Slippers.
- The CavePainting Table is used in the CavePainting application.
- Transparent palettes are used for both 2D and 3D interaction.
- The CAT (Control Action Table) is designed for surround-screen display environments.
Direct Human Input
A powerful approach to interacting with 3D applications is to obtain data directly from signals generated by the human body. With this approach, the user actually becomes the input device. For example, a user could stand in front of a camera and perform different movements, which the computer would interpret as commands (Lucente et al. 1998). In this section, we specifically discuss speech, bioelectric, and brain-computer input and how they can be used in 3D UIs.
Please provide unit 3,4,5 notes on prajapatimansi958@gmail.com
ReplyDelete