9.2 Inertial measurement units (IMUs) and sensor fusion

2 min readaugust 7, 2024

Inertial measurement units (IMUs) are key to tracking motion in AR and VR. They use accelerometers, gyroscopes, and magnetometers to measure movement and orientation. These sensors work together to provide accurate data about a device's position and rotation.

Sensor fusion techniques combine data from multiple sensors to improve . Methods like Kalman filters and complementary filters help reduce errors and . This fusion is crucial for creating smooth, responsive experiences in AR and VR applications.

Inertial Sensors

Accelerometer and Gyroscope

Top images from around the web for Accelerometer and Gyroscope
Top images from around the web for Accelerometer and Gyroscope
  • measures linear acceleration along three orthogonal axes (x, y, and z)
  • Provides information about the device's orientation relative to gravity
  • Can detect tilt, motion, and vibration
  • measures angular velocity around three orthogonal axes (pitch, yaw, and roll)
  • Detects rotational motion and helps determine the device's orientation
  • Gyroscope data is integrated over time to estimate the device's angular position

Magnetometer and Sensor Data

  • measures the strength and direction of the Earth's magnetic field
  • Helps determine the device's absolute orientation relative to magnetic north
  • Can be affected by magnetic interference from nearby objects (metal structures, electronic devices)
  • Angular velocity represents the rate of change of the device's orientation over time
    • Measured in degrees per second (°/s) or radians per second (rad/s)
  • Linear acceleration represents the device's acceleration along the three axes
    • Measured in meters per second squared (m/s²)
    • Includes the effect of gravity, which needs to be compensated for to obtain true linear acceleration

Sensor Fusion Techniques

Kalman Filter and Complementary Filter

  • is a recursive algorithm that estimates the state of a system based on noisy sensor measurements
  • Combines data from multiple sensors to produce an optimal estimate of the system's state
  • Accounts for the uncertainties and errors in the sensor measurements
  • is a simpler alternative to the Kalman filter
  • Combines data from different sensors based on their frequency characteristics
  • Typically uses a high-pass filter for gyroscope data and a low-pass filter for accelerometer data

Sensor Fusion Algorithms and Drift Correction

  • Sensor fusion algorithms combine data from multiple sensors to provide a more accurate and reliable estimate of the device's orientation and motion
  • Common algorithms include the Extended Kalman Filter (EKF) and the Madgwick algorithm
  • Sensor fusion helps overcome the limitations of individual sensors (accelerometer, gyroscope, magnetometer)
  • Drift correction is a crucial aspect of sensor fusion
  • Gyroscope measurements are prone to drift over time due to integration errors
  • Accelerometer and magnetometer data can be used to correct the gyroscope drift
  • Sensor fusion algorithms continuously update the orientation estimate based on the corrected sensor data

Key Terms to Review (20)

3-axis IMU: A 3-axis Inertial Measurement Unit (IMU) is a sensor device that measures and reports specific force, angular rate, and sometimes magnetic field surrounding the sensor in three-dimensional space. This device typically includes three accelerometers and three gyroscopes, allowing it to capture the motion and orientation of an object in real-time, which is essential for applications in navigation, robotics, and augmented reality systems.
6-axis IMU: A 6-axis inertial measurement unit (IMU) is a sensor device that measures acceleration and angular velocity in three-dimensional space, providing data on motion and orientation. It combines a 3-axis accelerometer and a 3-axis gyroscope, allowing it to capture both linear movement and rotational motion, making it essential for applications in augmented and virtual reality.
9-axis IMU: A 9-axis inertial measurement unit (IMU) is a device that combines three types of sensors: accelerometers, gyroscopes, and magnetometers, to measure motion and orientation in three-dimensional space. This integration allows for precise tracking of an object's acceleration, angular velocity, and magnetic field, making it essential for applications like augmented and virtual reality, robotics, and navigation systems.
Accelerometer: An accelerometer is a sensor that measures the acceleration it experiences relative to free fall, providing data about the motion and orientation of an object. This device is essential for detecting changes in velocity, allowing it to be integrated into various technologies, including mobile devices and inertial measurement units. Accelerometers play a key role in combining data from different sensors to improve accuracy in localization and tracking systems.
Accuracy: Accuracy refers to the degree to which a measurement or calculation reflects the true value or position of an object in a given system. In augmented and virtual reality, accuracy is crucial for creating realistic experiences, ensuring that user interactions align precisely with visual and auditory feedback.
Complementary filter: A complementary filter is a data processing technique used to combine multiple sensor readings, particularly from inertial measurement units (IMUs), to produce a more accurate estimate of orientation or position. This filter effectively merges the fast, short-term responsiveness of gyroscope data with the stable, long-term accuracy of accelerometer data, providing a balanced output that mitigates the drawbacks of each individual sensor.
Data fusion: Data fusion is the process of integrating multiple sources of data to produce more consistent, accurate, and useful information than that provided by any single source alone. This technique combines data from various sensors, such as inertial measurement units (IMUs), to enhance the reliability and precision of information used in applications like navigation and motion tracking.
Drift: Drift refers to the gradual deviation of a virtual object's position from its intended location within an augmented or virtual environment. This can occur due to inaccuracies in tracking or sensor data, leading to a disconnection between the user's perception and the actual spatial anchor points in the digital space. It is essential to understand how drift impacts user experience, particularly in applications involving world-locked content and sensor technology.
Environment Mapping: Environment mapping is a technique used in computer graphics to create a realistic representation of a scene by simulating the reflection and refraction of light on surfaces. This process involves capturing the surrounding environment and applying it to the surfaces of 3D objects, enhancing their realism in virtual spaces. By incorporating this technique, immersive experiences can be developed that react dynamically to user movements and interactions.
Gestural Recognition: Gestural recognition refers to the ability of a system to interpret and respond to human gestures, typically through the use of sensors and cameras. This technology is fundamental in enhancing user interaction with augmented and virtual reality environments, allowing for intuitive control and engagement. By recognizing hand movements, body posture, and facial expressions, gestural recognition enables a seamless experience in immersive applications.
Gyroscope: A gyroscope is a device that uses the principles of angular momentum to measure or maintain orientation and angular velocity. By detecting changes in motion, gyroscopes are essential in enhancing the accuracy of navigation and stabilization systems, particularly when integrated with other sensors.
IEEE 802.15.4: IEEE 802.15.4 is a standard for low-rate wireless personal area networks (LR-WPANs) that defines the physical layer and media access control for devices with low power and low data rate requirements. This standard is crucial in enabling communication for various applications, particularly in wireless sensor networks, smart devices, and the Internet of Things (IoT), and it connects seamlessly with both optical tracking systems and inertial measurement units.
Kalman Filter: The Kalman Filter is an algorithm that uses a series of measurements observed over time, containing noise and other inaccuracies, to produce estimates of unknown variables that tend to be more precise than those based on a single measurement. It plays a crucial role in sensor fusion by providing a method to combine various data sources effectively, enhancing the accuracy of localization and tracking in dynamic environments.
Latency: Latency refers to the time delay between an action and the corresponding response in a system, which is especially critical in augmented and virtual reality applications. High latency can lead to noticeable delays between user input and system output, causing a disconnect that may disrupt the immersive experience.
Magnetometer: A magnetometer is a scientific instrument used to measure the strength and direction of magnetic fields. In the context of localization and orientation tracking, it plays a crucial role in sensor fusion by providing accurate magnetic field data, which can help determine an object's position and movement relative to the Earth's magnetic field.
Motion tracking: Motion tracking is a technology that captures the movement of objects or users in real-time, translating those movements into data that can be used in virtual and augmented environments. This capability is essential for creating immersive experiences, as it allows the digital content to respond accurately to the user's actions and surroundings.
Noise Reduction: Noise reduction refers to techniques and processes used to minimize unwanted disturbances that can interfere with the accuracy of data captured by sensors, especially in inertial measurement units (IMUs). Effective noise reduction is crucial for enhancing the quality of data being processed, leading to more reliable sensor fusion outcomes, which ultimately supports better navigation and tracking in augmented and virtual reality systems.
ROS: ROS, or Robot Operating System, is an open-source framework that provides a collection of software libraries and tools for building robot applications. It serves as a middleware layer, facilitating communication between different components of robotic systems, including inertial measurement units (IMUs) and various sensors. By simplifying the integration of hardware and software, ROS enhances the capabilities of sensor fusion algorithms that rely on data from multiple sources to improve navigation and positioning.
Sensor calibration: Sensor calibration is the process of adjusting and fine-tuning a sensor's output to ensure that it accurately reflects the physical quantity it measures. This process is essential for improving the accuracy of measurements and enhancing the overall performance of systems that rely on sensors, especially in applications requiring precise localization and integration with inertial measurement units (IMUs). Accurate sensor calibration leads to better data fusion, enabling more reliable tracking and positioning in augmented and virtual reality environments.
Sensor integration: Sensor integration refers to the process of combining data from multiple sensors to produce a more accurate and comprehensive understanding of an environment or system. This technique is crucial for improving the performance of applications like augmented and virtual reality, where precise tracking and interaction with virtual elements rely on the seamless fusion of data from different sources, such as inertial measurement units (IMUs) and other sensors.
© 2024 Fiveable Inc. All rights reserved.
AP® and SAT® are trademarks registered by the College Board, which is not affiliated with, and does not endorse this website.