0 votes
by (240 points)
LiDAR Robot Navigation

LiDAR robots navigate using a combination of localization, mapping, and also path planning. This article will outline the concepts and demonstrate how they work by using an example in which the robot achieves an objective within the space of a row of plants.

LiDAR sensors have modest power demands allowing them to extend the battery life of a robot and decrease the raw data requirement for localization algorithms. This allows for more iterations of SLAM without overheating the GPU.

LiDAR Sensors

The core of lidar systems is its sensor, which emits laser light in the surrounding. These light pulses bounce off objects around them at different angles based on their composition. The sensor measures how long it takes for each pulse to return and utilizes that information to calculate distances. Sensors are mounted on rotating platforms, which allow them to scan the surrounding area quickly and at high speeds (10000 samples per second).

LiDAR sensors are classified according to the type of sensor they are designed for applications in the air or on land. Airborne lidar systems are typically attached to helicopters, aircraft, or UAVs. (UAVs). Terrestrial LiDAR systems are generally mounted on a stationary robot platform.

To accurately measure distances the sensor must always know the exact location of the robot. This information is recorded by a combination inertial measurement unit (IMU), GPS and time-keeping electronic. These sensors are utilized by LiDAR systems to determine the precise position of the sensor within the space and time. This information is then used to create a 3D representation of the surrounding.

LiDAR scanners can also be used to identify different surface types and types of surfaces, which is particularly useful when mapping environments that have dense vegetation. For instance, if an incoming pulse is reflected through a forest canopy it is common for it to register multiple returns. The first one is typically attributed to the tops of the trees, while the second one is attributed to the surface of the ground. If the sensor records each pulse as distinct, LiDAR Robot Navigation it is referred to as discrete return LiDAR.

Discrete return scanning can also be useful in analyzing surface structure. For instance, a forested area could yield a sequence of 1st, 2nd, and 3rd returns, with a final, large pulse representing the bare ground. The ability to divide these returns and save them as a point cloud allows for the creation of detailed terrain models.

Once a 3D model of the environment is built and the robot is equipped to navigate. This process involves localization and creating a path to take it to a specific navigation "goal." It also involves dynamic obstacle detection. This is the process that detects new obstacles that are not listed in the original map and then updates the plan of travel in line with the new obstacles.

SLAM Algorithms

SLAM (simultaneous localization and mapping) is an algorithm that allows your robot to build an outline of its surroundings and then determine the location of its position in relation to the map. Engineers make use of this information to perform a variety of tasks, such as the planning of routes and obstacle detection.

To allow SLAM to function, your robot must have an instrument (e.g. A computer with the appropriate software for processing the data as well as either a camera or laser are required. You also need an inertial measurement unit (IMU) to provide basic information about your position. The system will be able to track the precise location of your robot in a hazy environment.

The SLAM system is complex and offers a myriad of back-end options. No matter which solution you choose to implement an effective SLAM it requires constant interaction between the range measurement device and the software that extracts data, as well as the robot or vehicle. This is a dynamic procedure with a virtually unlimited variability.

When the robot moves, it adds scans to its map. The SLAM algorithm then compares these scans with previous ones using a process called scan matching. This allows loop closures to be identified. The SLAM algorithm is updated with its estimated robot trajectory when a loop closure has been detected.

The fact that the surrounding changes over time is a further factor that complicates SLAM. For instance, if a robot travels down an empty aisle at one point, and then comes across pallets at the next location it will be unable to connecting these two points in its map. Dynamic handling is crucial in this scenario and are a part of a lot of modern Lidar SLAM algorithm.

SLAM systems are extremely efficient in navigation and 3D scanning despite these limitations. It is particularly useful in environments that don't allow the robot to rely on GNSS-based positioning, like an indoor factory floor. It's important to remember that even a properly configured SLAM system could be affected by errors. To correct these errors it is crucial to be able to spot the effects of these errors and their implications on the SLAM process.

Mapping

The mapping function builds a map of the robot's environment, which includes the robot itself as well as its wheels and actuators, and everything else in its field of view. This map is used to aid in localization, route planning and obstacle detection. This is an area in which 3D lidars can be extremely useful, as they can be utilized like an actual 3D camera (with only one scan plane).

Map building can be a lengthy process, but it pays off in the end. The ability to create a complete and coherent map of a robot's environment allows it to move with high precision, and also around obstacles.

As a general rule of thumb, the greater resolution the sensor, more accurate the map will be. Not all robots require maps with high resolution. For example, a floor sweeping robot may not require the same level of detail as a robotic system for industrial use that is navigating factories of a large size.

To this end, there are a number of different mapping algorithms to use with LiDAR sensors. One of the most well-known algorithms is Cartographer which employs the two-phase pose graph optimization technique to correct for drift and create a uniform global map. It is particularly useful when paired with the odometry information.

Another option is GraphSLAM which employs a system of linear equations to model the constraints of a graph. The constraints are modelled as an O matrix and a X vector, with each vertice of the O matrix representing the distance to a point on the X vector. A GraphSLAM Update is a series of additions and subtractions on these matrix elements. The end result is that both the O and X Vectors are updated to reflect the latest observations made by the robot.

imageSLAM+ is another useful mapping algorithm that combines odometry with mapping using an Extended Kalman filter (EKF). The EKF updates not only the uncertainty in the robot's current position, but also the uncertainty of the features that were drawn by the sensor. The mapping function can then make use of this information to improve its own position, allowing it to update the underlying map.

Obstacle Detection

imageA robot vacuum cleaner lidar must be able to perceive its surroundings so it can avoid obstacles and reach its goal point. It utilizes sensors such as digital cameras, infrared scanners sonar and laser radar to sense its surroundings. It also makes use of an inertial sensors to monitor its position, speed and orientation. These sensors help it navigate in a safe way and prevent collisions.

A key element of this process is the detection of obstacles, which involves the use of a range sensor to determine the distance between the robot and the obstacles. The sensor can be mounted to the robot, a vehicle or a pole.

Your answer

Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Welcome to My QtoA, where you can ask questions and receive answers from other members of the community.
...