以下是对“slam-algorithms”技能包的中文翻译:
name: slam-算法 description: 专家级技能,用于SLAM(同时定位与建图)算法的选择、配置和调优。配置视觉SLAM(ORB-SLAM3, RTAB-Map)、激光雷达SLAM(Cartographer, LIO-SAM),调整参数,评估精度,并优化实时性能。 allowed-tools: Bash(*) 读写编辑Glob Grep WebFetch metadata: author: babysitter-sdk version: “1.0.0” category: perception backlog-id: SK-007
slam-算法
你是 slam-算法 - 一个专门用于SLAM(同时定位与建图)算法选择、配置和调优的专家级技能。
概览
这项技能支持AI驱动的SLAM实现,包括:
- 为单目、双目和RGB-D配置ORB-SLAM3
- 设置RTAB-Map进行视觉和激光雷达SLAM
- 配置Google Cartographer进行2D和3D SLAM
- 实现LIO-SAM和LeGO-LOAM进行激光雷达-惯性SLAM
- 调整特征检测和匹配参数
- 配置回环检测和优化
- 设置IMU预积分和视觉-惯性融合
- 优化实时性能
- 评估SLAM精度(ATE, RPE指标)
- 配置地图保存和加载
前提条件
- 带有SLAM包的ROS/ROS2
- 相机校准(内参和外参)
- 使用VI-SLAM时的IMU校准
- 适当的计算资源(推荐使用GPU进行视觉SLAM)
能力
1. ORB-SLAM3 配置
为不同传感器配置ORB-SLAM3:
# orb_slam3_config.yaml
%YAML:1.0
# 相机参数(单目/双目)
Camera.type: "PinHole"
Camera.fx: 458.654
Camera.fy: 457.296
Camera.cx: 367.215
Camera.cy: 248.375
Camera.k1: -0.28340811
Camera.k2: 0.07395907
Camera.p1: 0.00019359
Camera.p2: 1.76187114e-05
# 相机分辨率
Camera.width: 752
Camera.height: 480
Camera.fps: 20.0
# 双目参数
Camera.bf: 47.90639384423901 # 基线 * fx
# RGB-D参数
DepthMapFactor: 1.0
ThDepth: 35.0 # 深度阈值
# ORB提取器
ORBextractor.nFeatures: 1200
ORBextractor.scaleFactor: 1.2
ORBextractor.nLevels: 8
ORBextractor.iniThFAST: 20
ORBextractor.minThFAST: 7
# IMU参数(用于VI-SLAM)
IMU.NoiseGyro: 1.7e-4
IMU.NoiseAcc: 2.0e-3
IMU.GyroWalk: 1.9e-5
IMU.AccWalk: 3.0e-3
IMU.Frequency: 200
# 查看器参数
Viewer.KeyFrameSize: 0.05
Viewer.KeyFrameLineWidth: 1
Viewer.GraphLineWidth: 0.9
Viewer.PointSize: 2
Viewer.CameraSize: 0.08
Viewer.CameraLineWidth: 3
Viewer.ViewpointX: 0
Viewer.ViewpointY: -0.7
Viewer.ViewpointZ: -1.8
Viewer.ViewpointF: 500
启动ORB-SLAM3:
# 单目
ros2 run orb_slam3_ros orb_slam3_mono \
--ros-args -p vocabulary:=/path/to/ORBvoc.txt \
-p settings:=/path/to/config.yaml \
-r /camera/image_raw:=/robot/camera/image_raw
# 双目
ros2 run orb_slam3_ros orb_slam3_stereo \
--ros-args -p vocabulary:=/path/to/ORBvoc.txt \
-p settings:=/path/to/stereo_config.yaml
# RGB-D
ros2 run orb_slam3_ros orb_slam3_rgbd \
--ros-args -p vocabulary:=/path/to/ORBvoc.txt \
-p settings:=/path/to/rgbd_config.yaml
# 双目-惯性
ros2 run orb_slam3_ros orb_slam3_stereo_inertial \
--ros-args -p vocabulary:=/path/to/ORBvoc.txt \
-p settings:=/path/to/stereo_inertial_config.yaml
2. RTAB-Map 配置
为RGB-D和激光雷达SLAM配置RTAB-Map:
# rtabmap_params.yaml
rtabmap:
ros__parameters:
# 数据库
database_path: ""
# 检测
Rtabmap/DetectionRate: "1.0"
Rtabmap/TimeThr: "0.0"
# 内存
Mem/IncrementalMemory: "true"
Mem/STMSize: "30"
Mem/RehearsalSimilarity: "0.6"
# 视觉特征
Vis/FeatureType: "6" # ORB
Vis/MaxFeatures: "500"
Vis/MinInliers: "20"
Vis/InlierDistance: "0.1"
# 回环闭合
RGBD/LoopClosureReextractFeatures: "true"
RGBD/OptimizeFromGraphEnd: "false"
RGBD/ProximityBySpace: "true"
# 激光雷达ICP
Reg/Strategy: "1" # 0=Vis, 1=ICP, 2=VisIcp
Icp/PointToPlane: "true"
Icp/Iterations: "30"
Icp/VoxelSize: "0.05"
Icp/MaxCorrespondenceDistance: "0.1"
# 图优化
Optimizer/Strategy: "1" # g2o
Optimizer/Iterations: "20"
# 建图
Grid/CellSize: "0.05"
Grid/RangeMax: "5.0"
Grid/RayTracing: "true"
Grid/3D: "true"
rgbd_odometry:
ros__parameters:
frame_id: "base_link"
odom_frame_id: "odom"
publish_tf: true
Odom/Strategy: "0" # Frame-to-Map
Odom/ResetCountdown: "1"
Vis/CorType: "0" # 特征匹配
启动RTAB-Map:
from launch import LaunchDescription
from launch_ros.actions import Node
def generate_launch_description():
return LaunchDescription([
# RGB-D 里程计
Node(
package='rtabmap_odom',
executable='rgbd_odometry',
output='screen',
parameters=[{
'frame_id': 'base_link',
'odom_frame_id': 'odom',
'subscribe_rgbd': True,
'approx_sync': True,
}],
remappings=[
('rgbd_image', '/camera/rgbd'),
]
),
# RTAB-Map SLAM
Node(
package='rtabmap_slam',
executable='rtabmap',
output='screen',
parameters=[{
'subscribe_rgbd': True,
'subscribe_scan': True,
'approx_sync': True,
'frame_id': 'base_link',
'map_frame_id': 'map',
'odom_frame_id': 'odom',
'queue_size': 10,
}],
remappings=[
('rgbd_image', '/camera/rgbd'),
('scan', '/lidar/scan'),
]
),
# RViz
Node(
package='rtabmap_viz',
executable='rtabmap_viz',
output='screen',
parameters=[{
'subscribe_rgbd': True,
'subscribe_scan': True,
}],
)
])
3. Google Cartographer 配置
为2D和3D SLAM配置Cartographer:
-- cartographer_2d.lua
include "map_builder.lua"
include "trajectory_builder.lua"
options = {
map_builder = MAP_BUILDER,
trajectory_builder = TRAJECTORY_BUILDER,
map_frame = "map",
tracking_frame = "imu_link",
published_frame = "base_link",
odom_frame = "odom",
provide_odom_frame = true,
publish_frame_projected_to_2d = false,
use_pose_extrapolator = true,
use_odometry = false,
use_nav_sat = false,
use_landmarks = false,
num_laser_scans = 1,
num_multi_echo_laser_scans = 0,
num_subdivisions_per_laser_scan = 1,
num_point_clouds = 0,
lookup_transform_timeout_sec = 0.2,
submap_publish_period_sec = 0.3,
pose_publish_period_sec = 5e-3,
trajectory_publish_period_sec = 30e-3,
rangefinder_sampling_ratio = 1.,
odometry_sampling_ratio = 1.,
fixed_frame_pose_sampling_ratio = 1.,
imu_sampling_ratio = 1.,
landmarks_sampling_ratio = 1.,
}
MAP_BUILDER.use_trajectory_builder_2d = true
TRAJECTORY_BUILDER_2D.submaps.num_range_data = 35
TRAJECTORY_BUILDER_2D.min_range = 0.3
TRAJECTORY_BUILDER_2D.max_range = 30.
TRAJECTORY_BUILDER_2D.missing_data_ray_length = 1.
TRAJECTORY_BUILDER_2D.use_imu_data = true
TRAJECTORY_BUILDER_2D.use_online_correlative_scan_matching = true
TRAJECTORY_BUILDER_2D.real_time_correlative_scan_matcher.linear_search_window = 0.1
TRAJECTORY_BUILDER_2D.real_time_correlative_scan_matcher.translation_delta_cost_weight = 10.
TRAJECTORY_BUILDER_2D.real_time_correlative_scan_matcher.rotation_delta_cost_weight = 1e-1
POSE_GRAPH.optimization_problem.huber_scale = 5e2
POSE_GRAPH.optimize_every_n_nodes = 35
POSE_GRAPH.constraint_builder.sampling_ratio = 0.03
POSE_GRAPH.constraint_builder.max_constraint_distance = 15.
POSE_GRAPH.constraint_builder.min_score = 0.55
POSE_GRAPH.constraint_builder.global_localization_min_score = 0.6
return options
-- cartographer_3d.lua
include "map_builder.lua"
include "trajectory_builder.lua"
options = {
map_builder = MAP_BUILDER,
trajectory_builder = TRAJECTORY_BUILDER,
map_frame = "map",
tracking_frame = "imu_link",
published_frame = "base_link",
odom_frame = "odom",
provide_odom_frame = true,
publish_frame_projected_to_2d = false,
use_pose_extrapolator = true,
use_odometry = false,
use_nav_sat = false,
use_landmarks = false,
num_laser_scans = 0,
num_multi_echo_laser_scans = 0,
num_subdivisions_per_laser_scan = 1,
num_point_clouds = 1,
lookup_transform_timeout_sec = 0.2,
submap_publish_period_sec = 0.3,
pose_publish_period_sec = 5e-3,
trajectory_publish_period_sec = 30e-3,
rangefinder_sampling_ratio = 1.,
odometry_sampling_ratio = 1.,
fixed_frame_pose_sampling_ratio = 1.,
imu_sampling_ratio = 1.,
landmarks_sampling_ratio = 1.,
}
MAP_BUILDER.use_trajectory_builder_3d = true
TRAJECTORY_BUILDER_3D.num_accumulated_range_data = 1
TRAJECTORY_BUILDER_3D.min_range = 1.
TRAJECTORY_BUILDER_3D.max_range = 100.
TRAJECTORY_BUILDER_3D.voxel_filter_size = 0.15
TRAJECTORY_BUILDER_3D.high_resolution_adaptive_voxel_filter.max_length = 2.
TRAJECTORY_BUILDER_3D.low_resolution_adaptive_voxel_filter.max_length = 4.
TRAJECTORY_BUILDER_3D.use_online_correlative_scan_matching = false
TRAJECTORY_BUILDER_3D.ceres_scan_matcher.translation_weight = 5.
TRAJECTORY_BUILDER_3D.ceres_scan_matcher.rotation_weight = 4e2
TRAJECTORY_BUILDER_3D.submaps.high_resolution = 0.10
TRAJECTORY_BUILDER_3D.submaps.low_resolution = 0.45
return options
4. LIO-SAM 配置
为激光雷达-惯性SLAM配置LIO-SAM:
# lio_sam_params.yaml
lio_sam:
ros__parameters:
# 主题
pointCloudTopic: "points_raw"
imuTopic: "imu_raw"
odomTopic: "odometry/imu"
gpsTopic: "gps/fix"
# 框架
lidarFrame: "base_link"
baselinkFrame: "base_link"
odometryFrame: "odom"
mapFrame: "map"
# GPS设置
useImuHeadingInitialization: true
useGpsElevation: false
gpsCovThreshold: 2.0
poseCovThreshold: 25.0
# 导出设置
savePCD: false
savePCDDirectory: "/Downloads/LOAM/"
# 传感器设置
sensor: velodyne # velodyne, ouster, livox
N_SCAN: 16
Horizon_SCAN: 1800
downsampleRate: 1
lidarMinRange: 1.0
lidarMaxRange: 100.0
# IMU设置
imuAccNoise: 3.9939570888238808e-03
imuGyrNoise: 1.5636343949698187e-03
imuAccBiasN: 6.4356659353532566e-05
imuGyrBiasN: 3.5640318696367613e-05
imuGravity: 9.80511
imuRPYWeight: 0.01
# 外部参数
extrinsicTrans: [0.0, 0.0, 0.0]
extrinsicRot: [-1, 0, 0, 0, 1, 0, 0, 0, -1]
# LOAM特征阈值
edgeThreshold: 1.0
surfThreshold: 0.1
edgeFeatureMinValidNum: 10
surfFeatureMinValidNum: 100
# 体素滤波器参数
odometrySurfLeafSize: 0.4
mappingCornerLeafSize: 0.2
mappingSurfLeafSize: 0.4
# 回环闭合
loopClosureEnableFlag: true
loopClosureFrequency: 1.0
surroundingKeyframeSize: 50
historyKeyframeSearchRadius: 15.0
historyKeyframeSearchTimeDiff: 30.0
historyKeyframeSearchNum: 25
historyKeyframeFitnessScore: 0.3
# 优化
z_tollerance: 1000.0
rotation_tollerance: 1000.0
numberOfCores: 4
mappingProcessInterval: 0.15
surroundingKeyframeDensity: 2.0
surroundingKeyframeSearchRadius: 50.0
5. SLAM精度评估
使用EVO工具包评估SLAM精度:
# 安装evo
pip install evo
# 计算绝对轨迹误差(ATE)
evo_ape tum groundtruth.txt estimated.txt -va --plot --save_results results/ate.zip
# 计算相对位姿误差(RPE)
evo_rpe tum groundtruth.txt estimated.txt -va --delta 1 --delta_unit m --plot
# 比较多个轨迹
evo_traj tum groundtruth.txt orb_slam.txt rtabmap.txt cartographer.txt -va --plot
# 生成轨迹统计数据
evo_res results/*.zip --use_filenames -p --save_table results/comparison.csv
Python评估:
import numpy as np
from evo.core import metrics
from evo.core.trajectory import PoseTrajectory3D
from evo.tools import file_interface
def evaluate_slam_accuracy(groundtruth_file, estimated_file):
"""使用ATE和RPE指标评估SLAM精度。"""
# 加载轨迹
traj_ref = file_interface.read_tum_trajectory_file(groundtruth_file)
traj_est = file_interface.read_tum_trajectory_file(estimated_file)
# 同步轨迹
from evo.core import sync
traj_ref, traj_est = sync.associate_trajectories(traj_ref, traj_est)
# 计算ATE
ate_result = metrics.APE(metrics.PoseRelation.translation_part)
ate_result.process_data((traj_ref, traj_est))
print(f"ATE RMSE: {ate_result.stats['rmse']:.4f} m")
print(f"ATE Mean: {ate_result.stats['mean']:.4f} m")
print(f"ATE Std: {ate_result.stats['std']:.4f} m")
# 计算RPE
rpe_result = metrics.RPE(metrics.PoseRelation.translation_part,
delta=1.0, delta_unit=metrics.Unit.meters)
rpe_result.process_data((traj_ref, traj_est))
print(f"RPE RMSE: {rpe_result.stats['rmse']:.4f} m")
return {
'ate_rmse': ate_result.stats['rmse'],
'ate_mean': ate_result.stats['mean'],
'rpe_rmse': rpe_result.stats['rmse']
}
6. 地图保存和加载
保存和加载SLAM地图:
# RTAB-Map
# 保存地图数据库
ros2 service call /rtabmap/pause std_srvs/srv/Empty
ros2 service call /rtabmap/backup std_srvs/srv/Empty
# 加载地图进行定位
ros2 run rtabmap_slam rtabmap \
--ros-args -p database_path:=/path/to/map.db \
-p Mem/IncrementalMemory:=false \
-p Mem/InitWMWithAllNodes:=true
# Cartographer
# 保存地图
ros2 service call /write_state cartographer_ros_msgs/srv/WriteState \
"{filename: '/path/to/map.pbstream'}"
# 加载地图
ros2 run cartographer_ros cartographer_node \
-configuration_directory /path/to/config \
-configuration_basename localization.lua \
-load_state_filename /path/to/map.pbstream
# 导出2D占用网格
ros2 run nav2_map_server map_saver_cli -f /path/to/map
MCP服务器集成
这项技能可以利用以下MCP服务器增强能力:
| 服务器 | 描述 | 参考 |
|---|---|---|
| ros-mcp-server | ROS主题/服务访问 | GitHub |
| ROSBag MCP | 包文件分析 | arXiv |
最佳实践
- 正确校准 - 精确的相机和IMU校准至关重要
- 特征调整 - 根据环境(纹理贫乏、动态)调整特征提取
- 回环闭合 - 启用并调整大型环境
- 实时监控 - 跟踪跟踪质量和重定位事件
- 地图管理 - 实施地图保存,以实现长期自主性
- 传感器融合 - 结合视觉和激光雷达以实现鲁棒性能
流程集成
这项技能与以下流程集成:
visual-slam-implementation.js- 视觉SLAM设置lidar-mapping-localization.js- 激光雷达SLAM配置autonomous-exploration.js- 探索SLAMsensor-fusion-framework.js- 多传感器SLAM
输出格式
执行操作时,提供结构化输出:
{
"operation": "configure-slam",
"algorithm": "rtabmap",
"sensorConfig": "rgbd+lidar",
"status": "success",
"configuration": {
"featureType": "ORB",
"loopClosure": true,
"optimizationStrategy": "g2o"
},
"artifacts": [
"config/rtabmap_params.yaml",
"launch/slam.launch.py"
],
"estimatedPerformance": {
"updateRate": "10 Hz",
"memoryUsage": "moderate"
}
}
约束
- 在SLAM部署前验证传感器校准
- 监控CPU/GPU使用情况以实现实时性能
- 首先在受控环境中测试回环闭合
- 在自主导航前验证地图质量
- 考虑环境因素(照明、纹理、动态对象)