Add usage documentation

Rewrites README.md and adds documentation and tools for both RealSense and WMR devices.
This commit is contained in:
Mateo de Mayo 2022-05-06 16:13:23 -03:00
parent 68c61eafbd
commit dd90a07092
5 changed files with 747 additions and 61 deletions

202
README.md
View File

@ -1,75 +1,155 @@
[![pipeline status](https://gitlab.com/VladyslavUsenko/basalt/badges/master/pipeline.svg)](https://gitlab.com/VladyslavUsenko/basalt/commits/master)
# Basalt for Monado
## Basalt
For more information see https://vision.in.tum.de/research/vslam/basalt
This is a fork of [Basalt](https://gitlab.com/VladyslavUsenko/basalt) with some
modifications so that it can be used from Monado for SLAM tracking. Many thanks
to the Basalt authors.
![teaser](doc/img/teaser.png)
Follow this file for instructions on how to get Basalt up and running with
Monado. This README tries to be as concise as possible, but there are many
details that need to be addressed on it, so please do not skip any section,
otherwise it is likely that it won't work. Having said that, this guide has
been tested in limited setups, so please report any changes you had to make
in order to get it working in different ones.
This project contains tools for:
* Camera, IMU and motion capture calibration.
* Visual-inertial odometry and mapping.
* Simulated environment to test different components of the system.
Some reusable components of the system are available as a separate [header-only library](https://gitlab.com/VladyslavUsenko/basalt-headers) ([Documentation](https://vladyslavusenko.gitlab.io/basalt-headers/)).
There is also a [Github mirror](https://github.com/VladyslavUsenko/basalt-mirror) of this project to enable easy forking.
## Related Publications
Visual-Inertial Odometry and Mapping:
* **Visual-Inertial Mapping with Non-Linear Factor Recovery**, V. Usenko, N. Demmel, D. Schubert, J. Stückler, D. Cremers, In IEEE Robotics and Automation Letters (RA-L) [[DOI:10.1109/LRA.2019.2961227]](https://doi.org/10.1109/LRA.2019.2961227) [[arXiv:1904.06504]](https://arxiv.org/abs/1904.06504).
Calibration (explains implemented camera models):
* **The Double Sphere Camera Model**, V. Usenko and N. Demmel and D. Cremers, In 2018 International Conference on 3D Vision (3DV), [[DOI:10.1109/3DV.2018.00069]](https://doi.org/10.1109/3DV.2018.00069), [[arXiv:1807.08957]](https://arxiv.org/abs/1807.08957).
Calibration (demonstrates how these tools can be used for dataset calibration):
* **The TUM VI Benchmark for Evaluating Visual-Inertial Odometry**, D. Schubert, T. Goll, N. Demmel, V. Usenko, J. Stückler, D. Cremers, In 2018 International Conference on Intelligent Robots and Systems (IROS), [[DOI:10.1109/IROS.2018.8593419]](https://doi.org/10.1109/IROS.2018.8593419), [[arXiv:1804.06120]](https://arxiv.org/abs/1804.06120).
Calibration (describes B-spline trajectory representation used in camera-IMU calibration):
* **Efficient Derivative Computation for Cumulative B-Splines on Lie Groups**, C. Sommer, V. Usenko, D. Schubert, N. Demmel, D. Cremers, In 2020 Conference on Computer Vision and Pattern Recognition (CVPR), [[DOI:10.1109/CVPR42600.2020.01116]](https://doi.org/10.1109/CVPR42600.2020.01116), [[arXiv:1911.08860]](https://arxiv.org/abs/1911.08860).
Optimization (describes square-root optimization and marginalization used in VIO/VO):
* **Square Root Marginalization for Sliding-Window Bundle Adjustment**, N. Demmel, D. Schubert, C. Sommer, D. Cremers, V. Usenko, In 2021 International Conference on Computer Vision (ICCV), [[arXiv:2109.02182]](https://arxiv.org/abs/2109.02182)
## Index
- [Basalt for Monado](#basalt-for-monado)
- [Index](#index)
- [Installation](#installation)
- [Build and Install Directories](#build-and-install-directories)
- [Dependencies](#dependencies)
- [Build Basalt](#build-basalt)
- [Running Basalt](#running-basalt)
- [Monado Specifics](#monado-specifics)
- [Notes on Basalt Usage](#notes-on-basalt-usage)
- [Using Real Hardware](#using-real-hardware)
## Installation
### APT installation for Ubuntu 20.04 and 18.04 (Fast)
Set up keys, add the repository to the sources list, update the Ubuntu package index and install Basalt:
```
sudo apt-key adv --keyserver hkp://keyserver.ubuntu.com:80 --recv-keys 0AD9A3000D97B6C9
sudo sh -c 'echo "deb [arch=amd64] http://packages.usenko.eu/ubuntu $(lsb_release -sc) $(lsb_release -sc)/main" > /etc/apt/sources.list.d/basalt.list'
sudo apt-get update
sudo apt-get dist-upgrade
sudo apt-get install basalt
This was tested on both Ubuntu 20.04 and 18.04, be sure to open an issue if the
steps don't work for you.
### Build and Install Directories
To not clutter your system directories, let's set two environment variables,
`$bsltdeps` and `$bsltinstall` that point to existing empty build and install
directories respectively. These directories will contain everything produced in
this guide besides installed apt dependencies.
```bash
# Change the paths accordingly
export bsltinstall=/home/mateo/Documents/apps/bsltinstall
export bsltdeps=/home/mateo/Documents/apps/bsltdeps
```
### Source installation for Ubuntu >= 18.04 and MacOS >= 10.14 Mojave
Clone the source code for the project and build it. For MacOS you should have [Homebrew](https://brew.sh/) installed.
```
git clone --recursive https://gitlab.com/VladyslavUsenko/basalt.git
cd basalt
./scripts/install_deps.sh
mkdir build
cd build
cmake .. -DCMAKE_BUILD_TYPE=RelWithDebInfo
make -j8
Let's extend our system paths with those.
```bash
export PATH=$bsltinstall/bin:$PATH
export PKG_CONFIG_PATH=$bsltinstall/lib/pkgconfig:$PKG_CONFIG_PATH # for compile time pkg-config
export LD_LIBRARY_PATH=$bsltinstall/lib/:$LD_LIBRARY_PATH # for runtime ld
export LIBRARY_PATH=$bsltinstall/lib/:$LIBRARY_PATH # for compile time gcc
```
## Usage
* [Camera, IMU and Mocap calibration. (TUM-VI, Euroc, UZH-FPV and Kalibr datasets)](doc/Calibration.md)
* [Visual-inertial odometry and mapping. (TUM-VI and Euroc datasets)](doc/VioMapping.md)
* [Visual odometry (no IMU). (KITTI dataset)](doc/Vo.md)
* [Simulation tools to test different components of the system.](doc/Simulation.md)
* [Batch evaluation tutorial (ICCV'21 experiments)](doc/BatchEvaluation.md)
### Dependencies
## Device support
* [Tutorial on Camera-IMU and Motion capture calibration with Realsense T265.](doc/Realsense.md)
Most dependencies will be automatically built by basalt, however there are some
known issues you might need to deal with (click to open the ones that might
affect you).
## Development
* [Development environment setup.](doc/DevSetup.md)
<details>
<summary>Issues with GCC 11</summary>
## Licence
The code is provided under a BSD 3-clause license. See the LICENSE file for details.
Note also the different licenses of thirdparty submodules.
If you are using GCC 11 you might also get some issues with pangolin as there is now a
[name clash with Pagolin `_serialize()` name](https://github.com/stevenlovegrove/Pangolin/issues/657),
it [should be fixed](https://gcc.gnu.org/bugzilla/show_bug.cgi?id=100438#c12)
in newer versions of GCC-11. For fixing it yourself, you can cherry-pick
[these commits](https://github.com/stevenlovegrove/Pangolin/pull/658/commits),
or use a different GCC version.
(see
[this discord thread](https://discord.com/channels/556527313823596604/556527314670714901/904339906288050196)
in the Monado server for more info).
</details>
Some improvements are ported back from the fork
[granite](https://github.com/DLR-RM/granite) (MIT license).
### Build Basalt
```bash
cd $bsltdeps
git clone --recursive git@gitlab.freedesktop.org:mateosss/basalt.git
./basalt/scripts/install_deps.sh
sed -i "s#/home/mateo/Documents/apps/bsltdeps/#$bsltdeps/#" basalt/data/monado/*.toml
cd basalt && mkdir build && cd build
cmake .. -DCMAKE_INSTALL_PREFIX=$bsltinstall -DCMAKE_BUILD_TYPE=RelWithDebInfo -DBUILD_TESTS=off -DBASALT_INSTANTIATIONS_DOUBLE=off
make install -j12
```
### Running Basalt
This step is optional but you can try Basalt without Monado with one of the following methods:
- Through an EuRoC dataset (be sure to [download
one](http://robotics.ethz.ch/~asl-datasets/ijrr_euroc_mav_dataset/vicon_room1/)
first): `basalt_vio --dataset-path /path/to/euroc/V1_01_easy --cam-calib
$bsltdeps/basalt/data/euroc_ds_calib.json --dataset-type euroc --config-path
$bsltdeps/basalt/data/euroc_config.json --marg-data ~/Desktop/euroc_marg_data
--show-gui 1`
- With a RealSense T265 (you'll need to get a `t265_calib.json` yourself as
detailed [below](#configuring-basalt) but meanwhile you can try with [this
file](https://gitlab.com/VladyslavUsenko/basalt/-/issues/52) instead):
`basalt_rs_t265_vio --cam-calib $bsltdeps/basalt/data/t265_calib.json
--config-path $bsltdeps/basalt/data/euroc_config.json`
- With a RealSense D455 (and maybe this also works for a D435):
`basalt_rs_t265_vio --is-d455 --cam-calib
$bsltdeps/basalt/data/d455_calib.json --config-path
$bsltdeps/basalt/data/euroc_config.json`
### Monado Specifics
You'll need to compile Monado with the same Eigen used in Basalt, and with the
same flags. For that, set these with CMake (or equivalent flags for meson):
`-DEIGEN3_INCLUDE_DIR=$bsltdeps/basalt/thirdparty/basalt-headers/thirdparty/eigen
-DCMAKE_C_FLAGS="-march=native" -DCMAKE_CXX_FLAGS="-march=native"` otherwise
Monado will automatically use your system's Eigen, and having mismatched Eigen
version/flags can cause a lot of headaches.
Run an OpenXR app like `hello_xr` with the following environment variables set
```bash
export EUROC_PATH=/path/to/euroc/V1_01_easy/ # Set euroc dataset path. You can get a dataset from http://robotics.ethz.ch/~asl-datasets/ijrr_euroc_mav_dataset/vicon_room1/V1_01_easy/V1_01_easy.zip
export EUROC_LOG=debug
export EUROC_HMD=false # if false, a fake controller will be tracked, else a fake HMD
export SLAM_LOG=debug
export SLAM_CONFIG=$bsltdeps/basalt/data/monado/euroc.toml # Point to Basalt config file for Euroc
export OXR_DEBUG_GUI=1 # We will need the debug ui to start streaming the dataset
```
Finally, run the XR app and press start in the euroc player debug ui and you
should see a controller being tracked with Basalt from the euroc dataset.
## Notes on Basalt Usage
- Tracking is not perfect, [this](https://youtu.be/mIgRHmxbaC8) and
[this](https://youtu.be/gxu3Ve8VCnI) show how it looks, as well as the
problems that it has (difficulties with rotation-only movements, wiggliness on
fast movements, etc)
- This fork only works with Stereo-IMU setups, but adapting Basalt to work with
other configurations should feasible (see
[granite](https://github.com/DLR-RM/granite)).
- Basalt is _fast_. While the standard sampling rate is stereo 640x480 at 30fps
I've been able to make it work at 848x480 at 60fps without problems on a
laptop.
- Some things that might cause crashes:
- Using images with bad exposure and gain values, or being in a dark room.
- Shaking causes drift that can diverge if maintained for long periods of
time.
- Continuously making sudden 90 degree rotations in which the new scene does not share
features with the previous scene.
- Moving too fast and/or making rotation only movements over extended periods
of time.
## Using Real Hardware
Monado has a couple of drivers supporting SLAM tracking (and thus Basalt). Here is how to set them up:
- [RealSense Driver](doc/monado/Realsense.md)
- [WMR Driver](doc/monado/WMR.md)

View File

@ -0,0 +1,223 @@
{
"CalibrationInformation": {
"Cameras": [{
"Intrinsics": {
"ModelParameterCount": 15,
"ModelParameters": [0.5067707896232605, 0.51088905334472656, 0.42040637135505676, 0.56076663732528687, 0.6257319450378418, 0.46612036228179932, 0.0041795829311013222, 0.89431935548782349, 0.54253977537155151, 0.06621214747428894, 0.0027008021716028452, -0.00058499001897871494, -4.2882973502855748e-05, -0.00018502399325370789, 2.7941114902496338],
"ModelType": "CALIBRATION_LensDistortionModelRational6KT"
},
"Location": "CALIBRATION_CameraLocationHT0",
"Purpose": "CALIBRATION_CameraPurposeHeadTracking",
"MetricRadius": 2.7941114902496338,
"Rt": {
"Rotation": [1, 0, 0, 0, 1, 0, 0, 0, 1],
"Translation": [0, 0, 0]
},
"SensorHeight": 480,
"SensorWidth": 640,
"Shutter": "CALIBRATION_ShutterTypeUndefined",
"ThermalAdjustmentParams": {
"Params": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
}
}, {
"Intrinsics": {
"ModelParameterCount": 15,
"ModelParameters": [0.50554186105728149, 0.52156180143356323, 0.42205807566642761, 0.56302011013031006, 0.55718272924423218, 0.22437196969985962, 0.0068156048655509949, 0.83317267894744873, 0.26174271106719971, 0.043505862355232239, 0.00676964595913887, -0.0012049071956425905, 0.0001220563062815927, 0.00011782468209275976, 2.7899987697601318],
"ModelType": "CALIBRATION_LensDistortionModelRational6KT"
},
"Location": "CALIBRATION_CameraLocationHT1",
"Purpose": "CALIBRATION_CameraPurposeHeadTracking",
"MetricRadius": 2.7899987697601318,
"Rt": {
"Rotation": [0.68708890676498413, -0.0066884770058095455, -0.726542592048645, -0.004866383969783783, 0.99989283084869385, -0.013807035051286221, 0.72655707597732544, 0.013022295199334621, 0.68698269128799438],
"Translation": [-0.096558056771755219, -0.00065802858443930745, -0.041434925049543381]
},
"SensorHeight": 480,
"SensorWidth": 640,
"Shutter": "CALIBRATION_ShutterTypeUndefined",
"ThermalAdjustmentParams": {
"Params": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
}
}, {
"Intrinsics": {
"ModelParameterCount": 15,
"ModelParameters": [0.50590163469314575, 0.501025915145874, 0.43491056561470032, 0.51977533102035522, 1.3840854167938232, -1.6851117610931396, 6.9683127403259277, 1.4408130645751953, -1.7367498874664307, 6.9408659934997559, 0.015325400047004223, -9.7191208624280989e-05, -0.00023150799097493291, 0.00017367169493809342, 1.3999999761581421],
"ModelType": "CALIBRATION_LensDistortionModelRational6KT"
},
"Location": "CALIBRATION_CameraLocationDO0",
"Purpose": "CALIBRATION_CameraPurposeDisplayObserver",
"MetricRadius": 1.3999999761581421,
"Rt": {
"Rotation": [1, 0, 0, 0, 1, 0, 0, 0, 1],
"Translation": [0, 0, 0]
},
"SensorHeight": 2048,
"SensorWidth": 2448,
"Shutter": "CALIBRATION_ShutterTypeUndefined",
"ThermalAdjustmentParams": {
"Params": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
}
}, {
"Intrinsics": {
"ModelParameterCount": 15,
"ModelParameters": [0.505301833152771, 0.50203627347946167, 0.43675145506858826, 0.52202123403549194, 9.5410280227661133, -19.506465911865234, 31.453699111938477, 9.64380931854248, -19.527170181274414, 31.339166641235352, 0.016977680847048759, 0.0032255123369395733, -0.00076866301242262125, 0.000584927445743233, 1.3999999761581421],
"ModelType": "CALIBRATION_LensDistortionModelRational6KT"
},
"Location": "CALIBRATION_CameraLocationDO1",
"Purpose": "CALIBRATION_CameraPurposeDisplayObserver",
"MetricRadius": 1.3999999761581421,
"Rt": {
"Rotation": [0.99999630451202393, 0.0025376041885465384, -0.00095876428531482816, -0.0025385897606611252, 0.99999624490737915, -0.0010279536945745349, 0.00095615215832367539, 0.0010303838644176722, 0.99999898672103882],
"Translation": [-0.0659172385931015, 7.41809417377226e-05, -0.00013815540296491235]
},
"SensorHeight": 2048,
"SensorWidth": 2448,
"Shutter": "CALIBRATION_ShutterTypeUndefined",
"ThermalAdjustmentParams": {
"Params": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]
}
}],
"Displays": [{
"Affine": [907.99212646484375, -0.55761176347732544, 785.1776123046875, -0, 908.129150390625, 796.96160888671875, 0, 0, 1],
"AssignedEye": "CALIBRATION_DisplayEyeLeft",
"DisplayHeight": 1600,
"DisplayWidth": 2880,
"DistortionRed": {
"ModelParameterCount": 5,
"ModelParameters": [787.31629583849372, 795.24291853725526, 3.9138434409665894e-07, 2.8758159734654908e-13, 4.9106499642880268e-19],
"ModelType": "CALIBRATION_DisplayDistortionModelPolynomial3K"
},
"DistortionGreen": {
"ModelParameterCount": 5,
"ModelParameters": [786.6702072553893, 794.69238050015588, 4.2007202329361826e-07, 1.8878438698927907e-13, 7.0250600169383861e-19],
"ModelType": "CALIBRATION_DisplayDistortionModelPolynomial3K"
},
"DistortionBlue": {
"ModelParameterCount": 5,
"ModelParameters": [785.89252155554175, 797.67656573162583, 5.0202131325863806e-07, -1.2813487061581019e-13, 1.2345214819185544e-18],
"ModelType": "CALIBRATION_DisplayDistortionModelPolynomial3K"
},
"EyePositionCorrectionModel": {
"ModelParameterCount": 0,
"ModelParameters": [],
"ModelType": "CALIBRATION_EyePositionCorrectionModelIpdTranslational"
},
"HorizontalFieldOfView": 0,
"VerticalFieldOfView": 0,
"VisibleAreaCenter": {
"X": 785.1776315693005,
"Y": 796.96162857569425
},
"VisibleAreaRadius": 820,
"VBlankToPhotonLatency": 0,
"Rt": {
"Rotation": [0.915799617767334, 0.0045000603422522545, -0.40161022543907166, 0.13639208674430847, 0.93702936172485352, 0.32151702046394348, 0.377767413854599, -0.3492216169834137, 0.85751736164093018],
"Translation": [-0.021123100072145462, 0.018961008638143539, 0.079663842916488647]
}
}, {
"Affine": [907.9639892578125, -0.49046245217323303, 2103.386474609375, -0, 907.97808837890625, 801.86834716796875, 0, 0, 1],
"AssignedEye": "CALIBRATION_DisplayEyeRight",
"DisplayHeight": 1600,
"DisplayWidth": 2880,
"DistortionRed": {
"ModelParameterCount": 5,
"ModelParameters": [2101.5711896433636, 800.96838064339022, 3.9383231143483177e-07, 2.4224056018878508e-13, 5.8448606879296118e-19],
"ModelType": "CALIBRATION_DisplayDistortionModelPolynomial3K"
},
"DistortionGreen": {
"ModelParameterCount": 5,
"ModelParameters": [2101.1059339164444, 799.96334797691441, 4.2454822843541872e-07, 1.412440585952293e-13, 7.9387878585863295e-19],
"ModelType": "CALIBRATION_DisplayDistortionModelPolynomial3K"
},
"DistortionBlue": {
"ModelParameterCount": 5,
"ModelParameters": [2100.8534549194974, 802.57825131804134, 5.1161917803099739e-07, -1.9534633908166077e-13, 1.3528407038619122e-18],
"ModelType": "CALIBRATION_DisplayDistortionModelPolynomial3K"
},
"EyePositionCorrectionModel": {
"ModelParameterCount": 0,
"ModelParameters": [],
"ModelType": "CALIBRATION_EyePositionCorrectionModelIpdTranslational"
},
"HorizontalFieldOfView": 0,
"VerticalFieldOfView": 0,
"VisibleAreaCenter": {
"X": 2103.3864389371893,
"Y": 801.86837363199038
},
"VisibleAreaRadius": 820,
"VBlankToPhotonLatency": 0,
"Rt": {
"Rotation": [0.9165765643119812, 0.0027716662734746933, -0.39984956383705139, 0.13801105320453644, 0.93633246421813965, 0.322853684425354, 0.37528696656227112, -0.35110378265380859, 0.85783785581588745],
"Translation": [-0.086859606206417084, 0.018304631114006042, 0.080222859978675842]
}
}],
"InertialSensors": [{
"BiasTemperatureModel": [0.016492774710059166, 0, 0, 0, -0.01642640121281147, 0, 0, 0, -0.0045625129714608192, 0, 0, 0],
"BiasUncertainty": [9.9999997473787516e-05, 9.9999997473787516e-05, 9.9999997473787516e-05],
"Id": "CALIBRATION_InertialSensorId_ICM20602",
"MixingMatrixTemperatureModel": [1.0000957250595093, 0, 0, 0, -0.00082343036774545908, 0, 0, 0, -0.0015050634974613786, 0, 0, 0, -0.00082384591223672032, 0, 0, 0, 0.99958902597427368, 0, 0, 0, -4.7339185584860388e-06, 0, 0, 0, -0.0015064212493598461, 0, 0, 0, -4.7357993935293052e-06, 0, 0, 0, 0.99919366836547852, 0, 0, 0],
"ModelTypeMask": 16,
"Noise": [0.00095000001601874828, 0.00095000001601874828, 0.00095000001601874828, 0, 0, 0],
"Rt": {
"Rotation": [0.91831707954406738, 0.0026458536740392447, -0.39583677053451538, 0.13994280993938446, 0.93323278427124023, 0.33089667558670044, 0.3702833354473114, -0.35926258563995361, 0.85663330554962158],
"Translation": [0, 0, 0]
},
"SecondOrderScaling": [0, 0, 0, 0, 0, 0, 0, 0, 0],
"SensorType": "CALIBRATION_InertialSensorType_Gyro",
"TemperatureBounds": [5, 60],
"TemperatureC": 0
}, {
"BiasTemperatureModel": [-0.18563582003116608, 0, 0, 0, 0.057621683925390244, 0, 0, 0, -0.13435612618923187, 0, 0, 0],
"BiasUncertainty": [0.0099999997764825821, 0.0099999997764825821, 0.0099999997764825821],
"Id": "CALIBRATION_InertialSensorId_ICM20602",
"MixingMatrixTemperatureModel": [0.99813401699066162, 0, 0, 0, 0.00030715670436620712, 0, 0, 0, -0.00016754241369199008, 0, 0, 0, 0.00030630044057033956, 0, 0, 0, 1.0009244680404663, 0, 0, 0, 0.00029482797253876925, 0, 0, 0, -0.0001675997773418203, 0, 0, 0, 0.000295753387035802, 0, 0, 0, 0.99779242277145386, 0, 0, 0],
"ModelTypeMask": 56,
"Noise": [0.010700000450015068, 0.010700000450015068, 0.010700000450015068, 0, 0, 0],
"Rt": {
"Rotation": [0.918566882610321, 0.0016602915711700916, -0.39526209235191345, 0.13899369537830353, 0.93476790189743042, 0.32693997025489807, 0.37002113461494446, -0.35525518655776978, 0.85841602087020874],
"Translation": [-0.083942756056785583, -0.0025952591095119715, 0.0026445253752171993]
},
"SecondOrderScaling": [0, 0, 0, 0, 0, 0, 0, 0, 0],
"SensorType": "CALIBRATION_InertialSensorType_Accelerometer",
"TemperatureBounds": [5, 60],
"TemperatureC": 0
}, {
"BiasTemperatureModel": [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0],
"BiasUncertainty": [0, 0, 0],
"Id": "CALIBRATION_InertialSensorId_AK09916",
"MixingMatrixTemperatureModel": [1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0],
"ModelTypeMask": 0,
"Noise": [0.699999988079071, 0.699999988079071, 0.699999988079071, 0, 0, 0],
"Rt": {
"Rotation": [0.918566882610321, 0.0016602915711700916, -0.39526209235191345, 0.13899369537830353, 0.93476790189743042, 0.32693997025489807, 0.37002113461494446, -0.35525518655776978, 0.85841602087020874],
"Translation": [0, 0, 0]
},
"SecondOrderScaling": [0, 0, 0, 0, 0, 0, 0, 0, 0],
"SensorType": "CALIBRATION_InertialSensorType_Magnetometer",
"TemperatureBounds": [0, 0],
"TemperatureC": 0
}],
"Metadata": {
"SerialId": "XQ8198CM61077ZA",
"FactoryCalDate": "6/24/2019 6:38:33 AM GMT",
"Version": {
"Major": 1,
"Minor": 2
},
"DeviceName": "GOERTEK-BUILD-MP"
},
"TemperatureData": {
"ACCEL_0": {
"Average": 36.477929875996843,
"Min": 34.54,
"Max": 37.92
},
"GYRO_0": {
"Average": 36.477925294725082,
"Min": 34.54,
"Max": 37.92
}
}
}
}

View File

@ -0,0 +1,207 @@
#!/usr/bin/env python3
# Run with ./wmr2bslt_calib.py your_wmrcalib.json > your_calib.json
import json
import argparse
import numpy as np
from numpy.linalg import inv
from math import sqrt
def get(j, name):
assert name in ["HT0", "HT1", "Gyro", "Accelerometer"]
is_imu = name in ["Gyro", "Accelerometer"]
calib = j["CalibrationInformation"]
sensors = calib["InertialSensors" if is_imu else "Cameras"]
name_key = "SensorType" if is_imu else "Location"
sensor = next(filter(lambda s: s[name_key].endswith(name), sensors))
return sensor
def rt2mat(rt):
R33 = np.array(rt["Rotation"]).reshape(3, 3)
t31 = np.array(rt["Translation"]).reshape(3, 1)
T34 = np.hstack((R33, t31))
T44 = np.vstack((T34, [0, 0, 0, 1]))
return T44
def rmat2quat(r):
w = sqrt(1 + r[0, 0] + r[1, 1] + r[2, 2]) / 2
w4 = 4 * w
x = (r[2, 1] - r[1, 2]) / w4
y = (r[0, 2] - r[2, 0]) / w4
z = (r[1, 0] - r[0, 1]) / w4
return np.array([x, y, z, w])
def extrinsics(j, cam):
# NOTE: The `Rt` field seems to be a transform from the sensor to HT0 (i.e.,
# from HT0 space to sensor space). For basalt we need the transforms
# expressed w.r.t IMU origin.
# NOTE: The gyro and magnetometer translations are 0, probably because an
# HMD is a rigid body. Therefore the accelerometer is considered as the IMU
# origin.
imu = get(j, "Accelerometer")
T_i_c0 = rt2mat(imu["Rt"])
T = None
if cam == "HT0":
T = T_i_c0
elif cam == "HT1":
cam1 = get(j, "HT1")
T_c1_c0 = rt2mat(cam1["Rt"])
T_c0_c1 = inv(T_c1_c0)
T_i_c1 = T_i_c0 @ T_c0_c1
T = T_i_c1
else:
assert False
q = rmat2quat(T[0:3, 0:3])
p = T[0:3, 3]
return {
"px": p[0],
"py": p[1],
"pz": p[2],
"qx": q[0],
"qy": q[1],
"qz": q[2],
"qw": q[3],
}
def resolution(j, cam):
camera = get(j, cam)
width = camera["SensorWidth"]
height = camera["SensorHeight"]
return [width, height]
def intrinsics(j, cam):
# https://github.com/microsoft/Azure-Kinect-Sensor-SDK/blob/2feb3425259bf803749065bb6d628c6c180f8e77/include/k4a/k4atypes.h#L1024-L1046
camera = get(j, cam)
model_params = camera["Intrinsics"]["ModelParameters"]
assert (
camera["Intrinsics"]["ModelType"]
== "CALIBRATION_LensDistortionModelRational6KT"
)
width = camera["SensorWidth"]
height = camera["SensorHeight"]
return {
"camera_type": "pinhole-radtan8",
"intrinsics": {
"fx": model_params[2] * width,
"fy": model_params[3] * height,
"cx": model_params[0] * width,
"cy": model_params[1] * height,
"k1": model_params[4],
"k2": model_params[5],
"p1": model_params[13],
"p2": model_params[12],
"k3": model_params[6],
"k4": model_params[7],
"k5": model_params[8],
"k6": model_params[9],
},
}
def calib_accel_bias(j):
# https://github.com/microsoft/Azure-Kinect-Sensor-SDK/blob/2feb3425259bf803749065bb6d628c6c180f8e77/include/k4ainternal/calibration.h#L48-L77
# https://vladyslavusenko.gitlab.io/basalt-headers/classbasalt_1_1CalibAccelBias.html#details
# https://gitlab.com/VladyslavUsenko/basalt-headers/-/issues/8
accel = get(j, "Accelerometer")
bias = accel["BiasTemperatureModel"]
align = accel["MixingMatrixTemperatureModel"]
return [
-bias[0 * 4],
-bias[1 * 4],
-bias[2 * 4],
align[0 * 4] - 1, # [0, 0]
align[3 * 4], # [1, 0]
align[6 * 4], # [2, 0]
align[4 * 4] - 1, # [1, 1]
align[7 * 4], # [2, 1]
align[8 * 4] - 1, # [2, 2]
]
def calib_gyro_bias(j):
# https://github.com/microsoft/Azure-Kinect-Sensor-SDK/blob/2feb3425259bf803749065bb6d628c6c180f8e77/include/k4ainternal/calibration.h#L48-L77
# https://vladyslavusenko.gitlab.io/basalt-headers/classbasalt_1_1CalibGyroBias.html#details
gyro = get(j, "Gyro")
bias = gyro["BiasTemperatureModel"]
align = gyro["MixingMatrixTemperatureModel"]
return [
-bias[0 * 4],
-bias[1 * 4],
-bias[2 * 4],
align[0 * 4] - 1, # [0, 0]
align[3 * 4], # [1, 0]
align[6 * 4], # [2, 0]
align[1 * 4], # [0, 1]
align[4 * 4] - 1, # [1, 1]
align[7 * 4], # [2, 1]
align[2 * 4], # [0, 2]
align[5 * 4], # [1, 2]
align[8 * 4] - 1, # [2, 2]
]
def noise_std(j, name):
imu = get(j, name)
return imu["Noise"][0:3]
def bias_std(j, name):
imu = get(j, name)
return list(map(sqrt, imu["BiasUncertainty"]))
def main():
parser = argparse.ArgumentParser()
parser.add_argument("wmr_json_file", help="Input WMR json calibration file")
args = parser.parse_args()
in_fn = args.wmr_json_file
with open(in_fn) as f:
j = json.load(f)
# We get 250 packets with 4 samples each per second, totalling 1000 samples per second.
# But in monado we just average those 4 samples to reduce the noise. So we have 250hz.
IMU_UPDATE_RATE = 250
# This is a very rough offset in pixels between the two cameras. I manually
# measured it for some particular point in some particular pair of images
# for my Odyssey+. In reality this offset changes based on distance to the
# point, nonetheless it helps to get some features tracked in the right
# camera.
VIEW_OFFSET = 247
out_calib = {
"value0": {
"T_imu_cam": [extrinsics(j, "HT0"), extrinsics(j, "HT1")],
"intrinsics": [intrinsics(j, "HT0"), intrinsics(j, "HT1")],
"resolution": [resolution(j, "HT0"), resolution(j, "HT1")],
"calib_accel_bias": calib_accel_bias(j),
"calib_gyro_bias": calib_gyro_bias(j),
"imu_update_rate": IMU_UPDATE_RATE,
"accel_noise_std": noise_std(j, "Accelerometer"),
"gyro_noise_std": noise_std(j, "Gyro"),
"accel_bias_std": bias_std(j, "Accelerometer"),
"gyro_bias_std": bias_std(j, "Gyro"),
"cam_time_offset_ns": 0,
"view_offset": VIEW_OFFSET,
"vignette": [],
}
}
print(json.dumps(out_calib, indent=4))
if __name__ == "__main__":
main()

137
doc/monado/Realsense.md Normal file
View File

@ -0,0 +1,137 @@
# Using a RealSense Camera
After making sure that everything works by running the EuRoC datasets, it should
be possible to use the `realsense` driver from Monado to get any RealSense
camera that has an IMU and one or more cameras to get tracked with SLAM.
However, this was only tested on a D455, so if you are having problems with
another device, please open an issue. Also, open an issue if you manage to make
it work with other devices so that I can add it to this README.
## Index
- [Using a RealSense Camera](#using-a-realsense-camera)
- [Index](#index)
- [Overview of the Setup (D455)](#overview-of-the-setup-d455)
- [SLAM-Tracked RealSense Driver](#slam-tracked-realsense-driver)
- [RealSense-Tracked Qwerty Driver](#realsense-tracked-qwerty-driver)
- [Non-D455 RealSense Devices](#non-d455-realsense-devices)
- [Configuring the RealSense Pipeline](#configuring-the-realsense-pipeline)
- [Configuring Basalt](#configuring-basalt)
## Overview of the Setup (D455)
Let's first assume you have a RealSense D455, which is the one that works with
the defaults. Even if you have another RealSense device follow this section, you
might at least get something working, although not at its best.
### SLAM-Tracked RealSense Driver
Set these environment variables:
- `export RS_HDEV_LOG=debug`: Make our realsense device logs more verbose
- `export RS_SOURCE_INDEX=0`: Indicate that we want to use the first RealSense device connected as data source
- `export RS_TRACKING=2`: Only try to use "host-slam". See other options
[here](https://gitlab.freedesktop.org/mateosss/monado/-/blob/64e70e76ad6d47e4bd1a0dfa164bff8597a50ce8/src/xrt/drivers/realsense/rs_prober.c#L33-39).
- `export SLAM_CONFIG=$bsltdeps/basalt/data/monado/d455.toml`:
Configuration file for Basalt and the D455.
### RealSense-Tracked Qwerty Driver
You now have a RealSense device that you can use to track another device, for
example, let's track a keyboard-and-mouse controlled HMD provided by the
`qwerty` driver.
Set these environment variables to enable the qwerty driver and stream to the
SLAM system on start:
```bash
export QWERTY_ENABLE=true QWERTY_COMBINE=true SLAM_SUBMIT_FROM_START=true
```
And then modify your tracking overrides in your monado configuration file
(`~/.config/monado/config_v0.json`) by updating the json object with:
```js
{
"tracking": {
"tracking_overrides": [
{
"target_device_serial": "Qwerty HMD", // Or "Qwerty Left Controller"
"tracker_device_serial": "Intel RealSense Host-SLAM",
"type": "direct",
"offset": {
"orientation": { "x": 0, "y": 0, "z": 0, "w": 1 },
"position": { "x": 0, "y": 0, "z": 0 }
},
"xrt_input_name": "XRT_INPUT_GENERIC_TRACKER_POSE"
}
],
}
}
```
And that's it! You can now start an OpenXR application with Monado and get your
view tracked with your D455 camera.
## Non-D455 RealSense Devices
While I was unable to test other devices because I don't have access to them, it
should be possible to make them work by:
### Configuring the RealSense Pipeline
[These
fields](https://gitlab.freedesktop.org/mateosss/monado/-/blob/9e1b7e2203ef49abb939cc8fc92afa16fcc9cb3a/src/xrt/drivers/realsense/rs_hdev.c#L118-129)
determine your RealSense streaming configuration, and
[these](https://gitlab.freedesktop.org/mateosss/monado/-/blob/b26a6023226a4623381215fc159da3b4bcb27c9b/src/xrt/drivers/realsense/rs_hdev.c#L47-61)
are their current defaults that work on a D455. You can change those fields by
setting any of them in your `config_v0.json` inside a `config_realsense_hdev`
field. Also note that as we already set `RS_HDEV_LOG=debug`, you should see the
values they are currently taking at the start of Monado.
For example, let's say you have a realsense device which has two fisheye cameras
that support streaming 640x360 at 30fps (a T265 I think), then a configuration
like this should work:
```js
"config_realsense_hdev": {
"stereo": true,
"video_format": 9, // 9 gets casted to RS2_FORMAT_Y8 (see https://git.io/Jzkfw), grayscale
"video_width": 640, // I am assuming the T265 supports 640x360 streams at 30fps
"video_height": 360,
"video_fps": 30,
"gyro_fps": 0, // 0 indicates any
"accel_fps": 0,
"stream_type": 4, // 4 gets casted to RS2_STREAM_FISHEYE (see https://git.io/Jzkvq)
"stream1_index": -1, // If there were more than one possible stream with these properties select them, -1 is for auto
"stream2_index": -1,
}
```
The particular values you could set here are very dependent on your camera. I
recommend seeing the values that get output by running the [rs-sensor-control
example](https://dev.intelrealsense.com/docs/rs-sensor-control) from the
RealSense API.
### Configuring Basalt
As you might've noticed, we set `SLAM_CONFIG` to
`$bsltdeps/basalt/data/monado/d455.toml` which is [this](data/monado/d455.toml)
config file that I added for the D455. This file points to a [calibration
file](data/d455_calib.json) and a [VIO configuration
file](data/euroc_config.json).
For the tracking to be as good as possible you should set the
intrinsics/extrinsics of the device in a similar calibration file and point to
it with the `SLAM_CONFIG` config file. You can obtain that information from the
previously mentioned
[rs-sensor-control](https://dev.intelrealsense.com/docs/rs-sensor-control)
utility. Issues like [this
(T265)](https://gitlab.com/VladyslavUsenko/basalt/-/issues/52) and [this
(D435)](https://gitlab.com/VladyslavUsenko/basalt/-/issues/50) provide
configuration files tried by other users. Additionally Basalt provides custom
[calibration
tools](https://gitlab.com/VladyslavUsenko/basalt/-/blob/master/doc/Calibration.md)
that can work for any camera-IMU setup or tools like
[`basalt_rs_t265_record`](https://gitlab.freedesktop.org/mateosss/basalt/-/blob/5a365bf6fb14ce5b044b76f742337e1d6865557e/src/rs_t265_record.cpp#L207) or the `euroc_recorder` in Monado
that can help creating an initial calibration file for RealSense devices.

39
doc/monado/WMR.md Normal file
View File

@ -0,0 +1,39 @@
# Windows Mixed Reality Headsets
We'll need to make a Basalt config file for your headset, let's say it's a
Reverb G2.
First, let's get your WMR device json config block. To get that json, set the
environment variable `WMR_LOG=debug` and run Monado with your WMR headset connected.
The headset json is printed on start after the line `DEBUG [wmr_read_config] JSON config:`.
Copy that to a file called `reverbg2_wmrcalib.json`.
Now let's convert this WMR json to a Basalt calibration file with:
```bash
$bsltdeps/basalt/data/monado/wmr-tools/wmr2bslt_calib.py reverbg2_wmrcalib.json > $bsltdeps/basalt/data/reverbg2_calib.json
```
Finally, we'll need to create the main config file for Basalt that references
this calibration file we just created. For that let's copy the config that is
already present for the Odyssey+:
```bash
cp $bsltdeps/basalt/data/monado/odysseyplus_rt8.toml $bsltdeps/basalt/data/monado/reverbg2.toml
```
And edit the `cam-calib` field in the `reverbg2.toml` file to point to your `reverbg2_calib.json` file.
And that's it, now you just need to reference this `reverbg2.toml` in the
`SLAM_CONFIG` environment variable before launching Monado with `export
SLAM_CONFIG=$bsltdeps/basalt/data/monado/reverbg2.toml` and Basalt will use the
appropriate calibration for your headset.
By default, the UI box `SLAM Tracker` has the option `Submit data to SLAM`
disabled so that you first manually configure the exposure and gain values in
the `WMR Camera` box. You can enable it yourself in the UI or enable it at start
by setting the environment variable `SLAM_SUBMIT_FROM_START=true`.
# Video Walkthrough
Here is a 15 minute walkthrough with some tips for using a WMR headset with Monado and Basalt that should help complement the guide found in the [README.md](README.md) file: <https://www.youtube.com/watch?v=jyQKjyRVMS4>