small fixes for tutorial

This commit is contained in:
Vladyslav Usenko 2019-10-01 14:40:10 +02:00
parent ea4f9a63cf
commit c9ab4a3a7b
3 changed files with 10 additions and 32 deletions

View File

@ -101,7 +101,7 @@ For the motion capture recording we use [ros_vrpn_client](https://github.com/eth
**Important for recording the dataset:**
* Set the `skip_frames` slider to 1 to use the full framerate.
* Reduce the exposure time to reduce the motion blur.
* Move the setup such that all axis of accelerometer and gyro are exited. This means moving with acceleration along X, Y and Z axis and rotating around those axes.
* Move the setup such that all axes of accelerometer and gyro are excited. This means moving with acceleration along X, Y and Z axes and rotating around those axes.
* Do not forget to simultaneously record motion capture data.
* Rename the dataset to `imu_calib`.
@ -131,9 +131,9 @@ To perform the calibration follow these steps:
## Generating Time-Aligned Ground Truth
Since motion capture system and the PC where the dataset was recorded might not have the same clock we need to perform the time synchronization. Additionally we need to transform the coordinate frame of the GT data to the IMU frame (originally it is in the coordinate frame attached to the markers).
The raw motion capture data is stored in the `mav/mocap0/` folder. We can find the time offset by minimizing the error between gyro measurements and rotational velocities computed from the motion capture data. The resulting trajectory (time aligned and transformed to the IMU frame) will be written to `mav/gt/` and automatically loaded when available.
The raw motion capture data is stored in the `mav/mocap0/` folder. We can find the time offset by minimizing the error between gyro measurements and rotational velocities computed from the motion capture data. If you press the `save_aligned_dataset` button the resulting trajectory (time aligned and transformed to the IMU frame) will be written to `mav/gt/data.csv` and automatically loaded when available.
```
basalt_time_alignment --dataset-path ~/t265_calib_data/sequence0 --dataset-type euroc --calibration ~/t265_calib_results/calibration.json --mocap-calibration ~/t265_calib_results/mocap_calibration.json --save-gt
basalt_time_alignment --dataset-path ~/t265_calib_data/sequence0 --dataset-type euroc --calibration ~/t265_calib_results/calibration.json --mocap-calibration ~/t265_calib_results/mocap_calibration.json
```
You should be able to see that, despite some noise, rotational velocity computed from the motion capture data aligns well with gyro measurements.
![t265_time_align_gyro](/doc/img/t265_time_align_gyro.png)
@ -151,7 +151,7 @@ After the system processes the whole sequence you can use `align_se3` button to
## Running Visual-Inertial Odometry Live
It is also possible to run the odometry live with the camera If no calibration files are provided the factory calibration will be used.
It is also possible to run the odometry live with the camera. If no calibration files are provided the factory calibration will be used.
```
basalt_rs_t265_vio --cam-calib ~/t265_calib_results/calibration.json --config-path /usr/etc/basalt/euroc_config.json
```

View File

@ -229,6 +229,8 @@ void CamCalib::computeVign() {
ve.save_vign_png(cache_path);
calib_opt->setVignette(ve.get_vign_param());
std::cout << "Saved vignette png files to " << cache_path << std::endl;
}
void CamCalib::setNumCameras(size_t n) {

View File

@ -66,8 +66,6 @@ int main(int argc, char **argv) {
std::string output_gyro_path;
std::string output_mocap_path;
bool save_new_gt = false;
double max_offset_s = 10.0;
bool show_gui = true;
@ -97,8 +95,6 @@ int main(int argc, char **argv) {
"Maximum offset for a grid search in seconds.");
app.add_flag("--show-gui", show_gui, "Show GUI for debugging");
app.add_flag("--save-gt", save_new_gt,
"Save time aligned data to mav0/gt/ folder");
try {
app.parse(argc, argv);
@ -106,6 +102,10 @@ int main(int argc, char **argv) {
return app.exit(e);
}
if (!dataset_path.empty() && dataset_path[dataset_path.length() - 1] != '/') {
dataset_path += '/';
}
basalt::VioDatasetPtr vio_dataset;
const bool use_calib =
@ -355,30 +355,6 @@ int main(int argc, char **argv) {
}
}
if (save_new_gt) {
fs::create_directory(dataset_path + "/mav0/gt/");
std::ofstream f(dataset_path + "/mav0/gt/data.csv");
for (size_t i = 0; i < vio_dataset->get_gt_timestamps().size(); i++) {
const int64_t corrected_time = vio_dataset->get_gt_timestamps()[i] +
vio_dataset->get_mocap_to_imu_offset_ns() +
best_offset_refined_ns;
if (corrected_time >= min_time && corrected_time <= max_time) {
const Sophus::SE3d &p = vio_dataset->get_gt_pose_data()[i];
f << corrected_time << ',' << p.translation().x() << ','
<< p.translation().y() << ',' << p.translation().z() << ','
<< p.unit_quaternion().w() << ',' << p.unit_quaternion().x() << ','
<< p.unit_quaternion().y() << ',' << p.unit_quaternion().z()
<< std::endl;
}
}
f.close();
}
if (show_gui) {
static constexpr int UI_WIDTH = 280;