Merge branch 'master' of gitlab.vision.in.tum.de:slam/basalt
This commit is contained in:
commit
f71f252794
|
@ -57,11 +57,13 @@ NOTE: It is possible to compile the code on Ubuntu 16.04, but you need to instal
|
||||||
|
|
||||||
## Usage
|
## Usage
|
||||||
* [Camera, IMU and Mocap calibration. (TUM-VI, Euroc, UZH-FPV and Kalibr datasets)](doc/Calibration.md)
|
* [Camera, IMU and Mocap calibration. (TUM-VI, Euroc, UZH-FPV and Kalibr datasets)](doc/Calibration.md)
|
||||||
* [Tutorial on Camera-IMU and Motion capture calibration with Realsense T265.](doc/Realsense.md)
|
|
||||||
* [Visual-inertial odometry and mapping. (TUM-VI and Euroc datasets)](doc/VioMapping.md)
|
* [Visual-inertial odometry and mapping. (TUM-VI and Euroc datasets)](doc/VioMapping.md)
|
||||||
* [Visual odometry (no IMU). (KITTI dataset)](doc/Vo.md)
|
* [Visual odometry (no IMU). (KITTI dataset)](doc/Vo.md)
|
||||||
* [Simulation tools to test different components of the system.](doc/Simulation.md)
|
* [Simulation tools to test different components of the system.](doc/Simulation.md)
|
||||||
|
|
||||||
|
## Device support
|
||||||
|
* [Tutorial on Camera-IMU and Motion capture calibration with Realsense T265.](doc/Realsense.md)
|
||||||
|
|
||||||
## Development
|
## Development
|
||||||
* [Development environment setup.](doc/DevSetup.md)
|
* [Development environment setup.](doc/DevSetup.md)
|
||||||
|
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
# Tutorial on Camera-IMU and Motion Capture Calibration with Realsense T265
|
# Tutorial on Camera-IMU and Motion Capture Calibration with Realsense T265
|
||||||
|
|
||||||
![Realsense](doc/img/realsense_setup.jpg)
|
![Realsense](/doc/img/realsense_setup.jpg)
|
||||||
|
|
||||||
In this tutorial we explain how to perform photometric and geometric calibration of the multi-camera setup and then calibrate the transformations between cameras, IMU and the motion capture marker setup. To make sure the calibration is successful we recommend to rigidly attach markers to the camera as shown on the figure above.
|
In this tutorial we explain how to perform photometric and geometric calibration of the multi-camera setup and then calibrate the transformations between cameras, IMU and the motion capture marker setup. To make sure the calibration is successful we recommend to rigidly attach markers to the camera as shown on the figure above.
|
||||||
|
|
||||||
|
@ -27,7 +27,7 @@ basalt_rs_t265_record --dataset-path ~/t265_calib_data/ --manual-exposure
|
||||||
* `--dataset-path` specifies the location where the recorded dataset will be stored. In this case it will be stored in `~/t265_calib_data/<current_timestamp>/`.
|
* `--dataset-path` specifies the location where the recorded dataset will be stored. In this case it will be stored in `~/t265_calib_data/<current_timestamp>/`.
|
||||||
* `--manual-exposure` disables the autoexposure. In this tutorial the autoexposure is disabled for all calibration sequences, but for the VIO sequence (sequence0) we enable it.
|
* `--manual-exposure` disables the autoexposure. In this tutorial the autoexposure is disabled for all calibration sequences, but for the VIO sequence (sequence0) we enable it.
|
||||||
|
|
||||||
![t265_record](doc/img/t265_record.png)
|
![t265_record](/doc/img/t265_record.png)
|
||||||
|
|
||||||
The GUI elements have the following meaning:
|
The GUI elements have the following meaning:
|
||||||
* `webp_quality` compression quality. The highest value (101) means lossless compression. For photometric calibration it is important not to have any compression artifacts, so we record these calibration sequences with lossless compression.
|
* `webp_quality` compression quality. The highest value (101) means lossless compression. For photometric calibration it is important not to have any compression artifacts, so we record these calibration sequences with lossless compression.
|
||||||
|
@ -59,7 +59,7 @@ Run the response function calibration:
|
||||||
basalt_response_calib.py -d ~/t265_calib_data/response_calib
|
basalt_response_calib.py -d ~/t265_calib_data/response_calib
|
||||||
```
|
```
|
||||||
You should see the response function and the irradiance image similar to the one shown below. For the details of the algorithm see Section 2.3.1 of [[arXiv:1607.02555]](https://arxiv.org/abs/1607.02555). The results suggest that the response function used in the camera is linear.
|
You should see the response function and the irradiance image similar to the one shown below. For the details of the algorithm see Section 2.3.1 of [[arXiv:1607.02555]](https://arxiv.org/abs/1607.02555). The results suggest that the response function used in the camera is linear.
|
||||||
![t265_inv_resp_irradiance](doc/img/t265_inv_resp_irradiance.png)
|
![t265_inv_resp_irradiance](/doc/img/t265_inv_resp_irradiance.png)
|
||||||
|
|
||||||
## Multi-Camera Geometric and Vignette Calibration
|
## Multi-Camera Geometric and Vignette Calibration
|
||||||
For the camera calibration we need to record a dataset with a static aprilgrid pattern.
|
For the camera calibration we need to record a dataset with a static aprilgrid pattern.
|
||||||
|
@ -84,7 +84,7 @@ To perform the calibration follow these steps:
|
||||||
* `init_cam_extr` initialize transformations between multiple cameras.
|
* `init_cam_extr` initialize transformations between multiple cameras.
|
||||||
* `init_opt` initialize optimization.
|
* `init_opt` initialize optimization.
|
||||||
* `opt_until_converge` optimize until convergence.
|
* `opt_until_converge` optimize until convergence.
|
||||||
* `opt_cam_poses` some initial poses computed from the initialized intrinsics can be far from optimum and not converge to the right minimum. To improve the final result we can re-initialize poses with optimized intrinsics.
|
* `init_cam_poses` some initial poses computed from the initialized intrinsics can be far from optimum and not converge to the right minimum. To improve the final result we can re-initialize poses with optimized intrinsics.
|
||||||
* `init_opt` initialize optimization with new initial poses.
|
* `init_opt` initialize optimization with new initial poses.
|
||||||
* `opt_until_converge` optimize until convergence.
|
* `opt_until_converge` optimize until convergence.
|
||||||
* `compute_vign` after optimizing geometric models compute the vignetting of the cameras.
|
* `compute_vign` after optimizing geometric models compute the vignetting of the cameras.
|
||||||
|
@ -96,12 +96,12 @@ To perform the calibration follow these steps:
|
||||||
## IMU and Motion Capture Calibration
|
## IMU and Motion Capture Calibration
|
||||||
After calibrating cameras we can proceed to geometric and time calibration of the cameras, IMU and motion capture system. Setting up the motion capture system is specific for your setup.
|
After calibrating cameras we can proceed to geometric and time calibration of the cameras, IMU and motion capture system. Setting up the motion capture system is specific for your setup.
|
||||||
|
|
||||||
For the motion capture recording we use [ros_vrpn_client](https://github.com/ethz-asl/ros_vrpn_client) with [basalt_capture_mocap.py](scripts/basalt_capture_mocap.py). We record the data to the `mocap0` folder and then move it to the `mav0` directory of the camera dataset. This script is provided as an example. Motion capture setup is different in every particular case.
|
For the motion capture recording we use [ros_vrpn_client](https://github.com/ethz-asl/ros_vrpn_client) with [basalt_capture_mocap.py](/scripts/basalt_capture_mocap.py). We record the data to the `mocap0` folder and then move it to the `mav0` directory of the camera dataset. This script is provided as an example. Motion capture setup is different in every particular case.
|
||||||
|
|
||||||
**Important for recording the dataset:**
|
**Important for recording the dataset:**
|
||||||
* Set the `skip_frames` slider to 1 to use the full framerate.
|
* Set the `skip_frames` slider to 1 to use the full framerate.
|
||||||
* Reduce the exposure time to reduce the motion blur.
|
* Reduce the exposure time to reduce the motion blur.
|
||||||
* Move the setup such that all axis of accelerometer and gyro are exited. This means moving with acceleration along X, Y and Z axis and rotating around those axes.
|
* Move the setup such that all axes of accelerometer and gyro are excited. This means moving with acceleration along X, Y and Z axes and rotating around those axes.
|
||||||
* Do not forget to simultaneously record motion capture data.
|
* Do not forget to simultaneously record motion capture data.
|
||||||
* Rename the dataset to `imu_calib`.
|
* Rename the dataset to `imu_calib`.
|
||||||
|
|
||||||
|
@ -131,9 +131,9 @@ To perform the calibration follow these steps:
|
||||||
## Generating Time-Aligned Ground Truth
|
## Generating Time-Aligned Ground Truth
|
||||||
Since motion capture system and the PC where the dataset was recorded might not have the same clock we need to perform the time synchronization. Additionally we need to transform the coordinate frame of the GT data to the IMU frame (originally it is in the coordinate frame attached to the markers).
|
Since motion capture system and the PC where the dataset was recorded might not have the same clock we need to perform the time synchronization. Additionally we need to transform the coordinate frame of the GT data to the IMU frame (originally it is in the coordinate frame attached to the markers).
|
||||||
|
|
||||||
The raw motion capture data is stored in the `mav/mocap0/` folder. We can find the time offset by minimizing the error between gyro measurements and rotational velocities computed from the motion capture data. The resulting trajectory (time aligned and transformed to the IMU frame) will be written to `mav/gt/` and automatically loaded when available.
|
The raw motion capture data is stored in the `mav/mocap0/` folder. We can find the time offset by minimizing the error between gyro measurements and rotational velocities computed from the motion capture data. If you press the `save_aligned_dataset` button the resulting trajectory (time aligned and transformed to the IMU frame) will be written to `mav/gt/data.csv` and automatically loaded when available.
|
||||||
```
|
```
|
||||||
basalt_time_alignment --dataset-path ~/t265_calib_data/sequence0 --dataset-type euroc --calibration ~/t265_calib_results/calibration.json --mocap-calibration ~/t265_calib_results/mocap_calibration.json --save-gt
|
basalt_time_alignment --dataset-path ~/t265_calib_data/sequence0 --dataset-type euroc --calibration ~/t265_calib_results/calibration.json --mocap-calibration ~/t265_calib_results/mocap_calibration.json
|
||||||
```
|
```
|
||||||
You should be able to see that, despite some noise, rotational velocity computed from the motion capture data aligns well with gyro measurements.
|
You should be able to see that, despite some noise, rotational velocity computed from the motion capture data aligns well with gyro measurements.
|
||||||
![t265_time_align_gyro](/doc/img/t265_time_align_gyro.png)
|
![t265_time_align_gyro](/doc/img/t265_time_align_gyro.png)
|
||||||
|
@ -141,6 +141,8 @@ You should be able to see that, despite some noise, rotational velocity computed
|
||||||
You can also switch to the error function plot and see that there is a clear minimum corresponding to the computed time offset.
|
You can also switch to the error function plot and see that there is a clear minimum corresponding to the computed time offset.
|
||||||
![t265_time_align_error](/doc/img/t265_time_align_error.png)
|
![t265_time_align_error](/doc/img/t265_time_align_error.png)
|
||||||
|
|
||||||
|
**Note:** If you want to run the time alignment again you should delete the `~/t265_calib_data/sequence0/mav/gt` folder first. If GT data already exist you will see the `save_aligned_dataset(disabled)` button which will **NOT** overwrite it.
|
||||||
|
|
||||||
## Running Visual-Inertial Odometry
|
## Running Visual-Inertial Odometry
|
||||||
Now we can run the visual-inertial odometry on the recorded dataset:
|
Now we can run the visual-inertial odometry on the recorded dataset:
|
||||||
```
|
```
|
||||||
|
@ -151,7 +153,7 @@ After the system processes the whole sequence you can use `align_se3` button to
|
||||||
|
|
||||||
|
|
||||||
## Running Visual-Inertial Odometry Live
|
## Running Visual-Inertial Odometry Live
|
||||||
It is also possible to run the odometry live with the camera If no calibration files are provided the factory calibration will be used.
|
It is also possible to run the odometry live with the camera. If no calibration files are provided the factory calibration will be used.
|
||||||
```
|
```
|
||||||
basalt_rs_t265_vio --cam-calib ~/t265_calib_results/calibration.json --config-path /usr/etc/basalt/euroc_config.json
|
basalt_rs_t265_vio --cam-calib ~/t265_calib_results/calibration.json --config-path /usr/etc/basalt/euroc_config.json
|
||||||
```
|
```
|
|
@ -148,7 +148,8 @@ typedef std::shared_ptr<DatasetIoInterface> DatasetIoInterfacePtr;
|
||||||
|
|
||||||
class DatasetIoFactory {
|
class DatasetIoFactory {
|
||||||
public:
|
public:
|
||||||
static DatasetIoInterfacePtr getDatasetIo(const std::string &dataset_type);
|
static DatasetIoInterfacePtr getDatasetIo(const std::string &dataset_type,
|
||||||
|
bool load_mocap_as_gt = false);
|
||||||
};
|
};
|
||||||
|
|
||||||
} // namespace basalt
|
} // namespace basalt
|
||||||
|
|
|
@ -148,7 +148,7 @@ class EurocVioDataset : public VioDataset {
|
||||||
|
|
||||||
class EurocIO : public DatasetIoInterface {
|
class EurocIO : public DatasetIoInterface {
|
||||||
public:
|
public:
|
||||||
EurocIO() {}
|
EurocIO(bool load_mocap_as_gt) : load_mocap_as_gt(load_mocap_as_gt) {}
|
||||||
|
|
||||||
void read(const std::string &path) {
|
void read(const std::string &path) {
|
||||||
if (!fs::exists(path))
|
if (!fs::exists(path))
|
||||||
|
@ -163,9 +163,10 @@ class EurocIO : public DatasetIoInterface {
|
||||||
|
|
||||||
read_imu_data(path + "/mav0/imu0/");
|
read_imu_data(path + "/mav0/imu0/");
|
||||||
|
|
||||||
if (file_exists(path + "/mav0/state_groundtruth_estimate0/data.csv")) {
|
if (!load_mocap_as_gt &&
|
||||||
|
file_exists(path + "/mav0/state_groundtruth_estimate0/data.csv")) {
|
||||||
read_gt_data_state(path + "/mav0/state_groundtruth_estimate0/");
|
read_gt_data_state(path + "/mav0/state_groundtruth_estimate0/");
|
||||||
} else if (file_exists(path + "/mav0/gt/data.csv")) {
|
} else if (!load_mocap_as_gt && file_exists(path + "/mav0/gt/data.csv")) {
|
||||||
read_gt_data_pose(path + "/mav0/gt/");
|
read_gt_data_pose(path + "/mav0/gt/");
|
||||||
} else if (file_exists(path + "/mav0/mocap0/data.csv")) {
|
} else if (file_exists(path + "/mav0/mocap0/data.csv")) {
|
||||||
read_gt_data_pose(path + "/mav0/mocap0/");
|
read_gt_data_pose(path + "/mav0/mocap0/");
|
||||||
|
@ -304,6 +305,7 @@ class EurocIO : public DatasetIoInterface {
|
||||||
}
|
}
|
||||||
|
|
||||||
std::shared_ptr<EurocVioDataset> data;
|
std::shared_ptr<EurocVioDataset> data;
|
||||||
|
bool load_mocap_as_gt;
|
||||||
}; // namespace basalt
|
}; // namespace basalt
|
||||||
|
|
||||||
} // namespace basalt
|
} // namespace basalt
|
||||||
|
|
|
@ -229,6 +229,8 @@ void CamCalib::computeVign() {
|
||||||
ve.save_vign_png(cache_path);
|
ve.save_vign_png(cache_path);
|
||||||
|
|
||||||
calib_opt->setVignette(ve.get_vign_param());
|
calib_opt->setVignette(ve.get_vign_param());
|
||||||
|
|
||||||
|
std::cout << "Saved vignette png files to " << cache_path << std::endl;
|
||||||
}
|
}
|
||||||
|
|
||||||
void CamCalib::setNumCameras(size_t n) {
|
void CamCalib::setNumCameras(size_t n) {
|
||||||
|
|
|
@ -42,10 +42,10 @@ OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||||
namespace basalt {
|
namespace basalt {
|
||||||
|
|
||||||
DatasetIoInterfacePtr DatasetIoFactory::getDatasetIo(
|
DatasetIoInterfacePtr DatasetIoFactory::getDatasetIo(
|
||||||
const std::string &dataset_type) {
|
const std::string &dataset_type, bool load_mocap_as_gt) {
|
||||||
if (dataset_type == "euroc") {
|
if (dataset_type == "euroc") {
|
||||||
// return DatasetIoInterfacePtr();
|
// return DatasetIoInterfacePtr();
|
||||||
return DatasetIoInterfacePtr(new EurocIO);
|
return DatasetIoInterfacePtr(new EurocIO(load_mocap_as_gt));
|
||||||
} else if (dataset_type == "bag") {
|
} else if (dataset_type == "bag") {
|
||||||
return DatasetIoInterfacePtr(new RosbagIO);
|
return DatasetIoInterfacePtr(new RosbagIO);
|
||||||
} else if (dataset_type == "uzh") {
|
} else if (dataset_type == "uzh") {
|
||||||
|
|
|
@ -1,3 +1,37 @@
|
||||||
|
/**
|
||||||
|
BSD 3-Clause License
|
||||||
|
|
||||||
|
This file is part of the Basalt project.
|
||||||
|
https://gitlab.com/VladyslavUsenko/basalt.git
|
||||||
|
|
||||||
|
Copyright (c) 2019, Vladyslav Usenko, Michael Loipführer and Nikolaus Demmel.
|
||||||
|
All rights reserved.
|
||||||
|
|
||||||
|
Redistribution and use in source and binary forms, with or without
|
||||||
|
modification, are permitted provided that the following conditions are met:
|
||||||
|
|
||||||
|
* Redistributions of source code must retain the above copyright notice, this
|
||||||
|
list of conditions and the following disclaimer.
|
||||||
|
|
||||||
|
* Redistributions in binary form must reproduce the above copyright notice,
|
||||||
|
this list of conditions and the following disclaimer in the documentation
|
||||||
|
and/or other materials provided with the distribution.
|
||||||
|
|
||||||
|
* Neither the name of the copyright holder nor the names of its
|
||||||
|
contributors may be used to endorse or promote products derived from
|
||||||
|
this software without specific prior written permission.
|
||||||
|
|
||||||
|
THIS SOFTWARE IS PROVIDED BY THE COPYRIGHT HOLDERS AND CONTRIBUTORS "AS IS"
|
||||||
|
AND ANY EXPRESS OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE
|
||||||
|
IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE
|
||||||
|
DISCLAIMED. IN NO EVENT SHALL THE COPYRIGHT HOLDER OR CONTRIBUTORS BE LIABLE
|
||||||
|
FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL
|
||||||
|
DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR
|
||||||
|
SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER
|
||||||
|
CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY,
|
||||||
|
OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE
|
||||||
|
OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.
|
||||||
|
*/
|
||||||
|
|
||||||
#include <pangolin/display/image_view.h>
|
#include <pangolin/display/image_view.h>
|
||||||
#include <pangolin/gl/gldraw.h>
|
#include <pangolin/gl/gldraw.h>
|
||||||
|
@ -66,8 +100,6 @@ int main(int argc, char **argv) {
|
||||||
std::string output_gyro_path;
|
std::string output_gyro_path;
|
||||||
std::string output_mocap_path;
|
std::string output_mocap_path;
|
||||||
|
|
||||||
bool save_new_gt = false;
|
|
||||||
|
|
||||||
double max_offset_s = 10.0;
|
double max_offset_s = 10.0;
|
||||||
|
|
||||||
bool show_gui = true;
|
bool show_gui = true;
|
||||||
|
@ -97,8 +129,6 @@ int main(int argc, char **argv) {
|
||||||
"Maximum offset for a grid search in seconds.");
|
"Maximum offset for a grid search in seconds.");
|
||||||
|
|
||||||
app.add_flag("--show-gui", show_gui, "Show GUI for debugging");
|
app.add_flag("--show-gui", show_gui, "Show GUI for debugging");
|
||||||
app.add_flag("--save-gt", save_new_gt,
|
|
||||||
"Save time aligned data to mav0/gt/ folder");
|
|
||||||
|
|
||||||
try {
|
try {
|
||||||
app.parse(argc, argv);
|
app.parse(argc, argv);
|
||||||
|
@ -106,6 +136,10 @@ int main(int argc, char **argv) {
|
||||||
return app.exit(e);
|
return app.exit(e);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (!dataset_path.empty() && dataset_path[dataset_path.length() - 1] != '/') {
|
||||||
|
dataset_path += '/';
|
||||||
|
}
|
||||||
|
|
||||||
basalt::VioDatasetPtr vio_dataset;
|
basalt::VioDatasetPtr vio_dataset;
|
||||||
|
|
||||||
const bool use_calib =
|
const bool use_calib =
|
||||||
|
@ -137,7 +171,7 @@ int main(int argc, char **argv) {
|
||||||
}
|
}
|
||||||
|
|
||||||
basalt::DatasetIoInterfacePtr dataset_io =
|
basalt::DatasetIoInterfacePtr dataset_io =
|
||||||
basalt::DatasetIoFactory::getDatasetIo(dataset_type);
|
basalt::DatasetIoFactory::getDatasetIo(dataset_type, true);
|
||||||
|
|
||||||
dataset_io->read(dataset_path);
|
dataset_io->read(dataset_path);
|
||||||
vio_dataset = dataset_io->get_data();
|
vio_dataset = dataset_io->get_data();
|
||||||
|
@ -355,30 +389,6 @@ int main(int argc, char **argv) {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
if (save_new_gt) {
|
|
||||||
fs::create_directory(dataset_path + "/mav0/gt/");
|
|
||||||
|
|
||||||
std::ofstream f(dataset_path + "/mav0/gt/data.csv");
|
|
||||||
|
|
||||||
for (size_t i = 0; i < vio_dataset->get_gt_timestamps().size(); i++) {
|
|
||||||
const int64_t corrected_time = vio_dataset->get_gt_timestamps()[i] +
|
|
||||||
vio_dataset->get_mocap_to_imu_offset_ns() +
|
|
||||||
best_offset_refined_ns;
|
|
||||||
|
|
||||||
if (corrected_time >= min_time && corrected_time <= max_time) {
|
|
||||||
const Sophus::SE3d &p = vio_dataset->get_gt_pose_data()[i];
|
|
||||||
|
|
||||||
f << corrected_time << ',' << p.translation().x() << ','
|
|
||||||
<< p.translation().y() << ',' << p.translation().z() << ','
|
|
||||||
<< p.unit_quaternion().w() << ',' << p.unit_quaternion().x() << ','
|
|
||||||
<< p.unit_quaternion().y() << ',' << p.unit_quaternion().z()
|
|
||||||
<< std::endl;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
f.close();
|
|
||||||
}
|
|
||||||
|
|
||||||
if (show_gui) {
|
if (show_gui) {
|
||||||
static constexpr int UI_WIDTH = 280;
|
static constexpr int UI_WIDTH = 280;
|
||||||
|
|
||||||
|
@ -404,11 +414,18 @@ int main(int argc, char **argv) {
|
||||||
|
|
||||||
pangolin::Var<bool> show_error("ui.show_error", false, false, true);
|
pangolin::Var<bool> show_error("ui.show_error", false, false, true);
|
||||||
|
|
||||||
pangolin::Var<std::function<void(void)>> save_aligned_dataset(
|
std::string save_button_name = "ui.save_aligned_dataset";
|
||||||
"ui.save_aligned_dataset", [&]() {
|
// Disable save_aligned_dataset button if GT data already exists
|
||||||
if (fs::exists(fs::path(dataset_path + "mav0/gt/data.csv"))) {
|
if (fs::exists(fs::path(dataset_path + "mav0/gt/data.csv"))) {
|
||||||
std::cout << "Aligned grount truth data already exists, skipping."
|
save_button_name += "(disabled)";
|
||||||
<< std::endl;
|
}
|
||||||
|
|
||||||
|
pangolin::Var<std::function<void(void)>> save_aligned_dataset(
|
||||||
|
save_button_name, [&]() {
|
||||||
|
if (fs::exists(fs::path(dataset_path + "mav0/gt/data.csv"))) {
|
||||||
|
std::cout << "Aligned ground-truth data already exists, skipping. "
|
||||||
|
"If you want to run the calibration again delete "
|
||||||
|
<< dataset_path << "mav0/gt/ folder." << std::endl;
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
std::cout << "Saving aligned dataset in "
|
std::cout << "Saving aligned dataset in "
|
||||||
|
|
|
@ -1 +1 @@
|
||||||
Subproject commit ee71c0172400419853f675d385ee7e06242facaf
|
Subproject commit ff561d2369d8dd159ff1e7316e451bec41544ebe
|
Loading…
Reference in New Issue