Differentiable Composite Neural Signed Distance Fields for Robot Navigation in Dynamic Indoor Environments (ICRA'25)
git clone --recursive https://github.com/stalhabukhari/comp-sdf-dyn-nav.git
conda create -n sdf-nav python=3.9
conda install pytorch==1.13.1 torchvision==0.14.1 pytorch-cuda=11.7 -c pytorch -c nvidia
pip install -r requirements.txt
# yolov5
pip install -r yolov5/requirements.txt
# CUDA-based farthest point sampling (optional)
pip install Pointnet2_PyTorch/pointnet2_ops_lib/
Congfigure the iGibson environment: https://github.com/stalhabukhari/iGibson
Train SDF models using the following repositories:
- Object-level SDF: DeepSDF: https://github.com/stalhabukhari/DeepSDF
- Scene-level SDF: https://github.com/stalhabukhari/iSDF
Pretrained models are available on the Google Drive link at the top.
Simulations can be executed via: python sim_<method>.py --cfg <path-to-config>
Examples:
CFG=configs/robs-cfgs/robs.yaml
# dual mode
python sim_dual_mode.py --cfg $CFG --dynamic
# robot sdf
python sim_robot_sdf.py --cfg $CFG --dynamic
# scene sdf
python sim_scene_sdf.py --cfg $CFG --dynamic
@inproceedings{bukhari25icra,
title={Differentiable Composite Neural Signed Distance Fields for Robot Navigation in Dynamic Indoor Environments},
author={Bukhari, S. Talha and Lawson, Daniel and Qureshi, Ahmed H.},
booktitle={2025 International Conference on Robotics and Automation (ICRA)},
year={2025},
organization={IEEE}
}
We thank the authors of the following repositories, which we adapt code from:
- https://github.com/facebookresearch/iSDF
- https://github.com/facebookresearch/DeepSDF
- https://github.com/ultralytics/yolov5
- https://github.com/erikwijmans/Pointnet2_PyTorch
Code is released under the MIT License. See the LICENSE file for more details.