3EED: Ground Everything Everywhere in 3D — Dataset Card
A cross-platform, multi-modal 3D visual grounding dataset spanning vehicle, drone, and quadruped platforms, with synchronized RGB, LiDAR, and language annotations. This page documents how to obtain and organize the dataset from HuggingFace and how to connect it with the training/evaluation code in the 3EED repository.
1. What’s Included
- Platforms:
vehicle, drone, quad (quadruped)
- Modalities: LiDAR point clouds, RGB images, language referring expressions, metadata
- Splits: train/val files per platform under
splits/
- Task: 3D visual grounding (language → 3D box)
2. Download
You can download via:
HuggingFace CLI:
pip install -U "huggingface_hub[cli]"
huggingface-cli download 3EED/3EED --repo-type dataset --local-dir ./3eed_dataset
Python:
from huggingface_hub import snapshot_download
snapshot_download(repo_id="3EED/3EED", repo_type="dataset", local_dir="./3eed_dataset")
Git (LFS):
git lfs install
git clone https://huggingface.co/datasets/3EED/3EED 3eed_dataset
3. Directory Structure
Place or verify the files under data/3eed/ in your project. A minimal expected layout (paths shown relative to the repo root):
data/3eed/
├── drone/ # Drone platform data
│ ├── scene-0001/
│ │ ├── 0000_0/
│ │ │ ├── image.jpg
│ │ │ ├── lidar.bin
│ │ │ └── meta_info.json
│ │ └── ...
│ └── ...
├── quad/ # Quadruped platform data
│ ├── scene-0001/
│ └── ...
├── waymo/ # Vehicle platform data
│ ├── scene-0001/
│ └── ...
└── splits/ # Train/val split files
├── drone_train.txt
├── drone_val.txt
├── quad_train.txt
├── quad_val.txt
├── waymo_train.txt
└── waymo_val.txt
4. Connect to the Codebase
Clone the code repository:
git clone https://github.com/iris0329/3eed
cd 3eed
Link or copy the downloaded dataset to data/3eed/:
ln -s ../3eed_dataset data/3eed
Now you can follow the Installation, Custom CUDA Operators, Training, and Evaluation sections in the GitHub README:
Train on all platforms:
bash scripts/train_3eed.sh
Train on a single platform:
bash scripts/train_waymo.sh
bash scripts/train_drone.sh
bash scripts/train_quad.sh
Evaluate:
bash scripts/val_3eed.sh
bash scripts/val_waymo.sh
bash scripts/val_drone.sh
bash scripts/val_quad.sh
Remember to set the correct --checkpoint_path inside the evaluation scripts.
5. Data Splits
We provide official splits under data/3eed/splits/:
*_train.txt: training scene/frame indices for each platform
*_val.txt: validation scene/frame indices for each platform
Please keep these files unchanged for fair comparison with the baselines and reported results.
6. Usage Tips
- Storage: LiDAR+RGB data can be large; ensure sufficient disk space and use Git LFS for partial sync if needed.
- IO Throughput: For faster training/evaluation, place frequently used scenes on fast local SSDs or use caching.
- Reproducibility: Use the exact environment files and scripts from the code repo; platform unions vs. single-platform runs are controlled by the provided scripts.
7. License
- Dataset license: Apache-2.0 (see the header of this page).
- The code repository uses Apache-2.0; refer to the LICENSE in the GitHub repo.
If you plan to use, redistribute, or modify the dataset, please review the dataset license and any upstream source licenses (e.g., Waymo Open Dataset, M3ED).
8. Citation
- If you find 3EED helpful, please cite:
@inproceedings{li2025_3eed,
title = {3EED: Ground Everything Everywhere in 3D},
author = {Rong Li and Yuhao Dong and Tianshuai Hu and Ao Liang and
Youquan Liu and Dongyue Lu and Liang Pan and Lingdong Kong and
Junwei Liang and Ziwei Liu},
booktitle = {Advances in Neural Information Processing Systems (NeurIPS)
Datasets and Benchmarks Track},
year = {2025}
}
9. Acknowledgements
We acknowledge the following upstream sources which make this dataset possible:
- Waymo Open Dataset (vehicle platform)
- M3ED (drone and quadruped platforms)
For baseline implementations and evaluation code, please refer to the GitHub repository.