GarmentPile:
Point-Level Visual Affordance Guided Retrieval and Adaptation for Cluttered Garments Manipulation
Garment-Pile Simulation Scene
Our Project is built upon Isaac Sim 2023.1.1. Please refer to the official guideline to download it.
After Download, please move the file into path '~/.local/share/ov/pkg/' and rename the file to be 'isaac-sim-2023.1.1' to adapt the path configuration of the repo.
There are some modification need to be done in Isaac Sim's meta-file. Please refer to this document.
- Clone the repo frist.
git clone https://github.com/AlwaySleepy/Garment-Pile.git - Download Garment Assets
Here we use Garment Assets from GarmentLab. Please refer to Google_Drive_link to download Garment folder and unzip it to 'Assets/'.
- Isaac Sim Env Preparation
For convenience, we recommend to provide an alias for the python.sh file in Isaac Sim 2023.1.1.
# 1. open .bashrc file sudo vim ~/.bashrc # 2. add following part to the end of the file alias isaac_pile=~/.local/share/ov/pkg/isaac-sim-2023.1.1/python.sh # 3. save file and exit. # 4. refresh for file configuration to take effect. source ~/.bashrcInstall necessary packages into Isaac Sim Env.
isaac_pile -m pip install termcolor plyfile- Model Training Env Preparation
create new conda environment
conda create -n garmentpile python=3.10Install necessary packages into Model Training Env.
conda activate garmentpile # CUDA version should be 11.8 or less, but no 12.X pip install torch==2.1.2 torchvision==0.16.2 torchaudio==2.1.2 --index-url https://download.pytorch.org/whl/cu118 pip install -r requirements.txtπ ProjectRoot # VS Code Configuration Files βββ π .vscode # Assets used in Isaac Sim βββ π Assets # Isaac Sim Env Configuration, including Camera, Robot, Garment, etc. βββ π Env_Config # Used for train_data collection βββ π Env_Data_Collection # standlone environment with pre-trained model βββ π Env_Eval # Used for fintuning model βββ π Env_Finetune # Model training code βββ π Model_Train # repo images βββ π Repo_Image In our project, we provide three garment-pile scenes: washingmachine, sofa, basket.
You can directly run the three environment based on the file in 'Env_Eval' folder.
The retrieve, pick, place procedure all rely on pre_trained model.
[ATTENTION!] If you find failure of assets loading in simulation, please enter "Env_Config / Config / xx_config.py" to check assets loading path.
# washmachine isaac_pile Env_Eval/washmachine.py # sofa isaac_pile Env_Eval/sofa.py # basket isaac_pile Env_Eval/basket.pyRun the following command to generate retrieval data:
# washmachine bash Env_Data_Collection/auto_washmachine_retrieve.sh # sofa bash Env_Data_Collection/auto_sofa_retrieve.sh # basket bash Env_Data_Collection/auto_basket_retrieve.shRun the following command to generate stir data:
# washmachine bash Env_Data_Collection/auto_washmachine_stir.sh # sofa bash Env_Data_Collection/auto_sofa_stir.sh # basket bash Env_Data_Collection/auto_basket_stir.shThere are some flags you can define manually in .sh file. Please check .sh file for more information. (such as, rgb_flag, random_flag, etc.)
Training Data are all collected in 'Data' file.
# activate conda env conda activate garmentpile # run any .py file in 'Model_Train' folder. remember to login in wandb # e.g. python Model_Train/WM_Model_train.pyWe provide washmachine place model finetune code as example in 'Env_Finetune' folder.
you can run the .sh file directly to see finetune procedure.
If you find this paper useful, please consider staring π this repo and citing π our paper:
@InProceedings{Wu_2025_CVPR, author = {Wu, Ruihai and Zhu, Ziyu and Wang, Yuran and Chen, Yue and Wang, Jiarui and Dong, Hao}, title = {Point-Level Visual Affordance Guided Retrieval and Adaptation for Cluttered Garments Manipulation}, booktitle = {Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)}, year = {2025}, } 


