Skip to content

GL-ZHU925/PanoAffordanceNet

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

34 Commits
 
 
 
 
 
 
 
 

Repository files navigation

PanoAffordanceNet:

Towards Holistic Affordance Grounding in 360° Indoor Environments

Guoliang Zhu1, Wanjun Jia1, Caoyang Shao1, Yuheng Zhang1, Zhiyong Li1,2, Kailun Yang1,2,†

1School of Artificial Intelligence and Robotics, Hunan University, China 2National Engineering Research Center of Robot Visual Perception and Control Technology, Hunan University, China Corresponding author: kailun.yang@hnu.edu.cn

🖼️ Teaser

PanoAffordanceNet Teaser

📖 Introduction

This work initiates the study of Holistic Affordance Grounding in 360° Indoor Environments. Embodied agents require global awareness for their 360° action space, yet current affordance research remains limited to object-centric, perspective views. To bridge this gap, we introduce a new task of holistic affordance grounding in 360° indoor environments, shifting the paradigm from isolated object-level understanding toward holistic scene-level reasoning, and propose PanoAffordanceNet as a solid baseline for scene-level perception in embodied intelligence.


🚀 Coming Soon

  • Release the 360-AGD dataset.
  • Release PanoAffordanceNet model architecture and training code.

✉️ Contact

For any inquiries or potential collaborations, please open an issue or contact: zhuzhuxia@hnu.edu.cn

About

No description, website, or topics provided.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors