Skip to content

Lower performance on custom synthetic dataset #401

@jorou125

Description

@jorou125

Hi, thank you for sharing your work and providing support. I was able to run FoundationPose on the provided demo data and was able to reproduce the performance presented in the article regarding the YCBInEOAT dataset.

I am now trying to run FoundationPose on a custom dataset. To do so, I am generating simple synthetic scenes in Blender using the YCB models. When running FoundationPose on the generated scenes, I have much lower performance, with an average AUC (ADD) of 73% compared to the 96% obtained in the provided demo data. This decline in performance makes the translation error be around a few cm rather than a few mm. I used the same model as the demo data, so I doubt the model itself is the error. Here is my synthetic scene and my debug folder:

https://drive.google.com/drive/folders/1f9UaLt_BMojAE-Xi5PyJ_GPHzoQRBWoW?usp=sharing

I would greatly appreciate it if you could take the time to check the debug file and see what I have missed that causes the significant decline in performance. Here are things I have checked and that don't seem to be the problem:

  • The texture of the object is correctly applied.
  • The scale of the object is correct.
  • Camera intrinsics are on the correct scale.
  • I've tried with different objects.
  • I have checked the quality of the depth data; it seems correct, but that is the part I am the least certain about.

I hope you can help me figure out the decline in performance.

Have a good day!

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions