Hello
I was successful in running the run_demo.py script, which provides similar performances as described in the paper. While the performance is similar, I am unable to have an inference speed similar to that of the paper. My graphics card is an RTX 4050 Laptop GPU (6 GB VRAM). While I understand a decline in inference speed is to be expected for running it with less VRAM, I have inference speeds of about 7 FPS. I checked nvidia-smi and noticed that during initial pose estimation the GPU is used to its full capacity (pose estimation takes about the same time as reported in the paper), but when it switches to pose tracking, the GPU is barely used, going from 15% to 40%. Could this be the cause of low inference speeds?
This is my nvidia-smi during pose estimation:
This is my nvidia-smi during pose tracking:

Hello
I was successful in running the run_demo.py script, which provides similar performances as described in the paper. While the performance is similar, I am unable to have an inference speed similar to that of the paper. My graphics card is an RTX 4050 Laptop GPU (6 GB VRAM). While I understand a decline in inference speed is to be expected for running it with less VRAM, I have inference speeds of about 7 FPS. I checked
nvidia-smiand noticed that during initial pose estimation the GPU is used to its full capacity (pose estimation takes about the same time as reported in the paper), but when it switches to pose tracking, the GPU is barely used, going from 15% to 40%. Could this be the cause of low inference speeds?This is my
nvidia-smiduring pose estimation:This is my

nvidia-smiduring pose tracking: