Skip to content

How can I verify that torchcodec is using the GPU/NVDEC? #672

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
FredrikNoren opened this issue May 8, 2025 · 3 comments
Closed

How can I verify that torchcodec is using the GPU/NVDEC? #672

FredrikNoren opened this issue May 8, 2025 · 3 comments

Comments

@FredrikNoren
Copy link

I saw the ffmpeg -hwaccel cuda -hwaccel_output_format cuda -i test/resources/nasa_13013.mp4 -f null - command in the doc, which I can use to verify that it's working on the ffmpeg side. Is there any way to verify that it also works on the torchcodec side?

@traversaro
Copy link
Contributor

Probably not a perfect method, but I typically check for GPU usage via nvidia-smi or nvtop, if there is GPU usage while using torchcodec, I assume that everything is actually running on the GPU.

@NicolasHug
Copy link
Member

nvidia-smi dmon also has some interesting stats.

For TorchCodec-specific usage, you are guaranteed that decoding will happen on GPU as long as you pass a cuda device as the device parameter. There is no automatic CPU fallback, i.e. if you ask for a CUDA device and the decoding somehow doesn't work on GPU, you'd get a loud error.

Hope this helps!

@FredrikNoren
Copy link
Author

@NicolasHug Thanks! Yup that's super helpful

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants