[Libav-user] 970530 - how to decode by GPU in ffmpeg
hamidi at gmail.com
Tue Aug 21 11:16:26 EEST 2018
application: WinForm application
We're using ffmpeg to decode frames sent via network to us. The program is
written in C# and uses FFmpegInvoke as well as the required DLL's like
avcodec-56.dll and avutil-54.dll for decoding the frames. We need to use
GPU instead of CPU for this purpose. Now CPU is used and decoding is done
without any problem.
My question is that how can I tell ffmpeg use GPU instead of CPU for
Is there any sample code for this purpose?
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Libav-user