[Libav-user] 970530 - how to decode by GPU in ffmpeg

hamidi hamidi at gmail.com
Tue Aug 21 11:16:26 EEST 2018


platform: Windows
application: WinForm application
language: C#
decoder: ffmpeg

Hi
We're using ffmpeg to decode frames sent via network to us. The program is
written in C# and uses FFmpegInvoke as well as the required DLL's like
avcodec-56.dll and avutil-54.dll for decoding the frames. We need to use
GPU instead of CPU for this purpose. Now CPU is used and decoding is done
without any problem.
My question is that how can I tell ffmpeg use GPU instead of CPU for
decoding?
Is there any sample code for this purpose?
Thanks
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://ffmpeg.org/pipermail/libav-user/attachments/20180821/dbb5c578/attachment.html>


More information about the Libav-user mailing list