[FFmpeg-devel] WHIP - Webrtc Http Ingest Protocol

Sergio Garcia Murillo sergio.garcia.murillo at gmail.com
Wed Sep 16 14:08:37 EEST 2020

On 11/09/2020 22:51, Kieran Kunhya wrote:
>> Because when you have a WebRTC server/service, you need WebRTC ingest.
>> You may be surprised by the amount of mixed RTC/streaming services that
>> are being implemented as we speak since COVID.
> Why is that the case?

Mainly because if you already have WebRTC implemented in your server for 
receiving and delivering media fromĀ  and to the web browsers, it is much 
easier to implement and to maintain a webrtc ingest than add SRT. 
Basically, you don't have to do anything new, neither maintain two 
different input paths, solve bugs, monitoring, stats, etc.

Also, WebRTC offers some end to end capabilities that are very difficult 
(if possible) to maintain if you mix protocols. I don't want to enter in 
a protocol comparative, but keeping a good quality with the lowest delay 
is not feasible if you mix protocols.

Not taking into account the extra cpu cost for protocol conversion (may 
be minimal, but when you handle hundreds of connections in a server, 
everything adds up), with webrtc you just need to forward rtp packets on 
server with minimal state. If we have to do transcoding (for example AAC 
to OPUS in RTMP) things get much worse.

Again, this might be not useful for the non-webrtc community, but I can 
tell you that it would help A LOT to the ones that are doing webrtc.

Best regards


More information about the ffmpeg-devel mailing list