[FFmpeg-user] ffserver for live streaming guidance request

En Figureo Canal figureo56.com at gmail.com
Mon Jun 15 03:27:31 CEST 2015


Hello everyone

I now have another question, this time regarding ffserver.

I'm currently testing a live stream from file and from a webcam (just for
testing purposes).

Trying to have different types of streams for different devices I created a
feed thats going to be used as the source for all the streams I'm testing:

<Feed livechannel.ffm>
        File /root/radio56.ffm
        FileMaxSize 200K
</Feed>

<Stream channel56.flv>

        Feed livechannel.ffm
        Format flv

        VideoCodec libx264
        VideoFrameRate 30
        VideoBitRate 500
        VideoSize 640x360
#        AVOptionVideo crf 30
        AVOptionVideo preset ultrafast
        VideoBufferSize 10000
        # for more info on crf/preset options, type: x264 --help
       AVOptionVideo flags +global_header

        AudioCodec libmp3lame
        AudioBitRate 96
        AudioChannels 2
        AudioSampleRate 44100
        AVOptionAudio flags +global_header

        NoDefaults

        MaxTime 0

</Stream>

<Stream channel56.sdp>       # Output stream URL definition
   Feed livechannel.ffm
   Format rtp

   # Audio settings
   AudioCodec libmp3lame
   AudioBitRate 96
  AudioChannels 2
   AudioSampleRate 44100

  # Video settings
  VideoCodec libx264
  VideoSize 560x320           # Video resolution
  VideoFrameRate 25           # Video FPS
  AVOptionVideo flags +global_header  # Parameters passed to encoder
                                       # (same as ffmpeg command-line
parameters)
  AVOptionVideo qmin 10
  AVOptionVideo qmax 42
  AVOptionAudio flags +global_header

   VideoBitRate 400            # Video bitrate
</Stream>

<Stream audio56.mp3>
        Feed livechannel.ffm
        Format mp2

        AudioBitRate 128
        AudioChannels 2
        AudioSampleRate 44100
        AudioCodec libmp3lame

        MaxTime 0
        NoVideo
</Stream>

Now, when stream to the server like this:

ffmpeg -re -i videos/masha-3.flv -vcodec libx264 -pix_fmt yuv422p -acodec
libmp3lame -maxrate 600k -bufsize 600k http://ip:8090/livechannel.ffm

I get this output:

Output #0, ffm, to 'http://ip:8090/livechannel.ffm':
  Metadata:
    creation_time   : now
    encoder         : Lavf56.31.100
    Stream #0:0: Audio: mp3 (libmp3lame), 44100 Hz, stereo, s32p, 96 kb/s
    Metadata:
      encoder         : Lavc56.35.101 libmp3lame
    Stream #0:1: Video: h264 (libx264), yuv422p, 640x360, q=-1--1, 500
kb/s, 25 fps, 1000k tbn, 30 tbc
    Metadata:
      encoder         : Lavc56.35.101 libx264
    Stream #0:2: Video: h264 (libx264), yuv422p, 560x320, q=10-42, 400
kb/s, 25 fps, 1000k tbn, 25 tbc
    Metadata:
      encoder         : Lavc56.35.101 libx264
    Stream #0:3: Audio: mp3 (libmp3lame), 44100 Hz, stereo, s32p, 128 kb/s
    Metadata:
      encoder         : Lavc56.35.101 libmp3lame
Stream mapping:
  Stream #0:1 -> #0:0 (mp3 (native) -> mp3 (libmp3lame))
  Stream #0:0 -> #0:1 (h264 (native) -> h264 (libx264))
  Stream #0:0 -> #0:2 (h264 (native) -> h264 (libx264))
  Stream #0:1 -> #0:3 (mp3 (native) -> mp3 (libmp3lame))
Press [q] to stop, [?] for help
frame=   37 fps=8.9 q=29.0 Lq=30.0 size=     184kB time=00:00:01.23
bitrate=1222.2kbits/s dup=18 drop=0

Noticed, this:

  Stream #0:1 -> #0:0 (mp3 (native) -> mp3 (libmp3lame))
  Stream #0:0 -> #0:1 (h264 (native) -> h264 (libx264))
  Stream #0:0 -> #0:2 (h264 (native) -> h264 (libx264))
  Stream #0:1 -> #0:3 (mp3 (native) -> mp3 (libmp3lame))

After playing nginx rtmp I noticed it doesn't demand specifications like
ffserver does for the feed, so I need to match the stream and feed with the
same specs ?

Does ffmpeg sends 4 different feeds for each stream I specified?

Another question, how can I play the rtsp stream on a website?

Thanks for your time and sorry if too many questions.

Have a great day!


More information about the ffmpeg-user mailing list