Playing RTMP streams with ffplay & iOS VideoKit

Hello developers,

This is a new blog post about playing RTMP streams with ffplay & VideoKit. We will show you how to set the parameters of RTMP stream with ffplay and our framework VideoKit using VKDECODER_OPT_KEY_PASS_THROUGH decoder option which newly added in version 1.10

Before start, please note that We tested our RTMP links with ffplay during this article, If you have not heard anything about ffplay since now, We can shortly say that ffplay is a basic console based media player built onto top of the ffmpeg libraries, it’s also designed by ffmpeg developers. BTW, all below commands that begin with ffplay can be also called with ffmpeg command.

First, let me give a brief explanation about RTMP streaming, (By wikipedia)

Real Time Messaging Protocol (RTMP) was initially a proprietary protocol developed by Macromedia for streaming audio, video and data over the Internet, between a Flash player and a server. Macromedia is now owned by Adobe, which has released an incomplete version of the specification of the protocol for public use.

The RTMP protocol has multiple variations:

The “plain” protocol which works on top of and uses TCP port number 1935 by default.
RTMPS which is RTMP over an TLS/SSL connection.
RTMPE which is RTMP encrypted using Adobe’s own security mechanism. While the details of the implementation are proprietary, the mechanism uses industry standard cryptography primitives.[1] RTMPT which is encapsulated within HTTP requests to traverse firewalls. RTMPT is frequently found utilizing cleartext requests on TCP ports 80 and 443 to bypass most corporate traffic filtering. The encapsulated session may carry plain RTMP, RTMPS, or RTMPE packets within.

RTMP is a TCP-based protocol which maintains persistent connections and allows low-latency communication. To deliver streams smoothly and transmit as much information as possible, it splits streams into fragments and their size is negotiated dynamically between the client and server while sometimes it is kept unchanged: the default fragment sizes are 64-bytes for audio data, and 128 bytes for video data and most other data types. Fragments from different streams may then be interleaved, and multiplexed over a single connection. With longer data chunks the protocol thus carries only a one-byte header per fragment, so incurring very little overhead. However, in practice individual fragments are not typically interleaved. Instead, the interleaving and multiplexing is done at the packet level, with RTMP packets across several different active channels being interleaved in such a way as to ensure that each channel meets its bandwidth, latency, and other quality-of-service requirements. Packets interleaved in this fashion are treated as indivisible, and are not interleaved on the fragment level.

For now, We will talk about only RTMP protocol, other protocols needs openSSL to be supported and We will talk about ffmpeg & openSSL in another blog post.

RTMP streaming is provided by another open source library which is libRTMP, and libRTMP is included inside the ffmpeg library since version 0.6, and with the latest version, 2.2, all RTMP protocols are supported by default by ffmpeg.

We have sufficient technical information & theory about rtmp streaming, now We can talk about more practical usages and examine what ffplay expects when try to play an RTMP stream,

Now, let’s look at the rtmp parameters,

./ffplay --help

Below is the output of help paramater related to RTMP protocol, as you see, there are many settable parameters, We don’t need to set all parameters, even that, there are many RTMP streams that don’t have any parameters …

The list of RTMP parameters and their definitions are shown below,

rtmp AVOptions:
-rtmp_app ED.... Name of application to connect to on the RTMP server
-rtmp_buffer ED.... Set buffer time in milliseconds. The default is 3000. (from 0 to INT_MAX) (default 3000)
-rtmp_conn ED.... Append arbitrary AMF data to the Connect message
-rtmp_flashver ED.... Version of the Flash plugin used to run the SWF player.
-rtmp_live .D.... Specify that the media is a live stream. (from INT_MIN to INT_MAX) (default -2)
any .D.... both
live .D.... live stream
recorded .D.... recorded stream
-rtmp_pageurl .D.... URL of the web page in which the media was embedded. By default no value will be sent.
-rtmp_playpath ED.... Stream identifier to play or to publish
-rtmp_subscribe .D.... Name of live stream to subscribe to. Defaults to rtmp_playpath.
-rtmp_swfhash .D.... SHA256 hash of the decompressed SWF file (32 bytes).
-rtmp_swfsize .D.... Size of the decompressed SWF file, required for SWFVerification. (from 0 to INT_MAX) (default 0)
-rtmp_swfurl ED.... URL of the SWF player. By default no value will be sent
-rtmp_swfverify .D.... URL to player swf file, compute hash/size automatically.
-rtmp_tcurl ED.... URL of the target stream. Defaults to proto://host[:port]/app.
-rtmp_listen .D.... Listen for incoming rtmp connections (from INT_MIN to INT_MAX) (default 0)
-listen .D.... Listen for incoming rtmp connections (from INT_MIN to INT_MAX) (default 0)
-timeout .D.... Maximum timeout (in seconds) to wait for incoming connections. -1 is infinite. Implies -rtmp_listen 1 (from INT_MIN to INT_MAX) (default -1)

OK, let’s begin with a stream that is no need to set any parameter (please note that the test streams are working when this post is written but may not work in future)

./ffplay rtmp://cdn.m.yupptv.tv:1935/liveorigin/maamusic1

The output,
ffplay version 2.1.3 Copyright (c) 2003-2013 the FFmpeg developers
built on Jan 31 2014 08:30:56 with Apple LLVM version 5.0 (clang-500.2.79) (based on LLVM 3.3svn)
configuration: --disable-yasm --disable-encoders
libavutil 52. 48.101 / 52. 48.101
libavcodec 55. 39.101 / 55. 39.101
libavformat 55. 19.104 / 55. 19.104
libavdevice 55. 5.100 / 55. 5.100
libavfilter 3. 90.100 / 3. 90.100
libswscale 2. 5.101 / 2. 5.101
libswresample 0. 17.104 / 0. 17.104
[flv @ 0x7fa546801000] Stream discovered after head already parsed
[flv @ 0x7fa546801000] negative cts, previous timestamps might be wrong
Last message repeated 2 times
[flv @ 0x7fa546801000] negative cts, previous timestamps might be wrong
Last message repeated 2 times
Input #0, flv, from 'rtmp://cdn.m.yupptv.tv:1935/liveorigin/maamusic1':
Metadata:
author :
copyright :
description :
keywords :
rating :
title :
presetname : Custom
creationdate : Sat May 31 03:42:54 2014
:
videodevice : Osprey-210 Video Device 1
avclevel : 30
avcprofile : 66
videokeyframe_frequency: 2
audiodevice : Osprey-210 Audio Device 1
audiochannels : 2
audioinputvolume: 75
encoder : Lavf55.12.100
Duration: 00:00:00.00, start: 0.000000, bitrate: N/A
Stream #0:0: Video: h264 (Constrained Baseline), yuv420p, 320x180 [SAR 1:1 DAR 16:9], 400 kb/s, 15 tbr, 1k tbn, 30 tbc
Stream #0:1: Audio: aac, 44100 Hz, stereo, fltp, 32 kb/s
Stream #0:2: Data: none
1.77 A-V: 0.009 fd= 3 aq= 8KB vq= 18KB sq= 0B f=0/1

Success, pretty easy huh :)

Now, let’s try some rtmp streams that We found in the internet (please note that the test streams are working when this post is written but may not work in future)

In here, there are a lot of RTMP stream examples but all of them are in similar format, let’s try one of them,

stream_url: ‘rtmp://hd6.lsops.net/live/ playpath=alalam_ar_1428 swfUrl=”http://static.ls-cdn.com/player/5.10/livestation-player.swf” swfVfy=true live=true’

The parameters seem very straightforward, so We can call ffplay with below parameters,

./ffplay rtmp://hd3.lsops.net/live -rtmp_playpath aljazeer_ar_838 -rtmp_swfurl http://static.ls-cdn.com/player/5.10/livestation-player.swf -rtmp_live 1

We can’t set only one parameter which is swfVfy, actually, ffplay has a parameter named rtmp_swfverify how ever this parameter needs a url to the player swf file. As they are not matched completely each other, We can just ignore it.

Now, let’s try and see the result,

ffplay version 2.1.3 Copyright (c) 2003-2013 the FFmpeg developers
built on Jan 31 2014 08:30:56 with Apple LLVM version 5.0 (clang-500.2.79) (based on LLVM 3.3svn)
configuration: --disable-yasm --disable-encoders
libavutil 52. 48.101 / 52. 48.101
libavcodec 55. 39.101 / 55. 39.101
libavformat 55. 19.104 / 55. 19.104
libavdevice 55. 5.100 / 55. 5.100
libavfilter 3. 90.100 / 3. 90.100
libswscale 2. 5.101 / 2. 5.101
libswresample 0. 17.104 / 0. 17.104
[rtmp @ 0x7fea19f1d360] Unknown connect error (unsupported authentication method?)
[rtmp @ 0x7fea19f1d360] Server error: Connection failed: Application folder ([install-location]/applications/) is missing.
rtmp://hd3.lsops.net/live: Unknown error occurred

What happened? Why did we get an error like this ? Let’s dive into the code and try to understand what ffplay expects from us,

When I try to search rtmp_playpath keyword in ffmpeg source, there comes 2 source files which are librtmp.c & rtmpproto.c, I am so lucky that there are only 2 files to be inspected and librtmp.c has some connection related methods. So, let’s look at the librtmp.c file,

Below is definition for rtmp_open (this method should be called first to open the connection)

/**
* Open RTMP connection and verify that the stream can be played.
*
* URL syntax: rtmp://server[:port][/app][/playpath][ keyword=value]...
* where 'app' is first one or two directories in the path
* (e.g. /ondemand/, /flash/live/, etc.)
* and 'playpath' is a file name (the rest of the path,
* may be prefixed with "mp4:")
*
* Additional RTMP library options may be appended as
* space-separated key-value pairs.
*/
static int rtmp_open(URLContext *s, const char *uri, int flags)
{
.
.
.

As you see the url format that ffplay expects, We need to rearrange our url format according to this,

We set server as “rtmp://hd3.lsops.net/live” which is wrong according to the definition above,
let’s define our format as below and try to play it with ffplay again,

./ffplay rtmp://hd3.lsops.net -rtmp_app live -rtmp_playpath aljazeer_ar_838 -rtmp_swfurl http://static.ls-cdn.com/player/5.10/livestation-player.swf -rtmp_live 1

Bingo!, It works!

Also, We want to try same stream with using one full and long url path without setting rtmp_app and rtmp_playpath parameters as below

rtmp://hd3.lsops.net/live/aljazeer_ar_838

let’s test with ffplay,

./ffplay rtmp://hd3.lsops.net/live/aljazeer_ar_838 -rtmp_swfurl http://static.ls-cdn.com/player/5.10/livestation-player.swf -rtmp_live 1

YES! This format is also working as mentioned in the definition of method, rtmp_open.

Now, We are able to set parameters for ffplay, it’s VideoKit’s turn :)

This part is very easy as We have already figured out setting the parameters in the right way.
Actually, We don’t add any parameters for RTMP to VideoKit, as there are so many parameters and in next release of ffmpeg, these parameters may be changed and this will cause updating VideoKit for every change in RTMP protocol. Then, how do We support all parameters in VideoKit ?

It’s simple, We just support playing like ffplay style as We did in terminal :)
Let me explain with samples,

Channel *c15 = [Channel channelWithName:@"Aflam Live TV" addr:@"rtmp://95.211.148.203/live/aflam4youddd?id=152675 -rtmp_swfurl http://mips.tv/content/scripts/eplayer.swf -rtmp_live live -rtmp_pageurl http://mips.tv/embedplayer/aflam4youddd/1/600/380 -rtmp_conn S:OK" description:@"Pass through params - ffplay style" localFile:NO options:[NSDictionary dictionaryWithObject:@"1" forKey:VKDECODER_OPT_KEY_PASS_THROUGH]];

(Above code is taken from ChannelsManager.m in sample project)

We added a new VK_DECODER option which is VKDECODER_OPT_KEY_PASS_THROUGH, this option lets all the parameters pass through to ffmpeg open connection method below (format_opts is a structure of AVDictionary),

avformat_open_input(&_avFmtCtx, input, NULL, &format_opts)

By this way, there will be no need to update VideoKit for the changes made in RTMP code.

That’s all for RTMP streaming with ffplay and iOS VideoKit, feel free to ask your questions …

Bye now untill next post!

Posted in Blog, ffmpeg, ffplay, rtmp, videokit and tagged , , .