2004-11-01 14:11

by Vasil Kolev

And now about the live and dead video…

First, equipment:
A camera with firewire (IEEE1394) output (I can’t remember the exact model);
Laptop with firewire port, centrino/1.4GHz, and ethernet port;
Server that has a good connectivity (marla);
A pipe from the laptop to the server with enough bandwidth (the sum of the streams + ~25%), to be able to catch up with the transmission, the errors and the bursts that follow.

Why firewire? Because it has better quality, and doesn’t use too much CPU (which can’t be said about v4l), and it’s usage is easier.

Software:

dvgrab from Debian with one patch by me, to make it not to write to the disk;
ffmpeg and ffserver, version 0.4.8 (the newer ones suck), for the encoding and broadcasting, with a small patch, that solves a problem when sending over the network (described here);
wget for archiving;

Principle of work – ffpmeg sends a few (in our case three) different streams (in our case two video and one audio) to the server which assembles them, and sends them to the different clients on request.

The situation:

The laptop stayed next to the camera, and was receiving the raw video, encoding it in some streams, and sending it to the server, where people were watching it. From the same server with wget there was a download of the current stream for archival (it was done to zadnik.org, to lower a bit the load on marla’s drive, thanks, Velin :) ).

Configuration, commands, etc :

Because dvgrab has its own opinion on a lot of questions, we created a named pipe (FIFO) with the command

mknod p av.dv

, and ran it in the following way:

dvgrab --format raw --frames 0 --buffers 1500 --size 0 > av.dv

The options in short mean rew format (e.g. no reencoding by it), frames means at how many frames to start writing a new file (e.g. to write only to stdout), size is a similar option, but in megabytes, and buffers is how many frames to buffer – 1500 is ~62 secs, which worked in most situations.
From dvgrab the raw stream went to ffmpeg, through the following command:

ffmpeg -i av.dv http://marla.ludost.net:8090/feed1.ffm

Nothing complex, it was taking the parameters of the streams from the configuration of ffserver.
(I’m not really sure what was the exact problem for doing dvgrab … | ffmpeg …, but there was, so we used named pipe)

On the server there was a ffserver running, which’s configuration cane seen at https://vasil.ludost.net/ffserver.conf. ffmpeg used the options inside, to know what streams to send. There isn’t anything special inside, only you should write it in a way that minimizes the number of streams (for example, all three streams use the same audio stream, mp3/mono/64kbps). I recommend running it with a higher priority, if the machine is loaded. Also this configuration doesn’t work if you want to view the video from windows with WMP, you’ll need an ASF stream.

Specific notes:

Test everything one-two days before making the real live broadcast (the problems are easily discovered by the users, not like the other services).
Make sure that the pipe used for sending the master stream is good enough (and keep in touch with the people that are responsible for it).
Make sure that the processor of the machine that does the encoding can withstand the load, and turn off all of its power saving.
The reconfiguration of the streaming server in the middle of the live feed means a short interruption of the broadcast (the server doesn’t have a reload, only restart).
It’s possible to use ffmpeg to read directly from firewire, if the camera is supported from the kernel (which wasn’t our case, so we used dvgrab (I don’t think that there are a lot of supported ones)). A problem will be the lack of buffer in ffmpeg, there will be a problem with the errors in the link.

Things for the future:

A local copy of the raw material (we weren’t able to make one this time, because of the lack of disk space).
A nicely made filler, which to be shown when there isn’t a live feed.
Momentarily accessible archive.
A way to transmit/broadcast over RTSP (may be through darwin streaming server?).
A better encoder, maybe based on MPlayer.
ASF stream, to be viewable from windows (WMP) machines.
Instruments for testing of the link, if if can sustain a given bandwidth for a long period.
Replication of the streaming server, if we make more than 70mbps traffic.

Leave a Reply