[an error occurred while processing this directive]

FlightGear screen streamer

Note: This was done in 2007 and is probably outdated now.

What it is

The FlightGear screen streamer is something I did to get FG to output its screen via the network, which can be used for making/capturing video or simply streaming it on the fly. It is a patch to FG source code.


There are many people who make videos of FG. There are many ways to do this, like videoing the display with a video camera, or outputing the computer screen via video output.

Another common way is to use a screen capturing program, such as xvidcap and recordmydesktop (on Linux). However this could be slow, depending on your display card, X, X drivers, etc.

Some people has been using FG's jpg-httpd to repeatedly capture the FG screen and encode them into a video. This works, but IMHO it is inducing too much overhead. You can think of the screen streamer as a continuous version of the jpg-httpd.

The idea of screen streamer is that FG will output the screen data (more) directly, which is *in theory* more efficient than capturing it at the display (X) level.

As a side note, there's other GL oriented video capturing tool (for Linux but might also work on other platforms), such as yukon and captury (website seems down, but it's in Debian testing and up). It works by overriding (kinda like LD_PRELOAD) the X and GL libraries to capture and save the data, which is probably the fastest way you can get with capturing. Though I haven't tried myself yet.

How it is done

At the moment, the screen streamer acts as a FG IO plugin. When it is enabled, it will start listening as a very basic HTTP server. Once a client has connected with the appropriate parameters, it will start capturing and repsond with the screen data. Currently the data can be either raw RGB or multipart jpeg (useful for live streaming in a web browser).

There are two possible capture methods. For both PLIB and OSG, it can capture the screen data using glReadPixels. This can be really fast on some cards/drivers, while slow on some others, unfortunately.

For OSG, the default capture method is to use (OSG's) render-to-texture, which should be much more efficient than glReadPixels.

In theory render-to-texture can also be implemented for FG PLIB, I just haven't looked at it :)

Once the data is being streamed over the network, the receiving client can then do whatever it want to, like encoding the data into a video, transcoding it into other format and broadcast it over the network.

See it in action

I have setup up an MPcam (multiplay camera) onto FGMap, so not only you can find out where our FlightGear pilots are flying on the map, but also you can actually watch them live.

I have an instance of FG running on my server logged on onto the FG MP server. Then using ffmpeg and ffserver, I have it grabbing the screen via the screen streamer, and transcode them on the fly into flash video, which can viewed in the flash player on the FGMap page. The FGMap page also allows you to control the camera like zooming in and out, and selecting which target (pilot) to follow. This is done via a CGI that updates properties of the running FG using the --httpd interface.

The MPcam FG uses the aircraft mibs.

Watch MPcam flash (You might need a reasonably new version of flash)

Watch MPcam via multipart-jpeg (For those who uses Firefox and does not want to use flash)

(If you don't see "obscam" online on the map, then my MPcam is not running)

After the page is loaded, click on the video icon near the top right corner. You should then see the live camera. You can also do basic control using the buttons below the camera. Hover over those buttons with your mouse to see the tooltip.

Please also bear in mind that there is only ONE instance of FG running, and so multiple people are watching and controlling the same single MPcam. So try not to fight too much :)

For those who's interested, here is my ffserver config file. And my ffmpeg command line is:
ffmpeg -f rawvideo -s 240x192 -pix_fmt rgb24 -r 7 -i "http://localhost:20000/?fps=7"


Last updated: Saturday, 08-Oct-2016 20:24:13 AEDT

(Doh, the current patch broke with latest OSG. Stay tuned while I fix it soon.)

Both FG PLIB and OSG work.

However OSG screenstreamer only captures what's in the 3D scene graph, and does not capture any UIs such as menus and dialogs. This could be a useful, though.

I've tested the patch on Linux and Windows.

Video encoding and streaming tested with ffmpeg/ffserver, mplayer/mencoder, and VLC. See "How to use it" for more details.

The patch

Patch for CVS PLIB FlightGear source

Patch for CVS OSG FlightGear source

By git

You can also get my screen streamer via my own branch in the FG git repository. The branch for FG OSG screenstreamer is osg.pigeon.ss, and FG PLIB is plib.pigeon.ss

How to use it

Pasted directly from the header comments in FG OSG's screenstreamer.cxx:
 * FG Screen Streamer
 * Pigeon <pigeon@pigeond.net>
 * Streaming FG screen directly over (a very basic) HTTP.
 * Command line arguments:
 * $ fgfs --screenstreamer=<bind address>,<port>
 * To use the default, which will listen on localhost:20000
 * $ fgfs --screenstreamer=, 
 * To listen on all interfaces on port 20000:
 * fgfs --screenstreamer=*,
 * URL query string parameters:
 * fps=<frame rate per second>
 * format=rgb|mpjpeg
 * capturer=gl|osg
 * w=<width>
 * h=<height>
 * scale=1|2|4
 * quality=<jpeg quality> (format mpjpeg only)
 * noflip
 * fps: frame rate per second, how many frame it should stream per second,
 *      default is 10.
 * format: format of the output stream, which can be:
 * rgb    - the default if no format given. Raw rgb data.
 * mpjpeg - multipart jpeg, useful for playback directly on a webpage, also can
 *          be used directly in some video players. Only available if you
 *          compile with SimGear jpeg factory support (which guarantees
 *          libjpeg).
 * capturer: gl or osg, osg default if available, otherwise gl.
 *           gl capturer does not allow window resizing during a capture.
 *           Window resizing will terminate any existing streaming.
 *           gl capturer captures at FG window's size only.
 *           osg capturer supports arbitrary capture/stream width and height,
 *           and allows window resizing during streaming.
 * w, h: arbitrary width and height, only works with the OSG capturer at the
 *       moment. Ignored with the GL capturer.
 * scale: the down scale factor, 1, 2 or 4, default is 1 (not scaled down).
 *        e.g. if FG window size is 800x600, scale=2 gives 400x300 output
 *        frames, scale=4 gives a 200x150 output frames. This is ignored with
 *        the OSG capturer if w and h are specified.
 * quality: the jpeg compress quality, for format mpjpeg, default is 75.
 * noflip: do not call the flip routine, output image will be inverted, but in
 *         theory faster for FG.
 * URL examples:
 * http://localhost:20000/?fps=20&format=mpjpeg&scale=2
 * http://localhost:20000/?capturer=osg&w=320&h=240
 * http://localhost:20000/?capturer=gl&scale=2&noflip
 * Usage examples, assuming FG window size is 800x600.
 * Direct playback examples:
 * $ ffplay -f rawvideo -pix_fmt rgb24 -s 800x600 "http://localhost:20000/"
 * $ ffplay -f rawvideo -pix_fmt rgb24 -s 400x300
 *     "http://localhost:20000/?fps=20&scale=2"
 * $ ffplay -f mjpeg "http://localhost:20000/?scale=2&format=mpjpeg"
 * $ mplayer -nocache -demuxer rawvideo
 *     -rawvideo w=800:h=600:format=rgb24:fps=20 http://localhost:20000/?fps=20
 * $ vlc --mjpeg-fps 10 --no-drop-late-frames
 *     "http://localhost:20000/?format=mpjpeg&fps=10"
 * (vlc svn 0.9.0 only)
 * $ vlc --demux rawvid --rawvid-chroma RV24
 *     --rawvid-width 800 --rawvid-height 600
 *     --no-drop-late-frames "http://localhost:20000/"
 * Save and playback later (file extension is important for ffplay and mplayer)
 * $ wget -O fg.mjpg "http://localhost:20000/?format=mpjpeg&fps=20"
 * $ ffplay fg.mjpg
 * $ mplayer -fps 20 fg.mjpg
 * $ vlc --mjpeg-fps=20 fg.mjpg
 * Save and play at the same time:
 * $ wget -O - "http://localhost:20000/?fps=20&scale=2&format=mpjpeg" |
 *     tee fg.mjpg | ffplay -f mjpeg -
 * Direct video encoding:
 * $ ffmpeg -r 20 -f rawvideo -pix_fmt rgb24 -s 400x300 -i
 *     "http://localhost:20000/?fps=20&scale=2" -vcodec mpeg4
 *     -b 256 -an -vtag DIVX output.avi
 * $ mencoder -nocache -demuxer rawvideo
 *     -rawvideo w=640:h=480:fps=25:format=rgb24
 *     -ovc lavc -lavcopts vcodec=mpeg4:vbitrate=480
 *     -o output.avi -ffourcc DIVX
 *     "http://localhost:20000/?fps=25&capturer=osg&w=640&h=480"
 * Playing in a webpage directly (only works in Mozilla/Firefox/etc):
 * Put the following tag in your HTML page
 * <img src="http://localhost:20000/?format=mpjpeg">
 * And you may want to change localhost to something else.
 * Watching, while encoding and streaming over HTTP, with VLC:
 * vlc "http://localhost:20000/?fps=10&scale=2"
 *   --no-sout-transcode-hurry-up --no-sout-ffmpeg-hurry-up
 *   --sout-transcode-fps 10 --no-drop-late-frames
 *   --demux rawvid --rawvid-width 400 --rawvid-height 300 --rawvid-fps 10
 *   --rawvid-chroma RV24 --sout
 *   "#duplicate{dst=display,dst='transcode{vcodec=mp4v,vb=256}:
 *   standard{access=http,mux=asf,dst=}'}"
 * Notes:
 * - Under Linux at least, ffplay/ffmpeg and mplayer/mencoder (-nocache is
 *   important!) seems to work pretty well. vlc also works, but seems to
 *   complain more about late frames every now and then.
 * - An output width and height of multiple of 2 is recommended. Hence if
 *   you're using scale=2, you should make sure FG's window width and height
 *   are multiple of 4. And for scale=4, multiple of 8.
 * - If you seriously want to capture audio as well, you'll have to record it
 *   from the audio device. Under Linux you can do so with ffmpeg by doing:
 *   $ ffmpeg -r 20 -f rawvideo -pix_fmt rgb24 -s 400x300 -i
 *       "http://localhost:20000/?fps=20&scale=2" -vcodec mpeg4
 *       -f audio_device -i /dev/dsp -ab 128k -acodec mp3 output.avi
 *   Note that with this method the video and audio is almost definitely to be
 *   a bit out of sync.

[an error occurred while processing this directive]