Video Streaming for ARM

UV4L was originally conceived as a modular collection of Video4Linux2-compliant, cross-platform, user space drivers for real or virtual video input and output devices (with absolutely no external difference from kernel drivers), and other pluggable back-ends or front-ends.

It has evolved over the years and now includes a full-featured Streaming Server component with a customisable web UI providing a set of standard, modern and unique solutions for encrypted live bidirectional data, audio and video streaming, mirroring or conferencing over the web and for the IoT . Since recent releases UV4L has also been providing a RESTful API for the developers who want to implement their custom applications.

HDMI Capturing

The Video4Linux2 uv4l-raspicam driver for Raspberry Pi  has been extended to support the TC358743 HDMI to MIPI converter chip. This chipset is often found in the B101 capture boards made by Auvidea.

Four HD Cameras for the ISS

With the SpaceX Dragon 3 capsule that was recently berthed to the ISS, NASA deployed the HDEV (High Definition Earth Viewing) experiment to space. It consists of four of the shelf HD video cameras in a common housing with a video encoder and router.

Today ISSs robotic arm extracted the box from the Dragons unpressurized cargo hold and mounted it outside of the Columbus module.

Live stream showing video of all four cameras in a predefined sequence.

The four cameras are mounted so that one camera is pointing forward into the stations velocity vector, two cameras to the back and one down towards the earth.
The housing insulates the cameras from the extreme temperatures and vacuum of space but will provide no significant shielding against radiation. That’s on purpose as the main reason for the experiment is to find out how none radiation hardened cameras, especially their sensors, will fare in this environment.

The video signal is encoded to a H264 stream for the downlink and broadcasted live on a ustream channel. The stream will show all four installed cameras in a preprogrammed sequence.

This is an interesting contribution in the race towards live satellite maps, several companies are now taking part in, as the possibility to use off the shelf camera would definitely limit the costs for such ventures.

Howto get the RTSP streaming URL of a YouTube Video


You can get low quality streaming URLs of a YouTube video by parsing:

e.g.: (XML)

One or more tags named “<media:content” with the attribute “type=’video/3gpp'” will contain an URL for a RTSP stream of the video.

For the above video that would be rtsp://r2—

Building The live555 Streaming Media Framework on Windows with Visual Studio 2012/13

The live555 Streaming Media framework allows to stream content over RTP, and comes with a RTSP server. Below some hints how to build on Windows 7 64Bit with Visual Studio Express 2012 though these should also work with VS 2013.

On the command line

More current versions of Visual Studio IDE do not support building makefile based projects, so you need to build from the command line. Instructions on generating makefiles for the windows command-line can be found in the official documentation.

You can still build the project in the IDE. Check the second part of the post for instructions on how to modify the project for that.

  • From the Windows SDKs include directory copy “NtWin32.Mak” and “Win32.Mak” to the VS include directory eg. “C:\Program Files (x86)\Microsoft Visual Studio 11.0\VC\include”.
  • Edit win32config and make “TOOLS32” point to the location of your build tools for me that was “C:\Program Files (x86)\Microsoft Visual Studio 11.0\VC”. Change “msvcirt.lib” to “msvcrt.lib” in the “LINK_OPTS_0” param and the “LINKS” param to ‘ “$(TOOLS32)\bin\$(link)” -out:’ (without the ticks).
  • Get the sources from here: and unpack them somewhere. You will also need the Windows Platform SDK.
  • From the start menu launch  “Developer Command Prompt for VS2012” this will give you a console that is configured for building with VS. Go to the directory where you unpacked the sources and execute “genWindowsMakefiles”.
  • Now run “nmake -f file.mak”  in all subdirs. Eg. “nmake -f groupsock.mak” in the groupsock dir. You should proceed in the following order:
  • liveMedia
  • groupsock
  • UsageEnvironment
  • BasicUsageEnvironment
  • testProgs
  • mediaServer

In the IDE

To get the project to build in the VS IDE you need to patch two files:
In “DelayQueue.cpp” after "const DelayInterval ETERNITY(INT_MAX, MILLION-1);" add

const DelayInterval DELAY_MINUTE = 60*DELAY_SECOND;
const DelayInterval DELAY_HOUR = 60*DELAY_MINUTE;
const DelayInterval DELAY_DAY = 24*DELAY_HOUR;

In “DelayQueue.hh” replace
extern DelayInterval const DELAY_ZERO;
extern DelayInterval const DELAY_SECOND;
DelayInterval const DELAY_MINUTE = 60*DELAY_SECOND;
DelayInterval const DELAY_HOUR = 60*DELAY_MINUTE;
DelayInterval const DELAY_DAY = 24*DELAY_HOUR;


extern DelayInterval const DELAY_ZERO;
extern DelayInterval const DELAY_SECOND;
extern DelayInterval const DELAY_MINUTE;
extern DelayInterval const DELAY_HOUR;
extern DelayInterval const DELAY_DAY;

Don’t forget to add the live555 libs as dependencies to the project. You will also need “advapi32.lib”.

Raspberry Pi – Webcam Streaming

Streaming an external cam

Several options exist to stream the picture of a webcam or the Raspberry Pi cam from the  Pi. The first is using a MJPEG stream. This is the most compatible as many applications and even browsers can display such a stream.

The second one is H264. Also H264 can be encoded on the Pis GPU it has a very high latency, at least five seconds from my experience.

And last but not least you can simply pipe the video stream over netcat to transmit it to another client.

1. Motion

“Motion” can serve up a MJPEG stream. Apart from that is has several other features as listed below: eg. a simple motion detection. It may run in the background as Linux daemon. Here’s a guide how to get motion going with a PS3 Eye Cam.

  • Taking snapshots of movement
  • Watch multiple video devices at the same time
  • Watch multiple inputs on one capture card at the same time
  • Live streaming webcam (using multipart/x-mixed-replace)
  • Real time creation of mpeg movies using libraries from ffmpeg
  • Take automated snapshots on regular intervals
  • Take automated snapshots at irregular intervals using cron
  • Execute external commands when detecting movement (and e.g. send SMS or email)
  • Motion tracking (camera follow motion – special hardware required)
  • Feed events to a MySQL or PostgreSQL database.
  • Feed video back to a video4linux loopback for real time viewing
  • Lots of user contributed related projects with web interfaces etc.
  • User configurable and user defined on screen display.
  • Control via browser (older versions used xml-rpc)
  • Automatic noise and threshold control
  • Motion is a daemon with low CPU consumption and small memory footprint.

2. With MJPEG streamer.

A nice guide how to build and run the open source MJPEG streamer on the pi by Miguel Grindberg.

2. As a H264 Stream with VLC via RTSP

Install VLC Player on the Pi. VLC will act as the streaming server.

sudo apt-get install vlc

Run raspivid and pipe the videostream into vlc for streaming.

raspivid -o - -t 0 -n -vf -hf -w 1280 -h 720 -fps 25 -g 100 | cvlc -vvv stream:///dev/stdin --sout '#rtp{sdp=rtsp://:8554/,mux=ts}' :demux=h264

you can now display the stream in VLC using rtsp://[Raspberry IP]l:8554/as your address.

iOS Screen Mirrored On A Mac

Reflection for Mac allows to mirror an iPad 2 or iPhone 4s screen on a Mac for presentations etc. By default the image gets scaled to 1280x720px but it’s also possible to use the native device resolution. A detailed test of the application you can find here (German).

MPEG continues with royalty-free MPEG video codec

“From the press release: ‘In recognition of the growing importance that the Internet plays in the generation and consumption of video content, MPEG intends to develop a new video compression standard in line with the expected usage models of the Internet. The new standard is intended to achieve substantially better compression performance than that offered by MPEG-2 and possibly comparable to that offered by the AVC Baseline Profile. MPEG will issue a call for proposals on video compression technology at the end of its upcoming meeting in March 2011 that is expected to lead to a standard falling under ISO/IEC “Type-1 licensing”, i.e. intended to be “royalty free.”‘”