Broadcasting Live Video
One of the Flash Media Server’s more interesting capabilities is its ability to broadcast live video streams.
The process of live video broadcasting is very much the same as the process of streaming on-demand (pre-recorded) Flash video: 1) you provide a video stream to Flash Media Server or a Flash Video Streaming Service; 2) the media is streamed to people who visit your site; and 3)those visitors view the stream using Macromedia Flash Player.
However, the way you provide the video stream for live video is very different: you need a source of live video, and that video feed must be encoded in real time as it is captured. Unlike the procedure for on-demand video, with live video the capturing, encoding, and publishing steps occur simultaneously.
Checklist for Capturing and Publishing Live Video
Let’s review how to capture and publish live video.
To capture, encode, and publish live video, follow the six steps outlined below:
- Using the Flash authoring tool, create a SWF file that captures and transmits live video, also known as a broadcast application. See Encoding Live Video at the Adobe Flash Developer Center website.
- Create a live streaming application on your Flash Media Server or sign up for a live video streaming account with an FVSS.
- Create another SWF file for playing your streaming video, and publish this SWF file to your website. See the article "Authoring Flash Video with Dreamweaver."
- Connect a digital video camera or web camera to your computer, and then open the broadcast application that you created.
- Publish the live stream from your camera to your Flash Media Server or FVSS. For more information, see About Streaming Video.
- When visitors to your site browse to the SWF file on your website, Flash Player displays the live video stream for them.
Encoding Live Video
To stream live Flash video, the source can come from any camera connected to a computer. This can be anything from a webcam connected to your laptop’s USB port to a digital video camera connected to a high-end video capture card. However, only streaming video can be delivered live.
Flash Player 6 and later contains an audio and video encoder that lets you capture audio and video directly from any camera or microphone connected to your computer. You run a particular kind of SWF file, also called a broadcast application, on the computer that the camera is connected to. The broadcast application contains all the settings that are required to encode the video in real time. Many of these settings are conceptually the same as those I’ve discussed previously, such as bit rate and keyframes.
Capture cards and other live video encoding hardware improve the quality of live video by sending a better video signal to the broadcast application and moving much of the encoding processing from Flash Player to the hardware. However, you will not get this quality improvement with the basic live video streaming I describe here.
The broadcast application sends the live video to your Flash Media Server or to an FVSS, which republishes the live video to be viewed by anyone who connects to the stream.
The live video capabilities in Flash Media Server include audio push, record, and record-append. In addition to live video streaming, the Flash Media Server also supports other advanced features—including video chat, video messaging, webcasting, video conferencing, stop-motion capture, and more. For information on using live video with these advanced streaming features, visit the Flash Communication Server homepage on macromedia.com.
The details of creating a broadcast application are outside the scope of this overview. However, Flash Media Server includes several sample applications and components for capturing and broadcasting live video. For information on specific settings to use, see the excellent (if a bit outdated) article Encoding Best Practices for Live Video at the Adobe Developer Center.
Finally, in addition to the broadcast application, you must create a web page that enables visitors to view the broadcast.