Developing Android Applications with Adobe AIR [72]
You need to resolve the path to where the video is located before playing it. In this example, there is a directory called myVideos on the SD card and a video called myVideo inside it:
var videosPath:File = File.documentsDirectory.resolvePath("myVideos");
var videoName:String = "myVideo.mp4";
stream.play(videosPath + "/" + videoName);
For more information on accessing the filesystem, refer to Chapter 6.
Browsing for video
You cannot use CameraRoll to browse for videos, but you can use the filesystem.
You could create a custom video player for the user to play videos installed on the device or on the SD card. The browseForOpen method triggers the filesystem to search for videos:
import flash.filesystem.File;
import flash.net.FileFilter;
import flash.media.Video;
var video:Video;
var filter:FileFilter = new FileFilter("video", "*.mp4;*.flv;*.mov;*.f4v");
var file:File = new File();
file.addEventListener(Event.SELECT, fileSelected);
file.browseForOpen("open", [filter]);
WARNING
At the time of this writing, it seems that only the FLV format is recognized when browsing the filesystem using AIR.
A list of the video files found appears. The following code is executed when the user selects one of the files. The video file is passed in the Event.SELECT event as file.target and is played using its url property. Note how the video is sized and displayed in the onMetaData function. We will cover this technique next:
import flash.net.NetConnection;
import flash.net.NetStream;
function fileSelected(event:Event):void {
video = new Video();
var connection:NetConnection = new NetConnection();
connection.connect(null);
var stream:NetStream = new NetStream(connection);
var client:Object = new Object();
client.onMetaData = onMetaData;
stream.client = client;
video.attachNetStream(stream);
stream.play(event.target.url);
}
function onMetaData(info:Object):void {
video.width = info.width;
video.height = info.height;
addChild(video);
}
Metadata
The client property of NetStream is used to listen to onMetaData. In this example, we use the video stream width and height, received in the metadata, to scale the Video object. Other useful information is the duration, the frame rate, and the codec:
// define the Stream client to receive callbacks
var client:Object = new Object();
client.onMetaData = onMetaData;
stream.client = client;
// attach the stream to the video
video.attachNetStream(stream);
stream.play("someVideo.flv");
// size the video object based on the metadata information
function onMetaData(info:Object):void {
video.width = info.width;
video.height = info.height;
addChild(video);
trace(info.duration);
trace(info.framerate);
trace(info.codec);
for (var prop:String in info) {
trace(prop, data[prop]);
}
}
Cue points
The FLVPlaybackComponent gives us the ability to add cue points to a video. The component listens to the current time code and compares it to a dictionary of cue points. When it finds a match, it dispatches an event with the cue point information.
The cue points come in two forms. Navigation cue points are used as markers for chapters or time-specific commentary. Event cue points are used to trigger events such as calling an ActionScript function. The cue point object looks like this:
var cuePoint:Object = {time:5, name:"cue1", type:"actionscript",
parameters:{prop:value}};
This component is not available in AIR for Android. If you want to use something similar, you need to write the functionality yourself. It can be a nice addition to bridge your video to your AIR content if you keep your cue points to a minimum. Use them sparsely, as they have an impact on performance.
Cue points can be embedded dynamically server-side if you are recording the file on Flash Media Server.
Buffering
The moov atom, video metadata that holds index information, needs to be placed at the beginning of the file for a progressive file. Otherwise, the whole file needs to be completely loaded in memory