六狼论坛

 找回密码
 立即注册

QQ登录

只需一步,快速开始

新浪微博账号登陆

只需一步,快速开始

搜索
查看: 25|回复: 0

Experiments in Streaming Content in Java ME(3)

[复制链接]

升级  93.33%

52

主题

52

主题

52

主题

秀才

Rank: 2

积分
190
 楼主| 发表于 2013-1-15 02:54:40 | 显示全部楼层 |阅读模式
Back to RTPSourceStream and StreamingDataSource

With the protocol handler in place, let's revisit the RTPSourceStream and StreamingDataSource classes from earlier, where they contained only place-holder methods. The StreamingDataSource is simple to code:
 
import java.io.IOException;import javax.microedition.media.Control;import javax.microedition.media.protocol.DataSource;import javax.microedition.media.protocol.SourceStream;public class StreamingDataSource extends DataSource {  // the full URL like locator to the destination  private String locator;  // the internal streams that connect to the source  // in this case, there is only one  private SourceStream[] streams;  // is this connected to its source?  private Boolean connected = false;  public StreamingDataSource(String locator) {      super(locator);      setLocator(locator);  }  public void setLocator(String locator) { this.locator = locator; }  public String getLocator() { return locator; }  public void connect() throws IOException {    // if already connected, return    if (connected) return;    // if locator is null, then can't actually connect    if (locator == null)      throw new IOException("locator is null");    // now populate the sourcestream array    streams = new RTPSourceStream[1];    // with a new RTPSourceStream    streams[0] = new RTPSourceStream(locator);    // set flag    connected = true;    }  public void disconnect() {    // if there are any streams    if (streams != null) {      // close the individual stream        try {          ((RTPSourceStream)streams[0]).close();        } catch(IOException ioex) {} // silent    }    // and set the flag    connected = false;  }  public void start() throws IOException {    if(!connected) return;    // start the underlying stream    ((RTPSourceStream)streams[0]).start();  }  public void stop() throws IOException {    if(!connected) return;    // stop the underlying stream    ((RTPSourceStream)streams[0])Close();  }  public String getContentType() {    // for the purposes of this article, it is only video/mpeg    return "video/mpeg";  }  public Control[] getControls() { return new Control[0]; }  public Control getControl(String controlType) { return null; }  public SourceStream[] getStreams() {    return streams; }} 
The main work takes place in the connect() method. It creates a new RTPSourceStream with the requested address. Notice that the getContentType() method returns video/mpeg as the default content type, but change it to the supported content type for your system. Of course, this should not be hard-coded; it should be based on the actual support for different media types.
The next listing shows the complete RTPSourceStream class, which, along with RTSPProtocolHandler, does the bulk of work in connecting getting the RTP packets of the server:
 
import java.io.IOException;import java.io.InputStream;import java.io.OutputStream;import javax.microedition.io.Datagram;import javax.microedition.io.Connector;import javax.microedition.media.Control;import javax.microedition.io.SocketConnection;import javax.microedition.io.DatagramConnection;import javax.microedition.media.protocol.SourceStream;import javax.microedition.media.protocol.ContentDescriptor;public class RTPSourceStream implements SourceStream {    private RTSPProtocolHandler handler;    private InputStream is;    private OutputStream Os;    private DatagramConnection socket;    public RTPSourceStream(String address) throws IOException {        // create the protocol handler and set it up so that the        // application is ready to read data        // create a socketconnection to the remote host        // (in this case I have set it up so that its localhost, you can        // change it to wherever your server resides)        SocketConnection sc =          (SocketConnection)Connector.open("socket://localhost:554");        // open the input and output streams        is = sc.openInputStream();        Os = sc.openOutputStream();        // and initialize the handler        handler = new RTSPProtocolHandler(address, is, Os);        // send the basic signals to get it ready        handler.doDescribe();        handler.doSetup();    }    public void start() throws IOException {      // open a local socket on port 8080 to read data to      socket = (DatagramConnection)Connector.open("datagram://:8080");      // and send the PLAY command      handler.doPlay();    }    public void close() throws IOException {        if(handler != null) handler.doTeardown();        is.close();        os.close();    }    public int read(byte[] buffer, int offset, int length)      throws IOException {      // create a byte array which will be used to read the datagram      byte[] fullPkt = new byte[length];      // the new Datagram      Datagram packet = socket.newDatagram(fullPkt, length);      // receive it      socket.receive(packet);      // extract the actual RTP Packet's media data in the requested buffer      RTPPacket rtpPacket = getRTPPacket(packet, packet.getData());      buffer = rtpPacket.getData();      // debug      System.err.println(rtpPacket + " with media length: " + buffer.length);      // and return its length      return buffer.length;    }    // extracts the RTP packet from each datagram packet received    private RTPPacket getRTPPacket(Datagram packet, byte[] buf) {      // SSRC      long SSRC = 0;        // the payload type        byte PT = 0;      // the time stamp        int timeStamp = 0;        // the sequence number of this packet        short seqNo = 0;        // see http://www.networksorcery.com/enp/protocol/rtp.htm        // for detailed description of the packet and its data        PT =          (byte)((buf[1] & 0xff) & 0x7f);        seqNo =          (short)((buf[2] << 8) | ( buf[3] & 0xff));        timeStamp =          (((buf[4] & 0xff) << 24) | ((buf[5] & 0xff) << 16) |            ((buf[6] & 0xff) << 8) | (buf[7] & 0xff)) ;        SSRC =          (((buf[8] & 0xff) << 24) | ((buf[9] & 0xff) << 16) |            ((buf[10] & 0xff) << 8) | (buf[11] & 0xff));        // create an RTPPacket based on these values        RTPPacket rtpPkt = new RTPPacket();        // the sequence number        rtpPkt.setSequenceNumber(seqNo);        // the timestamp        rtpPkt.setTimeStamp(timeStamp);        // the SSRC        rtpPkt.setSSRC(SSRC);        // the payload type        rtpPkt.setPayloadType(PT);        // the actual payload (the media data) is after the 12 byte header        // which is constant        byte payload[] = new byte [packet.getLength() - 12];        for(int i=0; i < payload.length; i++) payload = buf[i+12];        // set the payload on the RTP Packet        rtpPkt.setData(payload);        // and return the payload        return rtpPkt;    }    public long seek(long where) throws IOException {     throw new IOException("cannot seek");    }    public long tell() { return -1; }    public int getSeekType() { return NOT_SEEKABLE;    }    public Control[] getControls() { return null; }    public Control getControl(String controlType) { return null; }    public long getContentLength() { return -1;    }    public int getTransferSize() { return -1;    }    public ContentDescriptor getContentDescriptor() {        return new ContentDescriptor("audio/rtp");    }} 
The constructor for the RTPSourceStream creates a SocketConnection to the remote server (hard-coded to the local server and port here, but you can change this to accept any server or port). It then opens the input and output streams, which it uses to create the RTSPProtocolHandler. Finally, using this handler, it sends the DESCRIBE and SETUP commands to the remote server to get the server ready to send the packets. The actual delivery doesn't start until the start() method is called by the StreamingDataSource, which opens up a local port (hard-coded to 8081 in this case) for receiving the packets and sends the PLAY command to start receiving these packets. The actual reading of the packets is done in the read() method, which receives the individual packets, strips them to create the RTPPacket instances (with the getRTPPacket() method), and returns the media data in the buffer supplied while calling the read() method.
A MIDlet to see if it works

With all the classes in place, let's write a simple MIDlet to first create a Player instance that will use the StreamingDataSource to connect to the server and then get media packets from it. The Player interface is defined by the MMAPI and allows you to control the playback (or recording) of media. Instances of this interface are created by using the Manager class from the MMAPI javax.microedition.media package (see the MMAPI tutorial). The following shows this rudimentary MIDlet:
 
import javax.microedition.media.Player;import javax.microedition.midlet.MIDlet;import javax.microedition.media.Manager;public class StreamingMIDlet extends MIDlet {  public void startApp() {    try {      // create Player instance, realize it and then try to start it      Player player =        Manager.createPlayer(          new StreamingDataSource(            "rtsp://localhost:554/sample_100kbit.mp4"));      player.realize();      player.start();    } catch(Exception e) {            e.printStackTrace();    }  }  public void pauseApp() {}  public void destroyApp(boolean unconditional) {}} 
So what should happen when you run this MIDlet in the Wireless toolkit? I have on purpose left out any code to display the resulting video on screen. When I run it in the toolkit, I know that I am receiving the packets because I see the debug statements as shown in Figure 2.
 

 
Figure 2. Running StreamingMIDlet output
The RTP packets as sent by the server are being received. The StreamingDataSource along with the RTSPProtocolHandler and RTPSourceStream are doing their job of making the streaming server send these packets. This is confirmed by looking at the streaming server's admin console as shown in Figure 3.
 

 
Figure 3. Darwin's admin console shows that the file is being streamed (click for full-size image).
Unfortunately, the player constructed by the Wireless toolkit is trying to read the entire content at one go. Even if I were to make a StreamingVideoControl, it will not display the video until it has read the whole file, therefore defeating the purpose of the streaming aspect of this whole experiment. So what needs to be done to achieve the full streaming experience?
Ideally, MMAPI should provide the means for developers to register the choice of Player for the playback of certain media. This is easily achieved by providing a new method in the Manager class for registering (or overriding) MIME types or protocols with developer-made Player instances. For example, let's say I create a Player instance that reads streaming data called StreamingMPEGPlayer. With the Manager class, I should be able to say Manager.registerPlayer("video/mpeg", StreamingMPEGPlayer.class) or Manager.registerPlayer("rtsp", StreamingMPEGPlayer.class). MMAPI should then simply load this developer-made Player instance and use this as the means to read data from the developer-made datasource.
In a nutshell, you need to be able to create an independent media player and register it as the choice of instance for playing the desired content. Unfortunately, this is not possible with the current MMAPI implementation, and this is the data consumption conundrum that I had talked about earlier.
Of course, if you can test this code in a toolkit that does not need to read the complete data before displaying it (or for audio files, playing them), then you have achieved the aim of streaming data using the existing MMAPI implementation.
This experiment should prove that you can stream data with the current MMAPI implementation, but you may not be able to manipulate it in a useful manner until you have better control over the Manager and Player instances. I look forward to your comments and experiments using this code.
 
 
您需要登录后才可以回帖 登录 | 立即注册 新浪微博账号登陆

本版积分规则

快速回复 返回顶部 返回列表