将实时Android网络摄像头视频上传到RTP / RTSP服务器

时间:2013-11-29 09:00:45

标签: android rtsp rtsp-client

我已经做了适当的研究,但仍然缺乏有关我希望实现的事情的信息。

所以我想编写一个应用程序,用户可以录制视频并立即(实时)将视频上传到RTP / RTSP服务器。 服务器端不会有问题。我不清楚的是如何在电话方面实现这一点。

到目前为止,我的研究是我必须将录制的视频写入本地套接字而不是文件,因为如果写入文件的3gp文件无法访问,直到最终确定(当视频停止并且标题信息已写入视频长度和其他信息。)

当套接字收到连续数据时,我需要将其包装成RTP数据包并将其发送到远程服务器。我可能还必须首先进行基本编码(这还不是那么重要)。

如果这个理论到目前为止是否有任何人有任何想法。 我还想知道是否有人可以指出一些类似方法的代码片段,特别是用于将视频即时发送到服务器。我不知道该怎么做。

非常感谢和最诚挚的问候

7 个答案:

答案 0 :(得分:10)

您的整体方法听起来不错,但您需要考虑几件事情。

  

所以我想编写一个应用程序,用户可以录制视频并立即(实时)将视频上传到RTP / RTSP服务器。

  • 我假设您要上传到RTSP服务器,以便将内容重新分发给多个客户端?
  • 您将如何处理RTP会话的信令/设置 RTSP服务器?您需要以某种方式通知RTSP服务器用户 将上传实时媒体,以便它可以打开相应的 RTP / RTCP插座等。
  • 您将如何处理身份验证?多个客户端设备?
  

到目前为止,我的研究是我必须将录制的视频写入本地套接字而不是文件,因为如果写入文件的3gp文件无法访问,直到最终确定(当视频停止并且标题信息已写入视频长度和其他信息。)

通过RTP / RTCP实时发送帧是正确的方法。当捕获设备捕获每个帧时,您需要对其进行编码/压缩并通过套接字发送它。与mp4一样,3gp是用于文件存储的容器格式。对于实时捕获,无需写入文件。唯一有意义的是例如在HTTP实时流式传输或DASH方法中,媒体在通过HTTP提供之前写入传输流或mp4文件。

  

当套接字收到连续数据时,我需要将其包装成RTP数据包并将其发送到远程服务器。我可能还必须首先进行基本编码(这还不是那么重要)。

我不同意,编码非常很重要,否则你可能永远不会设法发送视频,而且你将不得不处理成本(通过移动网络)等问题大量的媒体取决于分辨率和帧率。

  

如果这个理论到目前为止是否有任何人有任何想法。我还想知道是否有人可以指出一些类似方法的代码片段,特别是用于将视频即时发送到服务器。我不知道该怎么做。

spydroid开源项目为出发点。 它包含许多必要的步骤,包括如何配置编码器,分组到RTP,发送RTCP,以及一些RTSP服务器功能。 Spydroid设置RTSP服务器,以便在使用VSP等RTSP客户端设置RTSP会话后对媒体进行编码和发送。由于您的应用程序是由想要将媒体发送到服务器的电话用户驱动的,您可能需要考虑另一种方法来启动发送,即使您向服务器发送某种消息以便例如在spydroid中设置RTSP会话

答案 1 :(得分:2)

一年前,我创建了Android应用程序,可以使用rtsp通过tcp将其摄像头/麦克风传输到wowza媒体服务器。

一般方法是创建unix socket,获取其文件描述符并将其提供给android media recorder组件。然后指示媒体记录器以mp4 / h264格式将摄像机视频记录到该文件描述符。现在,您的应用程序读取客户端套接字,解析mp4以删除标头并从中获取iframe并将其包装到rtsp流中。

也可以为声音(通常是AAC)做类似的事情。当然,你必须处理自己的时间戳,整个方法中最棘手的事情是视频/音频同步。

所以这是第一部分。可以称为rtspsocket的东西。它在连接方法中与媒体服务器协商,之后您可以将流本身写入其中。我稍后会再说。

package com.example.android.streaming.streaming.rtsp;

import java.io.IOException;
import java.io.InputStream;
import java.io.OutputStream;
import java.io.UnsupportedEncodingException;
import java.math.BigInteger;
import java.net.InetSocketAddress;
import java.net.Socket;
import java.net.SocketException;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import java.util.Locale;
import java.util.concurrent.ConcurrentHashMap;

import android.util.Base64;
import android.util.Log;

import com.example.android.streaming.StreamingApp;
import com.example.android.streaming.streaming.Session;
import com.example.android.streaming.BuildConfig;

public class RtspSocket extends Socket {
    public static final int RTSP_HEADER_LENGTH = 4;
    public static final int RTP_HEADER_LENGTH = 12;
    public static final int MTU = 1400;

    public static final int PAYLOAD_OFFSET = RTSP_HEADER_LENGTH + RTP_HEADER_LENGTH;
    public static final int RTP_OFFSET = RTSP_HEADER_LENGTH;

    private ConcurrentHashMap<String, String> headerMap = new ConcurrentHashMap<String, String>();

    static private final String kCRLF = "\r\n";

    // RTSP request format strings
    static private final String kOptions = "OPTIONS %s RTSP/1.0\r\n";
    static private final String kDescribe = "DESCRIBE %s RTSP/1.0\r\n";
    static private final String kAnnounce = "ANNOUNCE %s RTSP/1.0\r\n";
    static private final String kSetupPublish = "SETUP %s/trackid=%d RTSP/1.0\r\n";
    @SuppressWarnings("unused")
    static private final String kSetupPlay = "SETUP %s/trackid=%d RTSP/1.0\r\n";
    static private final String kRecord = "RECORD %s RTSP/1.0\r\n";
    static private final String kPlay = "PLAY %s RTSP/1.0\r\n";
    static private final String kTeardown = "TEARDOWN %s RTSP/1.0\r\n";

    // RTSP header format strings
    static private final String kCseq = "Cseq: %d\r\n";
    static private final String kContentLength = "Content-Length: %d\r\n";
    static private final String kContentType = "Content-Type: %s\r\n";
    static private final String kTransport = "Transport: RTP/AVP/%s;unicast;mode=%s;%s\r\n";
    static private final String kSession = "Session: %s\r\n";
    static private final String kRange = "range: %s\r\n";
    static private final String kAccept = "Accept: %s\r\n";
    static private final String kAuthBasic = "Authorization: Basic %s\r\n";
    static private final String kAuthDigest = "Authorization: Digest username=\"%s\",realm=\"%s\",nonce=\"%s\",uri=\"%s\",response=\"%s\"\r\n";

    // RTSP header keys
    static private final String kSessionKey = "Session";
    static private final String kWWWAuthKey = "WWW-Authenticate";

    byte header[] = new byte[RTSP_MAX_HEADER + 1];
    static private final int RTSP_MAX_HEADER = 4095;
    static private final int RTSP_MAX_BODY = 4095;

    static private final int RTSP_RESP_ERR = -6;
    // static private final int RTSP_RESP_ERR_SESSION = -7;
    static public final int RTSP_OK = 200;
    static private final int RTSP_BAD_USER_PASS = 401;

    static private final int SOCK_ERR_READ = -5;

    /* Number of channels including control ones. */
    private int channelCount = 0;

    /* RTSP negotiation cmd seq counter */
    private int seq = 0;

    private String authentication = null;
    private String session = null;

    private String path = null;
    private String url = null;
    private String user = null;
    private String pass = null;
    private String sdp = null;

    private byte[] buffer = new byte[MTU];

    public RtspSocket() {
        super();
        try {
            setTcpNoDelay(true);
            setSoTimeout(60000);
        } catch (SocketException e) {
            Log.e(StreamingApp.TAG, "Failed to set socket params.");
        }
        buffer[RTSP_HEADER_LENGTH] = (byte) Integer.parseInt("10000000", 2);
    }

    public byte[] getBuffer() {
        return buffer;
    }

    public static final void setLong(byte[] buffer, long n, int begin, int end) {
        for (end--; end >= begin; end--) {
            buffer[end] = (byte) (n % 256);
            n >>= 8;
        }
    }

    public void setSequence(int seq) {
        setLong(buffer, seq, RTP_OFFSET + 2, RTP_OFFSET + 4);
    }

    public void setSSRC(int ssrc) {
        setLong(buffer, ssrc, RTP_OFFSET + 8, RTP_OFFSET + 12);
    }

    public void setPayload(int payload) {
        buffer[RTP_OFFSET + 1] = (byte) (payload & 0x7f);
    }

    public void setRtpTimestamp(long timestamp) {
        setLong(buffer, timestamp, RTP_OFFSET + 4, RTP_OFFSET + 8);
    }

    /** Sends the RTP packet over the network */
    private void send(int length, int stream) throws IOException {
        buffer[0] = '$';
        buffer[1] = (byte) stream;
        setLong(buffer, length, 2, 4);
        OutputStream s = getOutputStream();
        s.write(buffer, 0, length + RTSP_HEADER_LENGTH);
        s.flush();
    }

    public void sendReport(int length, int ssrc, int stream) throws IOException {
        setPayload(200);
        setLong(buffer, ssrc, RTP_OFFSET + 4, RTP_OFFSET + 8);
        send(length + RTP_HEADER_LENGTH, stream);
    }

    public void sendData(int length, int ssrc, int seq, int payload, int stream, boolean last) throws IOException {
        setSSRC(ssrc);
        setSequence(seq);
        setPayload(payload);
        buffer[RTP_OFFSET + 1] |= (((last ? 1 : 0) & 0x01) << 7);
        send(length + RTP_HEADER_LENGTH, stream);
    }

    public int getChannelCount() {
        return channelCount;
    }

    private void write(String request) throws IOException {
        try {
            String asci = new String(request.getBytes(), "US-ASCII");
            OutputStream out = getOutputStream();
            out.write(asci.getBytes());
        } catch (IOException e) {
            throw new IOException("Error writing to socket.");
        }
    }

    private String read() throws IOException {
        String response = null;
        try {
            InputStream in = getInputStream();
            int i = 0, len = 0, crlf_count = 0;
            boolean parsedHeader = false;

            for (; i < RTSP_MAX_BODY && !parsedHeader && len > -1; i++) {
                len = in.read(header, i, 1);
                if (header[i] == '\r' || header[i] == '\n') {
                    crlf_count++;
                    if (crlf_count == 4)
                        parsedHeader = true;
                } else {
                    crlf_count = 0;
                }
            }
            if (len != -1) {
                len = i;
                header[len] = '\0';
                response = new String(header, 0, len, "US-ASCII");
            }
        } catch (IOException e) {
            throw new IOException("Connection timed out. Check your network settings.");
        }
        return response;
    }

    private int parseResponse(String response) {
        String[] lines = response.split(kCRLF);
        String[] items = response.split(" ");
        String tempString, key, value;

        headerMap.clear();
        if (items.length < 2)
            return RTSP_RESP_ERR;
        int responseCode = RTSP_RESP_ERR;
        try {
            responseCode = Integer.parseInt(items[1]);
        } catch (Exception e) {
            Log.w(StreamingApp.TAG, e.getMessage());
            Log.w(StreamingApp.TAG, response);
        }
        if (responseCode == RTSP_RESP_ERR)
            return responseCode;

        // Parse response header into key value pairs.
        for (int i = 1; i < lines.length; i++) {
            tempString = lines[i];

            if (tempString.length() == 0)
                break;

            int idx = tempString.indexOf(":");

            if (idx == -1)
                continue;

            key = tempString.substring(0, idx);
            value = tempString.substring(idx + 1);
            headerMap.put(key, value);
        }

        tempString = headerMap.get(kSessionKey);
        if (tempString != null) {
            // Parse session
            items = tempString.split(";");
            tempString = items[0];
            session = tempString.trim();
        }

        return responseCode;
    }

    private void generateBasicAuth() throws UnsupportedEncodingException {
        String userpass = String.format("%s:%s", user, pass);
        authentication = String.format(kAuthBasic, Base64.encodeToString(userpass.getBytes("US-ASCII"), Base64.DEFAULT));
    }

    public static String md5(String s) {
        MessageDigest digest;
        try {
            digest = MessageDigest.getInstance("MD5");
            digest.update(s.getBytes(), 0, s.length());
            String hash = new BigInteger(1, digest.digest()).toString(16);
            return hash;
        } catch (NoSuchAlgorithmException e) {
            e.printStackTrace();
        }
        return "";
    }

    static private final int CC_MD5_DIGEST_LENGTH = 16;

    private String md5HexDigest(String input) {
        byte digest[] = md5(input).getBytes();
        String result = new String();
        for (int i = 0; i < CC_MD5_DIGEST_LENGTH; i++)
            result = result.concat(String.format("%02x", digest[i]));
        return result;
    }

    private void generateDigestAuth(String method) {
        String nonce, realm;
        String ha1, ha2, response;

        // WWW-Authenticate: Digest realm="Streaming Server",
        // nonce="206351b944cb28fe37a0794848c2e36f"
        String wwwauth = headerMap.get(kWWWAuthKey);
        int idx = wwwauth.indexOf("Digest");
        String authReq = wwwauth.substring(idx + "Digest".length() + 1);

        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, String.format("Auth Req: %s", authReq));

        String[] split = authReq.split(",");
        realm = split[0];
        nonce = split[1];

        split = realm.split("=");
        realm = split[1];
        realm = realm.substring(1, 1 + realm.length() - 2);

        split = nonce.split("=");
        nonce = split[1];
        nonce = nonce.substring(1, 1 + nonce.length() - 2);

        if (BuildConfig.DEBUG) {
            Log.d(StreamingApp.TAG, String.format("realm=%s", realm));
            Log.d(StreamingApp.TAG, String.format("nonce=%s", nonce));
        }

        ha1 = md5HexDigest(String.format("%s:%s:%s", user, realm, pass));
        ha2 = md5HexDigest(String.format("%s:%s", method, url));
        response = md5HexDigest(String.format("%s:%s:%s", ha1, nonce, ha2));
        authentication = md5HexDigest(String.format(kAuthDigest, user, realm, nonce, url, response));
    }

    private int options() throws IOException {
        seq++;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kOptions, url));
        request.append(String.format(kCseq, seq));
        request.append(kCRLF);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- OPTIONS Request ---\n\n" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- OPTIONS Response ---\n\n" + response);
        return parseResponse(response);
    }

    @SuppressWarnings("unused")
    private int describe() throws IOException {
        seq++;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kDescribe, url));
        request.append(String.format(kAccept, "application/sdp"));
        request.append(String.format(kCseq, seq));
        request.append(kCRLF);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- DESCRIBE Request ---\n\n" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- DESCRIBE Response ---\n\n" + response);
        return parseResponse(response);
    }

    private int recurseDepth = 0;

    private int announce() throws IOException {
        seq++;
        recurseDepth = 0;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kAnnounce, url));
        request.append(String.format(kCseq, seq));
        request.append(String.format(kContentLength, sdp.length()));
        request.append(String.format(kContentType, "application/sdp"));
        request.append(kCRLF);
        if (sdp.length() > 0)
            request.append(sdp);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- ANNOUNCE Request ---\n\n" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- ANNOUNCE Response ---\n\n" + response);

        int ret = parseResponse(response);
        if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
            String wwwauth = headerMap.get(kWWWAuthKey);
            if (wwwauth != null) {
                if (BuildConfig.DEBUG)
                    Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
                int idx = wwwauth.indexOf("Basic");
                recurseDepth++;

                if (idx != -1) {
                    generateBasicAuth();
                } else {
                    // We are assuming Digest here.
                    generateDigestAuth("ANNOUNCE");
                }

                ret = announce();
                recurseDepth--;
            }
        }
        return ret;
    }

    private int setup(int trackId) throws IOException {
        seq++;
        recurseDepth = 0;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kSetupPublish, url, trackId));
        request.append(String.format(kCseq, seq));

        /* One channel for rtp (data) and one for rtcp (control) */
        String tempString = String.format(Locale.getDefault(), "interleaved=%d-%d", channelCount++, channelCount++);

        request.append(String.format(kTransport, "TCP", "record", tempString));
        request.append(kCRLF);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- SETUP Request ---\n\n" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- SETUP Response ---\n\n" + response);

        int ret = parseResponse(response);
        if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
            String wwwauth = headerMap.get(kWWWAuthKey);
            if (wwwauth != null) {
                if (BuildConfig.DEBUG)
                    Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
                int idx = wwwauth.indexOf("Basic");
                recurseDepth++;

                if (idx != -1) {
                    generateBasicAuth();
                } else {
                    // We are assuming Digest here.
                    generateDigestAuth("SETUP");
                }

                ret = setup(trackId);
                authentication = null;
                recurseDepth--;
            }
        }
        return ret;
    }

    private int record() throws IOException {
        seq++;
        recurseDepth = 0;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kRecord, url));
        request.append(String.format(kCseq, seq));
        request.append(String.format(kRange, "npt=0.000-"));
        if (authentication != null)
            request.append(authentication);
        if (session != null)
            request.append(String.format(kSession, session));
        request.append(kCRLF);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- RECORD Request ---\n\n" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- RECORD Response ---\n\n" + response);
        int ret = parseResponse(response);
        if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
            String wwwauth = headerMap.get(kWWWAuthKey);
            if (wwwauth != null) {
                if (BuildConfig.DEBUG)
                    Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
                int idx = wwwauth.indexOf("Basic");
                recurseDepth++;

                if (idx != -1) {
                    generateBasicAuth();
                } else {
                    // We are assuming Digest here.
                    generateDigestAuth("RECORD");
                }

                ret = record();
                authentication = null;
                recurseDepth--;
            }
        }
        return ret;
    }

    @SuppressWarnings("unused")
    private int play() throws IOException {
        seq++;
        recurseDepth = 0;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kPlay, url));
        request.append(String.format(kCseq, seq));
        request.append(String.format(kRange, "npt=0.000-"));
        if (authentication != null)
            request.append(authentication);
        if (session != null)
            request.append(String.format(kSession, session));
        request.append(kCRLF);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- PLAY Request ---\n\n" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- PLAY Response ---\n\n" + response);
        int ret = parseResponse(response);
        if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
            String wwwauth = headerMap.get(kWWWAuthKey);
            if (wwwauth != null) {
                if (BuildConfig.DEBUG)
                    Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
                int idx = wwwauth.indexOf("Basic");
                recurseDepth++;

                if (idx != -1) {
                    generateBasicAuth();
                } else {
                    // We are assuming Digest here.
                    generateDigestAuth("PLAY");
                }

                ret = record();
                authentication = null;
                recurseDepth--;
            }
        }
        return ret;
    }

    private int teardown() throws IOException {
        seq++;
        recurseDepth = 0;
        StringBuilder request = new StringBuilder();
        request.append(String.format(kTeardown, url));
        request.append(String.format(kCseq, seq));
        if (authentication != null)
            request.append(authentication);
        if (session != null)
            request.append(String.format(kSession, session));
        request.append(kCRLF);
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- TEARDOWN Request ---\n\n" + request);
        write(request.toString());
        String response = read();
        if (response == null)
            return SOCK_ERR_READ;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "--- TEARDOWN Response ---\n\n" + response);
        int ret = parseResponse(response);
        if (ret == RTSP_BAD_USER_PASS && recurseDepth == 0) {
            String wwwauth = headerMap.get(kWWWAuthKey);
            if (wwwauth != null) {
                if (BuildConfig.DEBUG)
                    Log.d(StreamingApp.TAG, String.format("WWW Auth Value: %s", wwwauth));
                int idx = wwwauth.indexOf("Basic");
                recurseDepth++;

                if (idx != -1) {
                    generateBasicAuth();
                } else {
                    // We are assuming Digest here.
                    generateDigestAuth("TEARDOWN");
                }

                ret = record();
                authentication = null;
                recurseDepth--;
            }
        }
        return ret;
    }

    public void connect(String dest, int port, Session session) throws IOException {
        int trackId = 1;
        int responseCode;

        if (isConnected())
            return;

        if (!session.hasAudioTrack() && !session.hasVideoTrack())
            throw new IOException("No tracks found in session.");

        InetSocketAddress addr = null;
        try {
            addr = new InetSocketAddress(dest, port);
        } catch (Exception e) {
            throw new IOException("Failed to resolve rtsp server address.");
        }

        this.sdp = session.getSDP();
        this.user = session.getUser();
        this.pass = session.getPass();
        this.path = session.getPath();
        this.url = String.format("rtsp://%s:%d%s", dest, addr.getPort(), this.path);

        try {
            super.connect(addr);
        } catch (IOException e) {
            throw new IOException("Failed to connect rtsp server.");
        }

        responseCode = announce();
        if (responseCode != RTSP_OK) {
            close();
            throw new IOException("RTSP announce failed: " + responseCode);
        }

        responseCode = options();
        if (responseCode != RTSP_OK) {
            close();
            throw new IOException("RTSP options failed: " + responseCode);
        }

        /* Setup audio */
        if (session.hasAudioTrack()) {
            session.getAudioTrack().setStreamId(channelCount);
            responseCode = setup(trackId++);
            if (responseCode != RTSP_OK) {
                close();
                throw new IOException("RTSP video failed: " + responseCode);
            }
        }

        /* Setup video */
        if (session.hasVideoTrack()) {
            session.getVideoTrack().setStreamId(channelCount);
            responseCode = setup(trackId++);
            if (responseCode != RTSP_OK) {
                close();
                throw new IOException("RTSP audio setup failed: " + responseCode);
            }
        }

        responseCode = record();
        if (responseCode != RTSP_OK) {
            close();
            throw new IOException("RTSP record failed: " + responseCode);
        }
    }

    public void close() throws IOException {
        if (!isConnected())
            return;
        teardown();
        super.close();
    }
}

答案 2 :(得分:1)

我试图获得相同的结果(但由于缺乏经验而被放弃)。我的方法是使用ffmpeg和/或avlib,因为它已经有工作rtmp堆栈。所以理论上你只需要将视频流路由到ffmpeg进程,该进程将流式传输到服务器。

答案 3 :(得分:1)

是否有理由在客户端使用3gp?使用mp4(在标题中设置MOOV原子),您可以以块的形式读取临时文件并发送到服务器,但可能会有轻微的时间延迟,这一切都取决于您的连接速度。您的rtsp服务器应该能够将mp4重新编码为3gp,以便进行低带宽查看。

答案 4 :(得分:1)

此时,如果我必须接受相机(原始流)并立即将其提供给一组客户端,我会去google环聊路线并使用WebRTC。有关工具集/ SDK,请参阅ondello'platform section'。在评估期间,您应该了解WebRTC v RTSP的比较优点。

IMO凭借其状态,RTSP将成为防火墙和NAT背后的软件。在3G / 4G上的AFAIK在第三方应用中使用RTP有点冒险。

那就是说,我使用来自netty的libs和'efflux'把git放在一个旧的android / rtp / rtsp / sdp project上。我认为这个项目试图从Youtube视频中检索和播放容器内的音频轨道(vid轨道被忽略而不是通过网络拉出),所有这些视频都是为RTSP编码的。我认为有一些数据包和帧头问题,我厌倦了RTSP并放弃了它。

如果你必须追求RTP / RTSP其他海报提到的一些数据包和帧级别的东西就在android类和efflux附带的测试用例中

答案 5 :(得分:1)

这是rtsp会话类。它使用rtsp套接字与媒体服务器通信。其目的还在于保持会话参数,例如,它可以发送的流(视频和/或音频),队列,某种音频/视频同步代码。

使用过的界面。

package com.example.android.streaming.streaming.rtsp;

public interface PacketListener {
    public void onPacketReceived(Packet p);
}

会话本身。

package com.example.android.streaming.streaming;

import static java.util.EnumSet.of;

import java.io.IOException;
import java.util.EnumSet;
import java.util.concurrent.BlockingDeque;
import java.util.concurrent.LinkedBlockingDeque;
import java.util.concurrent.atomic.AtomicBoolean;
import java.util.concurrent.locks.Condition;
import java.util.concurrent.locks.ReentrantLock;

import android.app.Activity;
import android.content.SharedPreferences;
import android.hardware.Camera;
import android.hardware.Camera.CameraInfo;
import android.os.SystemClock;
import android.preference.PreferenceManager;
import android.util.Log;
import android.view.SurfaceHolder;

import com.example.android.streaming.BuildConfig;
import com.example.android.streaming.StreamingApp;
import com.example.android.streaming.streaming.audio.AACStream;
import com.example.android.streaming.streaming.rtsp.Packet;
import com.example.android.streaming.streaming.rtsp.Packet.PacketType;
import com.example.android.streaming.streaming.rtsp.PacketListener;
import com.example.android.streaming.streaming.rtsp.RtspSocket;
import com.example.android.streaming.streaming.video.H264Stream;
import com.example.android.streaming.streaming.video.VideoConfig;
import com.example.android.streaming.streaming.video.VideoStream;

public class Session implements PacketListener, Runnable {
    public final static int MESSAGE_START = 0x03;
    public final static int MESSAGE_STOP = 0x04;
    public final static int VIDEO_H264 = 0x01;
    public final static int AUDIO_AAC = 0x05;

    public final static int VIDEO_TRACK = 1;
    public final static int AUDIO_TRACK = 0;

    private static VideoConfig defaultVideoQuality = VideoConfig.defaultVideoQualiy.clone();
    private static int defaultVideoEncoder = VIDEO_H264, defaultAudioEncoder = AUDIO_AAC;

    private static Session sessionUsingTheCamera = null;
    private static Session sessionUsingTheCamcorder = null;

    private static int startedStreamCount = 0;

    private int sessionTrackCount = 0;

    private static SurfaceHolder surfaceHolder;
    private Stream[] streamList = new Stream[2];
    protected RtspSocket socket = null;
    private Activity context = null;

    private String host = null;
    private String path = null;
    private String user = null;
    private String pass = null;
    private int port;

    public interface SessionListener {
        public void startSession(Session session);

        public void stopSession(Session session);
    };

    public Session(Activity context, String host, int port, String path, String user, String pass) {
        this.context = context;
        this.host = host;
        this.port = port;
        this.path = path;
        this.pass = pass;
    }

    public boolean isConnected() {
        return socket != null && socket.isConnected();
    }

    /**
     * Connect to rtsp server and start new session. This should be called when
     * all the streams are added so that proper sdp can be generated.
     */
    public void connect() throws IOException {
        try {
            socket = new RtspSocket();
            socket.connect(host, port, this);
        } catch (IOException e) {
            socket = null;
            throw e;
        }
    }

    public void close() throws IOException {
        if (socket != null) {
            socket.close();
            socket = null;
        }
    }

    public static void setDefaultVideoQuality(VideoConfig quality) {
        defaultVideoQuality = quality;
    }

    public static void setDefaultAudioEncoder(int encoder) {
        defaultAudioEncoder = encoder;
    }

    public static void setDefaultVideoEncoder(int encoder) {
        defaultVideoEncoder = encoder;
    }

    public static void setSurfaceHolder(SurfaceHolder sh) {
        surfaceHolder = sh;
    }

    public boolean hasVideoTrack() {
        return getVideoTrack() != null;
    }

    public MediaStream getVideoTrack() {
        return (MediaStream) streamList[VIDEO_TRACK];
    }

    public void addVideoTrack(Camera camera, CameraInfo info) throws IllegalStateException, IOException {
        addVideoTrack(camera, info, defaultVideoEncoder, defaultVideoQuality, false);
    }

    public synchronized void addVideoTrack(Camera camera, CameraInfo info, int encoder, VideoConfig quality,
            boolean flash) throws IllegalStateException, IOException {
        if (isCameraInUse())
            throw new IllegalStateException("Camera already in use by another client.");
        Stream stream = null;
        VideoConfig.merge(quality, defaultVideoQuality);

        switch (encoder) {
        case VIDEO_H264:
            if (BuildConfig.DEBUG)
                Log.d(StreamingApp.TAG, "Video streaming: H.264");
            SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context.getApplicationContext());
            stream = new H264Stream(camera, info, this, prefs);
            break;
        }

        if (stream != null) {
            if (BuildConfig.DEBUG)
                Log.d(StreamingApp.TAG, "Quality is: " + quality.resX + "x" + quality.resY + "px " + quality.framerate
                        + "fps, " + quality.bitrate + "bps");
            ((VideoStream) stream).setVideoQuality(quality);
            ((VideoStream) stream).setPreviewDisplay(surfaceHolder.getSurface());
            streamList[VIDEO_TRACK] = stream;
            sessionUsingTheCamera = this;
            sessionTrackCount++;
        }
    }

    public boolean hasAudioTrack() {
        return getAudioTrack() != null;
    }

    public MediaStream getAudioTrack() {
        return (MediaStream) streamList[AUDIO_TRACK];
    }

    public void addAudioTrack() throws IOException {
        addAudioTrack(defaultAudioEncoder);
    }

    public synchronized void addAudioTrack(int encoder) throws IOException {
        if (sessionUsingTheCamcorder != null)
            throw new IllegalStateException("Audio device is already in use by another client.");
        Stream stream = null;

        switch (encoder) {
        case AUDIO_AAC:
            if (android.os.Build.VERSION.SDK_INT < 14)
                throw new IllegalStateException("This device does not support AAC.");
            if (BuildConfig.DEBUG)
                Log.d(StreamingApp.TAG, "Audio streaming: AAC");
            SharedPreferences prefs = PreferenceManager.getDefaultSharedPreferences(context.getApplicationContext());
            stream = new AACStream(this, prefs);
            break;
        }

        if (stream != null) {
            streamList[AUDIO_TRACK] = stream;
            sessionUsingTheCamcorder = this;
            sessionTrackCount++;
        }
    }

    public synchronized String getSDP() throws IllegalStateException, IOException {
        StringBuilder sdp = new StringBuilder();
        sdp.append("v=0\r\n");

        /*
         * The RFC 4566 (5.2) suggests to use an NTP timestamp here but we will
         * simply use a UNIX timestamp.
         */
        //sdp.append("o=- " + timestamp + " " + timestamp + " IN IP4 127.0.0.1\r\n");
        sdp.append("o=- 0 0 IN IP4 127.0.0.1\r\n");
        sdp.append("s=Vedroid\r\n");
        sdp.append("c=IN IP4 " + host + "\r\n");
        sdp.append("i=N/A\r\n");
        sdp.append("t=0 0\r\n");
        sdp.append("a=tool:Vedroid RTP\r\n");
        int payload = 96;
        int trackId = 1;
        for (int i = 0; i < streamList.length; i++) {
            if (streamList[i] != null) {
                streamList[i].setPayloadType(payload++);
                sdp.append(streamList[i].generateSDP());
                sdp.append("a=control:trackid=" + trackId++ + "\r\n");
            }
        }
        return sdp.toString();
    }

    public String getDest() {
        return host;
    }

    public int getTrackCount() {
        return sessionTrackCount;
    }

    public static boolean isCameraInUse() {
        return sessionUsingTheCamera != null;
    }

    /** Indicates whether or not the microphone is being used in a session. **/
    public static boolean isMicrophoneInUse() {
        return sessionUsingTheCamcorder != null;
    }

    private SessionListener listener = null;

    public synchronized void prepare(int trackId) throws IllegalStateException, IOException {
        Stream stream = streamList[trackId];
        if (stream != null && !stream.isStreaming())
            stream.prepare();
    }

    public synchronized void start(int trackId) throws IllegalStateException, IOException {
        Stream stream = streamList[trackId];
        if (stream != null && !stream.isStreaming()) {
            stream.start();
            if (BuildConfig.DEBUG)
                Log.d(StreamingApp.TAG, "Started " + (trackId == VIDEO_TRACK ? "video" : "audio") + " channel.");
            //            if (++startedStreamCount == 1 && listener != null)
            //                listener.startSession(this);
        }
    }

    public void startAll(SessionListener listener) throws IllegalStateException, IOException {
        this.listener = listener;
        startThread();

        for (int i = 0; i < streamList.length; i++)
            prepare(i);

        /*
         * Important to start video capture before audio capture. This makes
         * audio/video de-sync smaller.
         */
        for (int i = 0; i < streamList.length; i++)
            start(streamList.length - i - 1);
    }

    public synchronized void stopAll() {
        for (int i = 0; i < streamList.length; i++) {
            if (streamList[i] != null && streamList[i].isStreaming()) {
                streamList[i].stop();
                if (BuildConfig.DEBUG)
                    Log.d(StreamingApp.TAG, "Stopped " + (i == VIDEO_TRACK ? "video" : "audio") + " channel.");
                if (--startedStreamCount == 0 && listener != null)
                    listener.stopSession(this);
            }
        }
        stopThread();
        this.listener = null;
        if (BuildConfig.DEBUG)
            Log.d(StreamingApp.TAG, "Session stopped.");
    }

    public synchronized void flush() {
        for (int i = 0; i < streamList.length; i++) {
            if (streamList[i] != null) {
                streamList[i].release();
                if (i == VIDEO_TRACK)
                    sessionUsingTheCamera = null;
                else
                    sessionUsingTheCamcorder = null;
                streamList[i] = null;
            }
        }
    }

    public String getPath() {
        return path;
    }

    public String getUser() {
        return user;
    }

    public String getPass() {
        return pass;
    }

    private BlockingDeque<Packet> audioQueue = new LinkedBlockingDeque<Packet>(MAX_QUEUE_SIZE);
    private BlockingDeque<Packet> videoQueue = new LinkedBlockingDeque<Packet>(MAX_QUEUE_SIZE);
    private final static int MAX_QUEUE_SIZE = 1000;

    private void sendPacket(Packet p) {
        try {
            MediaStream channel = (p.type == PacketType.AudioPacketType ? getAudioTrack() : getVideoTrack());
            p.packetizer.send(p, socket, channel.getPayloadType(), channel.getStreamId());
            getPacketQueue(p.type).remove(p);
        } catch (IOException e) {
            Log.e(StreamingApp.TAG, "Failed to send packet: " + e.getMessage());
        }
    }

    private final ReentrantLock queueLock = new ReentrantLock();
    private final Condition morePackets = queueLock.newCondition();
    private AtomicBoolean stopped = new AtomicBoolean(true);
    private Thread t = null;

    private final void wakeupThread() {
        queueLock.lock();
        try {
            morePackets.signalAll();
        } finally {
            queueLock.unlock();
        }
    }

    public void startThread() {
        if (t == null) {
            t = new Thread(this);
            stopped.set(false);
            t.start();
        }
    }

    public void stopThread() {
        stopped.set(true);
        if (t != null) {
            t.interrupt();
            try {
                wakeupThread();
                t.join();
            } catch (InterruptedException e) {
            }
            t = null;
        }
        audioQueue.clear();
        videoQueue.clear();
    }

    private long getStreamEndSampleTimestamp(BlockingDeque<Packet> queue) {
        long sample = 0;
        try {
            sample = queue.getLast().getSampleTimestamp() + queue.getLast().getFrameLen();
        } catch (Exception e) {
        }
        return sample;
    }

    private PacketType syncType = PacketType.AnyPacketType;
    private boolean aligned = false;

    private final BlockingDeque<Packet> getPacketQueue(PacketType type) {
        return (type == PacketType.AudioPacketType ? audioQueue : videoQueue);
    }

    private void setPacketTimestamp(Packet p) {
        /* Don't sync on SEI packet. */
        if (!aligned && p.type != syncType) {
            long shift = getStreamEndSampleTimestamp(getPacketQueue(syncType));
            Log.w(StreamingApp.TAG, "Set shift +" + shift + "ms to "
                    + (p.type == PacketType.VideoPacketType ? "video" : "audio") + " stream ("
                    + (getPacketQueue(syncType).size() + 1) + ") packets.");
            p.setTimestamp(p.getDuration(shift));
            p.setSampleTimestamp(shift);
            if (listener != null)
                listener.startSession(this);
            aligned = true;
        } else {
            p.setTimestamp(p.packetizer.getTimestamp());
            p.setSampleTimestamp(p.packetizer.getSampleTimestamp());
        }

        p.packetizer.setSampleTimestamp(p.getSampleTimestamp() + p.getFrameLen());
        p.packetizer.setTimestamp(p.getTimestamp() + p.getDuration());

        //        if (BuildConfig.DEBUG) {
        //            Log.d(StreamingApp.TAG, (p.type == PacketType.VideoPacketType ? "Video" : "Audio") + " packet timestamp: "
        //                    + p.getTimestamp() + "; sampleTimestamp: " + p.getSampleTimestamp());
        //        }
    }

    /*
     * Drop first frames if len is less than this. First sync frame will have
     * frame len >= 10 ms.
     */
    private final static int MinimalSyncFrameLength = 15;

    @Override
    public void onPacketReceived(Packet p) {
        queueLock.lock();
        try {
            /*
             * We always synchronize on video stream. Some devices have video
             * coming faster than audio, this is ok. Audio stream time stamps
             * will be adjusted. Other devices that have audio come first will
             * see all audio packets dropped until first video packet comes.
             * Then upon first video packet we again adjust the audio stream by
             * time stamp of the last video packet in the queue.
             */
            if (syncType == PacketType.AnyPacketType && p.type == PacketType.VideoPacketType
                    && p.getFrameLen() >= MinimalSyncFrameLength)
                syncType = p.type;

            if (syncType == PacketType.VideoPacketType) {
                setPacketTimestamp(p);
                if (getPacketQueue(p.type).size() > MAX_QUEUE_SIZE - 1) {
                    Log.w(StreamingApp.TAG, "Queue (" + p.type + ") is full, dropping packet.");
                } else {
                    /*
                     * Wakeup sending thread only if channels synchronization is
                     * already done.
                     */
                    getPacketQueue(p.type).add(p);
                    if (aligned)
                        morePackets.signalAll();
                }
            }
        } finally {
            queueLock.unlock();
        }
    }

    private boolean hasMorePackets(EnumSet<Packet.PacketType> mask) {
        boolean gotPackets;

        if (mask.contains(PacketType.AudioPacketType) && mask.contains(PacketType.VideoPacketType)) {
            gotPackets = (audioQueue.size() > 0 && videoQueue.size() > 0) && aligned;
        } else {
            if (mask.contains(PacketType.AudioPacketType))
                gotPackets = (audioQueue.size() > 0);
            else if (mask.contains(PacketType.VideoPacketType))
                gotPackets = (videoQueue.size() > 0);
            else
                gotPackets = (videoQueue.size() > 0 || audioQueue.size() > 0);
        }
        return gotPackets;
    }

    private void waitPackets(EnumSet<Packet.PacketType> mask) {
        queueLock.lock();
        try {
            do {
                if (!stopped.get() && !hasMorePackets(mask)) {
                    try {
                        morePackets.await();
                    } catch (InterruptedException e) {
                    }
                }
            } while (!stopped.get() && !hasMorePackets(mask));
        } finally {
            queueLock.unlock();
        }
    }

    private void sendPackets() {
        boolean send;
        Packet a, v;

        /*
         * Wait for any type of packet and send asap. With time stamps correctly
         * set, the real send moment is not important and may be quite
         * different. Media server will only check for time stamps.
         */
        waitPackets(of(PacketType.AnyPacketType));

        v = videoQueue.peek();
        if (v != null) {
            sendPacket(v);

            do {
                a = audioQueue.peek();
                if ((send = (a != null && a.getSampleTimestamp() <= v.getSampleTimestamp())))
                    sendPacket(a);
            } while (!stopped.get() && send);
        } else {
            a = audioQueue.peek();
            if (a != null)
                sendPacket(a);
        }
    }

    @Override
    public void run() {
        Log.w(StreamingApp.TAG, "Session thread started.");

        /*
         * Wait for both types of front packets to come and synchronize on each
         * other.
         */
        waitPackets(of(PacketType.AudioPacketType, PacketType.VideoPacketType));

        while (!stopped.get())
            sendPackets();

        Log.w(StreamingApp.TAG, "Flushing session queues.");
        Log.w(StreamingApp.TAG, "    " + audioQueue.size() + " audio packets.");
        Log.w(StreamingApp.TAG, "    " + videoQueue.size() + " video packets.");

        long start = SystemClock.elapsedRealtime();
        while (audioQueue.size() > 0 || videoQueue.size() > 0)
            sendPackets();

        Log.w(StreamingApp.TAG, "Session thread stopped.");
        Log.w(StreamingApp.TAG, "Queues flush took " + (SystemClock.elapsedRealtime() - start) + " ms.");
    }
}

答案 6 :(得分:0)

检查此答案:通过WIFI播放视频?

然后,如果您想在Android手机中看到直播,那么请在您的应用程序中包含vlc插件,并通过实时流协议(rtsp)进行连接。

Intent i = new Intent("org.videolan.vlc.VLCApplication.gui.video.VideoPlayerActivity");
i.setAction(Intent.ACTION_VIEW);
i.setData(Uri.parse("rtsp://10.0.0.179:8086/")); 
startActivity(i);

如果您已经在Android手机上安装了VLC,那么您可以使用intent传输并传递IP地址和端口号,如上所示。