使用ffmpeg将低延迟RTSP视频流式传输到android

时间:2014-10-19 23:28:24

标签: android video ffmpeg stream rtsp

我正在尝试使用KitKat将来自Ubuntu 12.04 PC的实时网络摄像头视频流传输到Android设备。到目前为止,我已经编写了ffserver配置文件来接收ffm feed并通过rtsp协议进行广播。我可以使用ffplay在同一局域网中的另一台计算机上观看流。

如何在Android设备上观看流?当使用vlc流式传输网络摄像头图像时,以下代码运行良好,但不适用于ffmpeg:

public class MainActivity extends Activity implements MediaPlayer.OnPreparedListener,
        SurfaceHolder.Callback {

    final static String RTSP_URL = "rtsp://192.168.1.54:4424/test.sdp";

    private MediaPlayer _mediaPlayer;
    private SurfaceHolder _surfaceHolder;

    @Override
    protected void onCreate(Bundle savedInstanceState) {
        super.onCreate(savedInstanceState);
        // Set up a full-screen black window.
        requestWindowFeature(Window.FEATURE_NO_TITLE);
        Window window = getWindow();
        window.setFlags(
                WindowManager.LayoutParams.FLAG_FULLSCREEN,
                WindowManager.LayoutParams.FLAG_FULLSCREEN);
        window.setBackgroundDrawableResource(android.R.color.black);
        getWindow().addFlags(WindowManager.LayoutParams.FLAG_KEEP_SCREEN_ON);
        setContentView(R.layout.activity_main);

        // Configure the view that renders live video.
        SurfaceView videoView =
                (SurfaceView) findViewById(R.id.videoView); //where R.id.videoView is a simple SurfaceView element in the layout xml file
        _surfaceHolder = videoView.getHolder();
        _surfaceHolder.addCallback(this);
        _surfaceHolder.setFixedSize(320, 240);
    }
    @Override
    public void surfaceCreated(SurfaceHolder surfaceHolder) {
        _mediaPlayer = new MediaPlayer();
        _mediaPlayer.setDisplay(_surfaceHolder);
        Context context = getApplicationContext();
        Uri source = Uri.parse(RTSP_URL);
        try {
            // Specify the IP camera's URL and auth headers.
            _mediaPlayer.setDataSource(context, source);

            // Begin the process of setting up a video stream.
            _mediaPlayer.setOnPreparedListener(this);
            _mediaPlayer.prepareAsync();
        }
        catch (Exception e) {}
    }
    @Override
    public void onPrepared(MediaPlayer mediaPlayer) {
        _mediaPlayer.start();
    }
}

我的ffserver.config文件:

HTTPPort 8090
RTSPBindAddress 0.0.0.0
RTSPPort 4424
MaxBandwidth 10000
CustomLog -

<Feed feed1.ffm>
        File /tmp/feed1.ffm
        FileMaxSize 20M
        ACL allow 127.0.0.1
</Feed>
<Stream test1.sdp>
    Feed feed1.ffm
    Format rtp  
    VideoCodec libx264
    VideoSize 640x480
    AVOptionVideo flags +global_header
    AVOptionVideo me_range 16
    AVOptionVideo qdiff 4
    AVOptionVideo qmin 10
    AVOptionVideo qmax 51
    Noaudio
    ACL allow localhost
        ACL allow 192.168.0.0 192.168.255.255
</Stream>

我使用此命令启动流:ffmpeg -f v4l2 -i /dev/video0 -c:v libx264 -b:v 600k http://localhost:8090/feed1.ffm

1 个答案:

答案 0 :(得分:0)

此错误很可能是由VLC和FFmpeg的不同编码参数引起的--VLC可以使用Android支持的编码参数,但FFmpeg可能使用不受支持的编码参数(很可能是AVC配置文件和级别)。尝试通过FFmpeg命令行选项和ffserver.config强制基线或主要配置文件和YUV 4:2:0像素格式。

相关问题