MJPEG 流媒体和解码

时间:2023-02-28
本文介绍了MJPEG 流媒体和解码的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

我想从 IP 摄像机(通过 RTSP)接收 JPEG 图像.为此,我在 OpenCV 中尝试了 cvCreateFileCapture_FFMPEG.但是 ffmpeg 似乎对流媒体的 MJPEG 格式有一些问题(因为它会自动尝试检测流媒体信息),最终出现以下错误

I want to receive JPEG images from an IP camera (over RTSP). For this, I tried cvCreateFileCapture_FFMPEG in OpenCV. But ffmpeg seems to have some problem with the MJPEG format of the streaming (since it automatically tries to detect the streaming info) and I end up with the following error

mjpeg: unsupported coding type

然后,我决定使用 live555 进行流式传输.到目前为止,我可以通过 openRTSP 成功建立流式传输和捕获(非解码)图像.

I, then, decided to use live555 for streaming. Till now, I can successfully establish streaming and capture (non-decoded) images through openRTSP.

问题是我如何在我的应用程序中做到这一点,例如,在 OpenCV 中.如何在 OpenCV 中使用 openRTSP 获取图像并将其保存为 JPEG 格式?

The question is how can I do this in my application, e.g., in OpenCV. How can I use openRTSP in OpenCV to get images and save them in JPEG format?

我听说可以将来自 openRTSP 的数据发送到缓冲区(或命名管道),然后在 OpenCV 的 IplImage 中读取.但我不知道该怎么做.

I have heard that the data from openRTSP can be sent to a buffer (or a named pipe) and then read in OpenCV's IplImage. But I don't know how to do this.

我将非常感谢有关此问题的任何帮助/建议.我需要回答以下任一问题:

I will really appreciate any help/suggestion in about this problem. I need answers of either of the following questions:

  1. 如何禁用 ffmpeg 的自动流信息检测并指定我自己的格式 (mjpeg),或者
  2. 如何在 OpenCV 中使用 openRTSP?

问候,

推荐答案

这是 Axis IP 摄像机吗?无论哪种方式,大多数提供 MPEG4 RTSP 流的 IP 摄像机都可以使用 OpenCV 使用 cvCreateFileCapture_FFMPEG 进行解码.然而,ffmpeg 解码器的 MJPEG 编解码器有一个众所周知的未解决的问题.我相信您会收到类似于

Is this an Axis IP camera? Either way, most IP cameras that provide MPEG4 RTSP stream that can be decoded using OpenCV using cvCreateFileCapture_FFMPEG. However, ffmpeg decoder's MJPEG codec has a widely known unresolved issues. I am sure you would have received an error similar to

[ingenient @ 0x97d20c0]Could not find codec parameters (Video: mjpeg)

选项 1:使用 opencv、libcurl 和 libjpeg

在opencv中查看mjpeg流看下面的实现

To view mjpeg stream in opencv take a look at the following implementation

http://www.eecs.ucf.edu/~rpatrick/code/onelinksys.c或者http://cse.unl.edu/~rpatrick/code/onelinksys.c

选项 2:使用 gstreamer(无 opencv)

如果您的目标只是查看或保存 jpeg 图像,我建议您查看 gstreamer

I would recommend looking at gstreamer if your goal is to just view or save jpeg images

查看 MJPEG 流,可以按如下方式执行媒体管道字符串

To view MJPEG stream one may execute media pipeline string as follows

gst-launch -v souphttpsrc location="http://[ip]:[port]/[dir]/xxx.cgi" do-timestamp=true is_live=true ! multipartdemux ! jpegdec ! ffmpegcolorspace ! autovideosink

对于 RTSP

gst-launch -v rtspsrc location="rtsp://[user]:[pass]@[ip]:[port]/[dir]/xxx.amp" debug=1 ! rtpmp4vdepay ! mpeg4videoparse ! ffdec_mpeg4 ! ffmpegcolorspace! autovideosink

要使用 C API,请参阅

To work with C API see

http://wiki.maemo.org/Documentation/Maemo_5_Developer_Guide/Using_Camera_ComponentsUsing_Camera_ComponentsUsage

关于一个简单的例子,请看我在 rtsp 上的另一篇关于构建 gstreamer C API 媒体管道的帖子(这与 gst-launch 字符串相同,但作为 C API 实现)

For a simple example take a look at my other post on rtsp for constructing gstreamer C API media pipeline (This is same as gst-launch string but rather implemented as a C API)

使用 python-gstreamer 播放 RTSP

保存 MJPEG 流为多个图像的管道(让我们垂直翻转 BIN 并将 PADS 连接到前一个和下一个 BINS 让它更漂亮)

To save MJPEG stream as multiple images the pipeline (Let us put a vertical flip BIN and connect the PADS to the previous and the next BINS to make it fancier)

gst-launch souphttpsrc location="http://[ip]:[port]/[dir]/xxx.cgi" do-timestamp=true is_live=true ! multipartdemux ! jpegdec !  videoflip method=vertical-flip ! jpegenc !  multifilesink location=image-out-%05d.jpg

也许值得一看 gst-opencv

更新:

选项 3:使用 gstreamer、命名管道和 opencv

在 Linux 上,可以获取 mjpeg 流并将其转换为 mpeg4 并将其提供给命名管道.然后从opencv中的命名管道中读取数据

On Linux one may get mjpeg stream and convert it to mpeg4 and feed it to a named pipe. Then read the data from the named pipe in opencv

第一步:创建命名管道

mkfifo stream_fifo

步骤 2. 创建 opencvvideo_test.c

Step 2. Create opencvvideo_test.c

// compile with gcc -ggdb `pkg-config --cflags --libs opencv` opencvvideo_test.c -o opencvvideo_test
#include <stdio.h>
#include "highgui.h"
#include "cv.h"


int main( int argc, char** argv){

IplImage  *frame;
    int       key;

    /* supply the AVI file to play */
    assert( argc == 2 );

    /* load the AVI file */
    CvCapture *capture = cvCreateFileCapture(argv[1]) ;//cvCaptureFromAVI( argv[1] );

    /* always check */
    if( !capture ) return 1;    

    /* get fps, needed to set the delay */
    int fps = ( int )cvGetCaptureProperty( capture, CV_CAP_PROP_FPS );

    int frameH    = (int) cvGetCaptureProperty(capture, CV_CAP_PROP_FRAME_HEIGHT);
    int frameW    = (int) cvGetCaptureProperty(capture, CV_CAP_PROP_FRAME_WIDTH);

    /* display video */
    cvNamedWindow( "video", CV_WINDOW_AUTOSIZE );

    while( key != 'q' ) {

    double t1=(double)cvGetTickCount();
    /* get a frame */
    frame = cvQueryFrame( capture );
    double t2=(double)cvGetTickCount();
    printf("time: %gms  fps: %.2g
",(t2-t1)/(cvGetTickFrequency()*1000.), 1000./((t2-t1)/(cvGetTickFrequency()*1000.)));

    /* always check */
    if( !frame ) break;

    /* display frame */
    cvShowImage( "video", frame );

    /* quit if user press 'q' */
    key = cvWaitKey( 1000 / fps );
    }

    /* free memory */
    cvReleaseCapture( &capture );
    cvDestroyWindow( "video" );

    return 0;
}

步骤 3. 准备使用 gstreamer 从 MJPEG 转换为 MPEG4(传入帧的速率至关重要)

Step 3. Prepare To Convert From MJPEG to MPEG4 using gstreamer (rate of incoming frames critical)

gst-launch -v souphttpsrc location="http://<ip>/cgi_bin/<mjpeg>.cgi" do-timestamp=true is_live=true ! multipartdemux ! jpegdec ! queue ! videoscale ! 'video/x-raw-yuv, width=640, height=480'! queue ! videorate ! 'video/x-raw-yuv,framerate=30/1' ! queue ! ffmpegcolorspace ! 'video/x-raw-yuv,format=(fourcc)I420' ! ffenc_mpeg4 ! queue ! filesink location=stream_fifo

步骤 4. 在 OpenCV 中显示流

Step 4. Display Stream in OpenCV

  ./opencvvideo_test stream_fifo

这篇关于MJPEG 流媒体和解码的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持html5模板网!

上一篇:为使用 OpenCV 的 C++ 代码编写 Python 绑定 下一篇:如何从旋转角度计算 OpenCV 的透视变换?

相关文章

最新文章