将捕获的图像从python服务器发送到javascript客户端

时间:2014-03-21 09:03:27

标签: javascript python html5 raspberry-pi live-streaming

现在我尝试使用Raspberry Pi制作服务器,将实时流图像数据发送到浏览器。 服务器端是用Python编写的。龙卷风,而客户端是用HTML和JavaScript编写的。两者都使用WebSocket。 (我是javascript的初学者。)

这些是代码

服务器端:

class WSHandler(WebSocketHandler):
    def initialize(self, camera):
        self.camera = camera
        cv.SetCaptureProperty(self.capture, cv.CV_CAP_PROP_FRAME_WIDTH, 480)
        cv.SetCaptureProperty(self.capture, cv.CV_CAP_PROP_FRAME_HEIGHT, 360)

    def open(self):
        print("connection opened")
        while True:
            self.loop()

    def loop(self):
        img = self.camera.takeImage()
        self.write_message(img, binary=True)

class Camera():
    def __init__(self):
        self.capture = cv.CaptureFromCAM(0)

    def takeImage(self):
        img = cv.QueryFrame(self.capture)
        img = cv.EncodeImage(".jpg", img).tostring()
        return img

def main():
    camera = Camera()
    app = tornado.web.Application([
        (r"/camera", WSHandler, dict(camera=camera)),
    ])
    http_server = tornado.httpserver.HTTPServer(app)
    http_server.listen(8080)
    IOLoop.instance().start()

if __name__ == "__main__":
    main()

客户方:

的JavaScript(client.js)

var canvas =  document.getElementById("liveCanvas");;
var context =  canvas.getContext("2d");

var ws = new WebSocket("ws://localhost:8080/camera");
ws.onopen = function(){
        console.log("connection was established");
};
ws.onmessage = function(evt){   
    context.drawImage(evt.data,0,0);
};

HTML(index.html的)

<html>
 <head>
  <title>livecamera</title>
  <canvas id="liveCanvas" width="480" height="360"></canvas>
  <script type="text/javascript" src="./client.js"></script>
 </head>
</html>

当我访问此&#39; index.html&#39;服务器运行时,出现下一个错误。

Uncaught TypeError: Failed to execute 'drawImage' on 'CanvasRenderingContext2D': No function was found that matched the signature provided. 

我猜,这是由于误将服务器发送的数据格式造成的。

我的问题是, 应该使用什么数据格式? 服务器应该如何发送数据? 客户应该如何接收数据?

1 个答案:

答案 0 :(得分:1)

我在C ++和javascript之间发现了类似的问题 Display image from blob using javascript and websockets

服务器端与以前相同。

客户方,&#39; ws.binaryType&#39;必须设置为&#39; arraybuffer&#39;接收blob对象。 它应该由base64和&#39;编码&#39;编码。从我上面写的链接引用的函数。

代码:

的javascript

var img = document.getElementById("liveImg");
var arrayBuffer;

var ws = new WebSocket("ws://localhost:8080/camera");
ws.binaryType = 'arraybuffer';

ws.onopen = function(){
    console.log("connection was established");
};
ws.onmessage = function(evt){
arrayBuffer = evt.data;
img.src = "data:image/jpeg;base64," + encode(new Uint8Array(arrayBuffer));
};

function encode (input) {
    var keyStr = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/=";
    var output = "";
    var chr1, chr2, chr3, enc1, enc2, enc3, enc4;
    var i = 0;

    while (i < input.length) {
        chr1 = input[i++];
        chr2 = i < input.length ? input[i++] : Number.NaN; // Not sure if the index
        chr3 = i < input.length ? input[i++] : Number.NaN; // checks are needed here

        enc1 = chr1 >> 2;
        enc2 = ((chr1 & 3) << 4) | (chr2 >> 4);
        enc3 = ((chr2 & 15) << 2) | (chr3 >> 6);
        enc4 = chr3 & 63;

        if (isNaN(chr2)) {
            enc3 = enc4 = 64;
        } else if (isNaN(chr3)) {
            enc4 = 64;
        }
        output += keyStr.charAt(enc1) + keyStr.charAt(enc2) +
                  keyStr.charAt(enc3) + keyStr.charAt(enc4);
    }
    return output;
}

HTML

我将canvas标签替换为img标签

<html>
 <head>
  <title>livecamera</title>
  <img id="liveImg" width="480" height="360"></canvas>
  <script type="text/javascript" src="./client.js"></script>
 </head>
</html>