本功能基於opencv, numpy, zmq和base64這幾個庫來實現,將攝像頭採集到的畫面實時傳輸到其它地方,用於顯示圖像或交給上位機進行機器視覺處理。
本教程中硬件主要用到
一臺PC和一個安裝有攝像頭的樹莓派
實驗所用的代碼如下:
RPiCam:
<code>#!/usr/bin/env/python3 # File name : server.py # Description : for FPV video and OpenCV functions # Website : www.adeept.com # Author : William(Based on Adrian Rosebrock's OpenCV code on pyimagesearch.com) # Date: 2019/11/21 import cv2 import zmq import base64 import picamera from picamera.array import PiRGBArray IP = '192.168.3.11' camera = picamera.PiCamera() camera.resolution = (640, 480) camera.framerate = 20 rawCapture = PiRGBArray(camera, size=(640, 480)) context = zmq.Context() footage_socket = context.socket(zmq.PAIR) footage_socket.connect('tcp://%s:5555'%IP) print(IP) for frame in camera.capture_continuous(rawCapture, format="bgr", use_video_port=True): frame_image = frame.array encoded, buffer = cv2.imencode('.jpg', frame_image) jpg_as_text = base64.b64encode(buffer) footage_socket.send(jpg_as_text) rawCapture.truncate(0) /<code>
PC:
<code>#!/usr/bin/python3 # File name : PC.py # Author : William # Date: 2019/12/23 import cv2 import zmq import base64 import numpy as np context = zmq.Context() footage_socket = context.socket(zmq.PAIR) footage_socket.bind('tcp://*:5555') while True: frame = footage_socket.recv_string() img = base64.b64decode(frame) npimg = np.frombuffer(img, dtype=np.uint8) source = cv2.imdecode(npimg, 1) cv2.imshow("Stream", source) cv2.waitKey(1)/<code>