利用OpenCV和Python多处理功能实现持续的摄像头捕捉

zour9fqk  于 2022-11-15  发布在  Python
关注(0)|答案(1)|浏览(136)

我一直在用Python从OpenCV相机阅读图像,并从主程序读取最新的图像。这是因为硬件有问题。
在混乱的线程和得到一个非常低的效率(废话!),我想切换到多处理。
以下是线程版本:

class WebcamStream:
    # initialization method
    def __init__(self, stream_id=0):
        self.stream_id = stream_id  # default is 0 for main camera

        # opening video capture stream
        self.camera = cv2.VideoCapture(self.stream_id)
        self.camera.set(cv2.CAP_PROP_FRAME_WIDTH, 3840)
        self.camera.set(cv2.CAP_PROP_FRAME_HEIGHT, 2880)

        if self.camera.isOpened() is False:
            print("[Exiting]: Error accessing webcam stream.")
            exit(0)

        # reading a single frame from camera stream for initializing
        _, self.frame = self.camera.read()

        # self.stopped is initialized to False
        self.stopped = True

        # thread instantiation
        self.t = Thread(target=self.update, args=())
        self.t.daemon = True  # daemon threads run in background

    # method to start thread
    def start(self):
        self.stopped = False
        self.t.start()

    # method passed to thread to read next available frame
    def update(self):
        while True:
            if self.stopped is True:
                break
            _, self.frame = self.camera.read()
        self.camera.release()

    # method to return latest read frame
    def read(self):
        return self.frame

    # method to stop reading frames
    def stop(self):
        self.stopped = True

还有...

if __name__ == "__main__":
    main_camera_stream = WebcamStream(stream_id=0)
    main_camera_stream.start()
    frame = main_camera_stream.read()

有人能帮我把这个翻译成多进程吗?
谢谢你!

z9smfwbn

z9smfwbn1#

我已经写了几个类似问题的解决方案,但已经有一段时间了,所以我们开始:
我将使用shared_memory作为一个缓冲区来读取帧,然后由另一个进程读取。我的第一个倾向是初始化相机并在子进程中读取帧,因为这看起来像是“设置它并忘记它”一类的事情。

import numpy as np
import cv2
from multiprocessing import Process, Queue
from multiprocessing.shared_memory import SharedMemory

def produce_frames(q):
    #get the first frame to calculate size of buffer
    cap = cv2.VideoCapture(0)
    success, frame = cap.read()
    shm = SharedMemory(create=True, size=frame.nbytes)
    framebuffer = np.ndarray(frame.shape, frame.dtype, buffer=shm.buf) #could also maybe use array.array instead of numpy, but I'm familiar with numpy
    framebuffer[:] = frame #in case you need to send the first frame to the main process
    q.put(shm) #send the buffer back to main
    q.put(frame.shape) #send the array details
    q.put(frame.dtype)
    try:
        while True:
            cap.read(framebuffer)
    except KeyboardInterrupt:
        pass
    finally:
        shm.close() #call this in all processes where the shm exists
        shm.unlink() #call from only one process

def consume_frames(q):
    shm = q.get() #get the shared buffer
    shape = q.get()
    dtype = q.get()
    framebuffer = np.ndarray(shape, dtype, buffer=shm.buf) #reconstruct the array
    try:
        while True:
            cv2.imshow("window title", framebuffer)
            cv2.waitKey(100)
    except KeyboardInterrupt:
        pass
    finally:
        shm.close()

if __name__ == "__main__":
    q = Queue()
    producer = Process(target=produce_frames, args=(q,))
    producer.start()
    consume_frames(q)

相关问题