在Raspberry Pi上从相机捕获视频,并在编码之前在OpenGL中过滤(Capture video from camera on Raspberry Pi and filter in OpenG

编程入门 行业动态 更新时间:2024-10-24 03:21:52
在Raspberry Pi上从相机捕获视频,并在编码之前在OpenGL中过滤(Capture video from camera on Raspberry Pi and filter in OpenGL before encoding)

我想要一种从Raspberry Pi中的相机界面捕获视频的方法,通过编写为OpenGL着色器的过滤器运行它,然后将其发送到硬件编码器。

这篇博文讲述了在使用raspistill时将OpenGL着色器滤镜应用到相机的输出上。 这是相应的源代码 。 然而,在这种情况下的输出不会进入视频编码器,并且这不会在视频上运行,仅在静止图像上运行。 另外(不完全确定)我认为这与预览有关,请看这些位: raspitex_state A pointer to the GL preview state和state->ops.redraw = sobel_redraw 。

该博客还谈到了“快速路径”,有人可以解释在这种情况下的含义吗?

I would like a way to capture video from the camera interface in Raspberry Pi, run it through a filter written as OpenGL shaders, and then send it to the hardware encoder.

This blog post talks about applying OpenGL shader filters onto the output of the camera when using raspistill. This is the corresponding source code. The output in that case however does not go to the video encoder, and this is not running on video, only on stills. Also (not completely sure) I think this ties into the preview, see these bits: raspitex_state A pointer to the GL preview state and state->ops.redraw = sobel_redraw.

The blog also talks about "fastpath", can someone explan what that means in this context?

最满意答案

纹理转换适用于任何MMAL不透明缓冲区,即相机预览,仍然(高达2000x2000分辨率),视频。 但是,示例代码仅为静态预览执行GL管道。 我认为有人在RPI论坛上发布了一个补丁,以使其与RaspiVid一起使用,因此您可以使用它。

Fastpath基本上意味着不将缓冲区数据复制到ARM内存并进行软件转换。 因此,对于GL渲染,它意味着只是将句柄传递给GL,因此GPU驱动程序可以直接执行此操作。

目前,驱动程序中没有支持/快速路径用于将OpenGL渲染缓冲区馈送到视频编码器中。 相反,缓慢且可能不切实际的路径是调用glReadPixels,将缓冲区转换为YUV并将转换后的缓冲区传递给编码器。

快速路径肯定是可能的,我已经做了一些工作将其移植到RPI驱动程序,但是还需要一些其他框架,直到新的一年我才有机会看到它。

The texture conversion will work on any MMAL opaque buffer i.e. camera preview, still (up to 2000x2000 resolution), video. However, the example code only does the GL plumbing for stills preview. I think someone posted a patch on the RPI forums to make it work with RaspiVid so you might be able to use that.

Fastpath basically means not copying the buffer data to ARM memory and doing a software conversion. So, for the GL rendering it means just passing a handle to GL so the GPU driver can do this directly.

Currently, there is no support/fastpath in the drivers for feeding the OpenGL rendered buffers into the video encoder. Instead, the slow and probably impractical path is to call glReadPixels, convert the buffer to YUV and pass the converted buffer to the encoder.

A fastpath is certainly possible and I've done some work in porting this to the RPI drivers but there's some other framework required and I won't get chance to look at this until the New Year.

更多推荐

本文发布于:2023-07-26 15:46:00,感谢您对本站的认可!
本文链接:https://www.elefans.com/category/jswz/34/1277538.html
版权声明:本站内容均来自互联网,仅供演示用,请勿用于商业和其他非法用途。如果侵犯了您的权益请与我们联系,我们将在24小时内删除。
本文标签:并在   相机   视频   Raspberry   Pi

发布评论

评论列表 (有 0 条评论)
草根站长

>www.elefans.com

编程频道|电子爱好者 - 技术资讯及电子产品介绍!