-
Notifications
You must be signed in to change notification settings - Fork 8k
video: add video compression support to tcpserversink sample #95862
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
video: add video compression support to tcpserversink sample #95862
Conversation
Good idea to split dependencies out of #92884 to help review. Is the current PR expected to be reviewed now? I am asking just because I did not see You may be interested in this PR which introduces H.264 into the Thank you for bringing this forward! |
#if DT_HAS_CHOSEN(zephyr_videoenc) | ||
encode_frame(vbuf, &vbuf_out); | ||
|
||
printk("\rSending compressed frame %d (size=%d bytes)\n", i++, vbuf_out->bytesused); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Maybe add the format (here H264) in the print message, based on the CONFIG_...
Add the zephyr,videoenc chosen node documentation for hardware video encoder support such as H264 or MJPEG video encoder. Signed-off-by: Hugues Fruchet <hugues.fruchet@foss.st.com>
Sync with video capture sample. Signed-off-by: Hugues Fruchet <hugues.fruchet@foss.st.com>
Add video compression support to lowerize network bandwidth. To visualise camera content on host PC, use GStreamer command line: $> gst-launch-1.0 tcpclientsrc host=<board ip address> port=5000 ! decodebin ! autovideosink sync=false Signed-off-by: Hugues Fruchet <hugues.fruchet@foss.st.com>
Add YUV420 semi-planar support (NV12). This is the video encoder prefered pixel format. Signed-off-by: Hugues Fruchet <hugues.fruchet@foss.st.com>
Allow to configure the number of allocated capture frames. This allows to make tradeof between framerate versus memory usage. 2 buffers allows to capture while sending data (optimal framerate). 1 buffer allows to reduce memory usage but capture framerate is lower. Signed-off-by: Hugues Fruchet <hugues.fruchet@foss.st.com>
Add configuration files for the stm32n6570_dk board. This enables streaming over ethernet of the images captured by MB1854 camera module compressed in 1920x1088 H264 video bitstream. Signed-off-by: Hugues Fruchet <hugues.fruchet@foss.st.com>
926cd0c
to
0fa7a7e
Compare
|
This PR adds support of camera streaming with compression of camera frames in video format such as H264 instead of sending big raw uncompressed frames, reducing drastically the network bandwidth needed for such use-case.
This video compression mode is enabled when a video encoder is found in device-tree, such as the VENC video encoder driver of the STM32N6 platform [1].
On :zephyr:board:
stm32n6570_dk
, the MB1854 IMX335 camera module must be plugged in the CSI-2 camera connector. A RJ45 ethernet cable must be plugged in the ethernet CN6 connector.For an optimal image experience, it is adviced to embed the STM32 image signal processing middleware [2].
To build sample: (flash using ST-Link and boot with FSBL [3])
The default configuration allows to capture and stream 1080p camera content from STM32N6 which can be received, decoded and displayed by an host PC using a GStreamer command line:
[1] #92884
[2] https://github.com/stm32-hotspot/zephyr-stm32-mw-isp
[3] https://docs.zephyrproject.org/latest/boards/st/stm32n6570_dk/doc/index.html