Feature/http websocket video streaming - Add HTTP/HTTPS MJPEG and WebSocket Video Streaming Support#13594
Conversation
| Pretty cool, although I think you're missing some more places where the WebSockets lib needs to be added. One example being the Vagrantfile |
Thanks for catching that! You're absolutely right - I've now added qtwebsockets to all the additional Qt installation locations:
This ensures consistency across all development environments (CI workflows, Vagrant, and manual setups). The changes have been pushed - |
| How can this be tested without needing to buy some sort of camera that supports this. With other gstreamer based feed we can simulate streams using gstreamer to validate things work. |
@DonLakeFlyer, I've included synthetic test servers that follow the ADSB simulator pattern and don't require a camera or video files to run. How to Test Without a Camera
The test servers will generate synthetic test patterns for validation. The project's README also contains GStreamer CLI alternatives if you prefer command-line tools for simulation. Real-World TestingFor more comprehensive, real-world testing, you can use PixEagle, which works well with webcams, video files, or the simulator sources mentioned above: |
| @alireza787b Interesting feature, we'll get back to it when one of us has more time to test it out |
Add two new video source types to QGroundControl: 1. HTTP MJPEG: Uses GStreamer souphttpsrc + multipartdemux pipeline for standard HTTP MJPEG streams (IP cameras, PixEagle, etc.) 2. WebSocket: Uses GStreamer appsrc with Qt6 QWebSocket for binary JPEG streaming over WebSocket connections (PixEagle ws/video_feed) Key implementation details: - 14 new configurable settings (timeouts, buffer size, keepalive, reconnect delay, heartbeat, adaptive quality parameters) - Settings captured on main thread before GStreamer worker dispatch to avoid cross-thread SettingsManager access - multipartdemux auto-detects boundary from Content-Type header (no hardcoded boundary string) - WebSocket source uses QAtomicInt guard on appsrc access for thread safety between Qt main thread and GStreamer worker - GStreamer App component is OPTIONAL - builds without it lose WebSocket support but HTTP MJPEG still works - Qt6::WebSockets linked only when GStreamer App is available (not a global dependency) - No copyright headers (per QGC coding standards post-Jan 2026) - Includes Python test servers for validation Rebased cleanly on current master with all audit fixes applied. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
c4b0c05 to 295a1c1 Compare Build ResultsPlatform Status
All builds passed. Pre-commit
Pre-commit hooks: 32 passed, 78 failed, 10 skipped. Test Resultslinux-sanitizers: 56 passed, 0 skipped linux_gcc_64: 56 passed, 0 skipped Total: 112 passed, 0 skipped Code CoverageCoverage: N/A No baseline available for comparison Artifact Sizes
No baseline available for comparison Updated: 2026-03-06 14:48:20 UTC • Triggered by: MacOS |
Use pkg_check_modules directly with IMPORTED_GLOBAL instead of FindGStreamer.cmake component to avoid directory-scoped IMPORTED target visibility issues with Qt autogen targets. Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
There was a problem hiding this comment.
Pull request overview
Adds new network-based video sources to QGroundControl’s video pipeline (HTTP/HTTPS MJPEG and WebSocket-fed JPEG via GStreamer appsrc), along with new Video settings and test servers to validate the streams.
Changes:
- Introduces HTTP MJPEG (
souphttpsrc → multipartdemux → jpegparse) and WebSocket (appsrc → jpegdec) source creation in the GStreamer receiver. - Adds new Video settings/Facts and updates the Video settings UI to configure HTTP/WebSocket URLs and some connection parameters.
- Adds Python HTTP MJPEG and WebSocket test servers plus a small README/requirements set.
Reviewed changes
Copilot reviewed 14 out of 14 changed files in this pull request and generated 9 comments.
Show a summary per file
| File | Description |
|---|---|
| test/VideoStreaming/http_mjpeg_server.py | Adds a local Flask-based MJPEG test server for manual validation. |
| test/VideoStreaming/websocket_video_server.py | Adds a local WebSocket JPEG-frame test server for manual validation. |
| test/VideoStreaming/requirements.txt | Adds Python dependencies for the test servers. |
| test/VideoStreaming/README.md | Documents how to run the test servers and expected protocol. |
| src/VideoManager/VideoReceiver/GStreamer/QGCWebSocketVideoSource.h | Declares a Qt WebSocket-to-GStreamer appsrc bridge. |
| src/VideoManager/VideoReceiver/GStreamer/QGCWebSocketVideoSource.cc | Implements WebSocket connection/reconnect/heartbeat and pushing frames to appsrc. |
| src/VideoManager/VideoReceiver/GStreamer/GstVideoReceiver.h | Adds HTTP/WebSocket source helpers and stream settings structs. |
| src/VideoManager/VideoReceiver/GStreamer/GstVideoReceiver.cc | Routes http(s)/ws(s) URIs to new source builders and captures settings. |
| src/VideoManager/VideoReceiver/GStreamer/CMakeLists.txt | Adds optional gstapp + Qt6 WebSockets build wiring for WebSocket support. |
| src/VideoManager/VideoManager.cc | Wires new video sources into URI selection and timeout behavior. |
| src/UI/AppSettings/VideoSettings.qml | Adds URL inputs and basic HTTP/WebSocket settings groups to the UI. |
| src/Settings/VideoSettings.h | Adds new setting Facts and new source string constants. |
| src/Settings/VideoSettings.cc | Registers new sources and setting Facts; adds streamConfigured logic for new URLs. |
| src/Settings/Video.SettingsGroup.json | Adds new HTTP/WebSocket settings metadata entries. |
| signals: | ||
| void connected(); | ||
| void disconnected(); | ||
| void errorOccurred(const QString &error); | ||
| | ||
| private slots: | ||
| void _onConnected(); | ||
| void _onDisconnected(); | ||
| void _onBinaryMessageReceived(const QByteArray &message); | ||
| void _onTextMessageReceived(const QString &message); | ||
| void _onError(); | ||
| void _onSslErrors(const QList<QSslError> &errors); | ||
| void _sendHeartbeat(); |
There was a problem hiding this comment.
QGCWebSocketVideoSource.h uses QSslError/QString/QByteArray/QList in signals/slots but doesn’t include the required headers (e.g., <QtNetwork/QSslError>, <QtCore/QString>, <QtCore/QByteArray>, <QtCore/QList>). This can break compilation depending on include order; add the proper includes (or a complete-type include for QSslError).
There was a problem hiding this comment.
Fixed. Added the required includes (QSslError, QString, QByteArray, QList) and sorted them alphabetically per QGC convention.
| GST_BUFFER_PTS(buffer) = gst_util_uint64_scale(_framesReceived, GST_SECOND, 30); | ||
| GST_BUFFER_DTS(buffer) = GST_BUFFER_PTS(buffer); | ||
| GST_BUFFER_DURATION(buffer) = gst_util_uint64_scale(1, GST_SECOND, 30); |
There was a problem hiding this comment.
Appsrc is configured with do-timestamp=TRUE, but _pushFrameToAppsrc sets PTS/DTS/duration using a hard-coded 30 FPS. This will produce incorrect timestamps whenever the server FPS differs and can conflict with appsrc’s timestamping. Prefer leaving timestamps unset (GST_CLOCK_TIME_NONE) and letting appsrc timestamp, or compute timestamps from actual arrival time / negotiated FPS.
| GST_BUFFER_PTS(buffer) = gst_util_uint64_scale(_framesReceived, GST_SECOND, 30); | |
| GST_BUFFER_DTS(buffer) = GST_BUFFER_PTS(buffer); | |
| GST_BUFFER_DURATION(buffer) = gst_util_uint64_scale(1, GST_SECOND, 30); | |
| // Let appsrc timestamp the buffer (do-timestamp=TRUE) instead of forcing a hard-coded 30 FPS. | |
| GST_BUFFER_PTS(buffer) = GST_CLOCK_TIME_NONE; | |
| GST_BUFFER_DTS(buffer) = GST_CLOCK_TIME_NONE; | |
| GST_BUFFER_DURATION(buffer) = GST_CLOCK_TIME_NONE; |
There was a problem hiding this comment.
Agreed. Switched to GST_CLOCK_TIME_NONE for PTS/DTS/DURATION and relying on do-timestamp=TRUE to timestamp from the pipeline clock. This is the correct pattern for a live source with variable frame rate.
| void QGCWebSocketVideoSource::_onBinaryMessageReceived(const QByteArray &message) | ||
| { | ||
| if (message.isEmpty()) { | ||
| return; | ||
| } | ||
| | ||
| _pushFrameToAppsrc(message); | ||
| _expectingBinaryFrame = false; | ||
| } | ||
| | ||
| void QGCWebSocketVideoSource::_onTextMessageReceived(const QString &message) | ||
| { | ||
| // Text messages are JSON metadata from PixEagle-compatible servers | ||
| // Format: {"type":"frame","size":N,"quality":Q} | ||
| if (message.contains(QStringLiteral("frame"))) { | ||
| _expectingBinaryFrame = true; | ||
| } | ||
| } | ||
| |
There was a problem hiding this comment.
_onTextMessageReceived sets _expectingBinaryFrame but _onBinaryMessageReceived ignores it and always pushes any binary message into jpegdec. Since protocol metadata (including declared size) is effectively ignored (_expectedFrameSize is never used), non-frame binary messages or size mismatches can be fed to the decoder. Consider enforcing the expected state/size before pushing to appsrc.
There was a problem hiding this comment.
Removed the dead protocol state (_expectingBinaryFrame, _expectedFrameSize). Replaced with a lightweight JPEG SOI marker check (0xFFD8) on incoming binary messages — protocol-agnostic and prevents non-JPEG data from reaching jpegdec.
| find_package(PkgConfig QUIET) | ||
| if(PKG_CONFIG_FOUND) | ||
| pkg_check_modules(GSTREAMER_APP QUIET IMPORTED_TARGET gstreamer-app-1.0) | ||
| endif() | ||
| if(TARGET PkgConfig::GSTREAMER_APP) | ||
| set_target_properties(PkgConfig::GSTREAMER_APP PROPERTIES IMPORTED_GLOBAL TRUE) | ||
| target_link_libraries(${CMAKE_PROJECT_NAME} PRIVATE PkgConfig::GSTREAMER_APP) | ||
| target_compile_definitions(${CMAKE_PROJECT_NAME} PRIVATE QGC_GST_APP_AVAILABLE) | ||
| | ||
| target_sources(${CMAKE_PROJECT_NAME} | ||
| PRIVATE | ||
| QGCWebSocketVideoSource.cc | ||
| QGCWebSocketVideoSource.h | ||
| ) | ||
| | ||
| find_package(Qt6 ${QGC_QT_MINIMUM_VERSION}...${QGC_QT_MAXIMUM_VERSION} | ||
| COMPONENTS WebSockets | ||
| ) | ||
| if(TARGET Qt6::WebSockets) | ||
| target_link_libraries(${CMAKE_PROJECT_NAME} PRIVATE Qt6::WebSockets) | ||
| else() | ||
| message(WARNING "Qt6::WebSockets not found - WebSocket video streaming will not be available") | ||
| endif() | ||
| endif() |
There was a problem hiding this comment.
CMake adds QGCWebSocketVideoSource.* and defines QGC_GST_APP_AVAILABLE based only on gstreamer-app being found, but the source unconditionally includes QtWebSockets headers. If gstapp is present but Qt6::WebSockets is not, this will fail to compile/link. Gate adding these sources/defines on BOTH dependencies (gstapp + Qt6::WebSockets), or otherwise disable WebSocket support cleanly.
There was a problem hiding this comment.
Fixed. WebSocket sources and QGC_GST_APP_AVAILABLE are now only added when both gstreamer-app-1.0 and Qt6::WebSockets are found. If either is missing, a status message explains which dependency is unavailable.
| SettingsGroupLayout { | ||
| Layout.fillWidth: true | ||
| heading: qsTr("HTTP Stream Settings") | ||
| visible: _isHTTP | ||
| | ||
| LabelledFactTextField { | ||
| Layout.fillWidth: true | ||
| label: qsTr("Connection Timeout") | ||
| fact: _videoSettings.httpTimeout | ||
| } | ||
| | ||
| LabelledFactTextField { | ||
| Layout.fillWidth: true | ||
| label: qsTr("Retry Attempts") | ||
| fact: _videoSettings.httpRetryAttempts | ||
| } | ||
| | ||
| FactCheckBoxSlider { | ||
| Layout.fillWidth: true | ||
| text: qsTr("Keep-Alive") | ||
| fact: _videoSettings.httpKeepAlive | ||
| } | ||
| } | ||
| | ||
| SettingsGroupLayout { | ||
| Layout.fillWidth: true | ||
| heading: qsTr("WebSocket Stream Settings") | ||
| visible: _isWebSocket | ||
| | ||
| LabelledFactTextField { | ||
| Layout.fillWidth: true | ||
| label: qsTr("Connection Timeout") | ||
| fact: _videoSettings.websocketTimeout | ||
| } | ||
| | ||
| LabelledFactTextField { | ||
| Layout.fillWidth: true | ||
| label: qsTr("Reconnect Delay") | ||
| fact: _videoSettings.websocketReconnectDelay | ||
| } | ||
| | ||
| LabelledFactTextField { | ||
| Layout.fillWidth: true | ||
| label: qsTr("Heartbeat Interval") | ||
| fact: _videoSettings.websocketHeartbeat | ||
| } | ||
| } |
There was a problem hiding this comment.
VideoSettings.qml exposes only a subset of the newly-added HTTP/WebSocket tuning Facts (e.g., httpBufferSize/httpUserAgent/adaptiveQuality/minQuality/maxQuality/websocketBufferFrames are defined in settings but have no UI controls here). Either add the missing controls or remove/keep them internal to avoid confusing “hidden” settings.
There was a problem hiding this comment.
Removed the unused settings (httpBufferSize, httpUserAgent, adaptiveQuality, minQuality, maxQuality, websocketBufferFrames). PixEagle's adaptive quality runs server-side (bandwidth EWMA + encoding time + CPU monitoring via AdaptiveQualityEngine), so no client-side quality controls are needed. Buffer size and user-agent remain as hardcoded struct defaults in the GStreamer receiver. Will re-add as user-facing settings if/when client-initiated quality negotiation is implemented.
| async def video_handler(websocket): | ||
| """Handle a single WebSocket video client.""" | ||
| print(f"Client connected: {websocket.remote_address}") | ||
| frame_number = 0 | ||
| fps = 30 | ||
| quality = 85 | ||
| frame_interval = 1.0 / fps | ||
| | ||
| try: | ||
| while True: | ||
| start = time.monotonic() | ||
| | ||
| frame = generate_test_frame(640, 480, frame_number, fps) | ||
| _, jpeg = cv2.imencode('.jpg', frame, [cv2.IMWRITE_JPEG_QUALITY, quality]) | ||
| jpeg_bytes = jpeg.tobytes() | ||
| | ||
| # Send JSON metadata first | ||
| metadata = json.dumps({ | ||
| "type": "frame", | ||
| "size": len(jpeg_bytes), | ||
| "quality": quality, | ||
| "frame": frame_number, | ||
| "timestamp": time.time() | ||
| }) | ||
| await websocket.send(metadata) | ||
| | ||
| # Send binary JPEG frame | ||
| await websocket.send(jpeg_bytes) | ||
| | ||
| frame_number += 1 | ||
| | ||
| elapsed = time.monotonic() - start | ||
| remaining = frame_interval - elapsed | ||
| if remaining > 0: | ||
| await asyncio.sleep(remaining) | ||
| | ||
| except websockets.exceptions.ConnectionClosed: | ||
| print(f"Client disconnected: {websocket.remote_address}") | ||
| | ||
| | ||
| async def main(port): | ||
| print(f"Starting WebSocket video server on ws://0.0.0.0:{port}/ws/video_feed") | ||
| async with websockets.serve(video_handler, "0.0.0.0", port): | ||
| await asyncio.Future() | ||
| | ||
| | ||
| if __name__ == '__main__': | ||
| parser = argparse.ArgumentParser(description='WebSocket Video Test Server') | ||
| parser.add_argument('--port', type=int, default=5078, help='Server port (default: 5078)') | ||
| parser.add_argument('--fps', type=int, default=30, help='Frames per second (default: 30)') | ||
| args = parser.parse_args() | ||
| | ||
| asyncio.run(main(args.port)) |
There was a problem hiding this comment.
--fps is accepted on the command line but ignored: video_handler hard-codes fps=30 and main() doesn’t pass args.fps through. This makes the test server behavior not match the documented usage. Thread the selected FPS into the handler (e.g., via a closure/global) and use it for frame_interval and frame generation.
There was a problem hiding this comment.
Fixed. The --fps argument is now threaded through to the handler via functools.partial.
src/Settings/VideoSettings.h Outdated
| DEFINE_SETTINGFACT(adaptiveQuality) | ||
| DEFINE_SETTINGFACT(minQuality) | ||
| DEFINE_SETTINGFACT(maxQuality) | ||
| DEFINE_SETTINGFACT(websocketBufferFrames) |
There was a problem hiding this comment.
These WebSocket adaptive-quality related Facts are introduced but aren’t referenced anywhere outside settings (no implementation or UI). If adaptive quality isn’t implemented yet, consider removing these Facts until they’re wired up, or add the missing logic/UI so they have an effect.
| DEFINE_SETTINGFACT(adaptiveQuality) | |
| DEFINE_SETTINGFACT(minQuality) | |
| DEFINE_SETTINGFACT(maxQuality) | |
| DEFINE_SETTINGFACT(websocketBufferFrames) |
There was a problem hiding this comment.
Removed. Adaptive quality is handled entirely server-side by PixEagle's AdaptiveQualityEngine — no client-side settings needed.
| g_object_set(_pipeline, | ||
| "message-forward", TRUE, | ||
| nullptr); | ||
| | ||
| _captureStreamSettings(); | ||
| | ||
| _source = _makeSource(_uri); |
There was a problem hiding this comment.
_captureStreamSettings() reads SettingsManager/Fact values from the GStreamer worker thread (start() runs on the worker when _needDispatch() is true). Settings/Facts are QObjects owned by the main thread, so this is unsafe cross-thread access. Capture these settings on the caller (main) thread and pass plain values into the worker, or fetch them via a blocking queued invoke to the main thread.
There was a problem hiding this comment.
Fixed. _captureStreamSettings() now executes before _needDispatch() dispatches to the worker thread, ensuring Fact values are read on the main thread. The worker then uses the captured plain-value structs (_httpSettings, _wsSettings) safely.
| { | ||
| "name": "httpUrl", | ||
| "shortDesc": "HTTP Video URL", | ||
| "longDesc": "HTTP/HTTPS URL for MJPEG video stream. Format: http://host:port/path (e.g., http://192.168.1.100:5077/video_feed for PixEagle, or http://camera-ip/mjpeg for IP cameras).", | ||
| "type": "string", | ||
| "default": "" | ||
| }, |
There was a problem hiding this comment.
PR description says the default “Video Display Fit” is changed to “Fit Width”, but the metadata here still indicates enumValues 0=Fit Width and default is 1 (Fit Height). If the default is intended to be Fit Width, update the default accordingly (and ensure any related code/UI matches).
There was a problem hiding this comment.
The JSON default for videoFit was never changed in this PR — it remains 1 (Fit Height) as before. Corrected the PR description to remove the incorrect claim. No code change needed.
| Addressing all 9 review comments plus additional improvements: Thread safety:
Build correctness:
A/V correctness:
Code quality:
YAGNI removals:
Note: |
- Fix cross-thread safety: capture stream settings on main thread before dispatching to GStreamer worker (_captureStreamSettings reads Fact QObjects owned by main thread) - Fix CMake: gate WebSocket sources on both gstreamer-app-1.0 AND Qt6::WebSockets to prevent build failure when only one is available - Fix timestamps: use GST_CLOCK_TIME_NONE with do-timestamp=TRUE instead of hardcoded 30 FPS (correct pattern for live source with variable FPS) - Add missing Qt headers in QGCWebSocketVideoSource.h (QSslError, QString, QByteArray, QList) - Replace dead protocol state (_expectingBinaryFrame/_expectedFrameSize) with JPEG SOI marker validation (0xFFD8) - Extract _createAndConnectWebSocket() to eliminate duplication between start() and _reconnect() - Fix _wsSource lifecycle: use single queued lambda for stop+deleteLater to prevent race, null pointer after ownership transfer to GStreamer bin - Remove unused settings: adaptiveQuality, minQuality, maxQuality, websocketBufferFrames, httpBufferSize, httpUserAgent (adaptive quality is handled server-side by PixEagle's AdaptiveQualityEngine) - Fix test server --fps passthrough via functools.partial
This PR implements professional HTTP/HTTPS MJPEG and WebSocket video streaming capabilities for QGroundControl, enabling users to stream video from modern web-based video sources including HTTP servers, WebSocket endpoints, and cloud-based streaming services.
Features
HTTP/HTTPS MJPEG Streaming:
WebSocket Video Streaming:
Settings & UI:
Use Cases:
Technical Implementation
Architecture:
souphttpsrc → queue → multipartdemux → jpegdec → [QGC pipeline]appsrc (Qt-fed) → jpegdec → [QGC pipeline]WebSocket Threading Model:
QGCWebSocketVideoSourcecreated in GstVideoWorker thread, moved to main thread viamoveToThread()for Qt event loopQMetaObject::invokeMethodwithQt::QueuedConnectiondeleteLater()to prevent cross-thread deletion crashesGStreamer Integration:
GST_CLOCK_TIME_NONE+do-timestamp=TRUEfor automatic relative timestampsWebSocket Protocol Design:
Cross-Platform Compatibility:
find_packagefor Linux/macOSFiles Modified
CMakeLists.txt- Added WebSockets to Qt6 required componentssrc/Settings/Video.SettingsGroup.json- 14 new settings + videoFit default changed to Fit Widthsrc/Settings/VideoSettings.{h,cc}- Setting facts and validation logicsrc/VideoManager/VideoManager.cc- Stream source routing for HTTP/WebSocketsrc/VideoManager/VideoReceiver/GStreamer/CMakeLists.txt- GStreamer App component, Qt6::WebSockets linking, gstapp-1.0 librarysrc/VideoManager/VideoReceiver/GStreamer/GstVideoReceiver.{h,cc}- HTTP and WebSocket pipeline implementationsrc/UI/AppSettings/VideoSettings.qml- UI controls for new settingsFiles Added
src/VideoManager/VideoReceiver/GStreamer/QGCWebSocketVideoSource.h- WebSocket video source class header (~130 lines)src/VideoManager/VideoReceiver/GStreamer/QGCWebSocketVideoSource.cc- Implementation (~435 lines)Commits
Future Extensibility
This implementation provides a foundation for future video streaming enhancements:
Test Steps
Prerequisites
Option 1: PixEagle Drone Simulator (example test server)
http://127.0.0.1:5077/video_feedws://127.0.0.1:5077/ws/video_feedOption 2: Any Standard HTTP MJPEG Server
Option 3: Custom WebSocket Video Server
Build Requirements:
Test Case 1: HTTP MJPEG Streaming
http://127.0.0.1:5077/video_feedor your HTTP MJPEG source)Test Case 2: WebSocket Streaming
ws://127.0.0.1:5077/ws/video_feedor your WebSocket server)Test Case 3: Video Display Fit Options
Test Case 4: Adaptive Quality (WebSocket Only)
qgc.videomanager.websocketcategoryTest Case 5: Automatic Reconnection (WebSocket)
Test Case 6: Thread Safety & Resource Management
Test Case 7: Network Video Sources (Real-World)
Test Case 8: Cross-Platform CI Verification
Checklist:
Related Issue
This PR implements WebSocket and HTTP video streaming capabilities for modern network-based video sources. It addresses the need for flexible video streaming options beyond traditional UDP/RTSP/TCP sources, enabling QGroundControl to work with cloud-based services, IP cameras, and modern web-based video streaming protocols.
Note: This is an enhancement/feature addition implementing new functionality. No specific issue ID as this capability was not previously tracked.
Developer Information:
Alireza Ghaderi (@alireza787b)
Contact: p30planets@gmail.com
By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.