summaryrefslogtreecommitdiff
path: root/src/gstreamer
AgeCommit message (Collapse)Author
2020-08-25meson: Remove -Wno-unused-parameterLaurent Pinchart
We build libcamera with -Wno-unused-parameter and this doesn't cause much issue internally. However, it prevents catching unused parameters in inline functions defined in public headers. This can lead to compilation warnings for applications compiled without -Wno-unused-parameter. To catch those issues, remove -Wno-unused-parameter and fix all the related warnings with [[maybe_unused]]. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
2020-08-25libcamera: Remove void specifier for functions that take no argumentsLaurent Pinchart
In C++, unlike in C, a function that takes no argument doesn't need to specify void in the arguments list. Drop the unnecessary specifiers. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
2020-08-05libcamera: camera: Rename name() to id()Niklas Söderlund
Rename Camera::name() to camera::id() to better describe what it represents, a unique and stable ID for the camera. While at it improve the documentation for the camera ID to describe it needs to be stable for a camera between resets of the system. Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-06-18gst: Replace explicit DRM FourCCs with libcamera formatsLaurent Pinchart
Use the new pixel format constants to replace usage of macros from drm_fourcc.h. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
2020-05-13licenses: License all meson files under CC0-1.0Laurent Pinchart
In an attempt to clarify the license terms of all files in the libcamera project, the build system files deserve particular attention. While they describe how the binaries are created, they are not themselves transformed into any part of binary distributions of the software, and thus don't influence the copyright on the binary packages. They are however subject to copyright, and thus influence the distribution terms of the source packages. Most of the meson.build files would not meet the threshold of originality criteria required for copyright protection. Some of the more complex meson.build files may be eligible for copyright protection. To avoid any ambiguity and uncertainty, state our intent to not assert copyrights on the build system files by putting them in the public domain with the CC0-1.0 license. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Acked-by: Giulio Benetti <giulio.benetti@micronovasrl.com> Acked-by: Jacopo Mondi <jacopo@jmondi.org> Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Acked-by: Naushir Patuck <naush@raspberrypi.com> Acked-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Acked-by: Niklas Söderlund <niklas.soderlund@ragnatech.se> Acked-by: Paul Elder <paul.elder@ideasonboard.com> Acked-by: Show Liu <show.liu@linaro.org>
2020-03-18libcamera: framebuffer_allocator: Lift camera restrictions on allocatorLaurent Pinchart
The Camera class currently requires the allocator to have no allocated buffer before the camera is reconfigured, and the allocator to be destroyed before the camera is released. There's no basis for these restrictions anymore, remove them. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
2020-03-18libcamera: PixelFormat: Make constructor explicitLaurent Pinchart
To achieve the goal of preventing unwanted conversion between a DRM and a V4L2 FourCC, make the PixelFormat constructor that takes an integer value explicit. All users of pixel formats flagged by the compiler are fixed. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se> Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
2020-03-18libcamera: Use PixelFormat instead of unsigned int where appropriateNiklas Söderlund
Use the PixelFormat instead of unsigned int where a pixel format is to be used. PixelFormat is defined as an unsigned int but is about to be turned into a class to add functionality. There is no functional change in this patch. Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: Fix GLib detectionLaurent Pinchart
Commit 17cccc68a88f ("Add GStreamer plugin and element skeleton") has gained a last minute fix for a clang compilation error with GLib prior to v2.63.0. The fix wasn't properly tested, and failed to check the GLib dependency correctly. This resulted in compilation of the GStreamer element to always be disabled. Fix this by changing the GLib package name from 'glib' to 'glib-2.0'. Fixes: 17cccc68a88f ("Add GStreamer plugin and element skeleton") Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
2020-03-07gst: Turn the top-level plugin file gstlibcamera.c into a C++ fileLaurent Pinchart
The top-level plugin file gstlibcamera.c is the only C source file in the whole libcamera GStreamer element. To avoid specifying both C and C++ compiler arguments in the future, turn it into a C++ file. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
2020-03-07gst: libcamerasrc: Prevent src task deadlock on exhausted buffer poolJakub Adam
Allow GstLibcameraPool to notify the source when a new buffer has become available in a previously exhausted buffer pool. This can be used to resume a src task that got paused because it couldn't acquire a buffer. Without this change the src task will never resume from pause once the pool gets exhausted. To trigger the deadlock (it doesn't happen every time), run: gst-launch-1.0 libcamerasrc ! queue ! glimagesink Signed-off-by: Jakub Adam <jakub.adam@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: utils: Factor-out the task resume helperJakub Adam
Task resume will be added in the core GStreamer API in the future and we will need to call this in another location in the following patches. Signed-off-by: Jakub Adam <jakub.adam@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerasrc: Add a TODO commentNicolas Dufresne
This is to guide upcoming contributors toward what is left to do to get toward a production ready element. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerasrc: Implement timestamp supportNicolas Dufresne
This is an experimental patch adding timestamp support to the libcamerasrc element. This patch currently assume that the driver timestamp are relative to the system monotonic clock. Without a reference clock source, the timestamp are otherwise unusable, and without timestamp only minor use case can be achieved. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerasrc: Implement initial streamingNicolas Dufresne
With this patch, the element is now able to push buffers to the next element in the graph. The buffers are currently missing any metadata like timestamp, sequence number. This will be added in the next commit. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: pad: Add method to store retrieve pending buffersNicolas Dufresne
These will be useful for streaming. The requestComplete callback will store the buffers on each pads so that the _run() can pick them up and push them through the pads from a streaming thread. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: Add getters for Stream and FrameBufferNicolas Dufresne
This adds getters on pad/pool/allocator so that we can retrieve the Stream or FrameBuffer. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerasrc: Allocate and release buffersNicolas Dufresne
Setup the allocation and the release of buffers in the element. We have one pooling GstAllocator that wraps the FrameBufferAllocator and tracks the lifetime of FrameBuffer objects. Then, for each pad we have a GstBufferPool object which is only used to avoid re-allocating the GstBuffer structure every time we push a buffer. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerapad: Allow storing a poolNicolas Dufresne
This adds get/set helper to store a pool on the pad. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: Add a pool and an allocator implementationNicolas Dufresne
This is needed to track the lifetime of the FrameBufferAllocator in relation to the GstBuffer/GstMemory objects travelling inside GStreamer. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerasrc: Push segment eventNicolas Dufresne
Now that we have stream-start and caps, we can now push a segment event to announce what time will our buffer correlate to. For live sources this is just an open segment in time format. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerasrc: Implement minimal caps negotiationNicolas Dufresne
This is not expected to work in every possible cases, but should be sufficient as an initial implementation. What it does is that it turns the StreamFormats into caps and queries downstream caps with that as a filter. The result is the subset of caps that can be used. We then keep the first structure in that result and fixate using the default values found in StreamConfiguration as a default in case a range is available. We then validate this configuration and turn the potentially modified configuration into caps that we push downstream. Note that we trust the order in StreamFormats as being sorted best first, but this is not currently in libcamera. A todo has been added in the head of this file as a reminder to fix that in the core. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: utils: Add StreamConfiguration helpersNicolas Dufresne
This adds helpers to deal with the conversion from StreamConfiguration to caps and vice-versa. This is needed to implement caps negotiation. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerasrc: Send stream start eventNicolas Dufresne
Prior to sending caps, we need to send a stream-start event. This requires generating a stream and a group id. The stream id is random for live sources and the group id is shared across all pads. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerasrc: Store the srcpad in a vectorNicolas Dufresne
This will allow implementing generic algorithm even if we cannot request pads yet. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerapad: Add a method to access the roleNicolas Dufresne
Each pad can have a different roles. Users will have to request and configure their pads role before moving to a higher state. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerasrc: Add a task for the streaming threadNicolas Dufresne
Use a GstTask as our internal streaming thread. Unlike GstBaseSrc, we will be running a streaming thread at the element level rather than per pad. This is needed to combine buffer request for multiple pads. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerasrc: Implement selection and acquisitionNicolas Dufresne
This adds code to select and acquire a camera. With this, it is now possible to run a pipeline like: gst-launch-1.0 libcamerasrc ! fakesink Though no buffer will be streamed yet. In this function, we implement the change_state() virtual method to trigger actions on specific state transitions. Note that we also return GST_STATE_CHANGE_NO_PREROLL in GST_STATE_CHANGE_READY_TO_PAUSED and GST_STATE_CHANGE_PLAYING_TO_PAUSED transitions as this is required for all live sources. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerasrc: Add a debug categoryNicolas Dufresne
This will allow selecting libcamerasrc traces with the following environment: GST_DEBUG=libcamerasrc:7 Or all libcamera GStreamer element traces using GST_DEBUG="libcamera*:7" Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerasrc: Add camera-name propertyNicolas Dufresne
This property will be used to select by name the camera to use. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: libcamerasrc: Allocate and add static padNicolas Dufresne
This pad will always be present and will allow simple pipeline to be used to stream from the camera. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: Add pads to the sourceNicolas Dufresne
This simply adds the boiler plate for pads on the source element. The design is that we have one pad, called "src", that will always be present, and then more pads can be requested prior in READY or less state. Initially pads have one property "stream-role" that let you decide which role this pad will have. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: utils: Add simple scoped lockers for GMutex and GRectMutexNicolas Dufresne
While GLib has locker implementation already using g_autoptr(), recursive mutex locker was only introduced in recent GLib. Implement a simple locker for GMutex and GRectMutex in order to allow making locking simpler and safer. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: Add initial device providerNicolas Dufresne
This feature is used with GstDeviceMonitor in order to enumerate and monitor devices to be used with the source element. The resulting GstDevice implementation is also used by application to abstract the configuration of the source element. Implementation notes: - libcamera does not support polling yet - The device ID isn't unique in libcamera yet - The "name" property does not yet exist in libcamerasrc yet Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07gst: Add utility to convert StreamFormats to GstCapsNicolas Dufresne
This transforms the basic information found in StreamFormats to GstCaps. This can be handy to reply to early caps query or inside a device provider. Note that we ignored generated range as they are harmful to caps negotiation. We also don't simplify the caps for readability reasons, so some of the discrete value may be included in a range. Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2020-03-07Add GStreamer plugin and element skeletonNicolas Dufresne
This implements the GStreamer plugin interface and adds libcamerasrc element feature to it. This is just enough to allow plugin introspection. gst-inspect-1.0 build/src/gstreamer/libgstlibcamera.so Plugin Details: Name libcamera Description libcamera capture plugin Filename build/src/gstreamer/libgstlibcamera.so Version 0.0.0+1042-6c9f16d3-dirty License LGPL Source module libcamera Binary package libcamera Origin URL https://libcamera.org libcamerasrc: libcamera Source 1 features: GST_PLUGIN_PATH=$(pwd)/build/src/gstreamer gst-inspect-1.0 libcamerasrc Factory Details: Rank primary (256) Long-name libcamera Source Klass Source/Video Description Linux Camera source using libcamera Author Nicolas Dufresne <nicolas.dufresne@collabora.com Plugin Details: Name libcamera Description libcamera capture plugin Filename /home/nicolas/Sources/libcamera/build/src/gstreamer/libgstlibcamera.so Version 0.0.0+1042-6c9f16d3-dirty License LGPL Source module libcamera Binary package libcamera Origin URL https://libcamera.org GObject +----GInitiallyUnowned +----GstObject +----GstElement +----GstLibcameraSrc Pad Templates: none Element has no clocking capabilities. Element has no URI handling capabilities. Pads: none Element Properties: name : The name of the object flags: accès en lecture, accès en écriture, 0x2000 String. Default: "libcamerasrc0" parent : The parent of the object flags: accès en lecture, accès en écriture, 0x2000 Object of type "GstObject" Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> [Silence -Wunused-function warning for older GLib versions] Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
75' href='#n975'>975 976 977 978 979 980 981 982 983 984 985 986 987 988 989 990 991 992 993 994 995 996 997 998 999 1000 1001 1002 1003 1004 1005 1006 1007 1008 1009 1010 1011 1012 1013 1014 1015 1016 1017 1018 1019 1020 1021 1022 1023 1024 1025 1026 1027 1028 1029 1030 1031 1032 1033 1034 1035 1036 1037 1038 1039 1040 1041 1042 1043 1044 1045 1046 1047 1048 1049 1050 1051 1052 1053 1054 1055 1056 1057 1058 1059 1060 1061 1062 1063 1064 1065 1066 1067 1068 1069 1070 1071 1072 1073 1074 1075 1076 1077 1078 1079 1080 1081 1082 1083 1084 1085 1086 1087 1088 1089 1090 1091 1092 1093 1094 1095 1096 1097 1098 1099 1100 1101 1102 1103 1104 1105 1106 1107 1108 1109 1110 1111 1112 1113 1114 1115 1116 1117 1118 1119 1120 1121 1122 1123 1124 1125 1126 1127 1128 1129 1130 1131 1132 1133 1134 1135 1136 1137 1138 1139 1140 1141 1142 1143 1144 1145 1146 1147 1148 1149 1150 1151 1152 1153 1154 1155 1156 1157 1158 1159 1160 1161 1162 1163 1164 1165 1166 1167 1168 1169 1170 1171 1172 1173 1174 1175 1176 1177 1178 1179 1180 1181 1182 1183 1184 1185 1186 1187 1188 1189 1190 1191 1192 1193 1194 1195 1196 1197 1198 1199 1200 1201 1202 1203 1204 1205 1206 1207 1208 1209 1210 1211 1212 1213 1214 1215 1216 1217 1218 1219 1220 1221 1222 1223 1224 1225 1226 1227 1228 1229 1230 1231 1232 1233 1234 1235 1236 1237 1238 1239 1240 1241 1242 1243 1244 1245 1246 1247 1248 1249 1250 1251 1252 1253 1254 1255 1256 1257 1258 1259 1260 1261 1262 1263 1264 1265 1266 1267 1268 1269 1270 1271 1272 1273 1274 1275 1276 1277 1278 1279 1280 1281 1282 1283 1284 1285 1286 1287 1288 1289 1290 1291 1292 1293 1294 1295 1296 1297 1298 1299 1300 1301 1302 1303 1304 1305 1306 1307 1308 1309 1310 1311 1312 1313 1314 1315 1316 1317 1318 1319 1320 1321 1322 1323 1324 1325 1326 1327 1328 1329 1330 1331 1332 1333 1334 1335 1336 1337 1338 1339 1340 1341 1342 1343 1344 1345 1346 1347 1348 1349 1350 1351 1352 1353 1354 1355 1356 1357 1358 1359 1360 1361 1362 1363 1364 1365 1366 1367 1368 1369 1370 1371 1372 1373 1374 1375 1376 1377 1378 1379 1380 1381 1382 1383 1384 1385 1386 1387 1388 1389 1390 1391 1392 1393 1394 1395 1396 1397 1398 1399 1400 1401 1402 1403 1404 1405 1406 1407 1408 1409 1410 1411 1412 1413 1414 1415 1416 1417 1418 1419 1420 1421 1422 1423 1424 1425 1426 1427 1428 1429 1430 1431 1432 1433 1434 1435 1436 1437 1438 1439 1440 1441 1442 1443 1444 1445 1446 1447 1448 1449 1450 1451 1452 1453 1454 1455 1456 1457 1458 1459 1460 1461 1462 1463 1464 1465 1466 1467 1468 1469 1470 1471 1472 1473 1474 1475 1476 1477 1478 1479 1480 1481 1482 1483 1484 1485 1486 1487 1488 1489 1490 1491 1492 1493 1494
/* SPDX-License-Identifier: LGPL-2.1-or-later */
/*
 * Copyright (C) 2019-2023, Raspberry Pi Ltd
 *
 * pipeline_base.cpp - Pipeline handler base class for Raspberry Pi devices
 */

#include "pipeline_base.h"

#include <chrono>

#include <linux/media-bus-format.h>
#include <linux/videodev2.h>

#include <libcamera/base/file.h>
#include <libcamera/base/utils.h>

#include <libcamera/formats.h>
#include <libcamera/logging.h>
#include <libcamera/property_ids.h>

#include "libcamera/internal/camera_lens.h"
#include "libcamera/internal/ipa_manager.h"
#include "libcamera/internal/v4l2_subdevice.h"

using namespace std::chrono_literals;

namespace libcamera {

using namespace RPi;

LOG_DEFINE_CATEGORY(RPI)

using StreamFlag = RPi::Stream::StreamFlag;

namespace {

constexpr unsigned int defaultRawBitDepth = 12;

bool isRaw(const PixelFormat &pixFmt)
{
	/* This test works for both Bayer and raw mono formats. */
	return BayerFormat::fromPixelFormat(pixFmt).isValid();
}

PixelFormat mbusCodeToPixelFormat(unsigned int mbus_code,
				  BayerFormat::Packing packingReq)
{
	BayerFormat bayer = BayerFormat::fromMbusCode(mbus_code);

	ASSERT(bayer.isValid());

	bayer.packing = packingReq;
	PixelFormat pix = bayer.toPixelFormat();

	/*
	 * Not all formats (e.g. 8-bit or 16-bit Bayer formats) can have packed
	 * variants. So if the PixelFormat returns as invalid, use the non-packed
	 * conversion instead.
	 */
	if (!pix.isValid()) {
		bayer.packing = BayerFormat::Packing::None;
		pix = bayer.toPixelFormat();
	}

	return pix;
}

SensorFormats populateSensorFormats(std::unique_ptr<CameraSensor> &sensor)
{
	SensorFormats formats;

	for (auto const mbusCode : sensor->mbusCodes())
		formats.emplace(mbusCode, sensor->sizes(mbusCode));

	return formats;
}

bool isMonoSensor(std::unique_ptr<CameraSensor> &sensor)
{
	unsigned int mbusCode = sensor->mbusCodes()[0];
	const BayerFormat &bayer = BayerFormat::fromMbusCode(mbusCode);

	return bayer.order == BayerFormat::Order::MONO;
}

double scoreFormat(double desired, double actual)
{
	double score = desired - actual;
	/* Smaller desired dimensions are preferred. */
	if (score < 0.0)
		score = (-score) / 8;
	/* Penalise non-exact matches. */
	if (actual != desired)
		score *= 2;

	return score;
}

V4L2SubdeviceFormat findBestFormat(const SensorFormats &formatsMap, const Size &req, unsigned int bitDepth)
{
	double bestScore = std::numeric_limits<double>::max(), score;
	V4L2SubdeviceFormat bestFormat;
	bestFormat.colorSpace = ColorSpace::Raw;

	constexpr float penaltyAr = 1500.0;
	constexpr float penaltyBitDepth = 500.0;

	/* Calculate the closest/best mode from the user requested size. */
	for (const auto &iter : formatsMap) {
		const unsigned int mbusCode = iter.first;
		const PixelFormat format = mbusCodeToPixelFormat(mbusCode,
								 BayerFormat::Packing::None);
		const PixelFormatInfo &info = PixelFormatInfo::info(format);

		for (const Size &size : iter.second) {
			double reqAr = static_cast<double>(req.width) / req.height;
			double fmtAr = static_cast<double>(size.width) / size.height;

			/* Score the dimensions for closeness. */
			score = scoreFormat(req.width, size.width);
			score += scoreFormat(req.height, size.height);
			score += penaltyAr * scoreFormat(reqAr, fmtAr);

			/* Add any penalties... this is not an exact science! */
			score += utils::abs_diff(info.bitsPerPixel, bitDepth) * penaltyBitDepth;

			if (score <= bestScore) {
				bestScore = score;
				bestFormat.mbus_code = mbusCode;
				bestFormat.size = size;
			}

			LOG(RPI, Debug) << "Format: " << size
					<< " fmt " << format
					<< " Score: " << score
					<< " (best " << bestScore << ")";
		}
	}

	return bestFormat;
}

const std::vector<ColorSpace> validColorSpaces = {
	ColorSpace::Sycc,
	ColorSpace::Smpte170m,
	ColorSpace::Rec709
};

std::optional<ColorSpace> findValidColorSpace(const ColorSpace &colourSpace)
{
	for (auto cs : validColorSpaces) {
		if (colourSpace.primaries == cs.primaries &&
		    colourSpace.transferFunction == cs.transferFunction)
			return cs;
	}

	return std::nullopt;
}

bool isRgb(const PixelFormat &pixFmt)
{
	const PixelFormatInfo &info = PixelFormatInfo::info(pixFmt);
	return info.colourEncoding == PixelFormatInfo::ColourEncodingRGB;
}

bool isYuv(const PixelFormat &pixFmt)
{
	/* The code below would return true for raw mono streams, so weed those out first. */
	if (isRaw(pixFmt))
		return false;

	const PixelFormatInfo &info = PixelFormatInfo::info(pixFmt);
	return info.colourEncoding == PixelFormatInfo::ColourEncodingYUV;
}

} /* namespace */

/*
 * Raspberry Pi drivers expect the following colour spaces:
 * - V4L2_COLORSPACE_RAW for raw streams.
 * - One of V4L2_COLORSPACE_JPEG, V4L2_COLORSPACE_SMPTE170M, V4L2_COLORSPACE_REC709 for
 *   non-raw streams. Other fields such as transfer function, YCbCr encoding and
 *   quantisation are not used.
 *
 * The libcamera colour spaces that we wish to use corresponding to these are therefore:
 * - ColorSpace::Raw for V4L2_COLORSPACE_RAW
 * - ColorSpace::Sycc for V4L2_COLORSPACE_JPEG
 * - ColorSpace::Smpte170m for V4L2_COLORSPACE_SMPTE170M
 * - ColorSpace::Rec709 for V4L2_COLORSPACE_REC709
 */
CameraConfiguration::Status RPiCameraConfiguration::validateColorSpaces([[maybe_unused]] ColorSpaceFlags flags)
{
	Status status = Valid;
	yuvColorSpace_.reset();

	for (auto cfg : config_) {
		/* First fix up raw streams to have the "raw" colour space. */
		if (isRaw(cfg.pixelFormat)) {
			/* If there was no value here, that doesn't count as "adjusted". */
			if (cfg.colorSpace && cfg.colorSpace != ColorSpace::Raw)
				status = Adjusted;
			cfg.colorSpace = ColorSpace::Raw;
			continue;
		}

		/* Next we need to find our shared colour space. The first valid one will do. */
		if (cfg.colorSpace && !yuvColorSpace_)
			yuvColorSpace_ = findValidColorSpace(cfg.colorSpace.value());
	}

	/* If no colour space was given anywhere, choose sYCC. */
	if (!yuvColorSpace_)
		yuvColorSpace_ = ColorSpace::Sycc;

	/* Note the version of this that any RGB streams will have to use. */
	rgbColorSpace_ = yuvColorSpace_;
	rgbColorSpace_->ycbcrEncoding = ColorSpace::YcbcrEncoding::None;
	rgbColorSpace_->range = ColorSpace::Range::Full;

	/* Go through the streams again and force everyone to the same colour space. */
	for (auto cfg : config_) {
		if (cfg.colorSpace == ColorSpace::Raw)
			continue;

		if (isYuv(cfg.pixelFormat) && cfg.colorSpace != yuvColorSpace_) {
			/* Again, no value means "not adjusted". */
			if (cfg.colorSpace)
				status = Adjusted;
			cfg.colorSpace = yuvColorSpace_;
		}
		if (isRgb(cfg.pixelFormat) && cfg.colorSpace != rgbColorSpace_) {
			/* Be nice, and let the YUV version count as non-adjusted too. */
			if (cfg.colorSpace && cfg.colorSpace != yuvColorSpace_)
				status = Adjusted;
			cfg.colorSpace = rgbColorSpace_;
		}
	}

	return status;
}

CameraConfiguration::Status RPiCameraConfiguration::validate()
{
	Status status = Valid;

	if (config_.empty())
		return Invalid;

	status = validateColorSpaces(ColorSpaceFlag::StreamsShareColorSpace);

	/*
	 * Validate the requested transform against the sensor capabilities and
	 * rotation and store the final combined transform that configure() will
	 * need to apply to the sensor to save us working it out again.
	 */
	Transform requestedTransform = transform;
	combinedTransform_ = data_->sensor_->validateTransform(&transform);
	if (transform != requestedTransform)
		status = Adjusted;

	std::vector<CameraData::StreamParams> rawStreams, outStreams;
	for (const auto &[index, cfg] : utils::enumerate(config_)) {
		if (isRaw(cfg.pixelFormat))
			rawStreams.emplace_back(index, &cfg);
		else
			outStreams.emplace_back(index, &cfg);
	}

	/* Sort the streams so the highest resolution is first. */
	std::sort(rawStreams.begin(), rawStreams.end(),
		  [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });

	std::sort(outStreams.begin(), outStreams.end(),
		  [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });

	/* Do any platform specific fixups. */
	status = data_->platformValidate(rawStreams, outStreams);
	if (status == Invalid)
		return Invalid;

	/* Further fixups on the RAW streams. */
	for (auto &raw : rawStreams) {
		StreamConfiguration &cfg = config_.at(raw.index);
		V4L2DeviceFormat rawFormat;

		const PixelFormatInfo &info = PixelFormatInfo::info(cfg.pixelFormat);
		unsigned int bitDepth = info.isValid() ? info.bitsPerPixel : defaultRawBitDepth;
		V4L2SubdeviceFormat sensorFormat = findBestFormat(data_->sensorFormats_, cfg.size, bitDepth);

		rawFormat.size = sensorFormat.size;
		rawFormat.fourcc = raw.dev->toV4L2PixelFormat(cfg.pixelFormat);

		int ret = raw.dev->tryFormat(&rawFormat);
		if (ret)
			return Invalid;
		/*
		 * Some sensors change their Bayer order when they are h-flipped
		 * or v-flipped, according to the transform. If this one does, we
		 * must advertise the transformed Bayer order in the raw stream.
		 * Note how we must fetch the "native" (i.e. untransformed) Bayer
		 * order, because the sensor may currently be flipped!
		 */
		V4L2PixelFormat fourcc = rawFormat.fourcc;
		if (data_->flipsAlterBayerOrder_) {
			BayerFormat bayer = BayerFormat::fromV4L2PixelFormat(fourcc);
			bayer.order = data_->nativeBayerOrder_;
			bayer = bayer.transform(combinedTransform_);
			fourcc = bayer.toV4L2PixelFormat();
		}

		PixelFormat inputPixFormat = fourcc.toPixelFormat();
		if (raw.cfg->size != rawFormat.size || raw.cfg->pixelFormat != inputPixFormat) {
			raw.cfg->size = rawFormat.size;
			raw.cfg->pixelFormat = inputPixFormat;
			status = Adjusted;
		}

		raw.cfg->stride = rawFormat.planes[0].bpl;
		raw.cfg->frameSize = rawFormat.planes[0].size;
	}

	/* Further fixups on the ISP output streams. */
	for (auto &out : outStreams) {
		StreamConfiguration &cfg = config_.at(out.index);
		PixelFormat &cfgPixFmt = cfg.pixelFormat;
		V4L2VideoDevice::Formats fmts = out.dev->formats();

		if (fmts.find(out.dev->toV4L2PixelFormat(cfgPixFmt)) == fmts.end()) {
			/* If we cannot find a native format, use a default one. */
			cfgPixFmt = formats::NV12;
			status = Adjusted;
		}

		V4L2DeviceFormat format;
		format.fourcc = out.dev->toV4L2PixelFormat(cfg.pixelFormat);
		format.size = cfg.size;
		/* We want to send the associated YCbCr info through to the driver. */
		format.colorSpace = yuvColorSpace_;

		LOG(RPI, Debug)
			<< "Try color space " << ColorSpace::toString(cfg.colorSpace);

		int ret = out.dev->tryFormat(&format);
		if (ret)
			return Invalid;

		/*
		 * But for RGB streams, the YCbCr info gets overwritten on the way back
		 * so we must check against what the stream cfg says, not what we actually
		 * requested (which carefully included the YCbCr info)!
		 */
		if (cfg.colorSpace != format.colorSpace) {
			status = Adjusted;
			LOG(RPI, Debug)
				<< "Color space changed from "
				<< ColorSpace::toString(cfg.colorSpace) << " to "
				<< ColorSpace::toString(format.colorSpace);
		}

		cfg.colorSpace = format.colorSpace;
		cfg.stride = format.planes[0].bpl;
		cfg.frameSize = format.planes[0].size;
	}

	return status;
}

V4L2DeviceFormat PipelineHandlerBase::toV4L2DeviceFormat(const V4L2VideoDevice *dev,
							 const V4L2SubdeviceFormat &format,
							 BayerFormat::Packing packingReq)
{
	unsigned int mbus_code = format.mbus_code;
	const PixelFormat pix = mbusCodeToPixelFormat(mbus_code, packingReq);
	V4L2DeviceFormat deviceFormat;

	deviceFormat.fourcc = dev->toV4L2PixelFormat(pix);
	deviceFormat.size = format.size;
	deviceFormat.colorSpace = format.colorSpace;
	return deviceFormat;
}

std::unique_ptr<CameraConfiguration>
PipelineHandlerBase::generateConfiguration(Camera *camera, const StreamRoles &roles)
{
	CameraData *data = cameraData(camera);
	std::unique_ptr<CameraConfiguration> config =
		std::make_unique<RPiCameraConfiguration>(data);
	V4L2SubdeviceFormat sensorFormat;
	unsigned int bufferCount;
	PixelFormat pixelFormat;
	V4L2VideoDevice::Formats fmts;
	Size size;
	std::optional<ColorSpace> colorSpace;

	if (roles.empty())
		return config;

	Size sensorSize = data->sensor_->resolution();
	for (const StreamRole role : roles) {
		switch (role) {
		case StreamRole::Raw:
			size = sensorSize;
			sensorFormat = findBestFormat(data->sensorFormats_, size, defaultRawBitDepth);
			pixelFormat = mbusCodeToPixelFormat(sensorFormat.mbus_code,
							    BayerFormat::Packing::CSI2);
			ASSERT(pixelFormat.isValid());
			colorSpace = ColorSpace::Raw;
			bufferCount = 2;
			break;

		case StreamRole::StillCapture:
			fmts = data->ispFormats();
			pixelFormat = formats::NV12;
			/*
			 * Still image codecs usually expect the sYCC color space.
			 * Even RGB codecs will be fine as the RGB we get with the
			 * sYCC color space is the same as sRGB.
			 */
			colorSpace = ColorSpace::Sycc;
			/* Return the largest sensor resolution. */
			size = sensorSize;
			bufferCount = 1;
			break;

		case StreamRole::VideoRecording:
			/*
			 * The colour denoise algorithm requires the analysis
			 * image, produced by the second ISP output, to be in
			 * YUV420 format. Select this format as the default, to
			 * maximize chances that it will be picked by
			 * applications and enable usage of the colour denoise
			 * algorithm.
			 */
			fmts = data->ispFormats();
			pixelFormat = formats::YUV420;
			/*
			 * Choose a color space appropriate for video recording.
			 * Rec.709 will be a good default for HD resolutions.
			 */
			colorSpace = ColorSpace::Rec709;
			size = { 1920, 1080 };
			bufferCount = 4;
			break;

		case StreamRole::Viewfinder:
			fmts = data->ispFormats();
			pixelFormat = formats::ARGB8888;
			colorSpace = ColorSpace::Sycc;
			size = { 800, 600 };
			bufferCount = 4;
			break;

		default:
			LOG(RPI, Error) << "Requested stream role not supported: "
					<< role;
			return nullptr;
		}

		std::map<PixelFormat, std::vector<SizeRange>> deviceFormats;
		if (role == StreamRole::Raw) {
			/* Translate the MBUS codes to a PixelFormat. */
			for (const auto &format : data->sensorFormats_) {
				PixelFormat pf = mbusCodeToPixelFormat(format.first,
								       BayerFormat::Packing::CSI2);
				if (pf.isValid())
					deviceFormats.emplace(std::piecewise_construct, std::forward_as_tuple(pf),
							      std::forward_as_tuple(format.second.begin(), format.second.end()));
			}
		} else {
			/*
			 * Translate the V4L2PixelFormat to PixelFormat. Note that we
			 * limit the recommended largest ISP output size to match the
			 * sensor resolution.
			 */
			for (const auto &format : fmts) {
				PixelFormat pf = format.first.toPixelFormat();
				if (pf.isValid()) {
					const SizeRange &ispSizes = format.second[0];
					deviceFormats[pf].emplace_back(ispSizes.min, sensorSize,
								       ispSizes.hStep, ispSizes.vStep);
				}
			}
		}

		/* Add the stream format based on the device node used for the use case. */
		StreamFormats formats(deviceFormats);
		StreamConfiguration cfg(formats);
		cfg.size = size;
		cfg.pixelFormat = pixelFormat;
		cfg.colorSpace = colorSpace;
		cfg.bufferCount = bufferCount;
		config->addConfiguration(cfg);
	}

	config->validate();

	return config;
}

int PipelineHandlerBase::configure(Camera *camera, CameraConfiguration *config)
{
	CameraData *data = cameraData(camera);
	int ret;

	/* Start by freeing all buffers and reset the stream states. */
	data->freeBuffers();
	for (auto const stream : data->streams_)
		stream->clearFlags(StreamFlag::External);

	std::vector<CameraData::StreamParams> rawStreams, ispStreams;
	std::optional<BayerFormat::Packing> packing;
	unsigned int bitDepth = defaultRawBitDepth;

	for (unsigned i = 0; i < config->size(); i++) {
		StreamConfiguration *cfg = &config->at(i);

		if (isRaw(cfg->pixelFormat))
			rawStreams.emplace_back(i, cfg);
		else
			ispStreams.emplace_back(i, cfg);
	}

	/* Sort the streams so the highest resolution is first. */
	std::sort(rawStreams.begin(), rawStreams.end(),
		  [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });

	std::sort(ispStreams.begin(), ispStreams.end(),
		  [](auto &l, auto &r) { return l.cfg->size > r.cfg->size; });

	/*
	 * Calculate the best sensor mode we can use based on the user's request,
	 * and apply it to the sensor with the cached tranform, if any.
	 *
	 * If we have been given a RAW stream, use that size for setting up the sensor.
	 */
	if (!rawStreams.empty()) {
		BayerFormat bayerFormat = BayerFormat::fromPixelFormat(rawStreams[0].cfg->pixelFormat);
		/* Replace the user requested packing/bit-depth. */
		packing = bayerFormat.packing;
		bitDepth = bayerFormat.bitDepth;
	}

	V4L2SubdeviceFormat sensorFormat = findBestFormat(data->sensorFormats_,
							  rawStreams.empty() ? ispStreams[0].cfg->size
									     : rawStreams[0].cfg->size,
							  bitDepth);
	/* Apply any cached transform. */
	const RPiCameraConfiguration *rpiConfig = static_cast<const RPiCameraConfiguration *>(config);

	/* Then apply the format on the sensor. */
	ret = data->sensor_->setFormat(&sensorFormat, rpiConfig->combinedTransform_);
	if (ret)
		return ret;

	/*
	 * Platform specific internal stream configuration. This also assigns
	 * external streams which get configured below.
	 */
	ret = data->platformConfigure(sensorFormat, packing, rawStreams, ispStreams);
	if (ret)
		return ret;

	ipa::RPi::ConfigResult result;
	ret = data->configureIPA(config, &result);
	if (ret) {
		LOG(RPI, Error) << "Failed to configure the IPA: " << ret;
		return ret;
	}

	/*
	 * Set the scaler crop to the value we are using (scaled to native sensor
	 * coordinates).
	 */
	data->scalerCrop_ = data->scaleIspCrop(data->ispCrop_);

	/*
	 * Update the ScalerCropMaximum to the correct value for this camera mode.
	 * For us, it's the same as the "analogue crop".
	 *
	 * \todo Make this property the ScalerCrop maximum value when dynamic
	 * controls are available and set it at validate() time
	 */
	data->properties_.set(properties::ScalerCropMaximum, data->sensorInfo_.analogCrop);

	/* Store the mode sensitivity for the application. */
	data->properties_.set(properties::SensorSensitivity, result.modeSensitivity);

	/* Update the controls that the Raspberry Pi IPA can handle. */
	ControlInfoMap::Map ctrlMap;
	for (auto const &c : result.controlInfo)
		ctrlMap.emplace(c.first, c.second);

	/* Add the ScalerCrop control limits based on the current mode. */
	Rectangle ispMinCrop = data->scaleIspCrop(Rectangle(data->ispMinCropSize_));
	ctrlMap[&controls::ScalerCrop] = ControlInfo(ispMinCrop, data->sensorInfo_.analogCrop, data->scalerCrop_);

	data->controlInfo_ = ControlInfoMap(std::move(ctrlMap), result.controlInfo.idmap());

	/* Setup the Video Mux/Bridge entities. */
	for (auto &[device, link] : data->bridgeDevices_) {
		/*
		 * Start by disabling all the sink pad links on the devices in the
		 * cascade, with the exception of the link connecting the device.
		 */
		for (const MediaPad *p : device->entity()->pads()) {
			if (!(p->flags() & MEDIA_PAD_FL_SINK))
				continue;

			for (MediaLink *l : p->links()) {
				if (l != link)
					l->setEnabled(false);
			}
		}

		/*
		 * Next, enable the entity -> entity links, and setup the pad format.
		 *
		 * \todo Some bridge devices may chainge the media bus code, so we
		 * ought to read the source pad format and propagate it to the sink pad.
		 */
		link->setEnabled(true);
		const MediaPad *sinkPad = link->sink();
		ret = device->setFormat(sinkPad->index(), &sensorFormat);
		if (ret) {
			LOG(RPI, Error) << "Failed to set format on " << device->entity()->name()
					<< " pad " << sinkPad->index()
					<< " with format  " << sensorFormat
					<< ": " << ret;
			return ret;
		}

		LOG(RPI, Debug) << "Configured media link on device " << device->entity()->name()
				<< " on pad " << sinkPad->index();
	}

	return 0;
}

int PipelineHandlerBase::exportFrameBuffers([[maybe_unused]] Camera *camera, libcamera::Stream *stream,
					    std::vector<std::unique_ptr<FrameBuffer>> *buffers)
{
	RPi::Stream *s = static_cast<RPi::Stream *>(stream);
	unsigned int count = stream->configuration().bufferCount;
	int ret = s->dev()->exportBuffers(count, buffers);

	s->setExportedBuffers(buffers);

	return ret;
}

int PipelineHandlerBase::start(Camera *camera, const ControlList *controls)
{
	CameraData *data = cameraData(camera);
	int ret;

	/* Check if a ScalerCrop control was specified. */
	if (controls)
		data->applyScalerCrop(*controls);

	/* Start the IPA. */
	ipa::RPi::StartResult result;
	data->ipa_->start(controls ? *controls : ControlList{ controls::controls },
			  &result);

	/* Apply any gain/exposure settings that the IPA may have passed back. */
	if (!result.controls.empty())
		data->setSensorControls(result.controls);

	/* Configure the number of dropped frames required on startup. */
	data->dropFrameCount_ = data->config_.disableStartupFrameDrops
			      ? 0 : result.dropFrameCount;

	for (auto const stream : data->streams_)
		stream->resetBuffers();

	if (!data->buffersAllocated_) {
		/* Allocate buffers for internal pipeline usage. */
		ret = prepareBuffers(camera);
		if (ret) {
			LOG(RPI, Error) << "Failed to allocate buffers";
			data->freeBuffers();
			stop(camera);
			return ret;
		}
		data->buffersAllocated_ = true;
	}

	/* We need to set the dropFrameCount_ before queueing buffers. */
	ret = queueAllBuffers(camera);
	if (ret) {
		LOG(RPI, Error) << "Failed to queue buffers";
		stop(camera);
		return ret;
	}

	/*
	 * Reset the delayed controls with the gain and exposure values set by
	 * the IPA.
	 */
	data->delayedCtrls_->reset(0);
	data->state_ = CameraData::State::Idle;

	/* Enable SOF event generation. */
	data->frontendDevice()->setFrameStartEnabled(true);

	data->platformStart();

	/* Start all streams. */
	for (auto const stream : data->streams_) {
		ret = stream->dev()->streamOn();
		if (ret) {
			stop(camera);
			return ret;
		}
	}

	return 0;
}

void PipelineHandlerBase::stopDevice(Camera *camera)
{
	CameraData *data = cameraData(camera);

	data->state_ = CameraData::State::Stopped;
	data->platformStop();

	for (auto const stream : data->streams_)
		stream->dev()->streamOff();

	/* Disable SOF event generation. */
	data->frontendDevice()->setFrameStartEnabled(false);

	data->clearIncompleteRequests();

	/* Stop the IPA. */
	data->ipa_->stop();
}

void PipelineHandlerBase::releaseDevice(Camera *camera)
{
	CameraData *data = cameraData(camera);
	data->freeBuffers();
}

int PipelineHandlerBase::queueRequestDevice(Camera *camera, Request *request)
{
	CameraData *data = cameraData(camera);

	if (!data->isRunning())
		return -EINVAL;

	LOG(RPI, Debug) << "queueRequestDevice: New request.";

	/* Push all buffers supplied in the Request to the respective streams. */
	for (auto stream : data->streams_) {
		if (!(stream->getFlags() & StreamFlag::External))
			continue;

		FrameBuffer *buffer = request->findBuffer(stream);
		if (buffer && !stream->getBufferId(buffer)) {
			/*
			 * This buffer is not recognised, so it must have been allocated
			 * outside the v4l2 device. Store it in the stream buffer list
			 * so we can track it.
			 */
			stream->setExternalBuffer(buffer);
		}

		/*
		 * If no buffer is provided by the request for this stream, we
		 * queue a nullptr to the stream to signify that it must use an
		 * internally allocated buffer for this capture request. This
		 * buffer will not be given back to the application, but is used
		 * to support the internal pipeline flow.
		 *
		 * The below queueBuffer() call will do nothing if there are not
		 * enough internal buffers allocated, but this will be handled by
		 * queuing the request for buffers in the RPiStream object.
		 */
		int ret = stream->queueBuffer(buffer);
		if (ret)
			return ret;
	}

	/* Push the request to the back of the queue. */
	data->requestQueue_.push(request);
	data->handleState();

	return 0;
}

int PipelineHandlerBase::registerCamera(std::unique_ptr<RPi::CameraData> &cameraData,
					MediaDevice *frontend, const std::string &frontendName,
					MediaDevice *backend, MediaEntity *sensorEntity)
{
	CameraData *data = cameraData.get();
	int ret;

	data->sensor_ = std::make_unique<CameraSensor>(sensorEntity);
	if (!data->sensor_)
		return -EINVAL;

	if (data->sensor_->init())
		return -EINVAL;

	data->sensorFormats_ = populateSensorFormats(data->sensor_);

	/*
	 * Enumerate all the Video Mux/Bridge devices across the sensor -> Fr
	 * chain. There may be a cascade of devices in this chain!
	 */
	MediaLink *link = sensorEntity->getPadByIndex(0)->links()[0];
	data->enumerateVideoDevices(link, frontendName);

	ipa::RPi::InitResult result;
	if (data->loadIPA(&result)) {
		LOG(RPI, Error) << "Failed to load a suitable IPA library";
		return -EINVAL;
	}

	/*
	 * Setup our delayed control writer with the sensor default
	 * gain and exposure delays. Mark VBLANK for priority write.
	 */
	std::unordered_map<uint32_t, RPi::DelayedControls::ControlParams> params = {
		{ V4L2_CID_ANALOGUE_GAIN, { result.sensorConfig.gainDelay, false } },
		{ V4L2_CID_EXPOSURE, { result.sensorConfig.exposureDelay, false } },
		{ V4L2_CID_HBLANK, { result.sensorConfig.hblankDelay, false } },
		{ V4L2_CID_VBLANK, { result.sensorConfig.vblankDelay, true } }
	};
	data->delayedCtrls_ = std::make_unique<RPi::DelayedControls>(data->sensor_->device(), params);
	data->sensorMetadata_ = result.sensorConfig.sensorMetadata;

	/* Register initial controls that the Raspberry Pi IPA can handle. */
	data->controlInfo_ = std::move(result.controlInfo);

	/* Initialize the camera properties. */
	data->properties_ = data->sensor_->properties();

	/*
	 * The V4L2_CID_NOTIFY_GAINS control, if present, is used to inform the
	 * sensor of the colour gains. It is defined to be a linear gain where
	 * the default value represents a gain of exactly one.
	 */
	auto it = data->sensor_->controls().find(V4L2_CID_NOTIFY_GAINS);
	if (it != data->sensor_->controls().end())
		data->notifyGainsUnity_ = it->second.def().get<int32_t>();

	/*
	 * Set a default value for the ScalerCropMaximum property to show
	 * that we support its use, however, initialise it to zero because
	 * it's not meaningful until a camera mode has been chosen.
	 */
	data->properties_.set(properties::ScalerCropMaximum, Rectangle{});

	/*
	 * We cache two things about the sensor in relation to transforms
	 * (meaning horizontal and vertical flips): if they affect the Bayer
	 * ordering, and what the "native" Bayer order is, when no transforms
	 * are applied.
	 *
	 * If flips are supported verify if they affect the Bayer ordering
	 * and what the "native" Bayer order is, when no transforms are
	 * applied.
	 *
	 * We note that the sensor's cached list of supported formats is
	 * already in the "native" order, with any flips having been undone.
	 */
	const V4L2Subdevice *sensor = data->sensor_->device();
	const struct v4l2_query_ext_ctrl *hflipCtrl = sensor->controlInfo(V4L2_CID_HFLIP);
	if (hflipCtrl) {
		/* We assume it will support vflips too... */
		data->flipsAlterBayerOrder_ = hflipCtrl->flags & V4L2_CTRL_FLAG_MODIFY_LAYOUT;
	}

	/* Look for a valid Bayer format. */
	BayerFormat bayerFormat;
	for (const auto &iter : data->sensorFormats_) {
		bayerFormat = BayerFormat::fromMbusCode(iter.first);
		if (bayerFormat.isValid())
			break;
	}

	if (!bayerFormat.isValid()) {
		LOG(RPI, Error) << "No Bayer format found";
		return -EINVAL;
	}
	data->nativeBayerOrder_ = bayerFormat.order;

	ret = platformRegister(cameraData, frontend, backend);
	if (ret)
		return ret;