summaryrefslogtreecommitdiff
path: root/Documentation/guides/pipeline-handler.rst
diff options
context:
space:
mode:
Diffstat (limited to 'Documentation/guides/pipeline-handler.rst')
-rw-r--r--Documentation/guides/pipeline-handler.rst115
1 files changed, 83 insertions, 32 deletions
diff --git a/Documentation/guides/pipeline-handler.rst b/Documentation/guides/pipeline-handler.rst
index 10b9c75c..fe752975 100644
--- a/Documentation/guides/pipeline-handler.rst
+++ b/Documentation/guides/pipeline-handler.rst
@@ -1,5 +1,7 @@
.. SPDX-License-Identifier: CC-BY-SA-4.0
+.. include:: ../documentation-contents.rst
+
Pipeline Handler Writers Guide
==============================
@@ -151,13 +153,14 @@ integrates with the libcamera build system, and a *vivid.cpp* file that matches
the name of the pipeline.
In the *meson.build* file, add the *vivid.cpp* file as a build source for
-libcamera by adding it to the global meson ``libcamera_sources`` variable:
+libcamera by adding it to the global meson ``libcamera_internal_sources``
+variable:
.. code-block:: none
# SPDX-License-Identifier: CC0-1.0
- libcamera_sources += files([
+ libcamera_internal_sources += files([
'vivid.cpp',
])
@@ -183,7 +186,7 @@ to the libcamera build options in the top level ``meson_options.txt``.
option('pipelines',
type : 'array',
- choices : ['ipu3', 'rkisp1', 'rpi/vc4', 'simple', 'uvcvideo', 'vimc', 'vivid'],
+ choices : ['ipu3', 'rkisp1', 'rpi/pisp', 'rpi/vc4', 'simple', 'uvcvideo', 'vimc', 'vivid'],
description : 'Select which pipeline handlers to include')
@@ -258,7 +261,7 @@ implementations for the overridden class members.
return false;
}
- REGISTER_PIPELINE_HANDLER(PipelineHandlerVivid)
+ REGISTER_PIPELINE_HANDLER(PipelineHandlerVivid, "vivid")
} /* namespace libcamera */
@@ -266,6 +269,8 @@ Note that you must register the ``PipelineHandler`` subclass with the pipeline
handler factory using the `REGISTER_PIPELINE_HANDLER`_ macro which
registers it and creates a global symbol to reference the class and make it
available to try and match devices.
+String "vivid" is the name assigned to the pipeline, matching the pipeline
+subdirectory name in the source tree.
.. _REGISTER_PIPELINE_HANDLER: https://libcamera.org/api-html/pipeline__handler_8h.html
@@ -516,14 +521,14 @@ handler and camera manager using `registerCamera`_.
Finally with a successful construction, we return 'true' indicating that the
PipelineHandler successfully matched and constructed a device.
-.. _Camera::create: https://libcamera.org/api-html/classlibcamera_1_1Camera.html#a453740e0d2a2f495048ae307a85a2574
+.. _Camera::create: https://libcamera.org/internal-api-html/classlibcamera_1_1Camera.html#adf5e6c22411f953bfaa1ae21155d6c31
.. _registerCamera: https://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#adf02a7f1bbd87aca73c0e8d8e0e6c98b
.. code-block:: cpp
std::set<Stream *> streams{ &data->stream_ };
- std::shared_ptr<Camera> camera = Camera::create(this, data->video_->deviceName(), streams);
- registerCamera(std::move(camera), std::move(data));
+ std::shared_ptr<Camera> camera = Camera::create(std::move(data), data->video_->deviceName(), streams);
+ registerCamera(std::move(camera));
return true;
@@ -549,8 +554,7 @@ Our match function should now look like the following:
/* Create and register the camera. */
std::set<Stream *> streams{ &data->stream_ };
- const std::string &id = data->video_->deviceName();
- std::shared_ptr<Camera> camera = Camera::create(data.release(), id, streams);
+ std::shared_ptr<Camera> camera = Camera::create(std::move(data), data->video_->deviceName(), streams);
registerCamera(std::move(camera));
return true;
@@ -587,12 +591,12 @@ immutable properties of the ``Camera`` device.
The libcamera controls and properties are defined in YAML form which is
processed to automatically generate documentation and interfaces. Controls are
-defined by the src/libcamera/`control_ids.yaml`_ file and camera properties
-are defined by src/libcamera/`properties_ids.yaml`_.
+defined by the src/libcamera/`control_ids_core.yaml`_ file and camera properties
+are defined by src/libcamera/`property_ids_core.yaml`_.
.. _controls framework: https://libcamera.org/api-html/controls_8h.html
-.. _control_ids.yaml: https://libcamera.org/api-html/control__ids_8h.html
-.. _properties_ids.yaml: https://libcamera.org/api-html/property__ids_8h.html
+.. _control_ids_core.yaml: https://libcamera.org/api-html/control__ids_8h.html
+.. _property_ids_core.yaml: https://libcamera.org/api-html/property__ids_8h.html
Pipeline handlers can optionally register the list of controls an application
can set as well as a list of immutable camera properties. Being both
@@ -651,7 +655,7 @@ inline in our VividCameraData init:
ctrls.emplace(id, info);
}
- controlInfo_ = std::move(ctrls);
+ controlInfo_ = ControlInfoMap(std::move(ctrls), controls::controls);
The ``properties_`` field is a list of ``ControlId`` instances
associated with immutable values, which represent static characteristics that can
@@ -672,6 +676,58 @@ handling controls:
#include <libcamera/controls.h>
#include <libcamera/control_ids.h>
+Vendor-specific controls and properties
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Vendor-specific controls and properties must be defined in a separate YAML file
+and included in the build by defining the pipeline handler to file mapping in
+include/libcamera/meson.build. These YAML files live in the src/libcamera
+directory.
+
+For example, adding a Raspberry Pi vendor control file for the PiSP pipeline
+handler is done with the following mapping:
+
+.. code-block:: meson
+
+ controls_map = {
+ 'controls': {
+ 'draft': 'control_ids_draft.yaml',
+ 'libcamera': 'control_ids_core.yaml',
+ 'rpi/pisp': 'control_ids_rpi.yaml',
+ },
+
+ 'properties': {
+ 'draft': 'property_ids_draft.yaml',
+ 'libcamera': 'property_ids_core.yaml',
+ }
+ }
+
+The pipeline handler named above must match the pipeline handler option string
+specified in the meson build configuration.
+
+Vendor-specific controls and properties must contain a `vendor: <vendor_string>`
+tag in the YAML file. Every unique vendor tag must define a unique and
+non-overlapping range of reserved control IDs in src/libcamera/control_ranges.yaml.
+
+For example, the following block defines a vendor-specific control with the
+`rpi` vendor tag:
+
+.. code-block:: yaml
+
+ vendor: rpi
+ controls:
+ - PispConfigDumpFile:
+ type: string
+ description: |
+ Triggers the Raspberry Pi PiSP pipeline handler to generate a JSON
+ formatted dump of the Backend configuration to the filename given by the
+ value of the control.
+
+The controls will be generated in the vendor-specific namespace
+`libcamera::controls::rpi`. Additionally a `#define
+LIBCAMERA_HAS_RPI_VENDOR_CONTROLS` will be available to allow applications to
+test for the availability of these controls.
+
Generating a default configuration
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -743,8 +799,7 @@ derived class, and assign it to a base class pointer.
.. code-block:: cpp
- VividCameraData *data = cameraData(camera);
- CameraConfiguration *config = new VividCameraConfiguration();
+ auto config = std::make_unique<VividCameraConfiguration>();
A ``CameraConfiguration`` is specific to each pipeline, so you can only create
it from the pipeline handler code path. Applications can also generate an empty
@@ -772,9 +827,7 @@ To generate a ``StreamConfiguration``, you need a list of pixel formats and
frame sizes which are supported as outputs of the stream. You can fetch a map of
the ``V4LPixelFormat`` and ``SizeRange`` supported by the underlying output
device, but the pipeline handler needs to convert this to a
-``libcamera::PixelFormat`` type to pass to applications. We do this here using
-``std::transform`` to convert the formats and populate a new ``PixelFormat`` map
-as shown below.
+``libcamera::PixelFormat`` type to pass to applications.
Continue adding the following code example to our ``generateConfiguration``
implementation.
@@ -784,14 +837,12 @@ implementation.
std::map<V4L2PixelFormat, std::vector<SizeRange>> v4l2Formats =
data->video_->formats();
std::map<PixelFormat, std::vector<SizeRange>> deviceFormats;
- std::transform(v4l2Formats.begin(), v4l2Formats.end(),
- std::inserter(deviceFormats, deviceFormats.begin()),
- [&](const decltype(v4l2Formats)::value_type &format) {
- return decltype(deviceFormats)::value_type{
- format.first.toPixelFormat(),
- format.second
- };
- });
+
+ for (auto &[v4l2PixelFormat, sizes] : v4l2Formats) {
+ PixelFormat pixelFormat = v4l2PixelFormat.toPixelFormat();
+ if (pixelFormat.isValid())
+ deviceFormats.try_emplace(pixelFormat, std::move(sizes));
+ }
The `StreamFormats`_ class holds information about the pixel formats and frame
sizes that a stream can support. The class groups size information by the pixel
@@ -881,9 +932,9 @@ Add the following function implementation to your file:
StreamConfiguration &cfg = config_[0];
- const std::vector<libcamera::PixelFormat> formats = cfg.formats().pixelformats();
+ const std::vector<libcamera::PixelFormat> &formats = cfg.formats().pixelformats();
if (std::find(formats.begin(), formats.end(), cfg.pixelFormat) == formats.end()) {
- cfg.pixelFormat = cfg.formats().pixelformats()[0];
+ cfg.pixelFormat = formats[0];
LOG(VIVID, Debug) << "Adjusting format to " << cfg.pixelFormat.toString();
status = Adjusted;
}
@@ -1293,7 +1344,7 @@ before being set.
continue;
}
- int32_t value = lroundf(it.second.get<float>() * 128 + offset);
+ int32_t value = std::lround(it.second.get<float>() * 128 + offset);
controls.set(cid, std::clamp(value, 0, 255));
}
@@ -1357,7 +1408,7 @@ value translation operations:
.. code-block:: cpp
- #include <math.h>
+ #include <cmath>
Frame completion and event handling
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -1370,7 +1421,7 @@ emitted triggers the execution of the connected slots. A detailed description
of the libcamera implementation is available in the `libcamera Signal and Slot`_
classes documentation.
-.. _Qt Signals and Slots: https://doc.qt.io/qt-5/signalsandslots.html
+.. _Qt Signals and Slots: https://doc.qt.io/qt-6/signalsandslots.html
.. _libcamera Signal and Slot: https://libcamera.org/api-html/classlibcamera_1_1Signal.html#details
In order to notify applications about the availability of new frames and data,