summaryrefslogtreecommitdiff
path: root/Documentation/guides
diff options
context:
space:
mode:
Diffstat (limited to 'Documentation/guides')
-rw-r--r--Documentation/guides/application-developer.rst4
-rw-r--r--Documentation/guides/introduction.rst2
-rw-r--r--Documentation/guides/ipa.rst40
-rw-r--r--Documentation/guides/pipeline-handler.rst87
4 files changed, 101 insertions, 32 deletions
diff --git a/Documentation/guides/application-developer.rst b/Documentation/guides/application-developer.rst
index 1b2d7727..9a9905b1 100644
--- a/Documentation/guides/application-developer.rst
+++ b/Documentation/guides/application-developer.rst
@@ -5,7 +5,7 @@ Using libcamera in a C++ application
This tutorial shows how to create a C++ application that uses libcamera to
interface with a camera on a system, capture frames from it for 3 seconds, and
-write metadata about the frames to standard out.
+write metadata about the frames to standard output.
Application skeleton
--------------------
@@ -348,7 +348,7 @@ The libcamera library uses the concept of `signals and slots` (similar to `Qt
Signals and Slots`_) to connect events with callbacks to handle them.
.. _signals and slots: https://libcamera.org/api-html/classlibcamera_1_1Signal.html#details
-.. _Qt Signals and Slots: https://doc.qt.io/qt-5/signalsandslots.html
+.. _Qt Signals and Slots: https://doc.qt.io/qt-6/signalsandslots.html
The ``Camera`` device emits two signals that applications can connect to in
order to execute callbacks on frame completion events.
diff --git a/Documentation/guides/introduction.rst b/Documentation/guides/introduction.rst
index 2d1760c1..700ec2d3 100644
--- a/Documentation/guides/introduction.rst
+++ b/Documentation/guides/introduction.rst
@@ -288,7 +288,7 @@ with dedicated pipeline handlers:
- Intel IPU3 (ipu3)
- Rockchip RK3399 (rkisp1)
- - RaspberryPi 3 and 4 (raspberrypi)
+ - RaspberryPi 3 and 4 (rpi/vc4)
Furthermore, generic platform support is provided for the following:
diff --git a/Documentation/guides/ipa.rst b/Documentation/guides/ipa.rst
index fc031745..25deadef 100644
--- a/Documentation/guides/ipa.rst
+++ b/Documentation/guides/ipa.rst
@@ -19,6 +19,16 @@ connect to, in order to receive data from the IPA asynchronously. In addition,
it contains any custom data structures that the pipeline handler and IPA may
pass to each other.
+It is possible to use the same IPA interface with multiple pipeline handlers
+on different hardware platforms. Generally in such cases, these platforms would
+have a common hardware ISP pipeline. For instance, the rkisp1 pipeline handler
+supports both the RK3399 and the i.MX8MP as they integrate the same ISP.
+However, the i.MX8MP has a more complex camera pipeline, which may call for a
+dedicated pipeline handler in the future. As the ISP is the same as for RK3399,
+the same IPA interface could be used for both pipeline handlers. The build files
+provide a mapping from pipeline handler to the IPA interface name as detailed in
+:ref:`compiling-section`.
+
The IPA protocol refers to the agreement between the pipeline handler and the
IPA regarding the expected response(s) from the IPA for given calls to the IPA.
This protocol doesn't need to be declared anywhere in code, but it shall be
@@ -43,7 +53,7 @@ interface definition is thus written by the pipeline handler author, based on
how they design the interactions between the pipeline handler and the IPA.
The entire IPA interface, including the functions, signals, and any custom
-structs shall be defined in a file named {pipeline_name}.mojom under
+structs shall be defined in a file named {interface_name}.mojom under
include/libcamera/ipa/.
.. _mojo Interface Definition Language: https://chromium.googlesource.com/chromium/src.git/+/master/mojo/public/tools/bindings/README.md
@@ -150,7 +160,7 @@ and the Event IPA interface, which describes the signals received by the
pipeline handler that the IPA can emit. Both must be defined. This section
focuses on the Main IPA interface.
-The main interface must be named as IPA{pipeline_name}Interface.
+The main interface must be named as IPA{interface_name}Interface.
The functions that the pipeline handler can call from the IPA may be
synchronous or asynchronous. Synchronous functions do not return until the IPA
@@ -243,7 +253,7 @@ then it may be empty. These emissions are meant to notify the pipeline handler
of some event, such as request data is ready, and *must not* be used to drive
the camera pipeline from the IPA.
-The event interface must be named as IPA{pipeline_name}EventInterface.
+The event interface must be named as IPA{interface_name}EventInterface.
Functions defined in the event interface are implicitly asynchronous.
Thus they cannot return any value. Specifying the [async] tag is not
@@ -266,38 +276,42 @@ The following is an example of an event interface definition:
setStaggered(libcamera.ControlList controls);
};
+.. _compiling-section:
+
Compiling the IPA interface
---------------------------
-After the IPA interface is defined in include/libcamera/ipa/{pipeline_name}.mojom,
+After the IPA interface is defined in include/libcamera/ipa/{interface_name}.mojom,
an entry for it must be added in meson so that it can be compiled. The filename
-must be added to the ipa_mojom_files object in include/libcamera/ipa/meson.build.
+must be added to the pipeline_ipa_mojom_mapping variable in
+include/libcamera/ipa/meson.build. This variable maps the pipeline handler name
+to its IPA interface file.
For example, adding the raspberrypi.mojom file to meson:
.. code-block:: none
- ipa_mojom_files = [
- 'raspberrypi.mojom',
+ pipeline_ipa_mojom_mapping = [
+ 'rpi/vc4': 'raspberrypi.mojom',
]
This will cause the mojo data definition file to be compiled. Specifically, it
generates five files:
- a header describing the custom data structures, and the complete IPA
- interface (at {$build_dir}/include/libcamera/ipa/{pipeline}_ipa_interface.h)
+ interface (at {$build_dir}/include/libcamera/ipa/{interface}_ipa_interface.h)
- a serializer implementing de/serialization for the custom data structures (at
- {$build_dir}/include/libcamera/ipa/{pipeline}_ipa_serializer.h)
+ {$build_dir}/include/libcamera/ipa/{interface}_ipa_serializer.h)
- a proxy header describing a specialized IPA proxy (at
- {$build_dir}/include/libcamera/ipa/{pipeline}_ipa_proxy.h)
+ {$build_dir}/include/libcamera/ipa/{interface}_ipa_proxy.h)
- a proxy source implementing the IPA proxy (at
- {$build_dir}/src/libcamera/proxy/{pipeline}_ipa_proxy.cpp)
+ {$build_dir}/src/libcamera/proxy/{interface}_ipa_proxy.cpp)
- a proxy worker source implementing the other end of the IPA proxy (at
- {$build_dir}/src/libcamera/proxy/worker/{pipeline}_ipa_proxy_worker.cpp)
+ {$build_dir}/src/libcamera/proxy/worker/{interface}_ipa_proxy_worker.cpp)
The IPA proxy serves as the layer between the pipeline handler and the IPA, and
handles threading vs isolation transparently. The pipeline handler and the IPA
@@ -312,7 +326,7 @@ file, the following header must be included:
.. code-block:: C++
- #include <libcamera/ipa/{pipeline_name}_ipa_interface.h>
+ #include <libcamera/ipa/{interface_name}_ipa_interface.h>
The POD types of the structs simply become their C++ counterparts, eg. uint32
in mojo will become uint32_t in C++. mojo map becomes C++ std::map, and mojo
diff --git a/Documentation/guides/pipeline-handler.rst b/Documentation/guides/pipeline-handler.rst
index 2d55666d..7e45cdb8 100644
--- a/Documentation/guides/pipeline-handler.rst
+++ b/Documentation/guides/pipeline-handler.rst
@@ -183,7 +183,7 @@ to the libcamera build options in the top level ``meson_options.txt``.
option('pipelines',
type : 'array',
- choices : ['ipu3', 'raspberrypi', 'rkisp1', 'simple', 'uvcvideo', 'vimc', 'vivid'],
+ choices : ['ipu3', 'rkisp1', 'rpi/vc4', 'simple', 'uvcvideo', 'vimc', 'vivid'],
description : 'Select which pipeline handlers to include')
@@ -203,7 +203,7 @@ implementations for the overridden class members.
PipelineHandlerVivid(CameraManager *manager);
CameraConfiguration *generateConfiguration(Camera *camera,
- const StreamRoles &roles) override;
+ Span<const StreamRole> roles) override;
int configure(Camera *camera, CameraConfiguration *config) override;
int exportFrameBuffers(Camera *camera, Stream *stream,
@@ -223,7 +223,7 @@ implementations for the overridden class members.
}
CameraConfiguration *PipelineHandlerVivid::generateConfiguration(Camera *camera,
- const StreamRoles &roles)
+ Span<const StreamRole> roles)
{
return nullptr;
}
@@ -258,7 +258,7 @@ implementations for the overridden class members.
return false;
}
- REGISTER_PIPELINE_HANDLER(PipelineHandlerVivid)
+ REGISTER_PIPELINE_HANDLER(PipelineHandlerVivid, "vivid")
} /* namespace libcamera */
@@ -266,6 +266,8 @@ Note that you must register the ``PipelineHandler`` subclass with the pipeline
handler factory using the `REGISTER_PIPELINE_HANDLER`_ macro which
registers it and creates a global symbol to reference the class and make it
available to try and match devices.
+String "vivid" is the name assigned to the pipeline, matching the pipeline
+subdirectory name in the source tree.
.. _REGISTER_PIPELINE_HANDLER: https://libcamera.org/api-html/pipeline__handler_8h.html
@@ -289,7 +291,7 @@ features:
.. code-block:: cpp
#include <libcamera/base/log.h>
-
+
#include "libcamera/internal/pipeline_handler.h"
Run the following commands:
@@ -587,12 +589,12 @@ immutable properties of the ``Camera`` device.
The libcamera controls and properties are defined in YAML form which is
processed to automatically generate documentation and interfaces. Controls are
-defined by the src/libcamera/`control_ids.yaml`_ file and camera properties
-are defined by src/libcamera/`properties_ids.yaml`_.
+defined by the src/libcamera/`control_ids_core.yaml`_ file and camera properties
+are defined by src/libcamera/`properties_ids_core.yaml`_.
.. _controls framework: https://libcamera.org/api-html/controls_8h.html
-.. _control_ids.yaml: https://libcamera.org/api-html/control__ids_8h.html
-.. _properties_ids.yaml: https://libcamera.org/api-html/property__ids_8h.html
+.. _control_ids_core.yaml: https://libcamera.org/api-html/control__ids_8h.html
+.. _properties_ids_core.yaml: https://libcamera.org/api-html/property__ids_8h.html
Pipeline handlers can optionally register the list of controls an application
can set as well as a list of immutable camera properties. Being both
@@ -651,7 +653,7 @@ inline in our VividCameraData init:
ctrls.emplace(id, info);
}
- controlInfo_ = std::move(ctrls);
+ controlInfo_ = ControlInfoMap(std::move(ctrls), controls::controls);
The ``properties_`` field is a list of ``ControlId`` instances
associated with immutable values, which represent static characteristics that can
@@ -672,6 +674,58 @@ handling controls:
#include <libcamera/controls.h>
#include <libcamera/control_ids.h>
+Vendor-specific controls and properties
+~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
+
+Vendor-specific controls and properties must be defined in a separate YAML file
+and included in the build by defining the pipeline handler to file mapping in
+include/libcamera/meson.build. These YAML files live in the src/libcamera
+directory.
+
+For example, adding a Raspberry Pi vendor control file for the PiSP pipeline
+handler is done with the following mapping:
+
+.. code-block:: meson
+
+ controls_map = {
+ 'controls': {
+ 'draft': 'control_ids_draft.yaml',
+ 'libcamera': 'control_ids_core.yaml',
+ 'rpi/pisp': 'control_ids_rpi.yaml',
+ },
+
+ 'properties': {
+ 'draft': 'property_ids_draft.yaml',
+ 'libcamera': 'property_ids_core.yaml',
+ }
+ }
+
+The pipeline handler named above must match the pipeline handler option string
+specified in the meson build configuration.
+
+Vendor-specific controls and properties must contain a `vendor: <vendor_string>`
+tag in the YAML file. Every unique vendor tag must define a unique and
+non-overlapping range of reserved control IDs in src/libcamera/control_ranges.yaml.
+
+For example, the following block defines a vendor-specific control with the
+`rpi` vendor tag:
+
+.. code-block:: yaml
+
+ vendor: rpi
+ controls:
+ - PispConfigDumpFile:
+ type: string
+ description: |
+ Triggers the Raspberry Pi PiSP pipeline handler to generate a JSON
+ formatted dump of the Backend configuration to the filename given by the
+ value of the control.
+
+The controls will be generated in the vendor-specific namespace
+`libcamera::controls::rpi`. Additionally a `#define
+LIBCAMERA_HAS_RPI_VENDOR_CONTROLS` will be available to allow applications to
+test for the availability of these controls.
+
Generating a default configuration
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
@@ -729,7 +783,7 @@ function.
.. _Camera::generateConfiguration(): https://libcamera.org/api-html/classlibcamera_1_1Camera.html#a25c80eb7fc9b1cf32692ce0c7f09991d
.. _PipelineHandler::generateConfiguration(): https://libcamera.org/api-html/classlibcamera_1_1PipelineHandler.html#a7932e87735695500ce1f8c7ae449b65b
-Configurations are generated by receiving a list of ``StreamRoles`` instances,
+Configurations are generated by receiving a list of ``StreamRole`` instances,
which libcamera uses as predefined ways an application intends to use a camera
(You can read the full list in the `StreamRole API`_ documentation). These are
optional hints on how an application intends to use a stream, and a pipeline
@@ -971,7 +1025,8 @@ with the fourcc and size attributes to apply directly to the capture device
node. The fourcc attribute is a `V4L2PixelFormat`_ and differs from the
``libcamera::PixelFormat``. Converting the format requires knowledge of the
plane configuration for multiplanar formats, so you must explicitly convert it
-using the helper ``V4L2PixelFormat::fromPixelFormat()``.
+using the helper ``V4L2VideoDevice::toV4L2PixelFormat()`` provided by the
+V4L2VideoDevice instance that the format will be applied on.
.. _V4L2DeviceFormat: https://libcamera.org/api-html/classlibcamera_1_1V4L2DeviceFormat.html
.. _V4L2PixelFormat: https://libcamera.org/api-html/classlibcamera_1_1V4L2PixelFormat.html
@@ -981,7 +1036,7 @@ Add the following code beneath the code from above:
.. code-block:: cpp
V4L2DeviceFormat format = {};
- format.fourcc = V4L2PixelFormat::fromPixelFormat(cfg.pixelFormat);
+ format.fourcc = data->video_->toV4L2PixelFormat(cfg.pixelFormat);
format.size = cfg.size;
Set the video device format defined above using the
@@ -1001,7 +1056,7 @@ Continue the implementation with the following code:
return ret;
if (format.size != cfg.size ||
- format.fourcc != V4L2PixelFormat::fromPixelFormat(cfg.pixelFormat))
+ format.fourcc != data->video_->toV4L2PixelFormat(cfg.pixelFormat))
return -EINVAL;
Finally, store and set stream-specific data reflecting the state of the stream.
@@ -1369,7 +1424,7 @@ emitted triggers the execution of the connected slots. A detailed description
of the libcamera implementation is available in the `libcamera Signal and Slot`_
classes documentation.
-.. _Qt Signals and Slots: https://doc.qt.io/qt-5/signalsandslots.html
+.. _Qt Signals and Slots: https://doc.qt.io/qt-6/signalsandslots.html
.. _libcamera Signal and Slot: https://libcamera.org/api-html/classlibcamera_1_1Signal.html#details
In order to notify applications about the availability of new frames and data,
@@ -1408,7 +1463,7 @@ function to the V4L2 device buffer signal.
video_->bufferReady.connect(this, &VividCameraData::bufferReady);
Create the matching ``VividCameraData::bufferReady`` function after your
-VividCameradata::init() impelementation.
+VividCameradata::init() implementation.
The ``bufferReady`` function obtains the request from the buffer using the
``request`` function, and notifies the ``Camera`` that the buffer and