Age | Commit message (Collapse) | Author |
|
A bug in libcxx [0] used by clang version 11.0.2 is prevalent when
building libcamera for Android SDK30. This has been fixed and
integrated upstream with [1].
As a workaround, directly cast libcamera::utils::Duration objects to
std::chrono::duration when dividing.
Alternatives evaluated:
Considered: Enable public inheritance of std::chrono::duration and
override operator/ in the class.
Outcome: Does not fix the original compiler error.
Considered: Enable public inheritance of std::chrono::duration and
override operator/ in the libcamera namespace.
Outcome: new compiler error:
ld.lld: error: duplicate symbol: libcamera::operator/
(libcamera::utils::Duration const&, libcamera::utils::Duration const&)
Considered: Use private inheritance of std::chrono::duration and
re-implement a pass-through version of each std::chrono::duration
operator within libcamera::utils::Duration and use template
metaprogramming to fix the division operator.
Outcome: Testing shows that this would introduce substantial
limitations, i.e. requring the Duration object to be on the LHS of any
arithmetic operation with other numeric types. This also substantially
increases implementation complexity.
Considered: Extract double values from libcamera::utils::Duration
objects and use those to divide.
Outcome: This creates substantial readability and unit-safety issues.
[0] https://github.com/llvm/llvm-project/issues/40475
[1] https://github.com/llvm/llvm-project/commit/efa6d803c624f9251d0ab7881122501bb9d27368
Bug: https://bugs.libcamera.org/show_bug.cgi?id=156
Signed-off-by: Nicholas Roth <nicholas@rothemail.net>
|
|
Currently, libcamera does not have information for the ov8858 sensor
used in the PinePhone Pro, a phone designed to run Linux.
This commit adds metadata, especially that sensor gain is reported and
set in 1/16 discrete increments.
For more information, see "5.8 manual exposure compensation/ manual
gain compensation" in [0] and the driver in [1].
[0] http://www.ahdsensor.com/uploadfile/202008/55322e75316871.pdf
[1] https://github.com/megous/linux/blob/orange-pi-5.19/drivers/media/i2c/ov8858.c
Signed-off-by: Nicholas Roth <nicholas@rothemail.net>
|
|
As long as the sensor and the ISP do not support flips/transpose,
software-rotate the image at display time using SDL2.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Augment the CameraSensor::setFormat() function to configure horizontal
and vertical flips before appllying the image format on the sensor.
Applying flips before format is crucial as they might change the Bayer
pattern ordering.
To allow users of the CameraSensor class to easily pass the requested
Transform add to the V4L2SubdeviceFormat class a 'transform' member, by
default initialized to Transform::Identity.
Moving the handling of H/V flips to the CameraSensor class allows to
remove quite some boilerplate code from the IPU3 and RaspberryPi
pipeline handlers.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The two pipeline handlers that currently support Transform do so by
operating H/V flips on the image sensor, instead of rotating on the ISP.
As the image sensor performs the actual rotation, centralize the code
that validates a requested Transform against the camera sensor rotation
and capabilities in the CameraSensor class.
The implementation in the IPU3 pipeline handler was copied from the
RaspberryPi implementation, and is now centralized in CameraSensor to
make it easier for other platforms that do not rotate on the ISP to
implement support for Transform using the CameraSensor class.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
During the camera sensor driver validation, verify if the sensor
supports horizontal and vertical flips and store a flag as
CameraSensor::supportFlips_ class member.
The flag will be later inspected when applying flips.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The current documentation of the BayerFormat::transform() function
reports examples on the Bayer components ordering transformation for
horizontal flip (mirroring) but not for vertical flip or for the
combination of the two.
It might be useful to complete the documentation to eases understanding
of the transform() function on a sensor's Bayer pattern.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Previously the code used to clear the camnera's h and v flip bits when
enumerating the supported formats so as to obtain any Bayer formats in
the sensor's native (untransformed) orientation. However this fails
when the camera is already in use elsewhere.
Instead, we query the current state of the flip bits and transform the
formats - which we obtain in their flipped orientation - back into
their native orientation to be stored.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
This makes it easier to perform transformations on Bayer type mbus
codes by converting them to a BayerFormat, doing the transform, and
then converting them back again.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Add support for parsing array controls to the cam capture script.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Implement support for raw Bayer capture during configuration generation,
validation and camera configuration.
While at it, fix a typo in a comment.
Signed-off-by: Florian Sylvestre <fsylvestre@baylibre.com>
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Implement support for raw Bayer capture at runtime, from start() to
stop(). Support of raw formats in the camera configuration is split to a
subsequent change to ease review.
In raw mode, the ISP is bypassed. There is no need to provide parameter
buffers, and the ISP will not generate statistics. This requires
multiple changes in the buffer handling:
- The params and stats buffers don't need to be allocated, and the
corresponding video nodes don't need to be started or stopped.
- The IPA module fillParamsBuffer() operation must not be called in
queueRequestDevice(). As a result, the IPA module thus doesn't emit
the paramsBufferReady signal. The main and self path video buffers
must thus be queued directly in queueRequestDevice().
- The tryCompleteRequest() function must not to wait until the params
buffer has been dequeued.
- When the frame buffer has been captured, the IPA module
processStatsBuffer() operation must be called directly to fill request
metadata.
Signed-off-by: Florian Sylvestre <fsylvestre@baylibre.com>
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Unlike RkISP1Path::generateConfiguration(), the validate() function
doesn't take the camera sensor resolution into account but only
considers the absolute minimum and maximum sizes supported by the ISP to
validate the stream size. Fix it by using the same logic as when
generating the configuration.
Instead of passing the sensor resolution to the validate() function,
pass the CameraSensor pointer to prepare for subsequent changes that
will require access to more camera sensor data. While at it, update the
generateConfiguration() function similarly for the same reason.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Query the driver for the output formats and sizes that it supports,
instead of hardcoding them. This allows future-proofing for formats that
are supported by some but not all versions of the driver.
As the rkisp1 driver currently does not support VIDIOC_ENUM_FRAMESIZES,
fallback to the hardcoded list of supported formats and framesizes. This
feature will be added to the driver in parallel, though we cannot
guarantee that users will have a new enough kernel for it.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Support raw capture by allowing manual control of the exposure time and
analogue gain.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Add support for manual gain and exposure in the rkisp1 IPA.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The RkISP1 can capture raw frames by bypassing the ISP. In that mode,
the ISP will not produce statistics nor consume parameters. Most
algorithms will thus be unable to run, with one exception: the AGC will
still be able to configure the sensor exposure time and analog gain in
manual mode.
To prepare for this, add the ability to disable algorithms for the
duration of the capture session based on the camera configuration.
Individual algorithms report whether they support raw formats at
construction time, and the IPA module disables algorithms in configure()
based on the stream configurations.
Disabled algorithms are skipped during the capture session in the
processStatsBuffer() operation. As the ISP doesn't produce statistics,
don't try to access the stats buffer. There is no need for similar logic
in fillParamsBuffer() as that operation won't be called for raw capture.
All algorithms report not supporting raw capture by default. Raw support
in AGC will be added separately.
The feature is implemented in the RkISP1 module without any support from
libipa at this point to avoid designing a generic API based on a single
user. This may be changed in the future.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The StreamRole enum has enumerators such as 'Raw' that are too generic
to be in the global libcamera namespace. Turn it into a scoped enum to
avoid namespace clashes, and update users accordingly.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
libcamera prints stream role values in log messages. To be more
user-friendly, add a specialization of operator<<() to print the role
name as a string instead of a numerical value.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Add a tuning script for rkisp1 that uses libtuning. So far it only
supports LSC.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add a tuning script for raspberrypi for alsc only, that uses libtuning.
Since there will also be a tuning script for raspberrypi that has more
modules, put the libtuning alsc module definition in a separate file so
that it can be reused later.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add a generator to libtuning for writing tuning output to a yaml file.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add a parser to libtuning for parsing configuration files in yaml
format.
At the moment it doesn't parse anything and simply returns an empty
config. This is fine for the time being, as the only user of it is the
rkisp1 tuning script, which only has an LSC module which doesn't consume
anything from the configuration file. When a module comes around that
requires the yaml parser, it can be implemented then.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add a generator to libtuning for writing tuning output to a json file
formatted the same way that raspberrypi's ctt formats them.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add a parser to libtuning for parsing configuration files that are the
same format as raspberrypi's ctt's configuration files.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add an LSC module for RkISP1.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add an ALSC module for Raspberry Pi.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add a base LSC module to libtuning's collection of modules. It is based
on raspberrypi's ctt's ALSC, but customizable for different lens shading
table sizes, among other things. It alone is insufficient as a module,
but it provides utilities that are useful for and which will simplify
implementing LSC modules.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Implement the extensible components of libtuning. This includes:
- Parsers, for supporting different types of input config file formats
- Generators, for supporting different types of output tuning file
formats
- Modules, for supporting different tuning modules for different
algorithms and platforms
No parsers, generators, or modules are actually implemented. Only the
base classes are implemented.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Implement math helpers for libtuning. This includes:
- Average, a wrapper class for numpy averaging functions
- Gradient, a class that represents gradients, for distributing and
mapping
- Smoothing, a wrapper class for cv2 smoothing functions
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Implement the core of libtuning, our new tuning tool infrastructure. It
leverages components from raspberrypi's ctt that could be reused for
tuning tools for other platforms.
The core components include:
- The Image class
- libtuning (entry point and other core functions)
- macbeth-related tools, including the macbeth reference image
- utils
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Building libcamera as a subproject is failing when tracepoints are
enabled due to incorrectly managing the relative paths between the
source and build directory while generating tracepoint headers.
The previously used
path = output.replace('include/', '', 1)
logic is not sufficient to correctly determine the proper path when
libcamera is built as a subproject, and does not correctly handle the
relative paths, causing path to be processed as:
'subprojects/libcamera/include/libcamera/internal/tracepoints.h'.replace('include/', '', 1)
which evaluates to
'subprojects/libcamera/libcamera/internal/tracepoints.h'
so the tracepoints.h header file will try to include:
#define TRACEPOINT_INCLUDE "subprojects/libcamera/libcamera/internal/tracepoints.h"
which will fail.
Fix it by using Python's pathlib to calculate the relative path of the
output file with respect to the "include" directory of libcamera.
This has been tested with Pipewire. For non-subproject builds it should
generate the exact same path that was previously generated.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
[Kieran: Commit message expanded/reworded]
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Use vimc lens to test sensor's ability to discover ancillary lens.
Tested with the recent kernel patch for vimc lens:
https://lore.kernel.org/linux-media/20220415023855.2568366-1-yunkec@google.com/
Signed-off-by: Yunke Cao <yunkec@google.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Currently, the Android HAL does not work on rkisp1-based devices because
required FrameDurationLimits metadata is missing from the IPA
implementation.
This change sets FrameDurationLimits for rkisp1 based on the existing
ipu3 implementation, using the sensor's reported range of vertical
blanking intervals with the minimum reported horizontal blanking
interval.
Signed-off-by: Nicholas Roth <nicholas@rothemail.net>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The RkISP1 pipeline and IPA module allows for the CameraSensorInfo to be
empty, probably to accommodate some sensor used in a test platform that
does not provide the mandatory libcamera requirements.
As the \todo item in the IPA reports, there is a possibility that the
received CameraSensorInfo is empty and it should be checked before
accessing it, but currently such requirement is not enforced in the
code.
This allows to assume all the test platforms in use have now
successfully moved their sensor driver to comply with the minimum
requirements and provide a populated CameraSensorInfo to the IPA.
As the safety check is not enforced, and as we don't want to allow
faulty sensors to send empty CameraSensorInfo to the IPA, remove the
\todo item in the IPA and fail hard in the pipeline handler if the
sensor does not comply with libcamera requirements.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The RkISP1 implementation of IPA::configure() still uses the legacy
interface where sensor controls (and eventually lens controls) are
passed from the pipeline handler to the IPA in a map.
Since the introduction of mojom-based IPA interface definition, it is
possible to define custom data types and use them in the interface
definition between the pipeline handler and the IPA.
Align the RkISP1 IPA::configure() implementation with the one in the
IPU3 IPA module by using a custom data type instead of relying on a map
to pass controls to the IPA.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The CameraSensor class validates that the sensor driver in use supports
the controls required for IPA modules to work correctly.
For in-tree IPA modules, whose pipeline handlers already use
CameraSensor there's no need to validate such controls again.
Remove controls validation from the IPU3 and RkISP1 IPA modules and rely
on CameraSensor doing that at initialization time.
The list of mandatory controls is expanded to add V4L2_CID_ANALOGUE_GAIN
without which IPA modules cannot function.
The new requirement only applies to RAW sensors, platforms like UVC and
Simple are not impacted by this change.
While at it, expand the sensor driver requirements documentation to
include V4L2_ANALOGUE_GAIN in the list of mandatory controls a sensor
driver has to support.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The BufferMask enum provides a way of identifying which stream a frame buffer
belongs to. This enum is defined in the raspberrypi.mojom interface file.
However, the IPA does not need these enum definitions to mmap buffers that it
uses.
Move this enum out of the raspberrypi.mojom interface file and put it into
the RPi namespace visible only to the pipeline handler. This removes the
need to include the auto-generated IPA interface header in the RPi::Stream
definition.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
No existing Algorithm used the base pause(), resume() functions
or the paused_ flag, nor is there a need for a generic pause API.
Remove these. The AGC and AWB algorithms now have methods named
disableAuto(), enableAuto() which better describe their functionality.
Signed-off-by: Nick Hollinghurst <nick.hollinghurst@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Report the sensor timestamp in metadata. Use the timestamp from the
first buffer. Accuracy could be improved by using the frame start event
from the CSI-2 receiver, but the kernel driver doesn't support it yet.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Add to the formats map all the supported ISI video capture
stream formats.
This allows to populate the list of stream formats for all the non-RAW
use cases, as the ISI can perform colorspace conversion between YUV and
RGB.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The libcamerasrc element looks for the availability of the
FrameDurationLimits control by looking it up by numeric control id.
The ControlinfoMap::find(unsigned int i) function searches the control
numerical identifier on the ContorlInfoMap::idMap_ class member, which
might be not initialized if the pipeline handler does not register
any control, causing an invalid memory access.
Avoid looking up the control by numerical id and use the ControlId
instance instead to prevent that.
Fixes: ccfe0a1af77c ("gstreamer: Provide framerate support for libcamerasrc")
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The parameter 'request' is only used in an assert. assert is only defined
for debug builds and release builds will not use the parameter, resulting
in warnings messages only for non-debug builds.
Fix this by flagging the parameter as 'maybe_unused'.
Signed-off-by: Christian Rauch <Rauch.Christian@gmx.de>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
This release contains all the work merged to libcamera over the last 5
weeks, including the following summary highlights:
Highlights:
Core:
* New pipeline handler for the IMX ISI
* Fixed memory leak in the logging infrastructure
* Fixed meson support for 0.56
* Additional Thread Safety annotations added throughout
* Add a release method to pipeline handlers to support
freeing resources when a camera is released, but not
deleted.
* Group test applications under src/apps
* Make libdl optional to support Android builds
Application layers:
* Added DNG File output to cam
* Fixes for building against Android
* gstreamer framerate control and negotiation
IPA:
* Support setting metadata directly from (libipa) algorithms
* Set AGC and AWB metadata for both RKISP1 and IPU3.
* Support for enum serialization and Flags
* Support multiple lens shading tables for different colour
temperatures on RKISP1/i.MX8MP.
Raspberry Pi IPA:
* Full line length control
* Better HBLANK synchronisation and full line length control
* Support ov9281 as ov9281_mono
* Update colour temperature whenever manual gains change
abi-compliance-checker tells me that this release is 100% abi compatible
with v0.0.1.
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a pipeline handler for the ISI capture interface found on
several versions of the i.MX8 SoC generation.
The pipeline handler supports capturing multiple streams from the same
camera in YUV, RGB or RAW formats. The number of streams is limited by
the number of ISI pipelines, and is currently hardcoded to 2 as the code
has been tested on the i.MX8MP only. Further development will make this
dynamic to support other SoCs.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Implement the PipelineHandlerRPi::releaseDevice method which allows
us to free any allocated buffers when a camera is released.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
This notifies pipeline handlers when a camera is released, in case
they want to free any resources or memory buffers.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
This reverts commit 30d704732badc675f72fe73d14749669cb645c23.
It turns out that this commit causes some regressions and is in fact
unnecessary because the related commit "libcamera: v4l2_videodevice:
Guard against releasing unallocated buffers"
(a2bdff6d0b67475492ac7cf9318866b6d89a28fd) fixes the problem
completely (if the buffers were never allocated, the video device
avoids trying to free them even if the pipeline handler asks).
The reason for the regressions is that in this new (broken) scheme we
would never call clearBuffers() on all the streams if the internal
buffers were never allocated (i.e. buffersAllocated_ is never
set). This causes the stream's bufferMap_ list to get longer and
longer if there are multiple back-to-back calls to configure, and
dev_->importBuffers() will ultimately to fail.
So either we need to think more carefully about how to stop the
pipeline handler from freeing buffers that it doesn't own, or we just
leave it as the other commit resolves the problem on its own. In the
interim, simply reverting this commit certainly seems like the best
solution.
Fixes: 30d704732bad ("pipeline: raspberrypi: Do not unconditionally free buffers on close")
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The V4L2VideoDevice ensures that all sequence numbers for applications
commence at zero from the libcamera perspective. This should be the
behaviour expected by kernel drivers, but this is not always the case.
This is handled internally to ensure consistency, and a warning is
printed if the device does not start from zero. It was expected that the
Warning would help highlight where kernel drivers should be fixed, but
this has led to several false positive reports of failures where people
have been concerned that this warning is a cause for unrelated issues.
Lower the log level print to 'Info', to reduce the apparent severity of
this warning. Info is likely more appropriate that Debug to continue to
facilitate awareness of kernel drivers that could be improved, while not
appearing to be a fault.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add rudimentary LSC tables for imx219. These were generated with the
rkisp1 tuning script from libtuning [1], using an imx8mp (debix) and a white
computer monitor, at only a single color temperature of 5800.
[1] https://lists.libcamera.org/pipermail/libcamera-devel/2022-October/035017.html
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|