Age | Commit message (Collapse) | Author |
|
The Array2D class is a very thin wrapper round std::vector that can be
used almost identically in the code, but it carries its 2D size with
it so that we aren't passing it around all the time.
All the std::vectors that were X * Y in size (X and Y being the ALSC
grid size) have been replaced. The sparse matrices that are XY * 4 in
size have not been as they are somewhat different, are used
differently, require more code changes, and actually make things more
confusing if everything looks like an Array2D but are not the same.
There should be no change in algorithm behaviour at all.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Remove any hard-coded assumptions about the target hardware platform
from the ALSC algorithm. Instead, use the "target" string provided by
the camera tuning config and generalised statistics structures to
determing parameters such as grid and region sizes.
The ALSC calculations use run-time allocated arrays/vectors on every
frame. Allocating these might add a non-trivial run-time penalty.
Replace these dynamic allocations with a set of reusable pre-allocated
vectors during the init phase.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a new Controller::HardwareConfig structure that captures the
hardware statistics grid/histogram sizes and pipeline widths. This
ensures there is a single centralised places for these parameters.
Add a getHardwareConfig() helper function to retrieve these values for a
given hardware target.
Update the statistics populating routine in the IPA to use the values
from this structure instead of the hardcoded numbers.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The target string may be used by algorithms to determine the running
hardware target.
Store the target string provided by the camera tuning files in the
controller state. Add a getTarget() member function to retrieve this
string.
Validate the correct hardware target ("bcm2835") during the IPA
initialisation phase.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Fix a bug in the default frame durations calculation where the min/max
values are swapped round. This is a rarely travelled code path, so has
not actually caused a reported failure.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Make a copy of the CameraMode structure on a switch mode call. This
replaces the existing lastSensitivity_ field.
Limit the AGC gain calculations to the minimum value given by the
CameraMode structure. The maximum value remains unclipped as any gain
over the sensor maximum will be made up by digital gain.
Rename clipShutter to limitShutter for consistency, and have the latter
limit the shutter speed to both upper and lower bounds.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Use the new analogue gain and shutter speed limit fields in the ipa
code when reporting back the control value limits and calculating the
analogue gain code to use. This also replaces the now unused (and
removed) maxSensorGainCode_ field.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add fields in the CameraMode structure to capture the mode specific
limits for analogue gain and shutter speed. For convenience, also add
fields for minimum and maximum frame durations.
Populate these new fields when setting up the CameraMode structure.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When an executable using libcamera calls exec(3) while a camera is in
use, file descriptors corresponding to the V4L2 video devices are kept
open has they have been created without O_CLOEXEC. This results in the
video devices staying busy, preventing the new executable from using
them:
[91] ERROR V4L2 v4l2_videodevice.cpp:1047 /dev/video0[149:cap]: Unableto set format: Resource busy
Fix this by opening video devices with O_CLOEXEC, which is generally a
good idea in libraries.
Signed-off-by: Elias Naur <mail@eliasnaur.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Invalid, or not correctly reset requests can cause undefined behaviour
in the pipeline handlers due to unexpected request state.
If the status has not been reset to Request::RequestPending, it is
either not new, or has not been correctly procesed through
Request::reuse().
This can be caught early by validating the status of the request when it
is queued to a camera.
Reject invalid requests before processing them in the pipeline handlers.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Use the Configuration section to report which dependency is used to
handle IPA module signatures.
In the event that it is not found, report directly in the configuration
that modules are Isolated.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Provide a CameraSensorHelper for the OV2685, along with the
corresponding camera sensor properties.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Provide a CameraSensorHelper for the OV5647 as used in the Raspberry Pi
Camera Module v1.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Christopher Obbard <chris.obbard@collabora.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The demosaic code first expands the buffer datatype to uint16, and then
shifts the data left so that the 8, 10 and 12 bitspp formats all become
16 bitspp.
It then, eventually, uses np.einsum to calculate averages, but this
averaging sums multiple uint16 values together, and stores them in
uint16 storage. As in the first step we shifted the values left,
possibly getting values close to the maximum of uint16 range, we, of
course, overflow when summing them together. This leads to rather bad
looking images.
Fix this by dropping the original shift. It serves no purpose, and is
probably a remnant of some early testing code. This way the largest
numbers we are summing together are 12 bit values, and as we use a 3x3
window from which we fetch values, for a single rgb plane, the max
number of 12 bit values is 5 (for green). Sum of 5 12 bit values is well
below the 16 bit maximum.
Signed-off-by: Tomi Valkeinen <tomi.valkeinen@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When compiled with LTO (the default on Ubuntu), the global static
objects camHelpers and algorithms cause a crash in raspberrypi_ipa_proxy
at runtime as they're not allocated by the time the registration
routines execute.
This is a fairly crude fix which just converts the global static objects
into local static objects inside an equivalently named function.
Signed-off-by: Dave Jones <dave.jones@canonical.com>
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Tested-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Logger::create() is not currently thread safe and causes crashes
noticeable on RaspberryPi 4. This adds a mutex around the creation
of categories.
Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The ConverterFactoryBase::create() function returns a nullptr when no
converter is found. The only caller, SimpleCameraData::init(), checks if
the converter is valid with isValid(), but doesn't check if the pointer
is null, which can lead to a crash.
We could check both pointer validity and converter validity in the
caller, but to limit the complexity in callers, it is better to check
the converter validity in the create() function and return a null
pointer when no valid converter is found.
Signed-off-by: Suhrid Subramaniam <suhrid.subramaniam@mediatek.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Some updates to the tuning for the imx296 sensors.
For the colour variant:
* Minor change to the AWB curve, making things a little less green.
* Updated CCMs that reduce colour saturation to a more accurate level.
Thanks to Dr. Rolf Henkel for these measurements and calculations.
* Sharpening has been toned down quite a lot.
* rpi.focus algorithm added so that the focus measure can be accessed.
The sharpening and focus changes are applied to the mono version of
the sensor too as we expect similar characteristics.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The maxVal variable in the computeInitialY function needs to be a
uint64_t, otherwise the subsequent multiplications in the function
can overflow on relatively high resolution images (when the counts in
the regions go over 16 bits).
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
On Raspberry Pi Compute Module platforms, it is possible to attach a
single camera device only to the secondary Unicam port. The current
logic of PipelineHandlerRPi::match() will return a failure during
enumeration of the first Unicam media device (due to no sensor attached,
or sensor failure) and thus the second Unicam media device will never be
enumerated.
Fix this by looping over all Unicam instances in PipelineHandlerRPi::match()
until a camera is correctly registered, or return a failure otherwise.
Reported-on: https://github.com/raspberrypi/libcamera/issues/44
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a new parameter to the pipeline handler config file named
"unicam_timeout_value_ms" to allow users to override the automatically
computed Unicam timeout value.
This value is given in milliseconds, and setting a value of 0 (the
default value) disables the override.
An example use of this parameter would be if an application configured a
RAW stream, and provides buffers for the stream on every request. If the
application holds off on sending requests for a particular reason (e.g.
a timelapse use case), then we will possibly hit the watchdog timeout as
it is only a small multiple of the frame length. This override allows an
application to select a larger value with the knowledge that it may
space requests longer than the calculated timeout value.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The existing mechanism of setting a timeout value simply uses the
maximum possible frame length advertised by the sensor mode. This can be
problematic when, for example, the IMX477 sensor can use a frame length
of over 600 seconds. However, for typical usage the frame length will
never go over several 100s of milliseconds, making the timeout very
impractical.
Store a list of the last 10 frame length values requested by the AGC. On
startup, and at the end of every frame, take the maximum frame length
value from this list and return that to the pipeline handler through the
setCameraTimeoutValue() signal. This allows the timeout value to better
track the actual sensor usage.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add an explicit helper function setCameraTimeout() in the pipeline
handler to set the Unicam timeout value. This function is signalled from
the IPA to set up an appropriate timeout. This replaces the
maxSensorFrameLengthMs value parameter returned back from
IPARPi::start().
Adjust the timeout to be 5x the maximum frame duration reported by the
IPA.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The RkISP1 implementation of the LensShadingCorrection algorithm has been
made adaptive to the scene color temperature in commit 14c869c00fdd ("ipa:
rkisp1: Take into account color temperature during LSC algorithm").
The LSC algorithm interpolates the correction factors using the
table's reference color temperatures. When calculating the interpolation
coefficients, an unintended integer division makes both coefficient
zeros resulting in a completely black image.
Fix this by type casting to double one of the division operands.
Fixes: 14c869c00fdd ("ipa: rkisp1: Take into account color temperature during LSC algorithm")
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
createPipelineHandlers
Currently the function `createPipelineHandlers` connects itself to the
`devicesAdded` signal at the end of each call. As the Signal object
supports multiple non-unique listeners connected to it, the former
function would be called exponentially often with each new emitted event
on `devicesAdded` (i.e. with udev plugging in a new camera)
Fix it by connecting the createPipelineHandlers() slot to `devicesAdded`
signal in CameraManager::Private::init() instead. This will prevent the
slot getting connected multiple times to the `devicesAdded` signal.
Signed-off-by: Sophie Friedrich <dev@flowerpot.me>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Currently it is not possible to display debug output from an isolated IPA
module. The standard descriptors are all closed and any specified log
file is explicitly deactivated for the IPA module. Since libcamera and the
isolated IPA module are separate processes, they cannot write to the same
file. However, if syslog is used, then this would be possible.
If syslog is specified as a log file, then this is left as it is for the
isolated IPA module.
Signed-off-by: Matthias Fend <matthias.fend@emfend.at>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Tested-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
|
|
Instead of having bool return type and an out parameter, use
std::optional<libcamera::StreamRole> to return from
StreamKeyValueParser::parseRole().
Meanwhile at it, re-word an existing comment to make it lucid.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Commit 613d5402673e ("pipeline: raspberrypi: Fix handling of colour
spaces") adjusts the colorspace to ColorSpace::Raw for raw streams.
However, if the colorspace is not requested for raw streams(nullopt),
we should still set the colorspace to ColorSpace::Raw, for raw streams.
Fixes: 613d5402673e ("pipeline: raspberrypi: Fix handling of colour spaces")
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
This patch adds JEA implementation to replace libjpeg in CrOS platform,
where hardware accelerator is available.
Signed-off-by: Harvey Yang <chenghaoyang@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
To prepare for support of the JEA encoder in a following commit, which
will need to access the buffer_handle_t of the destination buffer, pass
the StreamBuffer to the Encoder::encoder() function. As the StreamBuffer
contains the source FrameBuffer and the destination Span, drop them from
the function arguments and access them directly from the StreamBuffer.
Signed-off-by: Harvey Yang <chenghaoyang@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Han-Lin Chen <hanlinchen@chromium.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
To further control sources in jpeg to build based on the platform, this
patch adds meson.build in src/android/jpeg directory.
Signed-off-by: Harvey Yang <chenghaoyang@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Han-Lin Chen <hanlinchen@chromium.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
HALFrameBuffer is derived from FrameBuffer with access to
buffer_handle_t, which is needed for JEA usage.
Signed-off-by: Harvey Yang <chenghaoyang@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Han-Lin Chen <hanlinchen@chromium.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Commit 6f6e1bf704fe ("libcamera: camera_sensor: Apply flips at
setFormat()") extended the CameraSensor::setFormat() function
to apply vertical/horizontal flips on the sensor based on the
supplied Transform. To pass the Transform to the function the
V4L2SubdeviceFormat structure has been augmented with a Transform
member.
However as the newly added Transform is not used at all in the
V4L2Subdevice class, it should not be part of V4L2SubdeviceFormat.
Fix that by removing the transform field from V4L2SubdeviceFormat
and pass it as an explicit parameter to CameraSensor::setFormat().
Fixes: 6f6e1bf704fe ("libcamera: camera_sensor: Apply flips at setFormat())
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Commit 1a614866a29c ("libcamera: camera_sensor: Validate Transform") has
removed usage of the RPiCameraData::supportsFlips_ but hasn't removed
the field itself, nor its initialization. Drop those as they're unused.
Fixes: 1a614866a29c ("libcamera: camera_sensor: Validate Transform")
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Commit 1a614866a29c ("libcamera: camera_sensor: Validate Transform") has
removed usage of the IPU3CameraData::rotationTransform_ but hasn't
removed the field itself, nor its initialization. Drop those as they're
unused.
Fixes: 1a614866a29c ("libcamera: camera_sensor: Validate Transform")
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Fix a typo introduced in a comment when refactoring transformation
handling in the CameraSensor class.
Fixes: 1a614866a29c ("libcamera: camera_sensor: Validate Transform")
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
The CameraSensor::validateSensorDriver() function prints a Warning
message when the camera sensor doesn't support flips. We don't mandate
flip support and can run without it without any problem, so a warning is
too harsh. Demote it to a Debug message.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
The VC4 ISP uses a pipeline bit-depth of 13-bits. The AGC algorithm needs to
know this bit-depth when computing the Y value for the image.
Instead of hardcoding the VC4 bit-depth in the AGC source code, normalise all
region sums to 16-bits when filling the Statistics structure. AWB and ALSC are
agnostic about pipeline depth, so do not need changing.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Repurpose the StatisticsPtr type from being a shared_ptr<bcm2835_isp_stats> to
shared_ptr<RPiController::Statistics>. This removes any hardware specific header
files and structures from the algorithms source code.
Add a new function in the Raspberry Pi IPA to populate the generic statistics
structure from the values provided by the hardware in the bcm2835_isp_stats
structure.
Update the Lux, AWB, AGC, ALSC, Contrast, and Focus algorithms to use the
generic statistics structure appropriately in their calculations. Additionally,
remove references to any hardware specific headers and defines in these source
files.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Tested-by: Nick Hollinghurst <nick.hollinghurst@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Release the statistics buffer after running the through the AWB calculations.
Only the "counted" statistics are copied out to a local structure, so keeping
the statistics buffer allows the algorithm to see the "uncounted" statistics as
well.
This is currently handled by hard-coding the total number of statistics regions
regions based on the structure definition in the bcm2835_isp_stats structure.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a default constructor to the RPiController::Histogram class that creates
an empty histogram. Since this is a cumulative histogram, push a value of 0 into
the first (and only) bin to signify this.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
At present, the controller algorithms access the bcm2835_isp_stats structure,
which is hardware specific. It would be desirable to abstract out the statistics
structure to remove hardware specific headers from the algorithms source files.
Define a new templated RegionStats class that encompasses region based
statistics generated by the ISP. For the VC4 ISP, this can be used to hold
RGB sums and focus FoM values.
Define a new Statistics structure that holds all the VC4 ISP statistics output.
This includes AGC histograms, AGC/AWB region sums and focus FoM regions.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a new pipeline config parameter "disable_startup_frame_drops" to
disable any startup drop frames, overriding the IPA request.
When this parameter is set, it allows the pipeline handler to run with
no internally allocated Unicam buffers ("min_unicam_buffers").
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add the ability to read the platform configuration parameters from a
config file provided by the user through the LIBCAMERA_RPI_CONFIG_FILE
environment variable. Use the PipelineHandler::configurationFile()
helper to determine the full path of the file.
Provide an example configuration file named example.yaml. Currently two
parameters are available through the json file:
"min_unicam_buffers" The minimum number of internal Unicam buffers to
allocate.
"min_total_unicam_buffers" The minimum number of internal + external
Unicam buffers that must be allocated.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Reorder the code such that the IPA requested startup drop frames count is
available before the pipeline handler allocates any stream buffers.
This will be used in a subsequent change to stop Unicam buffer allocations if
there are no startup drop frames required and the application has configured a
raw stream and always provides buffers for it.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a configuration structure to store platform specific parameters used by
the pipeline handler. Currently, these only store Unicam buffer counts,
replacing the hardcoded static values in the source code.
In subsequent commits, more parameters will be added to the configuration
structure, and parameters will be read in through a config file.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a new helper function PipelineHandler::configurationFile() that returns
the full path of a named configuration file. This configuration file may be read
by pipeline handlers for platform specific configuration parameters on
initialisation.
The mechanism for searching for the configuration file is similar to the IPA
configuration file:
- In the source tree if libcamera is not installed
- Otherwise in standard system locations (etc and share directories).
When stored in the source tree, configuration files shall be located in a 'data'
subdirectory of their respective pipeline handler directory.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a pipeline_data_dir variable to the meson build files. This variable
points to the location of pipeline handler specific configuration files
on the filesystem.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
We implement a custom validateColorSpaces method that forces all
(non-raw) streams to same colour space, whilst distinguishing RGB
streams from YUV ones, as the former must have the YCbCr encoding and
range over-written.
When we apply the colour space, we always send the full YUV version as
that gets converted correctly to what our hardware drivers expect. It
is also careful to check what comes back as the YCbCr information gets
overwritten again on the way back.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Add support for Raspberry Pi Camera 3 modules (Sony IMX708 camera sensor) to the
Raspberry Pi IPA. These modules are available in either normal or wide angle
lens, both with IR or no IR cut options, giving a total for 4 variants. Provide
IQ tuning files for all four variants.
The IMX708 camera helper additionally parses PDAF and HDR histogram data that
is provided in the embedded data stream from Unicam.
Signed-off-by: Nick Hollinghurst <nick.hollinghurst@raspberrypi.com>
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|