Age | Commit message (Collapse) | Author |
|
Meson uses tags to sort installed files in categories, and makes it
possible to install a subset of the files using the '--tags' argument to
'meson install'. This is typically used by distributions to split the
runtime, development and documentation files into separate packages.
By default, meson tries to guess the correct tag for installed files,
but can't always do so properly. Mark the install targets that meson
can't guess with the correct install_tag.
As the feature has been introduced in meson 0.60, bump the minimum meson
version. The latest LTS release of all major distributions that
libcamera currently targets ship a recent enough meson version.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
|
|
Commit a3178dd0391f ("ipa: rkisp1: agc: drop hard-coded analogue gain range")
removed both minimum and maximum limits for the analogue gain value.
However, as some sensors can potentially have a minimum gain lower than
1.0, restore the check for the minimum limit.
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
As the sensor's analogue gain range is known (read-out in
IPARkISP1::configure()), drop the limiting hard-coded range.
This enables better performance in low-light conditions for sensors with
a higher gain (e.g. the imx327).
Signed-off-by: Benjamin Bara <benjamin.bara@skidata.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The current IPA build files require a flat directory structure for the
IPAs. Modify the build files to remove this restriction and allow a
directory structure such as:
src/ipa
|- raspberrypi
|- common
|- cam_helpers
|- controller
|- vc4
|- rkisp1
|- ipu3
where each subdir (e.g. raspberrypi/common, raspberrypi/cam_helper) has
its own meson.build file. Such a directory structure will be introduced
for the Raspberry Pi IPA in a future commit.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add to the RkISP1 data configuration files for the imx258 and ov8858
sensors found on Pine64 PinephonePro devices.
The tuning file contain LSC tables extracted from the Rockchip Android
BSP the PinephonePro ships with.
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Tested-by: Robert Mader <robert.mader@collabora.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add to the RkISP1 data configuration files for the ov2685 and ov5695
sensor found on Google DRU "Scarlett" tablet.
The tuning files contain LSC tables extracted from the Google Chrome
Android HAL configuration files, available at
src/overlays/overlay-scarlet/media-libs/cros-camera-hal-configs-scarlet/files/IQ/
at revision "stabilize-14790.B".
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Tested-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The RkISP1 implementation of the LensShadingCorrection algorithm has been
made adaptive to the scene color temperature in commit 14c869c00fdd ("ipa:
rkisp1: Take into account color temperature during LSC algorithm").
The LSC algorithm interpolates the correction factors using the
table's reference color temperatures. When calculating the interpolation
coefficients, an unintended integer division makes both coefficient
zeros resulting in a completely black image.
Fix this by type casting to double one of the division operands.
Fixes: 14c869c00fdd ("ipa: rkisp1: Take into account color temperature during LSC algorithm")
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Omnivision OV4689 sensor driver exposes maximum analogue gain of
16x. Raise kMaxAnalogueGain to 16.0, so that the full gain range can
be used.
Signed-off-by: Mikhail Rudenko <mike.rudenko@gmail.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
|
|
Add a minimal tuning file for Omnivision OV4689, specifying black
level subtraction level.
Signed-off-by: Mikhail Rudenko <mike.rudenko@gmail.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
|
|
To make things easier for consumers.
Related: https://gitlab.freedesktop.org/pipewire/pipewire/-/merge_requests/1450
Signed-off-by: Robert Mader <robert.mader@collabora.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Support raw capture by allowing manual control of the exposure time and
analogue gain.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Add support for manual gain and exposure in the rkisp1 IPA.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The RkISP1 can capture raw frames by bypassing the ISP. In that mode,
the ISP will not produce statistics nor consume parameters. Most
algorithms will thus be unable to run, with one exception: the AGC will
still be able to configure the sensor exposure time and analog gain in
manual mode.
To prepare for this, add the ability to disable algorithms for the
duration of the capture session based on the camera configuration.
Individual algorithms report whether they support raw formats at
construction time, and the IPA module disables algorithms in configure()
based on the stream configurations.
Disabled algorithms are skipped during the capture session in the
processStatsBuffer() operation. As the ISP doesn't produce statistics,
don't try to access the stats buffer. There is no need for similar logic
in fillParamsBuffer() as that operation won't be called for raw capture.
All algorithms report not supporting raw capture by default. Raw support
in AGC will be added separately.
The feature is implemented in the RkISP1 module without any support from
libipa at this point to avoid designing a generic API based on a single
user. This may be changed in the future.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Currently, the Android HAL does not work on rkisp1-based devices because
required FrameDurationLimits metadata is missing from the IPA
implementation.
This change sets FrameDurationLimits for rkisp1 based on the existing
ipu3 implementation, using the sensor's reported range of vertical
blanking intervals with the minimum reported horizontal blanking
interval.
Signed-off-by: Nicholas Roth <nicholas@rothemail.net>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The RkISP1 pipeline and IPA module allows for the CameraSensorInfo to be
empty, probably to accommodate some sensor used in a test platform that
does not provide the mandatory libcamera requirements.
As the \todo item in the IPA reports, there is a possibility that the
received CameraSensorInfo is empty and it should be checked before
accessing it, but currently such requirement is not enforced in the
code.
This allows to assume all the test platforms in use have now
successfully moved their sensor driver to comply with the minimum
requirements and provide a populated CameraSensorInfo to the IPA.
As the safety check is not enforced, and as we don't want to allow
faulty sensors to send empty CameraSensorInfo to the IPA, remove the
\todo item in the IPA and fail hard in the pipeline handler if the
sensor does not comply with libcamera requirements.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The RkISP1 implementation of IPA::configure() still uses the legacy
interface where sensor controls (and eventually lens controls) are
passed from the pipeline handler to the IPA in a map.
Since the introduction of mojom-based IPA interface definition, it is
possible to define custom data types and use them in the interface
definition between the pipeline handler and the IPA.
Align the RkISP1 IPA::configure() implementation with the one in the
IPU3 IPA module by using a custom data type instead of relying on a map
to pass controls to the IPA.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The CameraSensor class validates that the sensor driver in use supports
the controls required for IPA modules to work correctly.
For in-tree IPA modules, whose pipeline handlers already use
CameraSensor there's no need to validate such controls again.
Remove controls validation from the IPU3 and RkISP1 IPA modules and rely
on CameraSensor doing that at initialization time.
The list of mandatory controls is expanded to add V4L2_CID_ANALOGUE_GAIN
without which IPA modules cannot function.
The new requirement only applies to RAW sensors, platforms like UVC and
Simple are not impacted by this change.
While at it, expand the sensor driver requirements documentation to
include V4L2_ANALOGUE_GAIN in the list of mandatory controls a sensor
driver has to support.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add rudimentary LSC tables for imx219. These were generated with the
rkisp1 tuning script from libtuning [1], using an imx8mp (debix) and a white
computer monitor, at only a single color temperature of 5800.
[1] https://lists.libcamera.org/pipermail/libcamera-devel/2022-October/035017.html
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Add coefficients sets in the YAML tuning file to allow using different set
depending of the image color temperature (provided by AWB algorithm).
During processing, LSC algorithm computes coefficients by doing a linear
interpolation between the two closer set.
Signed-off-by: Florian Sylvestre <fsylvestre@baylibre.com>
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The limits for the shutter speed and analogue gain are stored in
IPASessionConfiguration::agc. While they're related to the AGC, they are
properties of the sensor, and are stored in the session configuration by
the IPA module, not the AGC algorithm. Move them to the
IPASessionConfiguration::sensor structure where they belong.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Reorder functions in the base ipa::Algorithm and its derived classes to
match the calling order: queueRequest(), prepare() and process(). This
makes the code flow easier to read. No functional change intended.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
LSC gradient parameters are currently computed during prepare() phase.
Because these parameters can be computed only one time and stay constant for
each frame after, move the computation to the configure() function.
Signed-off-by: Florian Sylvestre <fsylvestre@baylibre.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Fill the frame metadata in the AGC and AWB algorithm's prepare()
function. Additional metadata for other algorithms will be addressed
later.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Extend the Algorithm::process() function with a metadata control list,
to be filled by individual algorithms with frame metadata. Update the
rkisp1 and ipu3 IPA modules accordingly, and drop the dead code in the
IPARkISP1::prepareMetadata() function while at it.
This only creates the infrastructure, filling metadata in individual
algorithms will be handled separately.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
There's no need to print the exposure and gain control ranges as an Info
message. Downgrade it to Debug. While at it, print the ranges using the
"[min, max]" syntax.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Add fields for minimum and maximum line length (in units of pixels) to
the IPACameraSensorInfo structure. This replaces the existing lineLength
field.
Update the ipu3, raspberrypi and rkisp1 IPAs to use
IPACameraSensorInfo::minLineLength instead of
IPACameraSensorInfo::lineLength, as logically we will always want to use
the fastest sensor readout by default.
Since the IPAs now use minLineLength for their calculations, set the
starting value of the V4L2_CID_HBLANK control to its minimum in
CameraSensor::init().
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Tested-by: Dave Stevenson <dave.stevenson@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
utils::defopt causes compilation issues on gcc 8.0.0 to gcc 8.3.0,
likely due to bug https://gcc.gnu.org/bugzilla/show_bug.cgi?id=86521
that was fixed in gcc 8.4.0. gcc 8.3.0 may be considered old (libcamera
requires gcc-8 or newer), but it is shipped by Debian 10 that has LTS
support until mid-2024.
As no workaround has been found to fix compilation on gcc 8.3.0 while
still retaining the functionality of utils::defopt, stop using it in the
RkISP1 IPA module.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The REGISTER_CAMERA_SENSOR_HELPER() macro defines a class type that
inherits from the CameraSensorHelperFactory class, and implements a
constructor and createInstance() function. Replace the code generation
through macro with the C++ equivalent, a class template, as done by the
Algorithm factory.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Xavier Roumegue <xavier.roumegue@oss.nxp.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Multiple algorithms have an initialized_ flag that they set to true at
the end of the init() function, and check at the beginning of prepare()
to skip preparation. This serves no real purpose, as the flag can only
be false if init() fails, in which case the IPA module initialization as
a whole will fail.
Drop the initialized_ flags.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The red and blue gains are computed by dividing the green mean by the
red and blue means respectively. An offset of 1 is added to the dividers
to avoid divisions by zero. This introduces a bias in the gain values.
Fix it by clamping the divisors to a minimum of 1.0 instead of adding an
offset.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
When the RGB means are too small, gains and color temperature can't be
meaningfully calculated. Freeze the AWB in that case, using the
previously calculated values.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The gain values are currently clamped to the range [0.0, 3.996] used by
the hardware. A zero value makes little sense, as it would completely
remove the contribution of the corresponding color channel from the AWB
accumulators, but worse, would lead to divisions by zero when
calculating the raw means in subsequent iterations. Prevent this by
setting the minimum gain value to 1/256.
While at it, clamp the gain values before filtering them, to improve the
stability of the control loop.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Due to hardware rounding errors in the YCbCr means, the calculated RGB
means may be negative. This would lead to negative gains, messing up
calculation. Prevent this by clamping the means to positive values.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Extend the debug message in Awb::process() to log the means and color
temperature in addition to the gains. This is useful for debugging the
algorithm behaviour. While at it, set the showpoint flag to print a
fixed number of digits after the decimal point, making logs more
readable.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
RkISP actually supports two modes for color means, RGB and YCbCr. The
variables where the means are stored are identically named regardless of
the color means mode that's been selected.
Since the gains are computed in RGB mode, a conversion needs to be done
when the mode is YCbCr, which is unnecessary when RGB mode is selected.
This adds support for RGB means mode too, by checking at runtime which
mode is selected at a given time. The default is still set to YCbCr mode
for now.
Cc: Quentin Schulz <foss+libcamera@0leil.net>
Signed-off-by: Quentin Schulz <quentin.schulz@theobroma-systems.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The color temperature doesn't need floating point precision, and is
calculated by Awb::estimateCCT() as an unsigned integer. Store it with
the same data type in the frame context.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The AWB statistics are computed after the ISP applies the colour gains.
This means that the red, green and blue means do not match the data
coming directly from the sensor, but are multiplied by the colour gains
that were used for the frame on which the statistics have been computed.
The AWB algorithm needs to take this into account when calculating the
colour gains for the next frame. Do so by dividing the means by the
gains that were applied to the frame, retrieved from the frame context.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Now that data used by algorithms has been partitioned between the active
state and frame context, we have a better view of the role of each of
those structures. Document them appropriately.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Rework the algorithm's usage of the active state, to store the value of
controls for the last queued request in the queueRequest() function, and
store a copy of the values in the corresponding frame context. The
latter is used in the prepare() function to populate the ISP parameters
with values corresponding to the right frame.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Rework the algorithm's usage of the active state, to store the value of
controls for the last queued request in the queueRequest() function, and
store a copy of the values in the corresponding frame context. The
latter is used in the prepare() function to populate the ISP parameters
with values corresponding to the right frame.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Rework the algorithm's usage of the active state, to store the value of
controls for the last queued request in the queueRequest() function, and
store a copy of the values in the corresponding frame context. The
latter is used in the prepare() function to populate the ISP parameters
with values corresponding to the right frame.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Rework the algorithm's usage of the active state and frame context to
store data in the right place.
The active state stores two distinct categories of information:
- The consolidated value of all algorithm controls. Requests passed to
the queueRequest() function store values for controls that the
application wants to modify for that particular frame, and the
queueRequest() function updates the active state with those values.
The active state thus contains a consolidated view of the value of all
controls handled by the algorithm.
- The value of parameters computed by the algorithm when running in auto
mode. Algorithms running in auto mode compute new parameters every
time statistics buffers are received (either synchronously, or
possibly in a background thread). The latest computed value of those
parameters is stored in the active state in the process() function.
The frame context also stores two categories of information:
- The value of the controls to be applied to the frame. These values are
typically set in the queueRequest() function, from the consolidated
control values stored in the active state. The frame context thus
stores values for all controls related to the algorithm, not limited
to the controls specified in the corresponding request, but
consolidated from all requests that have been queued so far.
For controls that can be specified manually or computed by the
algorithm depending on the operation mode (such as the colour gains),
the control value will be stored in the frame context in
queueRequest() only when operating in manual mode. When operating in
auto mode, the values are computed by the algorithm and stored in the
frame context in prepare(), just before being stored in the ISP
parameters buffer.
The queueRequest() function can also store ancillary data in the frame
context, such as flags to indicate if (and what) control values have
changed compared to the previous request.
- Status information computed by the algorithm for a frame. For
instance, the colour temperature estimated by the algorithm from ISP
statistics calculated on a frame is stored in the frame context for
that frame in the process() function.
The active state and frame context thus both contain identical members
for most control values, but store values that have a different meaning.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Rework the algorithm's usage of the active state to store the value of
controls for the last queued request in the queueRequest() function, and
store a copy of the values in the corresponding frame context.
The frame context is used in the prepare() function to populate the ISP
parameters with values corresponding to the right frame.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Now that the Algorithm::prepare() function takes a frame number, we can
use it to replace the IPAActiveState::frameCount member.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Establish a queue of FrameContexts using the new FCQueue and use it to
supply the FrameContext to the algorithms.
The algorithms on the RKISP1 do not use this yet themselves, but are
able to do so after the introduction of this patch.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Inherit from the base FrameContext class in the RkISP1 IPAFrameContext.
As the IPAFrameContext is currently unused, this change is a no-op, but
it prepares the RkISP1 IPA module for frame context queue support.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The RkISP1 IPA module creates a single instance of its IPAFrameContext
structure, effectively using it more as an active state than a per-frame
context. To prepare for the introduction of a real per-frame context,
move all the members of the IPAFrameContext structure to a new
IPAActiveState structure. The IPAFrameContext becomes effectively
unused at runtime, and will be populated back with per-frame data after
converting the RkISP1 IPA module to using a frame context queue.
The IPAActiveState structure will slowly morph into a different entity
as individual algorithm get later ported to the frame context API.
While at it, fix a typo in the documentation of the
Agc::computeExposure() function that incorrectly refers to the frame
context instead of the global context.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The documentation of the IPA context structures is separate from the
documentation of the structure members. Sort the documentation block to
group members with their structure.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The "autoExposure" class member is not used.
Remove it.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
IPA modules have access to incoming Request's controls list and need to
store them in the frame context at queueRequest() time. Pass the frame
context to the Algorithm::queueRequest() function.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|