Age | Commit message (Collapse) | Author |
|
The IPAIPU3 interface currently uses event-type based structures in
order to communicate with the pipeline-handler (and vice-versa).
Replace the event based structures with dedicated functions associated
to each operation.
The translated naming scheme of actions to signals are:
ActionSetSensorControls => setSensorControls
ActionParamFilled => paramsBufferReady
ActionMetadataReady => metadataReady
The translated naming scheme of events to dedicated functions are:
EventProcessControls => queueRequest()
EventStatReady => processStatsBuffer()
EventFillParams => fillParamsBuffer()
The dedicated functions are called from pipeline-handler to the IPA
using IPC. These functions run asynchronously and when completed,
the IPA emits the respective signals as stated above in the translated
naming scheme.
The EventProcessControls is translated to queueRequest() to bring
symmetry to the IPU3 interface with other IPA interfaces.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Enable the V4L2VideoDevice dequeue timeout for the Unicam Image node, and
connect the timeout signal to a slot in the pipeline handler. This slot will
log an error message informing the user of a possible hardware stall.
The timeout is calculated as 2x the maximum frame length possible for a given
mode, returned by the IPA.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Under the right circumstances, the alsc calculations could spread the colour
errors across the entire image as lambda remains unbound. This would cause the
corrected image chroma values to slowly drift to incorrect values.
This change adds a config parameter (alsc.lambda_bound) that provides an upper
and lower bound to the lambda value at every stage of the calculation. With this
change, we now adjust the lambda values so that the average across the entire
grid is 1 instead of normalising to the minimum value.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Synchronise with other IPA interfaces (for e.g. IPU3, RkISP1)
that uses queueRequest() to pass in the request controls to
IPA.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Queuing of request (i.e. passing of controls to the IPA)
and filling of the parameters buffer are two separate operations.
Treat them as such by splitting them into two functions in
the rkisp1 IPA interface.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The OV5640 is an OmniVision sensor with a linear gain model, expressed
in 1/16 steps.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The IMX296 is a Sony sensor that expresses its gain in 0.1dB units. It
thus maps to the exponential gain model.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The IMX290 is a Sony sensor that expresses its gain in 0.3dB units. It
thus maps to the exponential gain model.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The CameraSensorHelper specifies two gain models, linear and
exponential. They are modelled after the MIPI CCS specification. Only
the linear model has been implemented, the exponential model was left
for later.
We now need to support sensors that configure their gain in a hardware
register with a value expressed in dB. This has similarities with the
MIPI CCS exponential gain model, but is only has an exponential factor,
while CCS also allows sensors to support a configurable linear factor.
The full CCS exponential model needs two values (for the linear and
exponential factors) to express a gain, while IPAs use a single linear
gain value internally. However, the exponential gain model example in
the CCS specification has a fixed linear factor, which may indicate that
it could be common for sensors that implement the exponential gain model
to only use the exponential factor. For this reason, implement the
exponential gain model with a fixed linear factor, but with a
sensor-specific coefficient for the exponential factor that allows
expressing the gain in dB (or other logarithmical units) instead of
limiting it to powers of 2 as in the MIPI CCS specification.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
To prepare for other gain models than the linear model, store the gain
constants in a union with per-model members. Due to the lack of
designated initializer support in gcc with C++17, initializing a single
complex structure that includes a union will be difficult. Split the
gain model type to a separate variable to work around this issue.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Add a validation check for sensor controls validateSensorControls()
before they are queried in IPAIPU3::updateSessionConfiguration().
Fail the IPAIPU3::configure() if the required sensor controls are
not found.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Drop exposure, gain private members from IPAIPU3 because the values
are handled directly via IPAFrameContext.
Move the default vblank value from IPAIPU3 to IPASessionConfiguration
structure as it is a default static value not expected to change
for a session.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The exposure and gain limits are required for AGC configuration
handled in IPAIPU3::updateSessionConfiguration(), which is happening
already. Therefore the max/min private members in IPAIPU3 class for
exposure/gain serve no use except setting initial values of exposure_
and gain_ members.
Drop the max/min private members from IPAIPU3 class and set initial
gain_ and exposure_ in IPAIPU3::updateSessionConfiguration().
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The RkISP1 ISP calculates a mean value for Y, Cr and Cb at each frame.
There is a RGB mode which could theoretically give us the values for R,
G and B directly, but it seems to be failing right now.
Convert those values into R, G and B and estimate the gain to apply in a
grey world.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Peter Griffin <peter.griffin@linaro.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
As for the IPU3, we can estimate the histogram of the luminance. The
RkISP1 can estimate multiple ones, the R, G and B ones, the Y only one
and a combination of RGB. The one we are interested by in AGC is the Y
histogram.
Use the hardware revision to determine the number of bins of the
produced histogram.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Peter Griffin <peter.griffin@linaro.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The Histogram constructor does not modify the data. Pass it a
Span<const uint32_t> instead of a Span<uint32_t>.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In order to have the proper pixel levels, apply a fixed black level
correction, based on the imx219 tuning file in RPi. The value is 4096 on
16 bits, and the pipeline for RkISP1 is on 12 bits, scale it.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Peter Griffin <peter.griffin@linaro.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Introduce a frameCount variable in the IPAFrameContext which increments
each time a request is queued. It is reset at configure call, when the
camera is started.
This will allow the frameCount to be used by other algorithms, without
having to keep multiple private frame counters.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The IPARkISP1Interface currently uses event-type based structures in
order to communicate with the pipeline-handler (and vice-versa).
Replace the event based structures with dedicated functions associated
to each operation.
The translated naming scheme of operations to dedicated functions:
ActionV4L2Set => setSensorControls
ActionParamFilled => paramsBufferReady
ActionMetadata => metdataReady
EventSignalStatBuffer => processStatsBuffer()
EventQueueRequest => queueRequest()
The lexical of IPARkISP1::metadataReady() will now conflict with the
metadataReady Signal being introduced in this patch as part of the
interface change. Hence, rename IPARkISP1::metadataReady() to
IPARkISP1::prepareReady() to prevent the conflict.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The private members of exposure and gain limits can be dropped
from IPARkISP1 since they are not used class-wide and can be easily
replaced by local counter-parts.
In case they are required to be widely available, these should then
sit in IPASessionConfiguration.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The Sony IMX296 exists in two versions, colour (Bayer) and monochrome.
The tuning file corresponds to the monochrome version.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
|
|
The Sony IMX296 sensor has an exponential gain model, and adds a fixed
14.26µs offset to the exposure time expressed in line units.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Some sensors (namely the Sony IMX296, whose support will be added
shortly) require different conversion formulas between exposure time and
exposure lines. Make the Exposure() and ExposureLines() functions
virtual to allow this.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The AwbAuto mode is defined in all the JSON tuning files as "auto",
not "normal". The effect of this was that you couldn't switch back to
"auto" mode once you had switched away.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The configure() function has a local configuration variable referencing
context.configuration for the purpose of shortening lines. Use it
instead of context.configuration in the remaining locations, and
constify it while at it as the configuration isn't meant to be modified.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The frame count is used to skip the gain and exposure filtering when
starting. It thus needs to be reset when configuring the algorithm, to
avoid slower convergence when stopping and restarting.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The datasheet for the OV2740 gives 0x80 as 1x gain, so real gain
is GainCode / 128.
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Pick up the focus value from the AF algorithm and send lens controls
along in the frame context.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Now that we have added lens controls, rename the existiing
member of the class to clarify that it relates to the sensor's
controls.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add further members to the ipu3 ipa interface that will hold lens
controls passed in by configInfo
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Since VCM for surface Go 2 (dw9719) had been successfully driven, this
Af module can be used to control the VCM and determine the focus value
based on the IPU3 AF state.
Based on the values from the IPU3 AF buffer, the variance of each focus
step is determined and a greedy approach is used to find the maximum
variance of the AF state and an appropriate focus value.
The grid configuration is implemented as a context. Also, the grid
parameter- AF_MIN_BLOCK_WIDTH is set to 4 (default is 3) since if the
default value is used, x_start (x_start > 640) will be at an incorrect
location of the image (rightmost of the sensor).
Signed-off-by: Kate Hsuan <hpa@redhat.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The gain values are coded as u3.13 fixed point values, ie they can not
be more than 8. Clamp the values in order to avoid any off limits value
which could make the IPU3 behave in a weird manner.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Instead of having a local cached value for line duration, store it in
the IPASessionConfiguration::sensor structure.
While at it, configure the default analogue gain and shutter speed to
controlled fixed values.
The latter is set to be 10ms as it will in most cases be close to the
one needed, making the AGC faster to converge.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
When the effective sensor values are stored during the EventStatReady
event, the lines are long. Fix it.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
When the current exposure value is calculated, it is cached and used by
filterExposure(). Use private filteredExposure_ and pass currentExposure
as a parameter.
In order to limit the use of filteredExposure_, return the value from
filterExposure().
While at it, remove a stale comment.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
With the controller algorithms running at 60fps, there are some dropped frames
when running at very high famerates. Reducing this to 30fps eliminates all these
drops without any noticeable changes to the image quality.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We now handle disabling ("pausing") AWB in the same way as
AEC/AGC. Instead of letting the pause flag be set so that the code
never runs at all, we instead fix the manual settings to the current
values (but continue to be called).
The algorithm does not restart any calculations in this state, but
continues to add AWB metadata to every frame. Therefore certain other
algorithms that want to know it (CCM and ALSC, for example) can still
find it.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Increase the maximum allowable gain from 6.0 to 8.0 in the normal and short
exposure profiles for all camera sensors. Increase this limit to 12.0 for the
long exposure profiles for sensors where this has been defined.
The 6.0x value was somewhat arbitrarily chosen, and does limit the total
exposure in dark conditions.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Limit the gain code to the maximum value reported by the sensor controls
when sending to DelayedControls. The AGC algorithm will handle a lower
gain used by the sensor, provided it knows the actual gain used. This
change ensures that DelayedControls will never report an unclipped gain
used.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Use the new utils::abs_diff() function where appropriate to replace
manual implementations.
While at it fix a header ordering issue in
src/libcamera/pipeline/raspberrypi/raspberrypi.cpp.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
For consistency with UniqueFD, rename the fd() function to get().
Renaming UniqueFD::get() to fd() would have been another option, but was
rejected to keep as close as possible to the std::shared_ptr<> and
std::unique_ptr<> APIs.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
Now that we have a UniqueFD class, the name FileDescriptor is ambiguous.
Rename it to SharedFD.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The FileDescriptor class is a generic helper that matches the criteria
for the base library. Move it there.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The non-static class member "numCells_" is not initialized in the
constructor. Fix it.
Reported-by: Coverity CID 365801
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Report value of sensor's gain pertaining to the current request
completion, as that is the gain value the request completed with.
The Agc algorithm processes the gain value for incoming next request
hence it should not be reported in request's metadata.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When a new parameter buffer needs to be queued, we need to specify which
algorithm is activated or not in the ISP. Add a simple prepare function
in AGC for that, which may later evolve to take the exposure locking
into account. For that function to be called, we also need to add the
loop on the algorithms in IPARkISP1::queueRequest.
We no longer disable the AE algorithm based on the controls::AeEnable,
which will be handled in a different manner later.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Now that we have IPAContext and Algorithm, we can implement a simple AGC
based on the IPU3 one. It is very similar, except that there is no
histogram used for an inter quantile mean. The RkISP1 is returning a 5x5
array (for V10) of luminance means. Estimating the relative luminance is
thus a simple mean of all the blocks already calculated by the ISP.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The ISP can use 25 or 81 cells depending on its revision. Remove the
cached value in IPARkISP1 and use IPASessionConfiguration to store it
and pass it to AGC later.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The pipeline handler populates a new sensorControls ControlList, to
have the effective exposure and gain values for the current frame. This
is done when a statistics buffer is received.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Now that libipa offers a templated class for Algorithm, use it in
RkISP1.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|