Age | Commit message (Collapse) | Author |
|
Currently we have a single structure of IPAFrameContext but
subsequently, we shall have a ring buffer (or similar) container
to keep IPAFrameContext structures for each frame.
It would be a hassle to query out the frame context required for
process() (since they will reside in a ring buffer) by the IPA
for each process. Hence, prepare the process() libipa template to
accept a particular IPAFrameContext early on.
As for this patch, we shall pass in the pointer as nullptr, so
that the changes compile and keep working as-is.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Currently, IPAFrameContext consolidates the values computed by the
active state of the algorithms, along with the values applied on
the sensor.
Moving ahead, we want to have a frame context associated with each
incoming request (or frame to be captured). This shouldn't necessarily
be tied to "active state" of the algorithms hence:
- Rename current IPAFrameContext -> IPAActiveState
This will now reflect the latest active state of the algorithms and
has nothing to do with any frame-related ops/values.
- Re-instate IPAFrameContext with a sub-structure 'sensor' currently
storing the exposure and gain value.
Adapt the various access to the frame context to the new changes
as described above.
Subsequently, the re-instated IPAFrameContext will be extended to
contain a frame number and ControlList to remember the incoming
request controls provided by the application. A ring-buffer will
be introduced to store these frame contexts for a certain number
of frames.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The OV5675 is an OmniVision sensor with a linear gain model, expressed
in 1/128 steps.
Cc: Quentin Schulz <foss+libcamera@0leil.net>
Signed-off-by: Quentin Schulz <quentin.schulz@theobroma-systems.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
These changes retrieve the correct value for sensitivity of the mode
selected for the sensor. This value is known to the CamHelper which
passes it across to the pipeline handler so that it can be set
correctly in the camera properties.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The message that prints all controls in a request in
IPARPi::queueRequest() is noisy. Demote it from Info to Debug.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
|
|
A not initialized frame ignore counter (ignoreCounter_) makes the AF
function not work since the ignore counter may start from a random
negative number. The counter was set to kIgnoreFrame when AF is in
prepare stage.
Signed-off-by: Kate Hsuan <hpa@redhat.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Simplify name-spacing of the RPi components by placing it in the
ipa::RPi namespace directly. It also aligns the RPi IPA with the other
ones (ipa::ipu3 and ipa::rkisp1) which already have this applied.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Synchronise the names of the operations with respect to parameters
buffer with the names used in other IPA interfaces.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
It is preferred that the interface definition should represent
the logical order in which the operations will be called.
The patch has no functional changes.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Simplify the accumulation of the total and variance with a ternary
operator.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Kate Hsuan <hpa@redhat.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Convert the y_table_item_t to a Span and use that for iteration when
estimating variance of the table.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Kate Hsuan <hpa@redhat.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The af statistics can be accessed directly from the mapped buffer.
Remove the redundant memcpy, and simplify the call to
afEstimateVariance().
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Kate Hsuan <hpa@redhat.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Use our geometry classes for Rectangle, Size and Point to identify
the region of interest for the autofocus, and center it on the BDS output.
This will facilitate custom ROI being passed in through controls at a later
time.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Kate Hsuan <hpa@redhat.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
A selection of constants are imported from ChromiumOS.
Move these out of the header, and simplify their documentation. Further
more, add a direct reference to the location they were obtained from.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Kate Hsuan <hpa@redhat.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Provide enforcement of the selection of the block_{width,height}_log2
parameters to the capabilities of the hardware.
While this selection is currently hardcoded to the minimum, providing
the restriction now allows for further dynamic sizing in the future and
documents the restrictions directly in code, making use of the already
existing constants.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Since we have moved away from switch/case on the operation ID,
there's little reason to split the operation in two functions.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Since we have moved away from switch/case on the operation ID,
there's little reason to split the operation in two functions.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The IPAIPU3 interface currently uses event-type based structures in
order to communicate with the pipeline-handler (and vice-versa).
Replace the event based structures with dedicated functions associated
to each operation.
The translated naming scheme of actions to signals are:
ActionSetSensorControls => setSensorControls
ActionParamFilled => paramsBufferReady
ActionMetadataReady => metadataReady
The translated naming scheme of events to dedicated functions are:
EventProcessControls => queueRequest()
EventStatReady => processStatsBuffer()
EventFillParams => fillParamsBuffer()
The dedicated functions are called from pipeline-handler to the IPA
using IPC. These functions run asynchronously and when completed,
the IPA emits the respective signals as stated above in the translated
naming scheme.
The EventProcessControls is translated to queueRequest() to bring
symmetry to the IPU3 interface with other IPA interfaces.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Enable the V4L2VideoDevice dequeue timeout for the Unicam Image node, and
connect the timeout signal to a slot in the pipeline handler. This slot will
log an error message informing the user of a possible hardware stall.
The timeout is calculated as 2x the maximum frame length possible for a given
mode, returned by the IPA.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Under the right circumstances, the alsc calculations could spread the colour
errors across the entire image as lambda remains unbound. This would cause the
corrected image chroma values to slowly drift to incorrect values.
This change adds a config parameter (alsc.lambda_bound) that provides an upper
and lower bound to the lambda value at every stage of the calculation. With this
change, we now adjust the lambda values so that the average across the entire
grid is 1 instead of normalising to the minimum value.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Synchronise with other IPA interfaces (for e.g. IPU3, RkISP1)
that uses queueRequest() to pass in the request controls to
IPA.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Queuing of request (i.e. passing of controls to the IPA)
and filling of the parameters buffer are two separate operations.
Treat them as such by splitting them into two functions in
the rkisp1 IPA interface.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The OV5640 is an OmniVision sensor with a linear gain model, expressed
in 1/16 steps.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The IMX296 is a Sony sensor that expresses its gain in 0.1dB units. It
thus maps to the exponential gain model.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The IMX290 is a Sony sensor that expresses its gain in 0.3dB units. It
thus maps to the exponential gain model.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The CameraSensorHelper specifies two gain models, linear and
exponential. They are modelled after the MIPI CCS specification. Only
the linear model has been implemented, the exponential model was left
for later.
We now need to support sensors that configure their gain in a hardware
register with a value expressed in dB. This has similarities with the
MIPI CCS exponential gain model, but is only has an exponential factor,
while CCS also allows sensors to support a configurable linear factor.
The full CCS exponential model needs two values (for the linear and
exponential factors) to express a gain, while IPAs use a single linear
gain value internally. However, the exponential gain model example in
the CCS specification has a fixed linear factor, which may indicate that
it could be common for sensors that implement the exponential gain model
to only use the exponential factor. For this reason, implement the
exponential gain model with a fixed linear factor, but with a
sensor-specific coefficient for the exponential factor that allows
expressing the gain in dB (or other logarithmical units) instead of
limiting it to powers of 2 as in the MIPI CCS specification.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
To prepare for other gain models than the linear model, store the gain
constants in a union with per-model members. Due to the lack of
designated initializer support in gcc with C++17, initializing a single
complex structure that includes a union will be difficult. Split the
gain model type to a separate variable to work around this issue.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Add a validation check for sensor controls validateSensorControls()
before they are queried in IPAIPU3::updateSessionConfiguration().
Fail the IPAIPU3::configure() if the required sensor controls are
not found.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Drop exposure, gain private members from IPAIPU3 because the values
are handled directly via IPAFrameContext.
Move the default vblank value from IPAIPU3 to IPASessionConfiguration
structure as it is a default static value not expected to change
for a session.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The exposure and gain limits are required for AGC configuration
handled in IPAIPU3::updateSessionConfiguration(), which is happening
already. Therefore the max/min private members in IPAIPU3 class for
exposure/gain serve no use except setting initial values of exposure_
and gain_ members.
Drop the max/min private members from IPAIPU3 class and set initial
gain_ and exposure_ in IPAIPU3::updateSessionConfiguration().
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The RkISP1 ISP calculates a mean value for Y, Cr and Cb at each frame.
There is a RGB mode which could theoretically give us the values for R,
G and B directly, but it seems to be failing right now.
Convert those values into R, G and B and estimate the gain to apply in a
grey world.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Peter Griffin <peter.griffin@linaro.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
As for the IPU3, we can estimate the histogram of the luminance. The
RkISP1 can estimate multiple ones, the R, G and B ones, the Y only one
and a combination of RGB. The one we are interested by in AGC is the Y
histogram.
Use the hardware revision to determine the number of bins of the
produced histogram.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Peter Griffin <peter.griffin@linaro.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The Histogram constructor does not modify the data. Pass it a
Span<const uint32_t> instead of a Span<uint32_t>.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In order to have the proper pixel levels, apply a fixed black level
correction, based on the imx219 tuning file in RPi. The value is 4096 on
16 bits, and the pipeline for RkISP1 is on 12 bits, scale it.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Peter Griffin <peter.griffin@linaro.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Introduce a frameCount variable in the IPAFrameContext which increments
each time a request is queued. It is reset at configure call, when the
camera is started.
This will allow the frameCount to be used by other algorithms, without
having to keep multiple private frame counters.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The IPARkISP1Interface currently uses event-type based structures in
order to communicate with the pipeline-handler (and vice-versa).
Replace the event based structures with dedicated functions associated
to each operation.
The translated naming scheme of operations to dedicated functions:
ActionV4L2Set => setSensorControls
ActionParamFilled => paramsBufferReady
ActionMetadata => metdataReady
EventSignalStatBuffer => processStatsBuffer()
EventQueueRequest => queueRequest()
The lexical of IPARkISP1::metadataReady() will now conflict with the
metadataReady Signal being introduced in this patch as part of the
interface change. Hence, rename IPARkISP1::metadataReady() to
IPARkISP1::prepareReady() to prevent the conflict.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The private members of exposure and gain limits can be dropped
from IPARkISP1 since they are not used class-wide and can be easily
replaced by local counter-parts.
In case they are required to be widely available, these should then
sit in IPASessionConfiguration.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The Sony IMX296 exists in two versions, colour (Bayer) and monochrome.
The tuning file corresponds to the monochrome version.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
|
|
The Sony IMX296 sensor has an exponential gain model, and adds a fixed
14.26µs offset to the exposure time expressed in line units.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Some sensors (namely the Sony IMX296, whose support will be added
shortly) require different conversion formulas between exposure time and
exposure lines. Make the Exposure() and ExposureLines() functions
virtual to allow this.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The AwbAuto mode is defined in all the JSON tuning files as "auto",
not "normal". The effect of this was that you couldn't switch back to
"auto" mode once you had switched away.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The configure() function has a local configuration variable referencing
context.configuration for the purpose of shortening lines. Use it
instead of context.configuration in the remaining locations, and
constify it while at it as the configuration isn't meant to be modified.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The frame count is used to skip the gain and exposure filtering when
starting. It thus needs to be reset when configuring the algorithm, to
avoid slower convergence when stopping and restarting.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The datasheet for the OV2740 gives 0x80 as 1x gain, so real gain
is GainCode / 128.
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Pick up the focus value from the AF algorithm and send lens controls
along in the frame context.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Now that we have added lens controls, rename the existiing
member of the class to clarify that it relates to the sensor's
controls.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add further members to the ipu3 ipa interface that will hold lens
controls passed in by configInfo
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Since VCM for surface Go 2 (dw9719) had been successfully driven, this
Af module can be used to control the VCM and determine the focus value
based on the IPU3 AF state.
Based on the values from the IPU3 AF buffer, the variance of each focus
step is determined and a greedy approach is used to find the maximum
variance of the AF state and an appropriate focus value.
The grid configuration is implemented as a context. Also, the grid
parameter- AF_MIN_BLOCK_WIDTH is set to 4 (default is 3) since if the
default value is used, x_start (x_start > 640) will be at an incorrect
location of the image (rightmost of the sensor).
Signed-off-by: Kate Hsuan <hpa@redhat.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The gain values are coded as u3.13 fixed point values, ie they can not
be more than 8. Clamp the values in order to avoid any off limits value
which could make the IPU3 behave in a weird manner.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Instead of having a local cached value for line duration, store it in
the IPASessionConfiguration::sensor structure.
While at it, configure the default analogue gain and shutter speed to
controlled fixed values.
The latter is set to be 10ms as it will in most cases be close to the
one needed, making the AGC faster to converge.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|