Age | Commit message (Collapse) | Author |
|
Drop exposure, gain private members from IPAIPU3 because the values
are handled directly via IPAFrameContext.
Move the default vblank value from IPAIPU3 to IPASessionConfiguration
structure as it is a default static value not expected to change
for a session.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Since VCM for surface Go 2 (dw9719) had been successfully driven, this
Af module can be used to control the VCM and determine the focus value
based on the IPU3 AF state.
Based on the values from the IPU3 AF buffer, the variance of each focus
step is determined and a greedy approach is used to find the maximum
variance of the AF state and an appropriate focus value.
The grid configuration is implemented as a context. Also, the grid
parameter- AF_MIN_BLOCK_WIDTH is set to 4 (default is 3) since if the
default value is used, x_start (x_start > 640) will be at an incorrect
location of the image (rightmost of the sensor).
Signed-off-by: Kate Hsuan <hpa@redhat.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Instead of having a local cached value for line duration, store it in
the IPASessionConfiguration::sensor structure.
While at it, configure the default analogue gain and shutter speed to
controlled fixed values.
The latter is set to be 10ms as it will in most cases be close to the
one needed, making the AGC faster to converge.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Remove the verbose #ifndef/#define/#endif pattern for maintaining
header idempotency, and replace it with a simple #pragma once.
This simplifies the headers, and prevents redundant changes when
header files get moved.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
The AWB estimates the color temperature, but it is not used at all. It
can be useful for debug purpose at least, but also for lux estimation
later, to be able to know the temperature estimated for a given frame.
Add a new member to the IPAFrameContext::awb for this purpose, and
update the value in AWB.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The pipeline handler populates the new sensorControls ControlList, to
have the effective exposure and gain values for the current frame. This
is done when a statistics buffer is received.
Make those values the frameContext::sensor values for the frame when the
EventStatReady event is received.
AGC also needs to use frameContext.sensor as its input values and
frameContext.agc as its output values. Modify computeExposure by passing
it the frameContext instead of individual exposure and gain values.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The ipa_context.h entry incorrectly referenced its file name.
Fix it.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
The tone mapping algorithm calculates the gamma curve for every frame,
regardless of whether the gamma value has changed or not. This issue is
exasperated as we currently hardcode the gamma to a single value.
Optimise the implementation to only recalculate the look up table when
the gamma setting is changed, and store the gamma setting of the LUT
curve as part of the IPA context.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The AGC frame context needs to be initialised correctly for the first
iteration. Until now, the IPA uses the minimum exposure and gain values
and caches those in local variables.
In order to give the sensor limits to AGC, create a new structure in
IPASessionConfiguration. Store the exposure in time (and not line
duration) and the analogue gain after CameraSensorHelper conversion.
Set the gain and exposure appropriately to the current values known to
the IPA and remove the setting of exposure and gain in IPAIPU3 as those
are now fully controlled by IPU3Agc.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The statistics buffer 'ipu3_uapi_awb_raw_buffer' stores the ImgU
calculation results in a buffer aligned horizontally to a multiple of 4
cells. The AWB loop should take care of it to add the proper offset
between lines and avoid any staircase effect.
It is no longer required to pass the grid configuration context to the
private functions called from process() which simplifies the code flow.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In preparation for using the AGC through the new algorithm interfaces,
convert the existing code to use the new function types.
Now that the process call is rewritten, re-enable the compiler flag to
warn when a function declaration hides virtual functions from a base class
(-Woverloaded-virtual).
We never use converged_ so remove its declaration. The controls may not
need to be updated at each call, but it should be decided on the context
side and not by a specific call by using a lock status in the Agc
structure for instance.
As the params_ local variable is not useful anymore, remove it here
too.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
When the stats are received, pass them with the context to the existing
AWB algorithm. IPAFrameContext now has a new structure to store the
gains calculated by the AWB algorithm.
When an EventFillParams event is received, call prepare() and set the new
gains accordingly in the params structure.
There is no more a need for the IPU3Awb::initialise() function, as the
params are always set in prepare().
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Introduce a new algorithm to manage the tone mapping handling of the
IPU3.
The initial algorithm is chosen to configure the gamma contrast curve
which moves the implementation out of AWB for simplicity. As it is
initialised with a default gamma value of 1.1, there is no need to use
the default table at initialisation anymore.
This demonstrates the way to use process() call when the EventStatReady
comes in. The function calculates the LUT in the context of a frame, and
when prepare() is called, the parameters are filled with the updated
values.
AGC is modified to take the new process interface into account.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Implement a new modular framework for algorithms with a common context
structure that is passed to each algorithm through a common API.
This patch:
- removes all the local references from IPAIPU3 and uses IPAContext
- implements the list of pointers and the loop at configure call on each
algorithm
- loops in fillParams on each prepare() call on the algorithm list
- loops in prepareStats on each process() call on the algorithm list
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
An increasing amount of data and information needs to be shared between
the components that build up to implement image processing algorithms.
Create a context structure which will allow us to work towards calling
algorithms in a modular way, and sharing information between the modules.
The IPA context is a global context set at configure time
(IPASessionConfiguration) and a per-frame context (IPAFrameContext) used
while streaming.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|