Age | Commit message (Collapse) | Author |
|
Now that the color temperature is updated per-frame, use the value and
set the corresponding controls::ColourTemperature.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The AWB estimates the color temperature, but it is not used at all. It
can be useful for debug purpose at least, but also for lux estimation
later, to be able to know the temperature estimated for a given frame.
Add a new member to the IPAFrameContext::awb for this purpose, and
update the value in AWB.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The minimum and maximum exposure are stored in lines. Replace it by
values in time to simplify the calculations.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Previously, the exposure value was calculated based on the estimated
shutter time and gain applied. Now that we have the real values for the
current frame, use those before estimating the next one and rename the
variable accordingly.
As the exposure value is updated in the beginning of the computation,
there is no need to initialize effectiveExposureValue anymore in the
configure call, and it can be a local variable and not a class variable
anymore.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
When an image is partially saturated, its brightness is not increasing
linearly when the shutter time or gain increases. It is a big issue with
a backlight as the algorithm is fading to darkness right now.
Introduce a function to estimate the brightness of the frame, based on
the current exposure/gain and loop on it several times to estimate it
again and approach the non linear function.
Inspired-by: 7de5506c30b3 ("libcamera: src: ipa: raspberrypi: agc: Improve gain update calculation for partly saturated images")
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
When we compute the new gain, we use the iqMean_ and estimate an
exposure value gain to apply. Return early when the gain is less than
1%.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Now that we have the real exposure applied at each frame, remove the
early return based on a frame counter and compute the gain for each
frame.
Introduce a number of startup frames during which the filter speed is
1.0, meaning we apply instantly the exposure value calculated and not a
slower filtered one. This is used to have a faster convergence, and
those frames may be dropped in a future development to hide the
convergance process from the viewer.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
When the histogram is calculated, we check if a cell is saturated or not
before cumulating its green value. This is wrong, and it can lead to an
empty histogram in case of a fully saturated frame.
Use a constant to limit the amount of pixels within a cell before
considering it saturated. If at the end of the loop we still have an
empty histogram, then make it a fully saturated one.
Bug: https://bugs.libcamera.org/show_bug.cgi?id=84
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The pipeline handler populates the new sensorControls ControlList, to
have the effective exposure and gain values for the current frame. This
is done when a statistics buffer is received.
Make those values the frameContext::sensor values for the frame when the
EventStatReady event is received.
AGC also needs to use frameContext.sensor as its input values and
frameContext.agc as its output values. Modify computeExposure by passing
it the frameContext instead of individual exposure and gain values.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
IPU3Event and IPU3Action use single ControlList for both libcamera and
V4L2 controls, and it's content could be either one based on the
context. Extend IPU3Event and IPU3Action for sensor V4L2 controls, and
preserve the original one for only libcamera Controls to make the
content of an event more specific.
Signed-off-by: Han-Lin Chen <hanlinchen@chromium.org>
[Jean-Michel: remove lensControls from the original patch]
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The IPU3 IPA has three events which are handled from the pipeline
handler.
The events are received in the sequence, EventProcessControls,
EventFillParams, and finally EventStatReady, while the code lists these
in a different order.
Update the flow of IPAIPU3::processEvent() to match the expected
sequence of events, to help support the reader in interpreting the flow
of events through the IPA.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
In case the maximum exposure received from the sensor is very high, we
can have a very high shutter speed with a small analogue gain, and it
may result in very slow framerate. We are not really supporting it for
the moment, so clamp the shutter speed to an arbitrary value of 60ms.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The IPAFrameContext uses unnamed structures to group items. Doxygen
doesn't seem to support this properly, documentation isn't properly
generated and warnings are output during compilation. Suppress the
warning with a workaround that still results in incorrect generated
documentation until Doxygen gets fixed.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
[JMH: Fix doxygen variable usage]
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
While the stop() function does not currently perform any action, it forms
part of the IPA interface and is a public function in the class.
Promote it to a full (but basic) function implementation and begin the
documentation accordingly so that there is an appropriate stub to
perform stop operations if they come up.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Doxygen warns us because the structures are referenced as \struct while
they should be \var. Fix it.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The IPU3 IPA core is growing with additional documentation. The
ipa_context documentation is stored here, but it pushes the IPU3
documentation and implementation further from the head of the file.
Furthermore, the ipa_context documentation is outside of the ipa::ipu3
namespace and isn't identified correctly by Doxygen.
Move the ipa_context to its own compilation object even though there
isn't any code, but to maintain consistency with our documentation
model.
Correctly re-introduce the documentation into the libcamera::ipa::ipu3
namespace during the move.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
The struct RGB and struct AwbStatus are used only by the internal
implementation of the AWB algorithm module.
Move them into the private class declaration.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
The AWB AwbStatus structure is contained within the Awb class.
Fix the Doxygen reference so that it can be found.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
The ipa_context.h entry incorrectly referenced its file name.
Fix it.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
The IPU3 IPA implements the basic 3A using the ImgU ISP.
Provide an overview document to describe its operations, and provide a
block diagram to help visualise how the components are put together to
assist any new developers exploring the code.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
The tone mapping algorithm is currently undocumented.
Provide an introduction and overview to the implementation as the class
definition and document how the algorithm operates in the process and
prepare methods.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Let the algorithm perform its initial configuration. Implement
configure() to set a default gamma value and let process do the updates
needed.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The tone mapping algorithm calculates the gamma curve for every frame,
regardless of whether the gamma value has changed or not. This issue is
exasperated as we currently hardcode the gamma to a single value.
Optimise the implementation to only recalculate the look up table when
the gamma setting is changed, and store the gamma setting of the LUT
curve as part of the IPA context.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The AGC class was not documented while developing. Extend that to
reference the origins of the implementation, and improve the
descriptions on how the algorithm operates internally.
While at it, rename the functions which have bad names.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Now that we moved the diagram into the AWB class documentation, reword the
accumulator documentation to make it clear it is not meant to be used
only in AWB.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The AWB algorithm is based on the Grey world algorithm and uses the
statistics generated by the ImgU for that. Explain how it uses those,
and reference the original algorithm at the same time.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The stats pointer is marked as [[maybe_unused]]. This is a leftover from
a previous commit which was here to keep the compatibility while
transitioning to the new iterative algorithms.
Remove this attribute to make it explicit that stats are really used to
feed the algorithms.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Clarify the roles and interactions between the pipeline handler events
and the algorithm calls by documenting all the remaining functions of
the class.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Further extend the documentation for the IPAIPU3::configure operation.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
The IPU3 IPA is maturing to a modular and extensible system capable of
handling specific algorithms for the processing blocks on the ImgU.
Provide a top-level class documentation to provide an overview of the
IPA, detailing what events are used and what algorithms are currently
supported, as well as the limitations currently imposed.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Instead of using constants for the analogue gains limits, use the
minimum and maximum from the configured sensor.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
We currently control the exposure value by the shutter speed and the
analogue gain. We can't use the digital gain to have more than the
maximum exposure value calculated because we are not controlling it.
Remove unused code associated with this digital gain.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Simplify the reading by removing one level of indentation to return
early when the change is small between two calls.
Reword the LOG() message when we are correctly exposed, and move the
lastFrame_ variable to update it even if the change is small.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We need to calculate the gain on the previous exposure value calculated.
Now that we initialise the exposure and gain values in configure(), we
know the initial exposure value, and we can set it before any loop is
running.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We have mixed terms between gain, analogue gain and the exposure value
gain.
Make it clear when we are using the analogue gain from the sensor, and
when we are using the calculated gain to be applied to the exposure
value to reach the target.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Until now, the algorithm makes complex assumptions when dividing the
exposure and analogue gain values. Instead, use a simpler clamping of
the shutter speed first, and then of the analogue gain, based on the
limits configured.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
We are filtering the exposure value to limit the gain to apply, but we
are not using the result.
Fix it.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The gains are currently set as a uint32_t while the analogue gain is
passed as a double. We also have a default maximum analogue gain of 15
which is quite high for a number of sensors.
Use a maximum value of 8 which should really be configured by the IPA
and not fixed as it is now. While at it make it a double.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We are using arbitrary constants for the exposure limit in a number of
lines.
Instead of using static constants for those, use the limits of the
sensor passed in IPASessionConfiguration and cache those.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The exposure value is filtered in filterExposure() using the
currentExposure_ and setting a prevExposure_ variable. This is misnamed
as it is not the previous exposure, but a filtered value.
Rename it accordingly.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
When zones are used for the grey world algorithm, they are only
considered if their average green value is at least 32/255 to exclude
zones that are too dark and don't provide relevant colour information
(on the opposite side of the spectrum, saturated regions are excluded by
the ImgU statistics engine).
The algorithm requires a minimal number of zones that meet this criteria
in order to run. Now that we correct the black level, the 32/255 minimal
value is a bit high and prevents the algorithm for running in low-light
conditions. Lower the value to 16/255 to fix it.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The AWB grey world algorithm tries to find a grey value and it can't do
it on over-exposed images. To exclude those, the saturation ratio is
used for each cell, and the cell is included only if this ratio is 0.
Now that we have changed the threshold, more cells may be considered as
partially saturated and excluded, preventing the algorithm from running
efficiently.
Change that behaviour, and consider 90% as a good enough ratio.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The AGC frame context needs to be initialised correctly for the first
iteration. Until now, the IPA uses the minimum exposure and gain values
and caches those in local variables.
In order to give the sensor limits to AGC, create a new structure in
IPASessionConfiguration. Store the exposure in time (and not line
duration) and the analogue gain after CameraSensorHelper conversion.
Set the gain and exposure appropriately to the current values known to
the IPA and remove the setting of exposure and gain in IPAIPU3 as those
are now fully controlled by IPU3Agc.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We can have a saturation ratio per cell, giving the percentage of pixels
over a threshold within a cell where 100% is set to 0xff.
The parameter structure 'ipu3_uapi_awb_config_s' contains four fields to
set the threshold, one per channel.
The blue field is also used to configure the ImgU and make it calculate
the saturation ratio or not.
Set a green value saturated when it is more than 230 (90% of the maximum
value 255, coded as 8191). As this is the only channel used for AGC,
there is no need to apply it to the other ones.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
"using" directives are harmful in headers, as they propagate the
namespace short-circuit to all files that include the header, directly
or indirectly. Drop the directive from agc.h, and use utils::Duration
explicitly. While at it, shorten the namespace qualifier from
libcamera::utils:: to utils:: in agc.cpp for Duration.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The Awb::generateZones() member function fills the zones vector passed
as an argument, which is actually a member variable. Use it directly in
the function.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
When a new CameraConfiguration is applied to the Camera the IPA is
configured as well, using the newly applied sensor configuration and its
updated V4L2 controls.
Also update the Camera controls at IPA::configure() time by re-computing
the controls::ExposureTime and controls::FrameDurationLimits limits and
update the controls on the pipeline handler side after having configured
the IPA.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The intel-ipu3.h public interface from the kernel does not define how to
parse the statistics for a cell. This had to be identified by a process
of reverse engineering, and later identifying the structures from [0]
leading to our custom definition of struct Ipu3AwbCell.
[0]
https://chromium.googlesource.com/chromiumos/platform/arc-camera/+/refs/heads/master/hal/intel/include/ia_imaging/awb_public.h
To improve the kernel interface, a proposal has been made to the
linux-kernel [1] to incorporate the memory layout for each cell into the
intel-ipu3 header directly.
[1]
https://lore.kernel.org/linux-media/20211005202019.253353-1-jeanmichel.hautbois@ideasonboard.com/
Update our local copy of the intel-ipu3.h to match the proposal and
change the AGC and AWB algorithms to reference that structure directly,
allowing us to remove the deprecated custom Ipu3AwbCell definition.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Now that we know how the AWB statistics are formatted, use a simplified
loop in processBrightness() to parse the green values and get the
histogram.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The pixels output by the camera normally include a black level, because
sensors do not always report a signal level of '0' for black. Pixels at
or below this level should be considered black and to achieve that, we
need to substract an offset to all the pixels. This can be taken into
account by reading the lowest value of a special region on sensors which
is not exposed to light. This provides a substracting factor to be
able to adjust the expected black levels in the resulting images.
For a camera outputting 10-bit pixel values (in the range 0 to 1023) a
typical black level might be 64. It is a fixed value, obtained by
capturing a raw frame with minimum exposure and gain fixed to 1.0 while
covering the sensor (the darker the better). We consider it good enough
as a very first approximation, until we measure it during a tuning
process and include it in a configuration file
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|