Age | Commit message (Collapse) | Author |
|
The gains are currently set as a uint32_t while the analogue gain is
passed as a double. We also have a default maximum analogue gain of 15
which is quite high for a number of sensors.
Use a maximum value of 8 which should really be configured by the IPA
and not fixed as it is now. While at it make it a double.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We are using arbitrary constants for the exposure limit in a number of
lines.
Instead of using static constants for those, use the limits of the
sensor passed in IPASessionConfiguration and cache those.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The exposure value is filtered in filterExposure() using the
currentExposure_ and setting a prevExposure_ variable. This is misnamed
as it is not the previous exposure, but a filtered value.
Rename it accordingly.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
When zones are used for the grey world algorithm, they are only
considered if their average green value is at least 32/255 to exclude
zones that are too dark and don't provide relevant colour information
(on the opposite side of the spectrum, saturated regions are excluded by
the ImgU statistics engine).
The algorithm requires a minimal number of zones that meet this criteria
in order to run. Now that we correct the black level, the 32/255 minimal
value is a bit high and prevents the algorithm for running in low-light
conditions. Lower the value to 16/255 to fix it.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The AWB grey world algorithm tries to find a grey value and it can't do
it on over-exposed images. To exclude those, the saturation ratio is
used for each cell, and the cell is included only if this ratio is 0.
Now that we have changed the threshold, more cells may be considered as
partially saturated and excluded, preventing the algorithm from running
efficiently.
Change that behaviour, and consider 90% as a good enough ratio.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The AGC frame context needs to be initialised correctly for the first
iteration. Until now, the IPA uses the minimum exposure and gain values
and caches those in local variables.
In order to give the sensor limits to AGC, create a new structure in
IPASessionConfiguration. Store the exposure in time (and not line
duration) and the analogue gain after CameraSensorHelper conversion.
Set the gain and exposure appropriately to the current values known to
the IPA and remove the setting of exposure and gain in IPAIPU3 as those
are now fully controlled by IPU3Agc.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We can have a saturation ratio per cell, giving the percentage of pixels
over a threshold within a cell where 100% is set to 0xff.
The parameter structure 'ipu3_uapi_awb_config_s' contains four fields to
set the threshold, one per channel.
The blue field is also used to configure the ImgU and make it calculate
the saturation ratio or not.
Set a green value saturated when it is more than 230 (90% of the maximum
value 255, coded as 8191). As this is the only channel used for AGC,
there is no need to apply it to the other ones.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
"using" directives are harmful in headers, as they propagate the
namespace short-circuit to all files that include the header, directly
or indirectly. Drop the directive from agc.h, and use utils::Duration
explicitly. While at it, shorten the namespace qualifier from
libcamera::utils:: to utils:: in agc.cpp for Duration.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The Awb::generateZones() member function fills the zones vector passed
as an argument, which is actually a member variable. Use it directly in
the function.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
When a new CameraConfiguration is applied to the Camera the IPA is
configured as well, using the newly applied sensor configuration and its
updated V4L2 controls.
Also update the Camera controls at IPA::configure() time by re-computing
the controls::ExposureTime and controls::FrameDurationLimits limits and
update the controls on the pipeline handler side after having configured
the IPA.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The intel-ipu3.h public interface from the kernel does not define how to
parse the statistics for a cell. This had to be identified by a process
of reverse engineering, and later identifying the structures from [0]
leading to our custom definition of struct Ipu3AwbCell.
[0]
https://chromium.googlesource.com/chromiumos/platform/arc-camera/+/refs/heads/master/hal/intel/include/ia_imaging/awb_public.h
To improve the kernel interface, a proposal has been made to the
linux-kernel [1] to incorporate the memory layout for each cell into the
intel-ipu3 header directly.
[1]
https://lore.kernel.org/linux-media/20211005202019.253353-1-jeanmichel.hautbois@ideasonboard.com/
Update our local copy of the intel-ipu3.h to match the proposal and
change the AGC and AWB algorithms to reference that structure directly,
allowing us to remove the deprecated custom Ipu3AwbCell definition.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Now that we know how the AWB statistics are formatted, use a simplified
loop in processBrightness() to parse the green values and get the
histogram.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The pixels output by the camera normally include a black level, because
sensors do not always report a signal level of '0' for black. Pixels at
or below this level should be considered black and to achieve that, we
need to substract an offset to all the pixels. This can be taken into
account by reading the lowest value of a special region on sensors which
is not exposed to light. This provides a substracting factor to be
able to adjust the expected black levels in the resulting images.
For a camera outputting 10-bit pixel values (in the range 0 to 1023) a
typical black level might be 64. It is a fixed value, obtained by
capturing a raw frame with minimum exposure and gain fixed to 1.0 while
covering the sensor (the darker the better). We consider it good enough
as a very first approximation, until we measure it during a tuning
process and include it in a configuration file
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The statistics buffer 'ipu3_uapi_awb_raw_buffer' stores the ImgU
calculation results in a buffer aligned horizontally to a multiple of 4
cells. The AWB loop should take care of it to add the proper offset
between lines and avoid any staircase effect.
It is no longer required to pass the grid configuration context to the
private functions called from process() which simplifies the code flow.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The gains have a precision u3.13, range [0, 8[ which means that a gain
multiplier value of 1.0 is represented as a multiplication by 8192 in
the ImgU. Correct the gains as this was misunderstood in the first
place.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The algorithm uses the statistics of a cell only if there is not too
much saturated pixels in it. The grey world algorithm works fine when
there are a limited number of outliers.
Consider a valid zone to be at least 80% of unsaturated cells in it.
This value could very well be configurable, and make the algorithm more
or less tolerant.
While at it, implement it in a configure() call as it will not change
during execution, and cache the cellsPerZone values estimated with
std::round as we are using cmath.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The loops over the width and height of the image when calculating the
BDS grid parameters are nested, but they're actually independent. Split
them to reduce the complexity.
While at it, split out the constants to documented const expressions
for the grid sizes.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Until now, the limits used to calculate the grid based on the Bayer Down
Scaler configuration where taken from the kernel documentation [0].
While testing and understanding the format of the ImgU statistics, it
appears that the ones defined in CrOS [1] are the correct ones. Use
those.
[0] https://www.kernel.org/doc/html/latest/userspace-api/media/v4l/pixfmt-meta-intel-ipu3.html?highlight=v4l2_meta_fmt_ipu3_params#intel-ipu3-imgu-uapi-data-types
[1] https://chromium.googlesource.com/chromiumos/platform/arc-camera/+/refs/heads/master/hal/intel/include/ia_imaging/awb_public.h
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The variables mix the terms cell, region and zone. It can confuse the
reader, and make the algorithm more difficult to follow. Rename the
local variables to be consistent with their definitions:
- Cells are defined in Pixels
- Zones are defined in Cells
There is no "region" as such, so replace it with the correct term.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The pixel component sums for the Accumulator are inconsistent with other
similar structures such as the IPAFrameContext::awb::gains. Group the
red, green, and blue sums together in a struct and store them as
uint64_t to reduce potential architectural differences.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The IspStatsRegion structure was introduced as an attempt to prepare for
a generic AWB algorithm structure. The structure name by itself is not
explicit and it is too optimistic to try and make a generic one for now.
Its role is to accumulate the pixels in a given zone. Rename it to
accumulator, and remove the uncounted field at the same time. It is
always possible to know how many pixels are not relevant for the
algorithm by calculating total-counted. The uncounted field was only
declared and not used. Amend the documentation accordingly.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The structure Ipu3AwbCell describes the AWB stats layout on the kernel
side. We will need it to be used by the AGC algorithm to be introduced
later, so let's make it visible from ipa::ipu3::algorithms and not only
for the AWB class.
The IspStatsRegion will be needed by AGC too, so let's move it in the
same namespace too.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The IPA::configure() function has an IPAConfigInfo parameters which
contains a map of numerical indexes to ControlInfoMap instances.
This is a leftover of the old IPA protocol, where it was not possible to
specify a rich interface as it is possible today and each entity
ControlInfoMap was indexed by a numerical id and stored in a map.
Now that the IPA interface allows to specify parameters by name, drop the
map and send the sensor's control info map only.
If we'll need more ControlInfoMap to be shared with the IPA, a new parameter
can be added to IPAConfigInfo.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The incoming params buffer may contain uninitialised data, or the
parameters of previously queued frames. Clearing the entire buffer
may be an expensive operation, and the kernel will only read from
structures which have their associated use-flag set.
It is the responsibility of the algorithms to set the use flags
accordingly for any data structure they update during prepare().
Clear the use flags of the parameter buffer before passing the buffer
to the algorithms during their prepare() operations.
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
MappedFrameBuffer::maps() returns planes_. This renames the function
name to planes().
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The IPASessionConfiguration now has the grid configuration stored. Use
it it at prepare() and process() calls in AWB and pass it as a reference
to the private functions when needed.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The IPASessionConfiguration now has the grid configuration stored. Use
it at process() call in AGC and pass it as a reference to the private
functions when needed.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Now that the interface is properly used by the AGC class, move it into
ipa::ipu3::algorithms and let the loops do the calls.
As we need to exchange the exposure_ and gain_ by passing them through the
FrameContext, use the calculated values in setControls() function to
ease the reading.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Now that the interface is properly used by the AWB class, move it into
ipa::ipu3::algorithms and let the loops do the calls.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In preparation for using the AGC through the new algorithm interfaces,
convert the existing code to use the new function types.
Now that the process call is rewritten, re-enable the compiler flag to
warn when a function declaration hides virtual functions from a base class
(-Woverloaded-virtual).
We never use converged_ so remove its declaration. The controls may not
need to be updated at each call, but it should be decided on the context
side and not by a specific call by using a lock status in the Agc
structure for instance.
As the params_ local variable is not useful anymore, remove it here
too.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
When the stats are received, pass them with the context to the existing
AWB algorithm. IPAFrameContext now has a new structure to store the
gains calculated by the AWB algorithm.
When an EventFillParams event is received, call prepare() and set the new
gains accordingly in the params structure.
There is no more a need for the IPU3Awb::initialise() function, as the
params are always set in prepare().
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Introduce a new algorithm to manage the tone mapping handling of the
IPU3.
The initial algorithm is chosen to configure the gamma contrast curve
which moves the implementation out of AWB for simplicity. As it is
initialised with a default gamma value of 1.1, there is no need to use
the default table at initialisation anymore.
This demonstrates the way to use process() call when the EventStatReady
comes in. The function calculates the LUT in the context of a frame, and
when prepare() is called, the parameters are filled with the updated
values.
AGC is modified to take the new process interface into account.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Implement a new modular framework for algorithms with a common context
structure that is passed to each algorithm through a common API.
This patch:
- removes all the local references from IPAIPU3 and uses IPAContext
- implements the list of pointers and the loop at configure call on each
algorithm
- loops in fillParams on each prepare() call on the algorithm list
- loops in prepareStats on each process() call on the algorithm list
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Introduce three functions in the Algorithm class to manage algorithms:
- configure which is called when IPA is configured only
- prepare called on EventFillParams event at each frame when the request
is queued
- process called on EventStatReady event at each frame completion when
the statistics have been generated.
The existing AGC implementation already has a function named process(),
though it has different arguments. Adding the new virtual process()
interface causes a compiler warning due to the AGC implementation
overloading a virtual function, even though the overload can be resolved
correctly.
Temporarily disable the warning in this commit to maintain bisection
until the AGC is converted to the new interface.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
An increasing amount of data and information needs to be shared between
the components that build up to implement image processing algorithms.
Create a context structure which will allow us to work towards calling
algorithms in a modular way, and sharing information between the modules.
The IPA context is a global context set at configure time
(IPASessionConfiguration) and a per-frame context (IPAFrameContext) used
while streaming.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The abstract Algorithm class was originally placed in libipa as an
attempt define a generic algorithm container. This was a little
optimistic and pushed a bit far too early.
Move the Algorithm class into the IPU3 which is the only user of the
class, as we adapt it to support modular algorithm components for the
IPU3.
Not documenting the namespace may cause issues with Doxygen in libipa.
The file libipa.cpp is thus created as an empty file for now, but we
can leverage it in the future to add more global libipa documentation,
and possibly code too.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Tidy-up a bit the inclusions directive in the IPU3 IPA module.
In detail:
- ipu3.cpp is missing inclusions for:
std::abs from <cmath>
std::map from <map>
std::min/max from <algorithm>
std::numeric_limits from <limits>
std::unique_ptr from <memory>
std::vector from <vector>
and does not require <sys/mman.h>
- ipu3_agc has two not used inclusions in the header file and one the cpp file
and is missing <chrono> for std::literals::chrono_literals
- ipu3_awb is missing <algorithm> for std::sort and does not use
<numeric> or <unordered_map>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
All the IPU3 Camera controls are currently initialized by the pipeline
handler which initializes them using the camera sensor configuration and
platform specific requirements.
However, some controls are better initialized by the IPA, which might,
in example, cap the exposure times and frame duration to the constraints
of its algorithms implementation.
Also, moving forward, the IPA should register controls to report its
capabilities, in example the ability to enable/disable 3A algorithms on
request.
Move the existing controls initialization to the IPA, by providing
the sensor configuration and its controls to the IPU3IPA::init()
function, which initializes controls and returns them to the pipeline
through an output parameter.
The existing controls initialization has been copied verbatim from the
pipeline handler to the IPA, if not a for few line breaks adjustments
and the resulting Camera controls values are not changed.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Remove the need for callers to reference PROT_READ/PROT_WRITE directly
from <sys/mman.h> by instead exposing the Read/Write mapping options as
flags from the MappedFrameBuffer class itself.
While here, introduce the <stdint.h> header which is required for the
uint8_t as part of the Plane.
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The MappedFrameBuffer is a convenience feature which sits on top of the
FrameBuffer and facilitates mapping it to CPU accessible memory with
mmap.
This implementation is internal and currently sits in the same internal
files as the internal FrameBuffer, thus exposing those internals to
users of the MappedFramebuffer implementation.
Move the MappedFrameBuffer and MappedBuffer implementation to its own
implementation files, and fix the sources throughout to use that
accordingly.
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Usage of 'method' to refer to member functions comes from Java. The C++
standard uses the term 'function' only. Replace 'method' with 'function'
or 'member function' through the whole code base and documentation.
While at it, fix two typos (s/backeng/backend/).
The BoundMethod and Object::invokeMethod() are left as-is here, and will
be addressed separately.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
libcamera names header files based on the classes they define. The
buffer.h file is an exception. Rename it to framebuffer.h.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
In order for the CameraSensorHelper to be instantiated, we need to find
its factory using the camera sensor model name stored in
IPASettings::sensorModel. As we don't need to do it at each configure
call (the sensor is not changing in-between), implement the init call in
IPAIPU3 to do that.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Headers which must not be exposed as part of the public libcamera API
should include base/private.h.
Any interface which includes the private.h header will only be able to
build if the libcamera_private dependency is used (or the
libcamera_base_private dependency directly).
Build targets which are intended to use the private API's will use the
libcamera_private to handle the automatic definition of the inclusion
guard.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Move the functionality for the following components to the new
base support library:
- BoundMethod
- EventDispatcher
- EventDispatcherPoll
- Log
- Message
- Object
- Signal
- Semaphore
- Thread
- Timer
While it would be preferable to see these split to move one component
per commit, these components are all interdependent upon each other,
which leaves us with one big change performing the move for all of them.
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Move the utils functionality to the libcamera/base library.
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
std::chrono::Duration is provided quite conveniently by
libcamera::utils::Duration wrapper. Port IPAIPU3 to use that
for duration-type entities (such as exposure time), such that
it becomes consistent with rest of the codebase.
The commit doesn't introduce any functional changes.
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
The IPU3 IPA interface does not define a return value from configure().
This prevents errors from being reported back to the pipeline handler
when they occur in the IPA.
Update the IPU3 IPA interface and add return values to the checks in
IPAIPU3::configure() accordingly
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The ipu3_agc.h forward-declares the IPACameraSensorInfo structure, but
incorrectly declares it as a class. This causes a compilation error with
clang:
include/libcamera/ipa/core_ipa_interface.h:24:1: error: 'IPACameraSensorInfo' defined as a struct here but previously declared as a class; this is valid, but may result in linker errors under the Microsoft C++ ABI [-Werror,-Wmismatched-tags]
struct IPACameraSensorInfo
^
../../src/ipa/ipu3/ipu3_agc.h:21:1: note: did you mean struct here?
class IPACameraSensorInfo;
^~~~~
struct
Fix it.
Fixes: 384a53d3cdf7 ("ipa: ipu3: Calculate line duration from IPACameraSensorInfo")
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Tested-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Frame duration is hard-coded for CTS as per [1]. Ideally, to accurately
calculate the frame duration, it needs the VBLANK value from every
frame's exposure. However, this particular bit is yet to be implemented
in IPAIPU3.
Meanwhile, we can at least head in the right direction by not hard
coding the value, instead using the default VBLANK value as reported
by the sensor. Update the existing \todo, to use the derived VBLANK
value as and when it's available from each frame exposure.
[1] 6c5f3fe6ced7 ("ipa: ipu3: Set output frame duration metadata")
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Tested-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|