Age | Commit message (Collapse) | Author |
|
In preparation for using the AGC through the new algorithm interfaces,
convert the existing code to use the new function types.
Now that the process call is rewritten, re-enable the compiler flag to
warn when a function declaration hides virtual functions from a base class
(-Woverloaded-virtual).
We never use converged_ so remove its declaration. The controls may not
need to be updated at each call, but it should be decided on the context
side and not by a specific call by using a lock status in the Agc
structure for instance.
As the params_ local variable is not useful anymore, remove it here
too.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
When the stats are received, pass them with the context to the existing
AWB algorithm. IPAFrameContext now has a new structure to store the
gains calculated by the AWB algorithm.
When an EventFillParams event is received, call prepare() and set the new
gains accordingly in the params structure.
There is no more a need for the IPU3Awb::initialise() function, as the
params are always set in prepare().
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Introduce a new algorithm to manage the tone mapping handling of the
IPU3.
The initial algorithm is chosen to configure the gamma contrast curve
which moves the implementation out of AWB for simplicity. As it is
initialised with a default gamma value of 1.1, there is no need to use
the default table at initialisation anymore.
This demonstrates the way to use process() call when the EventStatReady
comes in. The function calculates the LUT in the context of a frame, and
when prepare() is called, the parameters are filled with the updated
values.
AGC is modified to take the new process interface into account.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Implement a new modular framework for algorithms with a common context
structure that is passed to each algorithm through a common API.
This patch:
- removes all the local references from IPAIPU3 and uses IPAContext
- implements the list of pointers and the loop at configure call on each
algorithm
- loops in fillParams on each prepare() call on the algorithm list
- loops in prepareStats on each process() call on the algorithm list
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Introduce three functions in the Algorithm class to manage algorithms:
- configure which is called when IPA is configured only
- prepare called on EventFillParams event at each frame when the request
is queued
- process called on EventStatReady event at each frame completion when
the statistics have been generated.
The existing AGC implementation already has a function named process(),
though it has different arguments. Adding the new virtual process()
interface causes a compiler warning due to the AGC implementation
overloading a virtual function, even though the overload can be resolved
correctly.
Temporarily disable the warning in this commit to maintain bisection
until the AGC is converted to the new interface.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
An increasing amount of data and information needs to be shared between
the components that build up to implement image processing algorithms.
Create a context structure which will allow us to work towards calling
algorithms in a modular way, and sharing information between the modules.
The IPA context is a global context set at configure time
(IPASessionConfiguration) and a per-frame context (IPAFrameContext) used
while streaming.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The abstract Algorithm class was originally placed in libipa as an
attempt define a generic algorithm container. This was a little
optimistic and pushed a bit far too early.
Move the Algorithm class into the IPU3 which is the only user of the
class, as we adapt it to support modular algorithm components for the
IPU3.
Not documenting the namespace may cause issues with Doxygen in libipa.
The file libipa.cpp is thus created as an empty file for now, but we
can leverage it in the future to add more global libipa documentation,
and possibly code too.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The bcm2835-isp.h is included with quotes rather than
angle brackets.
Quoted includes is reserved for internal includes, while the
linux/bcm2835-isp.h header is exported from the Linux kernel.
Fix the inclusion type.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Plumb through VIMC mojo interface to enable buffers passing.
VIMC does not have parameters or statistics buffers but we can
mimick the typical case of passing IPA buffers from pipeline
handler to IPA using mock buffers. The mock IPA buffers are
FrameBuffers which are dmabuf backed (in other words, mmap()able
through MappedFramebuffer inside the IPA).
This commits shows:
- Passing the parameter buffer from the pipeline handler to
the IPA through functions defined in mojom interface.
- Passing request controls ControlList to the IPA.
Any tests using VIMC will now loop in the IPA paths. Any tests running
in isolated mode will help us to test IPA IPC code paths especially
around (de)serialization of data passing from pipeline handlers to the
IPA. Future IPA interface tests can simply extend the vimc mojom
interface to achieve/test a specific use case as required.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
VIMC pipeline handler has dmabuf-backed mock FrameBuffers which are
specifically targetted mimicking IPA buffers (parameter and statistics).
Map these mock buffers to the VIMC IPA that would enable exercising IPA
IPC code paths. This will provide leverage to our test suite to test
IPA IPC code paths, which are common to various platforms.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
As part of an effort to make the vimc IPA usable for testing, extend it
with a configure function. The configuration is currently ignored by the
IPA.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
"NOIR" modules are ones that have had the IR filters removed but are
otherwise identical. The same tuning can be used as for the regular
version except that the colour calibration supplied to the AWB
algorithm no longer works. Instead we need to switch the algorithm to
its basic "grey world" method.
Users with "NOIR" modules can switch to the matching "xxx_noir.json"
tuning file by using the LIBCAMERA_RPI_TUNING_FILE environment
variable.
Signed-off-by: David Plowman <david.plowman@raspberypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Tidy-up a bit the inclusions directive in the IPU3 IPA module.
In detail:
- ipu3.cpp is missing inclusions for:
std::abs from <cmath>
std::map from <map>
std::min/max from <algorithm>
std::numeric_limits from <limits>
std::unique_ptr from <memory>
std::vector from <vector>
and does not require <sys/mman.h>
- ipu3_agc has two not used inclusions in the header file and one the cpp file
and is missing <chrono> for std::literals::chrono_literals
- ipu3_awb is missing <algorithm> for std::sort and does not use
<numeric> or <unordered_map>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
All the IPU3 Camera controls are currently initialized by the pipeline
handler which initializes them using the camera sensor configuration and
platform specific requirements.
However, some controls are better initialized by the IPA, which might,
in example, cap the exposure times and frame duration to the constraints
of its algorithms implementation.
Also, moving forward, the IPA should register controls to report its
capabilities, in example the ability to enable/disable 3A algorithms on
request.
Move the existing controls initialization to the IPA, by providing
the sensor configuration and its controls to the IPU3IPA::init()
function, which initializes controls and returns them to the pipeline
through an output parameter.
The existing controls initialization has been copied verbatim from the
pipeline handler to the IPA, if not a for few line breaks adjustments
and the resulting Camera controls values are not changed.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Remove the need for callers to reference PROT_READ/PROT_WRITE directly
from <sys/mman.h> by instead exposing the Read/Write mapping options as
flags from the MappedFrameBuffer class itself.
While here, introduce the <stdint.h> header which is required for the
uint8_t as part of the Plane.
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The MappedFrameBuffer is a convenience feature which sits on top of the
FrameBuffer and facilitates mapping it to CPU accessible memory with
mmap.
This implementation is internal and currently sits in the same internal
files as the internal FrameBuffer, thus exposing those internals to
users of the MappedFramebuffer implementation.
Move the MappedFrameBuffer and MappedBuffer implementation to its own
implementation files, and fix the sources throughout to use that
accordingly.
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Usage of 'method' to refer to member functions comes from Java. The C++
standard uses the term 'function' only. Replace 'method' with 'function'
or 'member function' through the whole code base and documentation.
While at it, fix two typos (s/backeng/backend/).
The BoundMethod and Object::invokeMethod() are left as-is here, and will
be addressed separately.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When the sensor is switched to a mode with a different sensitivity,
the target exposure values need to be adjusted proportionately to
maintain the same image brightness.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
We use the CamHelper class to initialise it to the usual value of 1.
The CamHelper's GetModeSensitivity method can be redefined to
implement a different behaviour for sensors that require it.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add type safety by turning the MapFlag and OpenModeFlag enum into enum
class.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
This commit adds a tuning file for the 12MP imx378 sensor. The sensor
actually shares the same driver (and CamHelper) as the imx477 so only
a new tuning file is required. The default choice of imx477.json can
be overridden by pointing LIBCAMERA_RPI_TUNING_FILE at a version of
the new imx378.json file.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Return controls::FrameDuration through the per-frame Request metadata.
The frame duration is obtained by either the value in DelayedControls,
or (preferably) the value parsed from the embedded data buffer.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add a CameraSensorHelperOv8865 class. The gain coefficients are gleaned
from the datasheet; the lowest 7 bits are reported there as fractional
bits, so real gain is val/128.
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Extend the CameraSensorHelper factory with support for the IMX258
sensor found in the Nautilus Chromebook.
The values are read by manually tweaking the IMX258 kernel driver.
The IMX258 kernel driver hints that the sensor may be compatible
with the MIPI CCS specification, as the register set matches.
The values for analog gain constants are obtained by reading the
register indexes, corresponding to the analog gain constants, as
mentioned in MIPI CCS v1.1 specification.
The values have further been confirmed by Dave Stevenson as being
those specified in the datasheet.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Some values for array sizes differ between v10 and v12, so set them
in init() and adjust the auto exposure algorithm to the ae value
from there.
Signed-off-by: Heiko Stuebner <heiko.stuebner@theobroma-systems.com>
Reviewed-by: Dafna Hirschfeld <dafna.hirschfeld@collabora.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The datasheet states that the low 7 bits are fraction bits.
real_gain = GainCode/128
For example, 0x080 is 1x gain, 0x100 is 2x gain.
It means that we should have m0=1 and c1=128.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Extend the CameraSensorHelper factory with support for an
OV13858 sensor as found in the Soraka Chromebook.
The datasheet states that low 7 bits are fraction bits, so the gain is
calculated as gainCode=128*gain.
According to the formula, it means m0=1 and c1=128.
m1 then has to be 0, and c0=0.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The Metadata class defines a shared_ptr named MetadataPtr. It is not
used anywhere in the source code, so remove it.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
|
|
s/DefaultAnalogueGain/defaultAnalogueGain/
s/DefaultExposureTime/defaultExposureTime/
Change these for consistency with the other static const variables.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
With the recent change to allow long exposures on the imx477, the existing 100s
limit was not adequate.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Update the imx477 CamHelper to use long exposure modes if needed.
This is done by overloading the CamHelper::GetVBlanking function to return a
frame length (and vblank value) computed using a scaling factor when the value
would be larger than what the sensor register could otherwise hold.
CamHelperImx477::Prepare is also overloaded to ensure that the "device.status"
metadata returns the right value if the long exposure scaling factor is used.
The scaling factor is unfortunately not returned back in metadata.
With the current imx477 driver, we can achieve a maximum exposure time of approx
127 seconds since the HBLANK control is read-only.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Store the frame length into the DeviceStatus struct. The value is extracted
from embedded data when available, or calculated from the VBLANK value passed
from DelayedControls otherwise.
Update imx477 and imx219 CamHelper classes to extract the frame length from the
embedded data buffer.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add an operator<< overload to log all fields in DeviceStatus, and remove the
manual logging statements in the IPA and CamHelper.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The constructor sets all fields to 0. This replaces the memset(0) and default
value initialisation usage in the agc and lux controllers respectively.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
This header file is no longer C compatible, so remove the extern "C"
declaration.
Replace C++ style comments with C style based on libcamera guidelines.
There are no functional changes in this commit.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
libcamera names header files based on the classes they define. The
buffer.h file is an exception. Rename it to framebuffer.h.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
A few lines needed to be wrapped under 80 lines.
Remove some unneeded documentation and minor typos.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The [[maybe_unused]] in the IMX477 camera helper isn't needed. This had
been pointed out by Naush during review, but I failed to update the code
before pushing.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
|
|
Instead of having each CamHelper subclass the MdParserSmia, change the
implementation of MdParserSmia to be more generic. The MdParserSmia now gets
given a list of registers to search for and helper functions are used to compute
exposure lines and gain codes from these registers.
Update the imx219 and imx477 CamHelpers by using this new mechanism.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The derived CamHelper class now allocates a metadata parser object through a
unique_ptr that is passed to the base class constructor. This automates the
lifetime management of the parser object.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The necessary tuning file and CamHelper is added for the ov9281 sensor.
The ov9281 is a 1280x800 monochrome global shutter sensor. To enable
it, please add
dtoverlay=ov9281
to the /boot/config.txt file and reboot the Pi.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Warnings about the lack of AWB status results are demoted to being
just "Debug". With monochrome sensors becoming more common this would
otherwise overwhelm the console output, and in practice nothing is
really lost as it is normally very evident if AWB is failing to run.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In order for the CameraSensorHelper to be instantiated, we need to find
its factory using the camera sensor model name stored in
IPASettings::sensorModel. As we don't need to do it at each configure
call (the sensor is not changing in-between), implement the init call in
IPAIPU3 to do that.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
For various sensor operations, it may be needed to do sensor specific
computations, like analogue gain or vertical blanking.
This commit introduces a new camera sensor helper in libipa which aims
to solve this specific issue.
It is based on the MIPI alliance Specification for Camera Command Set
and implements, for now, only the analogue "Global gain" mode.
Setting analogue gain for a specific sensor is not a straightforward
operation, as one needs to know how the gain is calculated for it.
Three helpers are created in this patch: imx219, ov5670 and ov5693.
Adding a new sensor is pretty straightforward as one only needs to
implement the sub-class for it and register that class to the
CameraSensorHelperFactory.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Headers which must not be exposed as part of the public libcamera API
should include base/private.h.
Any interface which includes the private.h header will only be able to
build if the libcamera_private dependency is used (or the
libcamera_base_private dependency directly).
Build targets which are intended to use the private API's will use the
libcamera_private to handle the automatic definition of the inclusion
guard.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Move span, and adjust the Doxygen exclusion as well.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The File abstraction is a base helper and not part of the libcamera
API. Move it to to allow usage by users of the base library.
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Move the functionality for the following components to the new
base support library:
- BoundMethod
- EventDispatcher
- EventDispatcherPoll
- Log
- Message
- Object
- Signal
- Semaphore
- Thread
- Timer
While it would be preferable to see these split to move one component
per commit, these components are all interdependent upon each other,
which leaves us with one big change performing the move for all of them.
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Move the utils functionality to the libcamera/base library.
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
std::chrono::Duration is provided quite conveniently by
libcamera::utils::Duration wrapper. Port IPAIPU3 to use that
for duration-type entities (such as exposure time), such that
it becomes consistent with rest of the codebase.
The commit doesn't introduce any functional changes.
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|