Age | Commit message (Collapse) | Author |
|
Now that the interface is properly used by the AGC class, move it into
ipa::ipu3::algorithms and let the loops do the calls.
As we need to exchange the exposure_ and gain_ by passing them through the
FrameContext, use the calculated values in setControls() function to
ease the reading.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Now that the interface is properly used by the AWB class, move it into
ipa::ipu3::algorithms and let the loops do the calls.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In preparation for using the AGC through the new algorithm interfaces,
convert the existing code to use the new function types.
Now that the process call is rewritten, re-enable the compiler flag to
warn when a function declaration hides virtual functions from a base class
(-Woverloaded-virtual).
We never use converged_ so remove its declaration. The controls may not
need to be updated at each call, but it should be decided on the context
side and not by a specific call by using a lock status in the Agc
structure for instance.
As the params_ local variable is not useful anymore, remove it here
too.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
When the stats are received, pass them with the context to the existing
AWB algorithm. IPAFrameContext now has a new structure to store the
gains calculated by the AWB algorithm.
When an EventFillParams event is received, call prepare() and set the new
gains accordingly in the params structure.
There is no more a need for the IPU3Awb::initialise() function, as the
params are always set in prepare().
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Introduce a new algorithm to manage the tone mapping handling of the
IPU3.
The initial algorithm is chosen to configure the gamma contrast curve
which moves the implementation out of AWB for simplicity. As it is
initialised with a default gamma value of 1.1, there is no need to use
the default table at initialisation anymore.
This demonstrates the way to use process() call when the EventStatReady
comes in. The function calculates the LUT in the context of a frame, and
when prepare() is called, the parameters are filled with the updated
values.
AGC is modified to take the new process interface into account.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Implement a new modular framework for algorithms with a common context
structure that is passed to each algorithm through a common API.
This patch:
- removes all the local references from IPAIPU3 and uses IPAContext
- implements the list of pointers and the loop at configure call on each
algorithm
- loops in fillParams on each prepare() call on the algorithm list
- loops in prepareStats on each process() call on the algorithm list
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Introduce three functions in the Algorithm class to manage algorithms:
- configure which is called when IPA is configured only
- prepare called on EventFillParams event at each frame when the request
is queued
- process called on EventStatReady event at each frame completion when
the statistics have been generated.
The existing AGC implementation already has a function named process(),
though it has different arguments. Adding the new virtual process()
interface causes a compiler warning due to the AGC implementation
overloading a virtual function, even though the overload can be resolved
correctly.
Temporarily disable the warning in this commit to maintain bisection
until the AGC is converted to the new interface.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
An increasing amount of data and information needs to be shared between
the components that build up to implement image processing algorithms.
Create a context structure which will allow us to work towards calling
algorithms in a modular way, and sharing information between the modules.
The IPA context is a global context set at configure time
(IPASessionConfiguration) and a per-frame context (IPAFrameContext) used
while streaming.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The abstract Algorithm class was originally placed in libipa as an
attempt define a generic algorithm container. This was a little
optimistic and pushed a bit far too early.
Move the Algorithm class into the IPU3 which is the only user of the
class, as we adapt it to support modular algorithm components for the
IPU3.
Not documenting the namespace may cause issues with Doxygen in libipa.
The file libipa.cpp is thus created as an empty file for now, but we
can leverage it in the future to add more global libipa documentation,
and possibly code too.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
IPCMessage::payload() converts the IPCMessage into an IPCUnixSocket
payload. However, if IPCMessage is constructed with one of the
following constructors -
IPCMessage::IPCMessage(),
IPCMessage::IPCMessage(uint32_t cmd)
IPCMessage::IPCMessage(const Header &header)
The data_ vector of IPCMessage is empty and uninitialised. In that
case, IPCMessage::payload will try to memcpy() an empty data_ vector
which can lead to invoking memcpy() with a nullptr parameter, which
is then identified by the address sanity checker.. Add a non-empty
data_ vector check to avoid it.
The issue is noticed by running a test manually, testing the vimc
IPA code paths in isolated mode. It is only noticed when the test
is compiled with -Db_sanitize=address,undefined meson built-in option.
ipc_pipe.cpp:110:8: runtime error: null pointer passed as argument 2, which is declared to never be null
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
In IPCUnixSocket, a payload can be sent/received with empty fd vector,
which leads to passing a nullptr in memcpy() in both sendData()
and recvData(). Add a null check for fd vector's data pointer
to avoid invoking memcpy() with nullptr.
The issue is noticed by running a test manually testing the vimc
IPA code paths in isolated mode. It is only noticed when the test
is compiled with -Db_sanitize=address,undefined meson built-in option.
ipc_unixsocket.cpp:268:8: runtime error: null pointer passed as argument 2, which is declared to never be null
ipc_unixsocket.cpp:312:8: runtime error: null pointer passed as argument 1, which is declared to never be null
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Regarding (de)serialization in isolated IPA calls, we have four layers:
- struct
- byte vector + fd vector
- IPCMessage
- IPC payload
The proxy handles the upper three layers (with help from the
IPADataSerializer), and passes an IPCMessage to the IPC mechanism
(implemented as an IPCPipe), which sends an IPC payload to its worker
counterpart.
When a FileDescriptor is involved, previously it was only a
FileDescriptor in the first layer; in the lower three it was an int. To
reduce the risk of potential fd leaks in the future, keep the
FileDescriptor as-is throughout the upper three layers. Only the IPC
mechanism will deal with ints, if it so wishes, when it does the actual
IPC. IPCPipeUnixSocket does deal with ints for sending fds, so the
conversion between IPCMessage and IPCUnixSocket::Payload converts
between FileDescriptor and int.
Additionally, change the data portion of the serialized form of
FileDescriptor to a 32-bit unsigned integer, for alightnment purposes
and in preparation for conversion to an index into the fd array.
Also update the deserializer of FrameBuffer::Plane accordingly.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
The bcm2835-isp.h is included with quotes rather than
angle brackets.
Quoted includes is reserved for internal includes, while the
linux/bcm2835-isp.h header is exported from the Linux kernel.
Fix the inclusion type.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
There is no point in recording sensor's timestamp when V4L2VideoDevice
has marked the frame buffers with FrameMetadata::FrameCancelled
(happens when the streams are stopped). The metadata is mostly invalid
for cancelled buffers hence, setting timestamp on invalid metadata
does not make sense (however some metadata can be considered valid
for e.g. returning cause of failure through metadata on cancelled
buffers).
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Replace manual static casts from the PipelineHandler pointer to a
derived class pointer with helper functions in the camera data classes.
This simplifies code accessing the pipeline from the camera data.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The PipelineHandler controls() and properties() functions are only used
by the Camera class. Now that the controls and properties are stored in
the Camera::Private class, we can drop those functions and access the
private data directly in Camera::controls() and Camera::properties().
Suggested-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The CameraData class isn't used anymore. Drop it.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
As part of the effort to remove the CameraData class, migrate the
pipeline handler-specific camera data from CameraData to the
Camera::Private class.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
As part of the effort to remove the CameraData class, migrate the
pipeline handler-specific camera data from CameraData to the
Camera::Private class.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
As part of the effort to remove the CameraData class, migrate the
pipeline handler-specific camera data from CameraData to the
Camera::Private class.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
As part of the effort to remove the CameraData class, migrate the
pipeline handler-specific camera data from CameraData to the
Camera::Private class.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
As part of the effort to remove the CameraData class, migrate the
pipeline handler-specific camera data from CameraData to the
Camera::Private class.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
As part of the effort to remove the CameraData class, migrate the
pipeline handler-specific camera data from CameraData to the
Camera::Private class.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Improve the Camera::Private documentation by fixing minor spelling or
style issues.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
With pipeline handlers now being able to subclass Camera::Private, start
the migration from CameraData to Camera::Private by moving the members
of the base CameraData class. The controlInfo_, properties_ and pipe_
members are duplicated for now, to allow migrating pipeline handlers one
by one.
The Camera::Private class is now properly documented, don't exclude it
from documentation generation.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
In order to allow subclassing Camera::Private in pipeline handlers, pass
the pointer to the private data to the Camera constructor, and to the
Camera::createCamera() function.
The Camera::Private id_ and streams_ members now need to be initialized
by the Camera constructor instead of the Camera::Private constructor, to
allow storage of the streams in a pipeline handler-specific subclass of
Camera::Private.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The Extensible constructor takes a pointer to a Private instance, whose
lifetime it then manages. Make this explicit in the API by passing the
pointer as a std::unique_ptr<Private>.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
When the stream is stopped, the V4L2VideoDevice sends back all
the queued buffers with FrameMetadata::FrameCancelled status.
It is the responsibility of the pipeline handler to handle
these buffers with FrameMetadata::FrameCancelled. VIMC is
currently missing this handling path.
As the FrameMetadata::FrameCancelled is set when the stream is
stopped, we can be sure that no more queued and re-use of request
shall happen. Hence, cancel all the requests' buffers force a
complete with completeBuffer().
The issue is caught by the gstreamer_single_stream_test.cpp running
with vimc. During the check with meson built-in option
'-Db_sanitize=address,undefined'
it was observed:
==118003==ERROR: AddressSanitizer: heap-use-after-free on address 0x60e000037108 at pc 0x7f225160c9ac bp 0x7f224a47b620 sp 0x7f224a47b618
READ of size 4 at 0x60e000037108 thread T1
#0 0x7f225160c9ab in libcamera::Request::sequence() const ../include/libcamera/request.h:55
#1 0x7f22518297aa in libcamera::VimcCameraData::bufferReady(libcamera::FrameBuffer*) ../src/libcamera/pipeline/vimc/vimc.cpp:577
#2 0x7f225183b1ef in libcamera::BoundMethodMember<libcamera::VimcCameraData, void, libcamera::FrameBuffer*>::activate(libcamera::FrameBuffer*, bool) ../include/libcamera/base/bound_method.h:194
#3 0x7f22515cc91f in libcamera::Signal<libcamera::FrameBuffer*>::emit(libcamera::FrameBuffer*) ../include/libcamera/base/signal.h:126
#4 0x7f22515c3305 in libcamera::V4L2VideoDevice::streamOff() ../src/libcamera/v4l2_videodevice.cpp:1605
#5 0x7f225181f345 in libcamera::PipelineHandlerVimc::stop(libcamera::Camera*) ../src/libcamera/pipeline/vimc/vimc.cpp:365
The VimcCameraData::bufferReady seems to emit even after the stream
is stopped. It's primarily due to vimc's lack of handling
FrameMetadata::FrameCancelled in its pipeline handler.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Tested-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Plumb through VIMC mojo interface to enable buffers passing.
VIMC does not have parameters or statistics buffers but we can
mimick the typical case of passing IPA buffers from pipeline
handler to IPA using mock buffers. The mock IPA buffers are
FrameBuffers which are dmabuf backed (in other words, mmap()able
through MappedFramebuffer inside the IPA).
This commits shows:
- Passing the parameter buffer from the pipeline handler to
the IPA through functions defined in mojom interface.
- Passing request controls ControlList to the IPA.
Any tests using VIMC will now loop in the IPA paths. Any tests running
in isolated mode will help us to test IPA IPC code paths especially
around (de)serialization of data passing from pipeline handlers to the
IPA. Future IPA interface tests can simply extend the vimc mojom
interface to achieve/test a specific use case as required.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
VIMC pipeline handler has dmabuf-backed mock FrameBuffers which are
specifically targetted mimicking IPA buffers (parameter and statistics).
Map these mock buffers to the VIMC IPA that would enable exercising IPA
IPC code paths. This will provide leverage to our test suite to test
IPA IPC code paths, which are common to various platforms.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
VIMC is a virtual test driver that doesn't have statistics or
parameters buffers that are typically passed from a pipeline
handler to its platform IPA. To increase the test coverage going
forward, we can at least mimick the typical interaction of how
a pipeline handler and IPA interact, and use it to increase the
test coverage.
Hence, create simple (single plane) dmabuf-backed FrameBuffers,
which can act as mock IPA buffers and can be memory mapped (mmap)
to VIMC IPA. To create these buffers, temporarily hijack the output
video node and configure it with a V4L2DeviceFormat. Buffers then
can be exported from the output video node using
V4L2VideoDevice::exportBuffers(). These buffers will be mimicked as
IPA buffers in subsequent commits.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
As part of an effort to make the vimc IPA usable for testing, extend it
with a configure function. The configuration is currently ignored by the
IPA.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
"NOIR" modules are ones that have had the IR filters removed but are
otherwise identical. The same tuning can be used as for the regular
version except that the colour calibration supplied to the AWB
algorithm no longer works. Instead we need to switch the algorithm to
its basic "grey world" method.
Users with "NOIR" modules can switch to the matching "xxx_noir.json"
tuning file by using the LIBCAMERA_RPI_TUNING_FILE environment
variable.
Signed-off-by: David Plowman <david.plowman@raspberypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Tidy-up a bit the inclusions directive in the IPU3 IPA module.
In detail:
- ipu3.cpp is missing inclusions for:
std::abs from <cmath>
std::map from <map>
std::min/max from <algorithm>
std::numeric_limits from <limits>
std::unique_ptr from <memory>
std::vector from <vector>
and does not require <sys/mman.h>
- ipu3_agc has two not used inclusions in the header file and one the cpp file
and is missing <chrono> for std::literals::chrono_literals
- ipu3_awb is missing <algorithm> for std::sort and does not use
<numeric> or <unordered_map>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
All the IPU3 Camera controls are currently initialized by the pipeline
handler which initializes them using the camera sensor configuration and
platform specific requirements.
However, some controls are better initialized by the IPA, which might,
in example, cap the exposure times and frame duration to the constraints
of its algorithms implementation.
Also, moving forward, the IPA should register controls to report its
capabilities, in example the ability to enable/disable 3A algorithms on
request.
Move the existing controls initialization to the IPA, by providing
the sensor configuration and its controls to the IPU3IPA::init()
function, which initializes controls and returns them to the pipeline
through an output parameter.
The existing controls initialization has been copied verbatim from the
pipeline handler to the IPA, if not a for few line breaks adjustments
and the resulting Camera controls values are not changed.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Introduce a new field in the controls serialization protocol to
allow discerning which ControlIdMap a ControlInfoMap refers to.
The newly introduced IdMapType enumeration describes the possible
info maps:
- Either the globally available controls::controls and
properties::properties maps, which are valid across IPC boundaries
- A ControlIdMap created locally by the V4L2 device, which is not valid
across the IPC boundaries
At de-serialization time the idMapType field is inspected and
- If the idmap is a globally defined one, there's no need to create
new ControlId instances when populating the de-serialized
ControlInfoMap. Use the globally available map to retrieve the
ControlId reference and use it.
- If the idmap is a map only available locally, create a new ControlId
as it used to happen before this patch.
As a direct consequence, this change allows us to perform lookup by
ControlId reference on de-serialized ControlIdMap that refers to the
libcamera defined controls::controls and properties::properties.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
ControlInfoMap does not have a ControlId map associated, but rather
creates one with the generateIdMap() function at creation time.
As a consequence, when in the need to de-serialize a ControlInfoMap all
the ControlId it contains are created by the deserializer instance, not
being able to discern if the controls the ControlIdMap refers to are the
global libcamera controls (and properties) or instances local to the
V4L2 device that has first initialized the controls.
As a consequence the ControlId stored in a de-serialized map will always
be newly created entities, preventing lookup by ControlId reference on a
de-serialized ControlInfoMap.
In order to make it possible to use globally available ControlId
instances whenever possible, create ControlInfoMap with a reference to
an externally allocated ControlIdMap instead of generating one
internally.
As a consequence the class constructors take and additional argument,
which might be not pleasant to type in, but enforces the concepts that
ControlInfoMap should be created with controls part of the same id map.
As the ControlIdMap the ControlInfoMap refers to needs to be allocated
externally:
- Use the globally available controls::controls (or
properties::properties) id map when referring to libcamera controls
- The V4L2 device that creates ControlInfoMap by parsing the device's
controls has to allocate a ControlIdMap
- The ControlSerializer that de-serializes a ControlInfoMap has to
create and store the ControlIdMap the de-serialized info map refers to
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In mmap() error handling path, errno is stored but never printed
in the error log. Print it.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Remove the need for callers to reference PROT_READ/PROT_WRITE directly
from <sys/mman.h> by instead exposing the Read/Write mapping options as
flags from the MappedFrameBuffer class itself.
While here, introduce the <stdint.h> header which is required for the
uint8_t as part of the Plane.
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The MappedFrameBuffer is a convenience feature which sits on top of the
FrameBuffer and facilitates mapping it to CPU accessible memory with
mmap.
This implementation is internal and currently sits in the same internal
files as the internal FrameBuffer, thus exposing those internals to
users of the MappedFramebuffer implementation.
Move the MappedFrameBuffer and MappedBuffer implementation to its own
implementation files, and fix the sources throughout to use that
accordingly.
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Remove leftover inclusions of the sys/mman header file.
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Usage of 'method' to refer to member functions comes from Java. The C++
standard uses the term 'function' only. Replace 'method' with 'function'
or 'member function' through the whole code base and documentation.
While at it, fix two typos (s/backeng/backend/).
The BoundMethod and Object::invokeMethod() are left as-is here, and will
be addressed separately.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
To capture raw frames, the ImgU isn't needed. However, to implement
auto-exposure, we do need to configure the IPA since it shall setup
the sensor controls (exposure, vblank and so on) for the capture.
One cannot simply configure the IPA, without the ImgU as the
parameters and statistics buffer passed to the IPA are actually
managed by the ImgU.
Until we prepare and setup the ImgU to run an internal queue for
raw-only camera configuration, disallow this configuration and
report it as invalid.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Tested-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
cameraBuffer function in private constructor is unused.
Mark it as such.
Fixes: 33dd4fab9d39("libcamera: base: class: Don't pass Extensible pointer to Private constructor")
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Propagate the requested test pattern mode to libcamera::Camera
through libcamera::Request and also set the android metadata to
the test pattern mode contained by the complete Request.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When the sensor is switched to a mode with a different sensitivity,
the target exposure values need to be adjusted proportionately to
maintain the same image brightness.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
We use the CamHelper class to initialise it to the usual value of 1.
The CamHelper's GetModeSensitivity method can be redefined to
implement a different behaviour for sensors that require it.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Use the KMSSink class to display the viewfinder stream, if any, through
DRM/KMS. The output connector is selected through the new -D/--display
argument.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Not all display controllers support enabling the display without any
active plane. Delay display enabling to the first frame.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a KMSSink class to display framebuffers through the DRM/KMS API.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|