Age | Commit message (Collapse) | Author |
|
Add camera sensor properties for the Hynix hi846 sensor. The part is
also called YACG4D0C9SHC and a datasheet can be found at
https://product.skhynix.com/products/cis/cis.go
This is the selfie camera in the Librem 5 phone.
Signed-off-by: Martin Kepplinger <martin.kepplinger@puri.sm>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Usually .cpp files are equipped with using namespace libcamera;
Hence, it is unnecessary mentioning the explicit namespace of
libcamera at certain places.
While at it, a small typo in a comment was noticed and fixed as
part of this patch.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
Replace the open-coded PixelFormat lookup with the
V4L2PixelFormat::toPixelFormat() helper function. This simplifies the
implementation.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasoboard.com>
|
|
Many signals used in internal and public APIs carry the emitter pointer
as a signal argument. This was done to allow slots connected to multiple
signal instances to differentiate between emitters. While starting from
a good intention of facilitating the implementation of slots, it turned
out to be a bad API design as the signal isn't meant to know what it
will be connected to, and thus shouldn't carry parameters that are
solely meant to support a use case specific to the connected slot.
These pointers turn out to be unused in all slots but one. In the only
case where it is needed, it can be obtained by wrapping the slot in a
lambda function when connecting the signal. Do so, and drop the emitter
pointer from all signals.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
In many cases, the emitter object passed as a pointer from signals to
slots is also available as a class member. Use the class member when
this occurs, to prepare for removal of the emitter object pointer from
signals.
In test/event.cpp, this additionally requires moving the EventNotifier
to a class member.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
It can be useful to connect a signal to a functor, and in particular a
lambda function, while still operating in the context of a receiver
object (to support both object-based disconnection and queued
connections to Object instances).
Add a BoundMethodFunctor class to bind a functor, and a corresponding
Signal::connect() function. There is no corresponding disconnect()
function, as a lambda passed to connect() can't be later passed to
disconnect(). Disconnection typically uses disconnect(T *object), which
will cover the vast majority of use cases.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
When disconnecting a signal from a receiver, it is usually not necessary
to specify the receiver's slot function explicitly, as the signal is
often connected to a single slot for a given receiver. We can thus use a
simpler version of Signal::disconnect() that takes a pointer to the
receiver object only. This reduces code size, as the disconnect()
function is a template function.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
libcamera isn't supposed to log messages after the logger is destroyed,
as the global logger instance is destroyed after the main() function
returns, and the camera manager is supposed to have been stopped and
destroyed before that.
This rule is difficult to enforce in the V4L2 compat implementation, as
there is no location where we can destroy the camera manager manually
before the logger is destroyed. This results in a use-after-free
condition when the camera manager gets stopped during destruction.
Fix it by not trying to print log messages when the global logger
instance has been destroyed.
This is a bit of a hack, but hopefully not too bad. There could be race
conditions when using a CameraManager instance that is destroyed as part
of the destruction of global variables (like the V4L2 compat layer does,
it wraps CameraManager in a singleton V4L2CompatManager class, and
destroys it when V4L2CompatManager is destroyed) as the CameraManager
thread will still be running when the logger gets destroyed, but this
doesn't cause any regression as we destroy the logger without any
safeguard measure today anyway.
There are other options that could be considered. Forcing destruction of
the logger after the camera manager in the V4L2 compat layer is one of
them, but turned out to be difficult. For instance care would need to be
taken *not* to log any message in the mmap() wrapper if the fd doesn't
match a wrapped camera, as mmap() is called very early in the
initialization process, before libcamera and the logger get initialized.
The resulting implementation would likely be fairly complex.
Another option could be to wrap the logger with a shared pointer, and
keep a reference to it in CameraManager. That's more intrusive, and it's
not clear if it would be worth it.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The SimplePipelineHandler activeCamera_ member pointer is set but never
used. Drop it.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Martin Kepplinger <martin.kepplinger@puri.sm>
|
|
To use multiple cameras at the same time, a per-camera buffer ready
handler is needed. Move the bufferReady() function connected to the
V4L2VideoDevice bufferReady signal from the SimplePipelineHandler class
to the SimpleCameraData class.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Martin Kepplinger <martin.kepplinger@puri.sm>
|
|
To use multiple cameras at the same time, each camera instance will need
its own converter. Store the converter in SimpleCameraData, and open it
in init().
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Martin Kepplinger <martin.kepplinger@puri.sm>
|
|
The cameras created by the same pipeline handler instance may share
hardware resources, prohibiting usage of multiple cameras concurrently.
Implement a heuristic to reserve resources and handle mutual exclusiong
in a generic way.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Martin Kepplinger <martin.kepplinger@puri.sm>
|
|
Move opening of video devices at match() time, the same way as subdevs
are opened, to make the handling of V4L2 video devices and subdevices
more consistent.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Martin Kepplinger <martin.kepplinger@puri.sm>
|
|
Merge the SimplePipelineHandler videos_ and subdevs_ maps, which
respectively store V4L2 video devices and subdevices associated with
entities, into a single entities_ map that contains an EntityData
structure. This gathers all data about entities in a single place,
allowing for easy extension of entity data in the future.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Martin Kepplinger <martin.kepplinger@puri.sm>
|
|
Store the entity corresponding to the video node at the end of the
pipeline in the SimpleCameraData::entities_ list. This requires special
handling of the video node in the loops that iterate over all entities,
but will be useful to implement mutually exclusive access to entities
for concurrent camera usage.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Martin Kepplinger <martin.kepplinger@puri.sm>
|
|
The video device is currently opened in the SimpleCameraData
constructor. This requires opening the video devices on-demand the first
time they're accessed, which gets in the way of refactoring of
per-entity data storage in the pipeline handler.
Move opening of the video device to the SimpleCameraData::init()
function. The on-demand open mechanism isn't touched yet, it will be
refactored later.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Martin Kepplinger <martin.kepplinger@puri.sm>
|
|
Record the sink and source pads through which an entity is traversed in
the list of entities stored in the camera data. This prepares for
implementing mutually exclusive access to entities between cameras.
The debug message that displays detected pipelines now prints pads to
describe the pipeline more precisely:
[0:00:35.901275750] [260] DEBUG SimplePipeline simple.cpp:404 Found pipeline: [imx290 2-001a|0] -> [0|csis-32e40000.csi|1] -> [0|mxc_isi.0|1] -> [0|mxc_isi.0.capture]
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Tested-by: Martin Kepplinger <martin.kepplinger@puri.sm>
|
|
Add a new field to the MediaEntity class to identify the type of
interface it exposes to userspace. The MediaEntity constructor is
changed to take a media_v2_interface pointer instead of just the device
node major and minor to have access to the interface type.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Tested-by: Martin Kepplinger <martin.kepplinger@puri.sm>
|
|
"weighted", derived from the verb "to weight", comes from Middle English
weight, weiȝte, weght, wight, from Old English wiht, ġewiht, from
Proto-Germanic *wihtiz, from Proto-Indo-European *weǵʰ-. In none of
those does the t come before the h.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
MappedFrameBuffer::maps() returns planes_. This renames the function
name to planes().
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The bufferLength_ member variabled is checked to have a positive value
before being used, to catch usage before the variable is set. The
variable is initialized to zero at construction time, which renders the
checks useless.
Fix this by initializing the variable to -1 at construction time.
Fixes: c5e2ed7806be ("android: camera_buffer: Map buffer in the first plane() call")
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
If the camera devices does not support the MANUAL_SENSOR capabilities
there is no point in generating a request template for the Manual
capture use case.
This change fixes CTS tests
android.hardware.camera2.cts.CameraDeviceTest#testCameraDeviceManualTemplate
android.hardware.camera2.cts.NativeCameraDeviceTest#testCameraDeviceCreateCaptureRequest
For devices that do not support MANUAL_SENSOR capabilities.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Do not compare higher precision of the ratios, as it might lead to
absurd selection of sensor size for a relatively low requested
resolution size.
For example:
The imx258 driver supports the following sensor resolutions:
- 4208x3118 = 1.349583066
- 2104x1560 = 1.348717949
- 1048x780 = 1.343589744
It can be inferred that, that the aspect ratio only differs by a small
mantissa with each other. It does not makes sense to select a 4208x3118
for a requested size of say 640x480 or 1280x720, which is what is
happening currently.
($) cam -c1 -swidth=640,height=480,role=raw
- CIO2 configuration: 4208x3118-SGRBG10_IPU3 [*]
In order to address this constraint, only compare the ratio with single
precision to make a better decision on the sensor resolution policy
selection.
($) cam -c1 -srole=raw,width=640,height=480
- CIO2 configuration: 1048x780-SGRBG10_IPU3 [*]
[*] Please revert 0536a9aa7189("ipu3: Disallow raw only camera configuration")
if the configuration is reported as invalid.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Tested-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The current implementation of getSensorFormat() prioritizes sensor
sizes that match the output size FOV ratio.
Modify the frame size selection procedure to prioritize resolutions
with the same FOV as the sensor's native one, to avoid cropping in the
sensor pixel array.
For example:
- on a Soraka device equipped with ov13858 as back sensor, with a
native resolution of 4224x3136 (4:3), when requested to select the
sensor output size to produce 1080p (16:9) a frame size of 2112x1188
(16:9) is selected causing the ImgU configuration procedure to fail.
If a resolution with the same FOV as the sensor's native size, such
as 2112x1568 (4:3), is selected the pipeline works correctly.
Suggested-by:: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Tested-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
This prepares a base to introduce custom selection of sensor format
based on platform(Soraka or Nautilus) constraints. The changes in
selection policy will be introduced in a subsequent patch.
The function is copied from CameraSensor::getFormat() in the IPU3
pipeline handler code to be later changed on top.
The copy is not verbatim and has a minor change as follows:
CameraSensor::getFormats() has access to a V4L2Subdevice::Formats
internally and use it directly to iterate over supported camera sensor
frame sizes. The copy is adapted to use the CameraSensor::sizes(mbusCode)
instead, to enumerate over the supported frame sizes as per
mbusCodesToPixelFormat map.
No functional changes in this patch.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Tested-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
In CameraSensor, the mbusCodes() and sizes() accessor functions
retrieves all the supported media bus codes and the supported sizes
respectively. However, this is quite limiting since the caller
probably isn't in a position to match which range of sizes are
supported for a particular mbusCode.
Hence, the caller is most likely interested to know about the sizes
supported for a particular media bus code. This patch transforms the
existing CameraSensor::sizes() to CameraSensor::sizes(mbuscode) to
achieve that goal.
The patch also transforms existing CIO2Device::sizes() in IPU3 pipeline
handler to CIO2Device::sizes(PixelFormat) on a similar principle. The
function is then plumbed to CameraSensor::sizes(mbusCode) to enumerate
the per-format sizes as required in
PipelineHandlerIPU3::generateConfiguration().
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Tested-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The offset variable is introduced to FrameBuffer::Plane. In order to
detect that the plane is used while the offset is not set, this adds
the assertion to FrameBuffer::planes(). It should be removed in the
future.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
CameraDevice::CreateFrameBuffer() fills the length of the buffer to
each FrameBuffer::Plane::length. It should rather be the length of
plane. This also changes CreateFrameBuffer() to fill offset of
FrameBuffer::Plane.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
V4L2VideDevice::createBuffer() creates the same number of
FrameBuffer::Planes as V4L2 format planes. Therefore, if the v4l2 format
single is single-planar format, the created number of
FrameBuffer::Planes is 1. It should rather create the same number of
FrameBuffer::Planes as the color format planes.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The plane length is the length of the plane size. The buffer length
to be allocated for a plane is the offset and the length of
FrameBuffer::Plane.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
FrameBuffer::Plane has offset info now. This uses the offset
in mapping FrameBuffer in MainWindow.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
IPABuffer is represented by FrameBuffer. FrameBuffer::Plane has
now an offset. This uses the offset variable to map the IPABuffer.
The mapped IPABuffer is represented and managed as MappedFrameBuffer.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
This fixes the way of mapping FrameBuffer in FrameSink by
using offset.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
MappedBuffer::maps()
MappedBuffer::maps() returns std::vector<MappedBuffer::Plane>.
Plane has the address, but the address points the beginning of the
buffer containing the plane.
This makes the Plane point the beginning of the plane. So
MappedBuffer::maps()[i].data() returns the address of i-th plane.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
This adds offset to FrameBuffer::Plane. It enables representing frame
buffers that store planes in the same dmabuf at different offsets, as
for instance required by the V4L2 NV12 pixel format.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The documentation suggests to use CameraConfiguration::operator[] to
access the StreamConfiguration it contains, but as CameraConfiguration
instances are generated by the Camera class and are returned wrapped in
a unique_ptr<>, the usage of operator[] would require an awkward syntax such
as (*config)[i].
Better to suggest the usage of the CameraConfiguration::at() function
instead to access the StreamConfigurations.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
This adds getter functions of stride, offset and size to CameraBuffer
interface.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
CameraBuffer implementation maps a given buffer_handle_t in
constructor. Mapping is redundant to only know the plane info like
stride and offset. Mapping should be executed later in the first
plane() call.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
buffer_handle_t doesn't provide sufficient info to map a buffer
properly. cros::CameraBufferManager enables handling the buffer on
ChromeOS, but no way is provided for other platforms.
Therefore, we put the assumption that planes are in the same buffer
and they are consecutive. This modifies the way of mapping in
generic_camera_buffer with the assumption.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Pipeline managers sets a default value to StreamConfiguration::size. The
original fixation code was attempting to use it, but as it was truncating
the caps to its first structure it would never actually find a best match.
In this patch, instead of truncating, we weight various matches using the
product of the width and height delta. We also split delta from ranges
apart and prefer fixed size over them as ranges are not reliable.
This patch also removes the related todo, as it seems that libcamera core
won't go further then providing this default value and won't be sorting the
format and size lists.
Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
Tested-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
It's not allowed in GStreamer to push events while holding the object
lock. This reduce the scope into which we hold the object lock. In
fact we don't need to protect against gst_task_resume() concurrency
when we stop the task as resume only do something if the task is paused.
This fixes a deadlock when running multiple instances of libcamerasrc
and closing one of the streaming window.
Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
It's not allowed to have multiple instances of CameraManager. This
requirement is not easy for GStreamer were the device monitor and
the camerasrc, or two camerasrc instances don't usually have any
interaction between each other. Fix this by implementing a minimalist
singleton around CameraManager constructor and start()/stop()
operations.
Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
This deadlock occurs when a buffer is holding the last reference on
the allocator. In gst_libcamera_allocator_release() we must drop the
object lock before dropping the last ref of that object since the
destructor will lock it again causing deadlock.
This was notice while switching camera or resolution in Cheese software.
Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The IPASessionConfiguration now has the grid configuration stored. Use
it it at prepare() and process() calls in AWB and pass it as a reference
to the private functions when needed.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The IPASessionConfiguration now has the grid configuration stored. Use
it at process() call in AGC and pass it as a reference to the private
functions when needed.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The MappedBuffer structure is a custom container that binds a data
pointer with a length. This is exactly what Span is. Use it instead.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Now that the interface is properly used by the AGC class, move it into
ipa::ipu3::algorithms and let the loops do the calls.
As we need to exchange the exposure_ and gain_ by passing them through the
FrameContext, use the calculated values in setControls() function to
ease the reading.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Now that the interface is properly used by the AWB class, move it into
ipa::ipu3::algorithms and let the loops do the calls.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In preparation for using the AGC through the new algorithm interfaces,
convert the existing code to use the new function types.
Now that the process call is rewritten, re-enable the compiler flag to
warn when a function declaration hides virtual functions from a base class
(-Woverloaded-virtual).
We never use converged_ so remove its declaration. The controls may not
need to be updated at each call, but it should be decided on the context
side and not by a specific call by using a lock status in the Agc
structure for instance.
As the params_ local variable is not useful anymore, remove it here
too.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
When the stats are received, pass them with the context to the existing
AWB algorithm. IPAFrameContext now has a new structure to store the
gains calculated by the AWB algorithm.
When an EventFillParams event is received, call prepare() and set the new
gains accordingly in the params structure.
There is no more a need for the IPU3Awb::initialise() function, as the
params are always set in prepare().
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|