Age | Commit message (Collapse) | Author |
|
The original use of RPi::Stream::reset() was to clear the external flag state
and free/clear out the framebuffers for the stream. However, the latter is now
done through PipelineHandlerRPi::configure(). Rework
PipelineHandlerRPi::configure() to call RPi::Stream::setExternal() instead of
RPi::Stream::reset() to achieve the same thing.
Repurpose RPi::Stream::reset() to instead reset the state of the buffer handling
logic, where all internally allocated buffers are put back into the queue of
available buffers. As such, rename the function to RPi::Stream::resetbuffers().
This resetbuffers() is now called from PipelineHandlerRPi::start(), allowing the
pipeline handler to correctly deal with start()/stop()/start() sequences and
reusing the buffers allocated on the first start().
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Currently, all framebuffer allocations get freed and cleared on a stop in
PipelineHandlerRPi::stopDevice(). If PipelineHandlerRPi::start() is then called
without an intermediate PipelineHandlerRPi::configure(), it will re-allocate and
prepare all the buffers again, which is unnecessary.
Fix this by not freeing the buffer in PipelineHandlerRPi::stopDevice(), but
insted doing it in PipelineHandlerRPi::configure(), as the buffers might have
to be resized.
Add a flag to indicate that buffer allocations need to be done on the next
call to PipelineHandlerRPi::start().
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
This function used to clear the camera buffers does not belong in the
PipelineHandlerRPi class as it only access members of the RPiCameraData,
so move it.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The V4L2DeviceFormat structure for the ISP Output 1 node was a copy of what is
used ISP Output 0 node, but with the size changed. However, the plane size and
stride values were not updated. So there is a possibility that the buffer might
be over-sized for the requested resolution.
Fix this by only copying the relevant fields from the ISP Output 0
V4L2DeviceFormat structure, and let the device driver size the planes as needed.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The start(unsigned int msec) overload is error-prone, as the argument
unit can easily be mistaken in callers. Drop it and update all callers
to use the start(std::chrono::milliseconds) overload instead.
The callers now need to use std::chrono_literals. The using statement
could be added to timer.h for convenience, but "using" is discouraged in
header files to avoid namespace pollution. Update the callers instead,
and while at it, sort the "using" statements alphabetically in tests.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
The RkISP1Path::generateConfiguration() function limits the maximum
resolution to the sensor resolution, to avoid upscaling. It however
doesn't take the sensor aspect ratio into account, which leads to a
maximum (and default) resolution of 1920x1920 when using the self path
with a sensor that has a higher resolution.
Fix it by constraining the minimum and maximum resolutions to match the
sensor's aspect ratio.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
One if...endif block is incorrectly indented. Fix it.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The IPARkISP1Interface currently uses event-type based structures in
order to communicate with the pipeline-handler (and vice-versa).
Replace the event based structures with dedicated functions associated
to each operation.
The translated naming scheme of operations to dedicated functions:
ActionV4L2Set => setSensorControls
ActionParamFilled => paramsBufferReady
ActionMetadata => metdataReady
EventSignalStatBuffer => processStatsBuffer()
EventQueueRequest => queueRequest()
The lexical of IPARkISP1::metadataReady() will now conflict with the
metadataReady Signal being introduced in this patch as part of the
interface change. Hence, rename IPARkISP1::metadataReady() to
IPARkISP1::prepareReady() to prevent the conflict.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The private members of exposure and gain limits can be dropped
from IPARkISP1 since they are not used class-wide and can be easily
replaced by local counter-parts.
In case they are required to be widely available, these should then
sit in IPASessionConfiguration.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
s/bytesused/bytes used/
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a comment to clarify why we set v4l2_buffer timestamp while
queuing the buffer (VIDIOC_QBUF). The timstamps are required to
be supplied for memory-to-memory devices for output streams which
then are copied to capture stream buffers with the help of
V4L2_BUF_FLAG_TIMESTAMP_COPY (set by the driver).
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The struct v4l2_buffer documentation [1] clearly states
that setting of the sequence is done by the driver. libcamera does
not really need to set this field while queuing the buffer(s).
[1]: https://www.kernel.org/doc/html/latest/userspace-api/media/v4l/buffer.html#struct-v4l2-buffer
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add an entry to the sensor properties for the ov5640. Only the first
test pattern is included as the others that are exposed by the kernel
don't correspond to any that are defined in the libcamera control.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The Sony IMX296 exists in two versions, colour (Bayer) and monochrome.
The tuning file corresponds to the monochrome version.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
|
|
The Sony IMX296 sensor has an exponential gain model, and adds a fixed
14.26µs offset to the exposure time expressed in line units.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Some sensors (namely the Sony IMX296, whose support will be added
shortly) require different conversion formulas between exposure time and
exposure lines. Make the Exposure() and ExposureLines() functions
virtual to allow this.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The Sony IMX296 is a global shutter sensor with a 1456x1088 pixel array
size, with a recommended resolution after CFA interpolation of
1440x1080.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The AwbAuto mode is defined in all the JSON tuning files as "auto",
not "normal". The effect of this was that you couldn't switch back to
"auto" mode once you had switched away.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The includes in the v4l2_camera_proxy do not match the code style
and trigger clang-format changes when running checkstyle.
Update and fix the include order accordingly
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The configure() function has a local configuration variable referencing
context.configuration for the purpose of shortening lines. Use it
instead of context.configuration in the remaining locations, and
constify it while at it as the configuration isn't meant to be modified.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The frame count is used to skip the gain and exposure filtering when
starting. It thus needs to be reset when configuring the algorithm, to
avoid slower convergence when stopping and restarting.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The datasheet for the OV2740 gives 0x80 as 1x gain, so real gain
is GainCode / 128.
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add camera sensor property entries for the OmniVision 2740 camera
sensor.
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Pick up the focus value from the AF algorithm and send lens controls
along in the frame context.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When configuring the pipeline we want to share the controls for any
VCM device against the sensor too - pass them to the lensControls
member of configInfo
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Now that we have added lens controls, rename the existiing
member of the class to clarify that it relates to the sensor's
controls.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add further members to the ipu3 ipa interface that will hold lens
controls passed in by configInfo
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a function to the CameraLens class to fetch the V4L2 controls
for its V4L2 subdev
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a function to check for and initialise any VCMs linked to the
CameraSensor's entity by ancillary links. This should initialise
the lens_ member with the linked entity. Call the new function
during CameraSensor::init().
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The populateLinks() function can't currently handle ancillary links
which causes an error to be thrown in process() when the MediaObject
cannot be cast to a MediaPad.
Add explicit handling for the different link types, creating either
pad-2-pad links or else storing the pointer to the ancillary device
MediaEntity in the ancillaryEntities_ member of the primary.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
With kernel support for ancillary links, we can describe the
relationship between two devices represented individually as instances
of MediaEntity. As the only property of that relationship is its
existence, describe those relationships in libcamera simply as a
vector of MediaEntity pointers to the ancillary devices.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Since VCM for surface Go 2 (dw9719) had been successfully driven, this
Af module can be used to control the VCM and determine the focus value
based on the IPU3 AF state.
Based on the values from the IPU3 AF buffer, the variance of each focus
step is determined and a greedy approach is used to find the maximum
variance of the AF state and an appropriate focus value.
The grid configuration is implemented as a context. Also, the grid
parameter- AF_MIN_BLOCK_WIDTH is set to 4 (default is 3) since if the
default value is used, x_start (x_start > 640) will be at an incorrect
location of the image (rightmost of the sensor).
Signed-off-by: Kate Hsuan <hpa@redhat.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The gain values are coded as u3.13 fixed point values, ie they can not
be more than 8. Clamp the values in order to avoid any off limits value
which could make the IPU3 behave in a weird manner.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Instead of having a local cached value for line duration, store it in
the IPASessionConfiguration::sensor structure.
While at it, configure the default analogue gain and shutter speed to
controlled fixed values.
The latter is set to be 10ms as it will in most cases be close to the
one needed, making the AGC faster to converge.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
When the effective sensor values are stored during the EventStatReady
event, the lines are long. Fix it.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
When the current exposure value is calculated, it is cached and used by
filterExposure(). Use private filteredExposure_ and pass currentExposure
as a parameter.
In order to limit the use of filteredExposure_, return the value from
filterExposure().
While at it, remove a stale comment.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
The CameraLens class implements a function named "setFocusPostion".
There is a typo here, fix it.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We must calculate the initial scaler crop when the camera is
configured, otherwise the metadata will report this rectangle as being
all zeroes.
Because the calculation is identical to that performed later in handling
the scaler crop control, we factor it into a small helper function,
RPiCameraData::scaleIspCrop.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
This commit provides a sketch regarding Camera3RequestDescriptor
which aids tracking each capture reuqest placed by the android
framework to libcamera HAL.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Specifically document:
- CameraDevice::sendCaptureResults()
- CameraDevice::completeDescriptor()
- CameraDevice::streamProcessingComplete()
- CameraStream::PostProcessorWorker class
- Camera3RequestDescriptor::StreamBuffer structure
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Use structured bindings range-based for loops for better readability.
Signed-off-by: Nejc Galof <galof.nejc@gmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The V4L2 Compatibility layer is returning timestamps for buffers which
are incorrectly calculated from the frame metadata.
The sec component of the timestamp is correct, but the usec component is
out, leaving frame captures reporting non-monotonically increasing
timestamps, and incorrect frame rate calculations.
Fix the usecs calculation reported by the V4L2 adaptation layer.
Bug: https://bugs.libcamera.org/show_bug.cgi?id=118
Fixes: 0ce8f2390b52 ("v4l2: v4l2_compat: Add V4L2 compatibility layer")
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
With the controller algorithms running at 60fps, there are some dropped frames
when running at very high famerates. Reducing this to 30fps eliminates all these
drops without any noticeable changes to the image quality.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Simplify the image and embedded buffer matching logic by removing the assumption
that we require a buffer match between the two streams. Instead, if an image
buffer does not match with an embedded data buffer, simply use the ControlList
provided by DelayedControls for the sensor parameters.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
If Stream::returnBuffer() gets passed an internally allocated buffer, it now
simply re-queues it back to the device. With this change, the pipeline handler
code can be simplified slightly as it does not need multiple code paths for
internally allocated and non-internally allocated buffers.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We now handle disabling ("pausing") AWB in the same way as
AEC/AGC. Instead of letting the pause flag be set so that the code
never runs at all, we instead fix the manual settings to the current
values (but continue to be called).
The algorithm does not restart any calculations in this state, but
continues to add AWB metadata to every frame. Therefore certain other
algorithms that want to know it (CCM and ALSC, for example) can still
find it.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
When searching for a suitable pipeline, we mistakenly only break from
the inner loop. This results in the last suitable output being selected.
Pick the first one instead.
Fixes: 1de0f90dd432 ("cam: kms_sink: Print display pipelineconfiguration")
Signed-off-by: Eric Curtin <ecurtin@redhat.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Objects are not expected to be connected to the same signal more than
once. Doing so likely indicates a bug in the code, and can be
highlighted in debug builds with an assert that performs a lookup on the
signals_ list.
While it is possible to allow the implementation to let objects connect
to a specific signal multiple times, there are no expected use cases for
this in libcamera and this behaviour is restricted to favour defensive
programming by raising an error when this occurs.
Remove the support in the test framework which uses multiple Signal
connections on the same object, and update the test to use a second
Signal.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Provide a call allowing requests to be registered and associated with
the pipeline handler after being constructed by the camera.
This provides an opportunity for the PipelineHandler to connect any
signals it may be interested in receiving for the request such as
getting notifications when the request is ready for processing when
using a fence.
While here, update the existing usage of the d pointer in
Camera::createRequest() to match the style of other functions.
Bug: https://github.com/raspberrypi/libcamera-apps/issues/217
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Support easier usage of the v4l2 compatibility layer with a script that
handles the LD_PRELOAD for applications.
The wrapper can be prefixed to launch any application with the preload
set:
$ libcamera-v4l2 v4l2-ctl --list-devices
\_SB_.PCI0.GP13.XHC0.RHUB.PRT4- (libcamera:0):
/dev/video0
platform/vimc.0 Sensor B (libcamera:1):
/dev/video2
/dev/video3
/dev/video4
Specifying '-d' once before the command to run will enable V4L2Compat
layer debug output from libcamera.
Specifying '-d' twice will enable all debug levels from all libcamera
components and provide a very detailed log for analysis.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|