Age | Commit message (Collapse) | Author |
|
One if...endif block is incorrectly indented. Fix it.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The IPARkISP1Interface currently uses event-type based structures in
order to communicate with the pipeline-handler (and vice-versa).
Replace the event based structures with dedicated functions associated
to each operation.
The translated naming scheme of operations to dedicated functions:
ActionV4L2Set => setSensorControls
ActionParamFilled => paramsBufferReady
ActionMetadata => metdataReady
EventSignalStatBuffer => processStatsBuffer()
EventQueueRequest => queueRequest()
The lexical of IPARkISP1::metadataReady() will now conflict with the
metadataReady Signal being introduced in this patch as part of the
interface change. Hence, rename IPARkISP1::metadataReady() to
IPARkISP1::prepareReady() to prevent the conflict.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The private members of exposure and gain limits can be dropped
from IPARkISP1 since they are not used class-wide and can be easily
replaced by local counter-parts.
In case they are required to be widely available, these should then
sit in IPASessionConfiguration.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
s/bytesused/bytes used/
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a comment to clarify why we set v4l2_buffer timestamp while
queuing the buffer (VIDIOC_QBUF). The timstamps are required to
be supplied for memory-to-memory devices for output streams which
then are copied to capture stream buffers with the help of
V4L2_BUF_FLAG_TIMESTAMP_COPY (set by the driver).
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The struct v4l2_buffer documentation [1] clearly states
that setting of the sequence is done by the driver. libcamera does
not really need to set this field while queuing the buffer(s).
[1]: https://www.kernel.org/doc/html/latest/userspace-api/media/v4l/buffer.html#struct-v4l2-buffer
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add an entry to the sensor properties for the ov5640. Only the first
test pattern is included as the others that are exposed by the kernel
don't correspond to any that are defined in the libcamera control.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The Sony IMX296 exists in two versions, colour (Bayer) and monochrome.
The tuning file corresponds to the monochrome version.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
|
|
The Sony IMX296 sensor has an exponential gain model, and adds a fixed
14.26µs offset to the exposure time expressed in line units.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Some sensors (namely the Sony IMX296, whose support will be added
shortly) require different conversion formulas between exposure time and
exposure lines. Make the Exposure() and ExposureLines() functions
virtual to allow this.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The Sony IMX296 is a global shutter sensor with a 1456x1088 pixel array
size, with a recommended resolution after CFA interpolation of
1440x1080.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The AwbAuto mode is defined in all the JSON tuning files as "auto",
not "normal". The effect of this was that you couldn't switch back to
"auto" mode once you had switched away.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The includes in the v4l2_camera_proxy do not match the code style
and trigger clang-format changes when running checkstyle.
Update and fix the include order accordingly
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The configure() function has a local configuration variable referencing
context.configuration for the purpose of shortening lines. Use it
instead of context.configuration in the remaining locations, and
constify it while at it as the configuration isn't meant to be modified.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The frame count is used to skip the gain and exposure filtering when
starting. It thus needs to be reset when configuring the algorithm, to
avoid slower convergence when stopping and restarting.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The datasheet for the OV2740 gives 0x80 as 1x gain, so real gain
is GainCode / 128.
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add camera sensor property entries for the OmniVision 2740 camera
sensor.
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Pick up the focus value from the AF algorithm and send lens controls
along in the frame context.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When configuring the pipeline we want to share the controls for any
VCM device against the sensor too - pass them to the lensControls
member of configInfo
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Now that we have added lens controls, rename the existiing
member of the class to clarify that it relates to the sensor's
controls.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add further members to the ipu3 ipa interface that will hold lens
controls passed in by configInfo
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a function to the CameraLens class to fetch the V4L2 controls
for its V4L2 subdev
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a function to check for and initialise any VCMs linked to the
CameraSensor's entity by ancillary links. This should initialise
the lens_ member with the linked entity. Call the new function
during CameraSensor::init().
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The populateLinks() function can't currently handle ancillary links
which causes an error to be thrown in process() when the MediaObject
cannot be cast to a MediaPad.
Add explicit handling for the different link types, creating either
pad-2-pad links or else storing the pointer to the ancillary device
MediaEntity in the ancillaryEntities_ member of the primary.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
With kernel support for ancillary links, we can describe the
relationship between two devices represented individually as instances
of MediaEntity. As the only property of that relationship is its
existence, describe those relationships in libcamera simply as a
vector of MediaEntity pointers to the ancillary devices.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Daniel Scally <djrscally@gmail.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Since VCM for surface Go 2 (dw9719) had been successfully driven, this
Af module can be used to control the VCM and determine the focus value
based on the IPU3 AF state.
Based on the values from the IPU3 AF buffer, the variance of each focus
step is determined and a greedy approach is used to find the maximum
variance of the AF state and an appropriate focus value.
The grid configuration is implemented as a context. Also, the grid
parameter- AF_MIN_BLOCK_WIDTH is set to 4 (default is 3) since if the
default value is used, x_start (x_start > 640) will be at an incorrect
location of the image (rightmost of the sensor).
Signed-off-by: Kate Hsuan <hpa@redhat.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The gain values are coded as u3.13 fixed point values, ie they can not
be more than 8. Clamp the values in order to avoid any off limits value
which could make the IPU3 behave in a weird manner.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Instead of having a local cached value for line duration, store it in
the IPASessionConfiguration::sensor structure.
While at it, configure the default analogue gain and shutter speed to
controlled fixed values.
The latter is set to be 10ms as it will in most cases be close to the
one needed, making the AGC faster to converge.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
When the effective sensor values are stored during the EventStatReady
event, the lines are long. Fix it.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
When the current exposure value is calculated, it is cached and used by
filterExposure(). Use private filteredExposure_ and pass currentExposure
as a parameter.
In order to limit the use of filteredExposure_, return the value from
filterExposure().
While at it, remove a stale comment.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
The CameraLens class implements a function named "setFocusPostion".
There is a typo here, fix it.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We must calculate the initial scaler crop when the camera is
configured, otherwise the metadata will report this rectangle as being
all zeroes.
Because the calculation is identical to that performed later in handling
the scaler crop control, we factor it into a small helper function,
RPiCameraData::scaleIspCrop.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
This commit provides a sketch regarding Camera3RequestDescriptor
which aids tracking each capture reuqest placed by the android
framework to libcamera HAL.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Specifically document:
- CameraDevice::sendCaptureResults()
- CameraDevice::completeDescriptor()
- CameraDevice::streamProcessingComplete()
- CameraStream::PostProcessorWorker class
- Camera3RequestDescriptor::StreamBuffer structure
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Use structured bindings range-based for loops for better readability.
Signed-off-by: Nejc Galof <galof.nejc@gmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The V4L2 Compatibility layer is returning timestamps for buffers which
are incorrectly calculated from the frame metadata.
The sec component of the timestamp is correct, but the usec component is
out, leaving frame captures reporting non-monotonically increasing
timestamps, and incorrect frame rate calculations.
Fix the usecs calculation reported by the V4L2 adaptation layer.
Bug: https://bugs.libcamera.org/show_bug.cgi?id=118
Fixes: 0ce8f2390b52 ("v4l2: v4l2_compat: Add V4L2 compatibility layer")
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
With the controller algorithms running at 60fps, there are some dropped frames
when running at very high famerates. Reducing this to 30fps eliminates all these
drops without any noticeable changes to the image quality.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Simplify the image and embedded buffer matching logic by removing the assumption
that we require a buffer match between the two streams. Instead, if an image
buffer does not match with an embedded data buffer, simply use the ControlList
provided by DelayedControls for the sensor parameters.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
If Stream::returnBuffer() gets passed an internally allocated buffer, it now
simply re-queues it back to the device. With this change, the pipeline handler
code can be simplified slightly as it does not need multiple code paths for
internally allocated and non-internally allocated buffers.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We now handle disabling ("pausing") AWB in the same way as
AEC/AGC. Instead of letting the pause flag be set so that the code
never runs at all, we instead fix the manual settings to the current
values (but continue to be called).
The algorithm does not restart any calculations in this state, but
continues to add AWB metadata to every frame. Therefore certain other
algorithms that want to know it (CCM and ALSC, for example) can still
find it.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
When searching for a suitable pipeline, we mistakenly only break from
the inner loop. This results in the last suitable output being selected.
Pick the first one instead.
Fixes: 1de0f90dd432 ("cam: kms_sink: Print display pipelineconfiguration")
Signed-off-by: Eric Curtin <ecurtin@redhat.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Objects are not expected to be connected to the same signal more than
once. Doing so likely indicates a bug in the code, and can be
highlighted in debug builds with an assert that performs a lookup on the
signals_ list.
While it is possible to allow the implementation to let objects connect
to a specific signal multiple times, there are no expected use cases for
this in libcamera and this behaviour is restricted to favour defensive
programming by raising an error when this occurs.
Remove the support in the test framework which uses multiple Signal
connections on the same object, and update the test to use a second
Signal.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Provide a call allowing requests to be registered and associated with
the pipeline handler after being constructed by the camera.
This provides an opportunity for the PipelineHandler to connect any
signals it may be interested in receiving for the request such as
getting notifications when the request is ready for processing when
using a fence.
While here, update the existing usage of the d pointer in
Camera::createRequest() to match the style of other functions.
Bug: https://github.com/raspberrypi/libcamera-apps/issues/217
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Support easier usage of the v4l2 compatibility layer with a script that
handles the LD_PRELOAD for applications.
The wrapper can be prefixed to launch any application with the preload
set:
$ libcamera-v4l2 v4l2-ctl --list-devices
\_SB_.PCI0.GP13.XHC0.RHUB.PRT4- (libcamera:0):
/dev/video0
platform/vimc.0 Sensor B (libcamera:1):
/dev/video2
/dev/video3
/dev/video4
Specifying '-d' once before the command to run will enable V4L2Compat
layer debug output from libcamera.
Specifying '-d' twice will enable all debug levels from all libcamera
components and provide a very detailed log for analysis.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Currently, a subproject is used to fetch gtest dependency unconditionally
for any Linux distribution besides ChromeOS. But it leads to a regression
in distros whose builders are not allowed to download files during build.
This change was introduced by commit 0d50a04cc918 ("lc-compliance: Build
with gtest in subprojects") and the rationale is that some distros, such
as Debian ship libgtest-dev as a static library. And this could be built
with a different toolchain than the one used to build libcamera itself.
But this seems to be a corner case, usually users will either build both
libcamera and all its dependencies using the same toolchain or build it
using both the libgtest library and toolchain as provided by the distro.
If someone doesn't want for meson to pick up the non-compatible static
library provided by the distro, then instead should make sure that their
build root does not have the package providing this installed.
Let's simplify the logic to find the dependency and just use the built-in
support in dependency() function to fallback to a subproject if not found.
This covers to common case of attempting to use the gtest provided by the
system or pulling from source if not found or is not preferred.
To force meson to fallback to the wrap for gtest you can use this command:
meson configure -Dforce_fallback_for=gtest
and to force fallback for all the dependencies, you can use the following:
meson build --wrap-mode=forcefallback
Fixes: commit 0d50a04cc918 ("lc-compliance: Build with gtest in subprojects")
Reported-by: Eric Curtin <ecurtin@redhat.com>
Signed-off-by: Javier Martinez Canillas <javierm@redhat.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Increase the maximum allowable gain from 6.0 to 8.0 in the normal and short
exposure profiles for all camera sensors. Increase this limit to 12.0 for the
long exposure profiles for sensors where this has been defined.
The 6.0x value was somewhat arbitrarily chosen, and does limit the total
exposure in dark conditions.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Limit the gain code to the maximum value reported by the sensor controls
when sending to DelayedControls. The AGC algorithm will handle a lower
gain used by the sensor, provided it knows the actual gain used. This
change ensures that DelayedControls will never report an unclipped gain
used.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
external
Remove the code that marks the Embedded Data stream as external with the Unicam
Image (RAW) stream. This was needed for legacy reasons when matching image and
embedded buffers, but is not needed any more.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
V4L2 is happy to map buffers read-only for capture devices (but rejects
write-only mappings). We can support this as the dmabuf mmap()
implementation supports it. This change fixes usage of the V4L2
compatibility layer with OpenCV.
While at it, attempt to validate the other flags. videobuf2 requires
MAP_SHARED and doesn't check other flags, so mimic the same behaviour.
While unlikly, other flags could get rejected by other kernel layers for
V4L2 buffers but not for dmabuf. This can be handled later if the need
arises.
Reported-by: Nejc Galof <galof.nejc@gmail.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Nejc Galof <galof.nejc@gmail.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The ISP input stream currently only allocates a single slot in the
V4L2VideoDevice cache as it follows the number of buffers allocated for use.
However, this is wrong as the ISP input stream imports buffers from Unicam
image stream. As a consequence of this, only one cache slot was used during
runtime for the ISP input stream, and if multiple buffers were to be queued
simultaneously, the queue operation would return a failure.
Fix this by passing the same number of RAW buffers available from the Unicam
image stream. Additionally, double this count in the cases where buffers could
be allocated externally from the application.
Bug: https://github.com/raspberrypi/libcamera-apps/issues/236
Bug: https://github.com/raspberrypi/libcamera-apps/issues/238
Bug: https://github.com/raspberrypi/libcamera-apps/issues/240
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|