Age | Commit message (Collapse) | Author |
|
Now that we allow IPA modules to live in nested directories for the
RaspberryPi platform, it is required to allow parsing one more level
to be able to run libcamera from the source directory.
Without this patch the $(builddir)/src/ipa/rpi/vc4/ipa_rpi_vc4.so
IPA module cannot be loaded.
The issue is only present when running from the source directory
as when libcamera is installed all IPA modules are deployed to a single
$(prefix)/$(libcamera_libdir)/libamera/ location.
Fixes: 46aefed208fe ("pipeline: meson: Allow nested pipeline handler directory structures")
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Files opened internally in libcamera without the O_CLOEXEC file will
remain open upon a call to one of the exec(3) functions. As exec()
doesn't destroy local or global objects, this can lead to various side
effects. Avoid this by opening file descriptors with O_CLOEXEC for all
internal files.
The O_CLOEXEC flag should also be set when obtaining file handles from
the V4L2 VIDIOC_EXPBUF operation, but was missed during the updates to
both d942bdc913c5 ("libcamera: v4l2_device: openat(2) with O_CLOEXEC to
cleanup after exec(3)") and 436b38fd89fe ("libcamera: Open files with
O_CLOEXEC").
Set the O_CLOEXEC flag on calls to ioctl(VIDIOC_EXPBUF).
Fixes: 436b38fd89fe ("libcamera: Open files with O_CLOEXEC")
Signed-off-by: Elias Naur <mail@eliasnaur.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Move the v4l2-compat.so shared library installation to the libcamera
directory under libexec. This is the same location that the proxy
workers live and will facilitate easier packaging of the V4L2
compatibility layer with distributions.
Create a new libcamera_libexecdir variable within meson to simplify
representation of this path and update the proxy worker meson.build
infrastructure to make use of it as well.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Javier Martinez Canillas <javierm@redhat.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a flags_ field to indicate stream state information in RPi::Stream.
This replaces the existing external_ and importOnly_ boolean flags.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Create a new PipelineHandlerBase class that handles general purpose
housekeeping duties for the Raspberry Pi pipeline handler. Code the
implementation of new class is essentially pulled from the existing
pipeline/rpi/vc4/raspberrypi.cpp file with a small amount of
refactoring.
Create a derived PipelineHandlerVc4 class from PipelineHandlerBase that
handles the VC4 pipeline specific tasks of the pipeline handler. Again,
code for this class implementation is taken from the existing
pipeline/rpi/vc4/raspberrypi.cpp with a small amount of
refactoring.
The goal of this change is to allow third parties to implement their own
pipeline handlers running on the Raspberry Pi without duplicating all of
the pipeline handler housekeeping tasks.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Return a const std::string reference from RPi::Stream::name() to avoid
copying a string when not needed.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The region weights for the the AGC zones are handled by the AGC
algorithm. Apply them directly in the IPA (vc4.cpp) to the statistics
that we pass to the AGC.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Create a new IpaBase class that handles general purpose housekeeping
duties for the Raspberry Pi IPA. The implementation of the new class
is essentially pulled from the existing ipa/rpi/vc4/raspberrypi.cpp file
with a minimal amount of refactoring.
Create a derived IpaVc4 class from IpaBase that handles the VC4 pipeline
specific tasks of the IPA. Again, code for this class implementation is
taken from the existing ipa/rpi/vc4/raspberrypi.cpp with a
minimal amount of refactoring.
The goal of this change is to allow third parties to implement their own
IPA running on the Raspberry Pi without duplicating all of the IPA
housekeeping tasks.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Restructure the IPA mojom interface to be more consistent in the use
of the API. Function parameters are now grouped into *Params structures
and results are now returned in *Results structures.
The following pipeline -> IPA interfaces have been removed:
signalQueueRequest(libcamera.ControlList controls);
signalIspPrepare(ISPConfig data);
signalStatReady(uint32 bufferId, uint32 ipaContext);
and replaced with:
prepareIsp(PrepareParams params);
processStats(ProcessParams params);
signalQueueRequest() is now encompassed within prepareIsp().
The following IPA -> pipeline interfaces have been removed:
runIsp(uint32 bufferId);
embeddedComplete(uint32 bufferId);
statsMetadataComplete(uint32 bufferId, libcamera.ControlList controls);
and replaced with the following async calls:
prepareIspComplete(BufferIds buffers);
processStatsComplete(BufferIds buffers);
metadataReady(libcamera.ControlList metadata);
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
At present, the RPiStream buffer ids == -1 indicates an invalid value.
As a simplification, use id == 0 to indicate an invalid value. This
allows for better code readability.
As a consequence of this, use unsigned int for the buffer id values.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Split the Raspberry Pi pipeline handler and IPA source code into common
and VC4/BCM2835 specific file structures.
For the pipeline handler, the common code files now live in
src/libcamera/pipeline/rpi/common/
and the VC4-specific files in src/libcamera/pipeline/rpi/vc4/.
For the IPA, the common code files now live in
src/ipa/rpi/{cam_helper,controller}/
and the vc4 specific files in src/ipa/rpi/vc4/. With this change, the
camera tuning files are now installed under share/libcamera/ipa/rpi/vc4/.
To build the pipeline and IPA, the meson configuration options have now
changed from "raspberrypi" to "rpi/vc4":
meson setup build -Dipas=rpi/vc4 -Dpipelines=rpi/vc4
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The current pipeline handler build files require a flat directory
struture for each pipeline handler. Modify the build files to remove
this restriction and allow a directory structure such as:
src/libcamera/pipeline/
|- raspberrypi
|- common
|- vc4
|- rkisp1
|- ipu3
where each subdir (e.g. raspberrypi/common, raspberrypi/vc4) has its own
meson.build file. Such a directory structure will be introduced for the
Raspberry Pi pipeline handler in a future commit.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The current IPA build files require a flat directory structure for the
IPAs. Modify the build files to remove this restriction and allow a
directory structure such as:
src/ipa
|- raspberrypi
|- common
|- cam_helpers
|- controller
|- vc4
|- rkisp1
|- ipu3
where each subdir (e.g. raspberrypi/common, raspberrypi/cam_helper) has
its own meson.build file. Such a directory structure will be introduced
for the Raspberry Pi IPA in a future commit.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Remove the restriction on using the '/' character in the IPA name
string. This allows more flexibility in IPA directory structures where
different IPA modules might live in subdirectories under the usual
src/ipa/<platform>/ top level directory.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The switch statement checks `roles.size()` with cases for 1 and 2,
so in the `default` branch, `role.size() > 2`, i.e. it is always
different from 1, so the check is unnecessary.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
|
|
The STM32 contains a camera pipeline known as the DCMIPP (Digital
Camera-Memory Interface Pixel Processor) which receives data from a
parallel interface and dumps the post-processed data to memory. The
pipeline is capable of some processing in the form of downscaling
captured data through cropping or skipping the sensor's output.
The simple pipeline handler is quite capable of handling the DCMIPP
given its operation is handled entirely through configuring the pads
of a media graph, so add support for the driver to the pipeline's
supportedDevices array.
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
A new warning has been introduced to gcc-13 that produces a false
positive on the cam file sink object:
src/cam/file_sink.cpp:92:45: error: possibly dangling reference to a temporary [-Werror=dangling-reference]
92 | const FrameMetadata::Plane &meta = buffer->metadata().planes()[i];
| ^~~~
src/cam/file_sink.cpp:92:81: note: the temporary was destroyed at the end of the full expression '(& buffer->libcamera::FrameBuffer::metadata())->libcamera::FrameMetadata::planes().libcamera::Span<const libcamera::FrameMetadata::Plane>::operator[](i)'
92 | const FrameMetadata::Plane &meta = buffer->metadata().planes()[i];
| ^
cc1plus: all warnings being treated as errors
Workaround this issue by refactoring the code to take a local const
copy of the bytesused value, rather than a local const reference to the
plane.
Bug: https://bugs.libcamera.org/show_bug.cgi?id=185
Bug: https://gcc.gnu.org/bugzilla/show_bug.cgi?id=107532
Co-developed-by: Khem Raj <raj.khem@gmail.com>
Signed-off-by: Eric Curtin <ecurtin@redhat.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
[Kieran: Commit and comment reworded prior to merge]
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add to the RkISP1 data configuration files for the imx258 and ov8858
sensors found on Pine64 PinephonePro devices.
The tuning file contain LSC tables extracted from the Rockchip Android
BSP the PinephonePro ships with.
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Tested-by: Robert Mader <robert.mader@collabora.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add to the RkISP1 data configuration files for the ov2685 and ov5695
sensor found on Google DRU "Scarlett" tablet.
The tuning files contain LSC tables extracted from the Google Chrome
Android HAL configuration files, available at
src/overlays/overlay-scarlet/media-libs/cros-camera-hal-configs-scarlet/files/IQ/
at revision "stabilize-14790.B".
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Tested-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Just like we do for other pipeline handlers already.
This ensures we corretly pass on transforms that are not handled by the
sensor - e.g. rotations - back to the app via the config, which is
required on devices like the Pinephone.
Signed-off-by: Robert Mader <robert.mader@collabora.com>
Tested-by: Arnav Singh <me@arnavion.dev>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Now that the media bus code selection procedure does not depend
on the ISICameraConfiguration::formatsMap_ remove the association
between PixelFormat supported by the ISI and the media bus code produced
by the sensor.
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Daniel Scally <dan.scally@ideasonboard.com>
Tested-by: Daniel Scally <dan.scally@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
At generateConfiguration() a YUV/RGB pixel format is preferred for the
StillCapture/VideoRecording/Viewfinder roles, but currently there are no
guarantees in place that the sensor provides a non-Bayer bus format from
which YUV/RGB can be generated.
This makes the default configuration generated for those roles not to
work if the sensor is a RAW-only one.
To improve the situation split the configuration generation in two,
one for YUV modes and one for Raw Bayer mode.
StreamRoles assigned to a YUV mode will try to first generate a YUV
configuration and then fallback to RAW if that's what the sensor can
provide.
As an additional requirement, for YUV streams, the generated mode has to
be validated with the sensor to confirm the desired sizes can be
generated as the ISI cannot up-scale. In order to test a format use the
newly introduced CameraSensor::tryFormat().
Reported-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Daniel Scally <dan.scally@ideasonboard.com>
Tested-by: Daniel Scally <dan.scally@ideasonboard.com>
|
|
Add a function to the CameraSensor class that allows to test a format
without applying it to the subdevice and without modifying any control
value associated with the camera sensor.
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The ISICameraConfiguration::validate() function selects which media
bus format to configure the sensor with based on the pixel format
of the first configured stream using the media bus code associated to it
in the formatsMap_ map.
In order to remove the PixelFormamt-to-mbus-code association in
formatsMap_ provide a wrapper function for the newly introduced
getRawMediaBusFormat() and getYuvMediaBusFormat() that automatically
selects what media bus format to use based on the first stream pixel
format.
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Daniel Scally <dan.scally@ideasonboard.com>
Tested-by: Daniel Scally <dan.scally@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
As per the RAW format selection, the media bus format selection
procedure relies on the direct association of PixelFormat and media
bus code in the formatsMap_ map.
As the ISI can generate YUV and RGB formats from any non-Bayer media
bus format, break out the YUV/RGB media bus format selection to a
separate function.
The newly introduced getYuvMediaBusFormat() tests a list of
known-supported media bus formats against the list of media bus
formats supported by the sensor and tries to prefer media bus
codes with the same encoding as the requested PixelFormat.
Use the newly introduced function in
ISICameraConfiguration::validateYuv() to make sure the sensor can
produce a YUV/RGB media bus format.
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Daniel Scally <dan.scally@ideasonboard.com>
Tested-by: Daniel Scally <dan.scally@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The current implementation of the ISI pipeline handler handles
translation of PixelFormat to media bus formats from the sensor
through a centralized map.
As the criteria to select the correct media bus code depends on if the
output PixelFormat is a RAW Bayer format or not, start by splitting
the RAW media bus code procedure selection out by adding a function
for such purpose to the ISICameraData class.
Add the function to the ISICameraData and not to the
ISICameraConfiguration because:
- The sensor is a property of CameraData
- The same function will be re-used by the ISIPipelineHandler
during CameraConfiguration generation.
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Daniel Scally <dan.scally@ideasonboard.com>
Tested-by: Daniel Scally <dan.scally@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
There is no need to wrap DRM::AtomicRequest in std::unique_ptr<>
in KMSSink::start(). Remove it so that the syntax becomes similar to
what we have in KMSSink::stop().
No functional changes intended.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
It's possible to construct a Camera with an unsafe controlInfo_.
This is the case in the Simple pipeline, where the camera controls are
not populated.
With Simple, if we attempt to set a Control, we end up with a segfault
because the default constructor for ControlInfoMap doesn't
intialized idmap_ which is initialized at class declaration time as
const ControlIdMap *idmap_ = nullptr;
Add some safeguards in ControlInfoMap to handle this case.
Link: https://lists.libcamera.org/pipermail/libcamera-devel/2023-April/037439.html
Suggested-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Mattijs Korpershoek <mkorpershoek@baylibre.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
In `udevNotify()`, constructing an std::string from the device's
associated action is unnecessary as it is only compared against static
strings, and for that purpose an std::string_view works just as well,
while being cheaper to construct.
In the same vein, an std::string_view can be used to store the device's
devnode initially, and the string construction can be deferred until it
is needed.
Furthermore, previously `udev_device_get_devnode()` was called twice.
The extra call is now removed.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add formats definition and mappings for 14-bits Bayer RAW formats.
Add definitions for non-packed and CSI-2 packed variants.
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
|
|
Previously, after `addV4L2Device()` had seen all dependecies, it would
remove the `MediaDeviceDeps` object from the `pending_` list, which
would result in it being destroyed. However, there would still be
(dangling) pointers to this object in `devMap_` that were added in
`addUdevDevice()` (line 103). So remove the entry with the given devnum
when it is removed from the corresponding `MediaDeviceDeps` object.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The "shadows" constraint mode actually exists in a number of tuning
files, but had been omitted from the list of supported modes.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Clarify IPA's acronym by specifying what "IPA" stands for as part of the
ipa namespaces' "brief" doxygen-generated description. This allows
visitors to the docs to immediately have an idea of the purpose of the
IPA namespace at a glance. Because of the prevalence and importance of
the IPA namespace and functionality, the fact that it stands for "Image
Processing Algorithm" should be accessible to even casual perusers of
the docs.
Signed-off-by: Gabby George <gabbymg94@gmail.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Files opened internally in libcamera without the O_CLOEXEC file will
remain open upon a call to one of the exec(3) functions. As exec()
doesn't destroy local or global objects, this can lead to various side
effects. Avoid this by opening file descriptors with O_CLOEXEC for all
internal files.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Use the generalised focus statistics structure to compute the centre
window focus FoM value. This avoids needed to hard-code a specific
grid size.
Remove the focus reporting algorithm as the functionality is duplicated
by this bit of IPA code. Remove focus_status.h as it is no longer needed.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Remove any hard-coded assumptions about the target hardware platform
from the autofocus algorithm. Instead, use the "target" string provided
by the camera tuning config and generalised statistics structures to
determing parameters such as grid and region sizes.
Additionally, PDAF statistics are represented by a generalised region
statistics structure to be device agnostic.
These changes also require the autofocus algorithm to initialise
region weights on the first frame's prepare()/process() call rather
than during initialisation.
Signed-off-by: Nick Hollinghurst <nick.hollinghurst@raspberrypi.com>
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Tested-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Remove any hard-coded assumptions about the target hardware platform
from the AGC algorithm. Instead, use the "target" string provided by
the camera tuning config and generalised statistics structures to
determing parameters such as grid and region sizes.
This change replaces all hard-coded arrays with equivalent std::vector
types.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Update the overloaded RegionStats::get() and RegionStats::getFloating()
member functions to return a Region struct for consistency.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Generalise the contrast algorithm code by removing any hard-coded
assumptions about the target hardware platform. Instead, the algorithm
code creates a generic Pwl that gets returned to the IPA, where it gets
converted to the bcm2835 hardware specific lookup table.
As a drive-by, remove an unused mutex.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The algorithm uses the data type std::vector<std::array<double, 4>> to
represent the large sparse matrices that are XY (X, Y being the ALSC
grid size) high but with only 4 non-zero elements on each row.
Replace this slightly long type name by SparseArray<double>.
No functional changes.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The Array2D class is a very thin wrapper round std::vector that can be
used almost identically in the code, but it carries its 2D size with
it so that we aren't passing it around all the time.
All the std::vectors that were X * Y in size (X and Y being the ALSC
grid size) have been replaced. The sparse matrices that are XY * 4 in
size have not been as they are somewhat different, are used
differently, require more code changes, and actually make things more
confusing if everything looks like an Array2D but are not the same.
There should be no change in algorithm behaviour at all.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Remove any hard-coded assumptions about the target hardware platform
from the ALSC algorithm. Instead, use the "target" string provided by
the camera tuning config and generalised statistics structures to
determing parameters such as grid and region sizes.
The ALSC calculations use run-time allocated arrays/vectors on every
frame. Allocating these might add a non-trivial run-time penalty.
Replace these dynamic allocations with a set of reusable pre-allocated
vectors during the init phase.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a new Controller::HardwareConfig structure that captures the
hardware statistics grid/histogram sizes and pipeline widths. This
ensures there is a single centralised places for these parameters.
Add a getHardwareConfig() helper function to retrieve these values for a
given hardware target.
Update the statistics populating routine in the IPA to use the values
from this structure instead of the hardcoded numbers.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The target string may be used by algorithms to determine the running
hardware target.
Store the target string provided by the camera tuning files in the
controller state. Add a getTarget() member function to retrieve this
string.
Validate the correct hardware target ("bcm2835") during the IPA
initialisation phase.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Fix a bug in the default frame durations calculation where the min/max
values are swapped round. This is a rarely travelled code path, so has
not actually caused a reported failure.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Make a copy of the CameraMode structure on a switch mode call. This
replaces the existing lastSensitivity_ field.
Limit the AGC gain calculations to the minimum value given by the
CameraMode structure. The maximum value remains unclipped as any gain
over the sensor maximum will be made up by digital gain.
Rename clipShutter to limitShutter for consistency, and have the latter
limit the shutter speed to both upper and lower bounds.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Use the new analogue gain and shutter speed limit fields in the ipa
code when reporting back the control value limits and calculating the
analogue gain code to use. This also replaces the now unused (and
removed) maxSensorGainCode_ field.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add fields in the CameraMode structure to capture the mode specific
limits for analogue gain and shutter speed. For convenience, also add
fields for minimum and maximum frame durations.
Populate these new fields when setting up the CameraMode structure.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When an executable using libcamera calls exec(3) while a camera is in
use, file descriptors corresponding to the V4L2 video devices are kept
open has they have been created without O_CLOEXEC. This results in the
video devices staying busy, preventing the new executable from using
them:
[91] ERROR V4L2 v4l2_videodevice.cpp:1047 /dev/video0[149:cap]: Unableto set format: Resource busy
Fix this by opening video devices with O_CLOEXEC, which is generally a
good idea in libraries.
Signed-off-by: Elias Naur <mail@eliasnaur.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Invalid, or not correctly reset requests can cause undefined behaviour
in the pipeline handlers due to unexpected request state.
If the status has not been reset to Request::RequestPending, it is
either not new, or has not been correctly procesed through
Request::reuse().
This can be caught early by validating the status of the request when it
is queued to a camera.
Reject invalid requests before processing them in the pipeline handlers.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|