Age | Commit message (Collapse) | Author |
|
Initialize the pixel array properties in the UVC pipeline handler as
they're now initialized in the CameraSensor class, which the UVC
pipeline handler does not use.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The 'rotation' property is not critical. Only register it if the
sensor driver reports it.
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
If the sensor driver does not report the camera location default it
to 'External' instead of 'Front'.
As the camera location is used to construct the camera unique name
presented to the user, it makes more sense to report multiple 'External'
cameras instead of multiple 'Front' ones.
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
As support for the V4L2_SEL_TGT_CROP selection target used to read the
sensor analogue crop rectangle is schedule to become mandatory but is
still optional, use the sensor's active area size as fallback value to
allow the creation of the CameraSensorInfo in the case the driver does
not support it.
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Support for the V4L2 selection API is currently optional in the
CameraSensor class. Properties registered by using values read through
that API are defaulted in several different places (the Android camera
HAL or the CameraSensor class).
In the future support for the selection API will be made mandatory, but to
give time to sensor drivers in all test platforms to be updated, use
sensor resolution as fallback values for sensor pixel array properties
and cache them as class member variables.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The CameraSensor class requires the sensor driver to report
information through V4L2 controls and through the V4L2 selection API,
and uses that information to register Camera properties and to
construct CameraSensorInfo class instances to provide them to the IPA.
Currently, validation of the kernel support happens each time a
feature is requested, with slighly similar debug/error messages
output to the user in case a feature is not supported.
Rationalize this by validating the sensor driver requirements in a
single function
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The scope of File::exists() was changed to only validate that a file
exists and is therefore not usable to check if a directory exists. This
breaks the persistent name generation for DT based systems as it uses
File::exists() to check for directories, fix this by using stat()
directly.
On Scarlet the persistent names of the cameras are impacted by this and
where incorrectly reported as:
$ cam -l
Available cameras:
1: Internal front camera (platform/ff160000.i2c/i2c-7/7-003c ov2685)
2: Internal front camera (platform/ff160000.i2c/i2c-7/7-0036 ov5695
While the expected ones are restored with this fix:
$ cam -l
Available cameras:
1: Internal front camera (/base/i2c@ff160000/camera@3c)
2: Internal front camera (/base/i2c@ff160000/camera@36)
Fixes: 8f4e16f014c820a0 ("test: file: Check that directories are not treated as files")
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
On the bail out path, always ensure that ImgU and CIO2 are stopped
before freeing the buffers. V4L2VideoDevice class guarantees that
calling stop() without having to call start() is harmless, hence use
this guarantee to simplify error paths.
Signed-off-by: Umang Jain <email@uajain.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Track the state of streamon/streamoff calls to simplify error paths.
Ensuring that streamOff() can be called on non-streaming streams
facilitates simpler error code paths, where a set of devices can all
call streamOff regardless of their initialisation state.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Umang Jain <email@uajain.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
s/chance/change/
Signed-off-by: Sebastian Fricke <sebastian.fricke.linux@gmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The Android camera2 API defines a RAW capture capability ([1]) for
devices that support "outputting RAW buffers and metadata for
interpreting them". This capability requires the camera device to
support RAW_SENSOR ([2]) as an output format. Despite what its name may
sound like, the RAW_SENSOR format is defined as a 16 bits RAW format,
not an opaque implementation-dependent format (which is instead called
RAW_PRIVATE). Devices may additionally support the RAW10 and RAW12
formats, but that isn't enough to claim RAW capture capability.
To comply with the API requirements, only report the
ANDROID_REQUEST_AVAILABLE_CAPABILITIES_RAW capability when 16-bit RAW is
supported.
[1] https://developer.android.com/reference/android/hardware/camera2/CameraMetadata#REQUEST_AVAILABLE_CAPABILITIES_RAW
[2] https://developer.android.com/reference/android/graphics/ImageFormat#RAW_SENSOR
Suggested-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Tested-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Simplify the src level meson file by moving the declaration of the v4l2
subdir to match the other invocations, making use of 'subdir_done()' to
break out if the adaptation layer is not enabled.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Move the android subdir below the configuration options to keep all
subdirs together.
Add a comment explaining why android must come first, and some padding
to group the libcamera and ipa components, applications, and remaining
adaptation layers.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Ensure that when we check for existence with File() it will only return
if the path leads to a file, and not a directory.
Device nodes which could still be opened by this class are still
supported.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When accessing the value of a property by reading the properties
ControlList content with ControlList::get<>() it is not necessary to
specify the template type as it is already conveyed by the Control
instance provided as first argument.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Conditionally report the ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT
property inspecting the draft property reported by the libcamera Camera.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Inspect the list of media bus codes supported by the camera sensor
in order to deduce the color filter arrangement and register the
ColorFilterArrangement draft property.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The existing implementation of the BayerFormat class supports
converting a V4L2PixelFormat to a BayerFormat and vice-versa.
Expand the class by adding support for converting a media bus code
to a BayerFormat instance, by providing a conversion table and a
dedicated static methods.
Do not provide support for converting a BayerFormat to a media bus code
as there's no 1-to-1 mapping between the two.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Define the 'ColorFilterArrangement' draft property. The property is
currently identical to ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
There are no users left of this field, drop it.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
There is no need to pass the Camera pointer to queueRequest(),
completeBuffer() and completeRequest() as the Request also passed
contains the same information. Remove the Camera argument to avoid
situations where the information in the Request and the argument differ.
There is no functional change and no public API change as the interface
is only used between the Camera and PipelineHandler.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Remove the unused configs_ member variable in SimpleCameraData.
Signed-off-by: Phi-Bang Nguyen <pnguyen@baylibre.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
The Doxygen documentation for enumerators of the ConnectionType
enumeration prefixes the enumerator name with the enumeration name. For
unscoped enumerations, this is incorrect. Drop the scope. This fixes
warnings produced by Doxygen when multiple enumerators with identical
names are defined in different scopes.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
The digital gain reported by the AGC algorithm is reported in the
metadata that is included in completed requests.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
This control reports the global digital gain applied by the pipeline
as a whole, after capturing a raw image from the sensor.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The initLinks() and createCamera() methods are declared but never
defined. Remove them.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Report the number of supported output streams through the
ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS static metadata.
The camera HAL currently supports:
- 1 optional RAW stream
- 2 YUV streams
- 1 JPEG stream
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Report the pipeline depth in the capture results if the pipeline
reports it.
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
There is no need to validate all the ISP and sensor V4L2 control on
every use. Simply validate them once during the IPA configuration, and
fail if a required control is not available.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
These specific controls are part of the sensor device, so rename them
appropriately.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Replace manual resource destruction with std::unique_ptr<> where
applicable. This removes the need for several destructors.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Replace manual construction of V4L2VideoDevice and V4L2Subdevide with
the fromEntityName() helper where possible. The returned pointer is
managed as a std::unique_ptr<>, which simplifies the VimcCameraData
destructor.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The CameraSensor instance stored in RPiCameraData::sensor_ is allocated
dynamically and never deleted. Fix the memory leak by storing it in a
std::unique_ptr<>.
Fixes: 740fd1b62f67 ("libcamera: pipeline: Raspberry Pi pipeline handler")
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
C++17 has a std::size() function that returns the size of a C-style
array. Use it instead of the custom ARRAY_SIZE macro.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <email@uajain.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The auto keyword facilitates writing code. It avoids typing out very
long types, which can make the code more readable, but it can also have
a negative impact on readability as it requires the reader (including
reviewers) to look up the type of the variable.
Replace one occurrence of auto with the explicit type where doing so
doesn't require a long type name.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The ChromeOS camera service, which is the current main user of the
Android Camera HAL, fails to start if the pixel array properties are
not registered.
As the sensor driver for the Soraka test device have not yet been
updated to report their pixel array properties through the V4L2
selection API, temporary fix the gap by re-establishing the default
properties values removed by commit 1889cdc2e91c ("android: camera_device:
Initialize pixel array properties")
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Previously the CamHelper was returning the number of frames to drop
(on account of AGC/AWB converging). This wasn't really appropriate,
it's better for the algorithms to do it as they know how many frames
they might need.
The CamHelper::HideFramesStartup method should now just be returning
the number of frames to hide because they're bad/invalid in some way,
not worrying about the AGC/AWB. For many sensors, the correct value
for this is zero. But the ov5647 needs updating as it must return 2.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
colour gains
When the AWB is started from "cold" with fixed colour gains, we try to
estimate the colour temperature this corresponds to (if a calibrated
CT curve was supplied). When fixed colour gains are set after the AWB
has been running, we leave the CT estimate alone, as the one we have
is probably sensible.
This estimated colour is passed out in the metadata for other
algorithms - notably ALSC - to use.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add a method to the piecewise linear function (Pwl) class to compute
the inverse of a given Pwl. If the input function is non-monotonic we
can only produce a best effort "pseudo" inverse, and we signal this to
the caller.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We add a GetConvergenceFrames method to the AwbAlgorithm class which
can be called when the AWB is started from scratch. It suggests how
many frames should be dropped before displaying any (while the AWB
converges).
The Raspberry Pi specific implementation makes this customisable from
the tuning file.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We add a GetConvergenceFrames method to the AgcAlgorithm class which
can be called when the AGC is started from scratch. It suggests how
many frames should be dropped before displaying any (while the AGC
converges).
The Raspberry Pi specific implementation makes this customisable from
the tuning file.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
configure
The number of frames to drop (not display) is passed back now from the
start method, not configure. This means applications have a chance to
set fixed exposure/gain before starting the camera and this can affect
the frame drop count that is returned.
Note how we need to be able to tell the very first time we start the
camera from subsequent restarts, hence addition of the "firstStart_"
flag.
Both the IPA implementation file and the pipeline handler need
matching modifications.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Camera::start
The ScalerCrop control is handled by the pipeline handler, not the
IPA, so must be handled explicitly in the Camera::start method. The
ScalerCrop code used when processing requests has been factored out to
make it easy to reuse.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
This reorders Camera3Configs before executing
CameraConfiguration::validate() to make it easier for the Camera
to satisfy the Android framework request.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <email@uajain.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Add blank line and fix compilation on gcc 7.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Use the newly introduced Camera3StreamConfig to associate the
Android requested streams with the associated StreamConfiguration
in a vector of configurations.
This change prepares to sort the vector of configuration before using
it to configure the Camera and populate the streams_ vector.
No functional changes intended.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <email@uajain.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Camera3StreamConfig is a new class to store camera3_stream and
types with associated StreamConfiguration.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <email@uajain.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The fromEntityName() function returns a pointer to a newly allocated
V4L2Device instance, which must be deleted by the caller. This opens the
door to memory leaks. Return a unique pointer instead, which conveys the
API semantics better than a sentence in the documentation.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The fromEntityName() function returns a pointer to a newly allocated
V4L2Subdevice instance, which must be deleted by the caller. This opens
the door to memory leaks. Return a unique pointer instead, which conveys
the API semantics better than a sentence in the documentation.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Forward any controls passed into the pipeline handler to the IPA.
The IPA then sets up the Raspberry Pi controller with these settings
appropriately, and passes back any V4L2 sensor controls that need
to be applied.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
This change allows controls passed into PipelineHandler::start to be
forwarded onto IPAInterface::start(). We also add a return channel if the
pipeline handler must action any of these controls, e.g. setting the
analogue gain or shutter speed in the sensor device.
The IPA interface wrapper isn't addressed as it will soon be replaced by
a new mechanism to handle IPC.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|