Age | Commit message (Collapse) | Author |
|
The camera3_stream_buffer_t instances part of a capture request contain
information on the stream for which capture has been requested (size,
format and fences) and a handle to the stream's memory buffers.
This information is copied in the descriptor one piece at the time while
processing the camera3 streams to be re-used at request completion time.
Simplify the code by copying the stream information in the descriptor
at construction time.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The Camera3RequestDescriptor class can access the number of buffers
and the frame number from the camera3_capture_request_t instead of
having the caller passing them to the constructor.
This change allows to access other fields of the capture request, such
as the capture settings.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Add a copy constructor and assignment operator to CameraMetadata, as well
a constructor from camera_metadata_t. Also add a function getEntry to
allow getting metadata entries from CameraMetadata. This allows us to
use CameraMetadata for reading from camera_metadata_t.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Set the maximum shutter speed for the normal exposure profile to 66ms,
allowing viewfinder framerates to go down to approx. 15fps.
Set the maximum shutter speed for the sport exposure profile to 33ms,
limiting the minimum framerate to approx. 30fps.
Add a long exposure profile to allow shutter speeds of up to 120ms.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add support for setting V4L2_CID_VBLANK appropriately when setting
V4L2_CID_EXPOSURE. This will allow adaptive framerates during
viewfinder use cases (e.g. when the exposure time goes above 33ms, we
can reduce the framerate to lower than 30fps).
The minimum and maximum frame durations are provided via libcamera
controls, and will prioritise exposure time limits over any AGC request.
V4L2_CID_VBLANK is controlled through the staggered writer, just like
the exposure and gain controls.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add an int64_t array control (controls::FrameDurations) to specify the
minimum and maximum (in that order) frame duration to be used by the
camera sensor.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Now that the pixel array properties have been defaulted in the
CameraSensor class (or in the pipeline handler, for the UVC use case),
they will always be reported by the libcamera::Camera and there's no
need to default them in the Camera HAL.
Remove defaults and assume properties are always there.
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The VIMC driver does not yet support all the features required
for all sensor drivers. As it is the main testing platforms and the
driver changes might take a long time to land in the developments
and testing platforms, temporary close the gap by skipping driver
validation and initializing properties with static information such
as the sensor resolution.
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Add a const version of the MediaObject::dev() method to be able to
retrieve a pointer to a const MediaDevice from a constant instance of
a MediaObject sub-class.
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Initialize the pixel array properties in the UVC pipeline handler as
they're now initialized in the CameraSensor class, which the UVC
pipeline handler does not use.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The 'rotation' property is not critical. Only register it if the
sensor driver reports it.
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
If the sensor driver does not report the camera location default it
to 'External' instead of 'Front'.
As the camera location is used to construct the camera unique name
presented to the user, it makes more sense to report multiple 'External'
cameras instead of multiple 'Front' ones.
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
As support for the V4L2_SEL_TGT_CROP selection target used to read the
sensor analogue crop rectangle is schedule to become mandatory but is
still optional, use the sensor's active area size as fallback value to
allow the creation of the CameraSensorInfo in the case the driver does
not support it.
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Support for the V4L2 selection API is currently optional in the
CameraSensor class. Properties registered by using values read through
that API are defaulted in several different places (the Android camera
HAL or the CameraSensor class).
In the future support for the selection API will be made mandatory, but to
give time to sensor drivers in all test platforms to be updated, use
sensor resolution as fallback values for sensor pixel array properties
and cache them as class member variables.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The CameraSensor class requires the sensor driver to report
information through V4L2 controls and through the V4L2 selection API,
and uses that information to register Camera properties and to
construct CameraSensorInfo class instances to provide them to the IPA.
Currently, validation of the kernel support happens each time a
feature is requested, with slighly similar debug/error messages
output to the user in case a feature is not supported.
Rationalize this by validating the sensor driver requirements in a
single function
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The scope of File::exists() was changed to only validate that a file
exists and is therefore not usable to check if a directory exists. This
breaks the persistent name generation for DT based systems as it uses
File::exists() to check for directories, fix this by using stat()
directly.
On Scarlet the persistent names of the cameras are impacted by this and
where incorrectly reported as:
$ cam -l
Available cameras:
1: Internal front camera (platform/ff160000.i2c/i2c-7/7-003c ov2685)
2: Internal front camera (platform/ff160000.i2c/i2c-7/7-0036 ov5695
While the expected ones are restored with this fix:
$ cam -l
Available cameras:
1: Internal front camera (/base/i2c@ff160000/camera@3c)
2: Internal front camera (/base/i2c@ff160000/camera@36)
Fixes: 8f4e16f014c820a0 ("test: file: Check that directories are not treated as files")
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
On the bail out path, always ensure that ImgU and CIO2 are stopped
before freeing the buffers. V4L2VideoDevice class guarantees that
calling stop() without having to call start() is harmless, hence use
this guarantee to simplify error paths.
Signed-off-by: Umang Jain <email@uajain.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Track the state of streamon/streamoff calls to simplify error paths.
Ensuring that streamOff() can be called on non-streaming streams
facilitates simpler error code paths, where a set of devices can all
call streamOff regardless of their initialisation state.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Umang Jain <email@uajain.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
s/chance/change/
Signed-off-by: Sebastian Fricke <sebastian.fricke.linux@gmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The Android camera2 API defines a RAW capture capability ([1]) for
devices that support "outputting RAW buffers and metadata for
interpreting them". This capability requires the camera device to
support RAW_SENSOR ([2]) as an output format. Despite what its name may
sound like, the RAW_SENSOR format is defined as a 16 bits RAW format,
not an opaque implementation-dependent format (which is instead called
RAW_PRIVATE). Devices may additionally support the RAW10 and RAW12
formats, but that isn't enough to claim RAW capture capability.
To comply with the API requirements, only report the
ANDROID_REQUEST_AVAILABLE_CAPABILITIES_RAW capability when 16-bit RAW is
supported.
[1] https://developer.android.com/reference/android/hardware/camera2/CameraMetadata#REQUEST_AVAILABLE_CAPABILITIES_RAW
[2] https://developer.android.com/reference/android/graphics/ImageFormat#RAW_SENSOR
Suggested-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Tested-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Simplify the src level meson file by moving the declaration of the v4l2
subdir to match the other invocations, making use of 'subdir_done()' to
break out if the adaptation layer is not enabled.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Move the android subdir below the configuration options to keep all
subdirs together.
Add a comment explaining why android must come first, and some padding
to group the libcamera and ipa components, applications, and remaining
adaptation layers.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Ensure that when we check for existence with File() it will only return
if the path leads to a file, and not a directory.
Device nodes which could still be opened by this class are still
supported.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When accessing the value of a property by reading the properties
ControlList content with ControlList::get<>() it is not necessary to
specify the template type as it is already conveyed by the Control
instance provided as first argument.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Conditionally report the ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT
property inspecting the draft property reported by the libcamera Camera.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Inspect the list of media bus codes supported by the camera sensor
in order to deduce the color filter arrangement and register the
ColorFilterArrangement draft property.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The existing implementation of the BayerFormat class supports
converting a V4L2PixelFormat to a BayerFormat and vice-versa.
Expand the class by adding support for converting a media bus code
to a BayerFormat instance, by providing a conversion table and a
dedicated static methods.
Do not provide support for converting a BayerFormat to a media bus code
as there's no 1-to-1 mapping between the two.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Define the 'ColorFilterArrangement' draft property. The property is
currently identical to ANDROID_SENSOR_INFO_COLOR_FILTER_ARRANGEMENT.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
There are no users left of this field, drop it.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
There is no need to pass the Camera pointer to queueRequest(),
completeBuffer() and completeRequest() as the Request also passed
contains the same information. Remove the Camera argument to avoid
situations where the information in the Request and the argument differ.
There is no functional change and no public API change as the interface
is only used between the Camera and PipelineHandler.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Remove the unused configs_ member variable in SimpleCameraData.
Signed-off-by: Phi-Bang Nguyen <pnguyen@baylibre.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
The Doxygen documentation for enumerators of the ConnectionType
enumeration prefixes the enumerator name with the enumeration name. For
unscoped enumerations, this is incorrect. Drop the scope. This fixes
warnings produced by Doxygen when multiple enumerators with identical
names are defined in different scopes.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
The digital gain reported by the AGC algorithm is reported in the
metadata that is included in completed requests.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
This control reports the global digital gain applied by the pipeline
as a whole, after capturing a raw image from the sensor.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The initLinks() and createCamera() methods are declared but never
defined. Remove them.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Report the number of supported output streams through the
ANDROID_REQUEST_MAX_NUM_OUTPUT_STREAMS static metadata.
The camera HAL currently supports:
- 1 optional RAW stream
- 2 YUV streams
- 1 JPEG stream
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Report the pipeline depth in the capture results if the pipeline
reports it.
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
There is no need to validate all the ISP and sensor V4L2 control on
every use. Simply validate them once during the IPA configuration, and
fail if a required control is not available.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
These specific controls are part of the sensor device, so rename them
appropriately.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Replace manual resource destruction with std::unique_ptr<> where
applicable. This removes the need for several destructors.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Replace manual construction of V4L2VideoDevice and V4L2Subdevide with
the fromEntityName() helper where possible. The returned pointer is
managed as a std::unique_ptr<>, which simplifies the VimcCameraData
destructor.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The CameraSensor instance stored in RPiCameraData::sensor_ is allocated
dynamically and never deleted. Fix the memory leak by storing it in a
std::unique_ptr<>.
Fixes: 740fd1b62f67 ("libcamera: pipeline: Raspberry Pi pipeline handler")
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
C++17 has a std::size() function that returns the size of a C-style
array. Use it instead of the custom ARRAY_SIZE macro.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <email@uajain.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The auto keyword facilitates writing code. It avoids typing out very
long types, which can make the code more readable, but it can also have
a negative impact on readability as it requires the reader (including
reviewers) to look up the type of the variable.
Replace one occurrence of auto with the explicit type where doing so
doesn't require a long type name.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The ChromeOS camera service, which is the current main user of the
Android Camera HAL, fails to start if the pixel array properties are
not registered.
As the sensor driver for the Soraka test device have not yet been
updated to report their pixel array properties through the V4L2
selection API, temporary fix the gap by re-establishing the default
properties values removed by commit 1889cdc2e91c ("android: camera_device:
Initialize pixel array properties")
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Previously the CamHelper was returning the number of frames to drop
(on account of AGC/AWB converging). This wasn't really appropriate,
it's better for the algorithms to do it as they know how many frames
they might need.
The CamHelper::HideFramesStartup method should now just be returning
the number of frames to hide because they're bad/invalid in some way,
not worrying about the AGC/AWB. For many sensors, the correct value
for this is zero. But the ov5647 needs updating as it must return 2.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
colour gains
When the AWB is started from "cold" with fixed colour gains, we try to
estimate the colour temperature this corresponds to (if a calibrated
CT curve was supplied). When fixed colour gains are set after the AWB
has been running, we leave the CT estimate alone, as the one we have
is probably sensible.
This estimated colour is passed out in the metadata for other
algorithms - notably ALSC - to use.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add a method to the piecewise linear function (Pwl) class to compute
the inverse of a given Pwl. If the input function is non-monotonic we
can only produce a best effort "pseudo" inverse, and we signal this to
the caller.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We add a GetConvergenceFrames method to the AwbAlgorithm class which
can be called when the AWB is started from scratch. It suggests how
many frames should be dropped before displaying any (while the AWB
converges).
The Raspberry Pi specific implementation makes this customisable from
the tuning file.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We add a GetConvergenceFrames method to the AgcAlgorithm class which
can be called when the AGC is started from scratch. It suggests how
many frames should be dropped before displaying any (while the AGC
converges).
The Raspberry Pi specific implementation makes this customisable from
the tuning file.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|