Age | Commit message (Collapse) | Author |
|
Some sensors support producing and transmitting embedded data over a
stream separate from the image stream. Add support for this feature in
the CameraSensor interface, and implement it for the CameraSensorRaw
class. The CameraSensorLegacy uses the default stub implementation, as
the corresponding kernel drivers don't support embedded data.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a new CameraSensorRaw implementation of the CameraSensor interface
tailored to devices that implement the new V4L2 raw camera sensors API.
This new class duplicates code from the CameraSensorLegacy class. The
two classes will be refactored to share code.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
With support for metadata in the streams API, the v4l2_meta_format
structure has been extended with width, height and bytesperline fields.
Support them in the V4L2VideoDevice getFormat() and setFormat()
functions is the video device is meta capture device and if the
pixel format is one of the generic line-based metadata formats.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Support the newly introduced V4L2 media bus formats for metadata. This
includes generic metadata formats, and two sensor-specific embedded data
formats.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a CamHelper::setHwConfig() helper used by the IPA to set the
hardware configuration in use by the pipeline. This will be needed by
the IMX500 camera helper in a future commit to determine if the
metadata buffer is strided.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
This property (dataBufferStrided) indicates if the CSI-2 hardware writes
to the embedded/metadata buffer directly, or if it treats the buffer
like an image buffer and strides the metadata lines.
Unicam writes this buffer strided, while the PiSP Frontend writes to it
directly. This information will be relevant to data parsers in the
helpers where the data is structured in lines.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
These functions erase a key/value pair from the metadata object.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Use an r-value reference in set() and setLocked(), allowing more
efficient metadata handling with std::forward and std::move if needed.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Basic tuning done by David Plowman using a Waveshare SKU 28524
"IMX415-98 IR-CUT Camera" module.
Signed-off-by: Dave Stevenson <dave.stevenson@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
As another Starvis sensor, it is near identical to imx290/327.
Signed-off-by: Dave Stevenson <dave.stevenson@raspberrypi.com>
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Believed correct based on imx290.
Signed-off-by: Dave Stevenson <dave.stevenson@raspberrypi.com>
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
As DmaSyncer does sync start/end in the c'tor/d'tor, copying a DmaSyncer
instance would trigger sync end earlier than expected. This patch makes
it non-copyable to avoid the issue.
Fixes: 39482d59fe71 ("DmaBufAllocator: Add Dma Buffer synchronization function & helper class")
Signed-off-by: Harvey Yang <chenghaoyang@chromium.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
After the initial generation, when a frame is requested,
the test pattern generator rotates the image left by 1 column.
The current approach has two shortcomings:
(1) it allocates a temporary buffer to hold one column;
(2) it swaps two columns at a time.
The test patterns are simple ARGB images, in row-major order,
so doing (2) works against memory prefetching. This can be
addressed by doing the rotation one row at a time as that way
the image is addressed in a purely linear fashion. Doing so also
eliminates the need for a dynamically allocated temporary buffer,
as the required buffer now only needs to hold one sample,
which is 4 bytes in this case.
In an optimized build, this results in about a 2x increase in the
number of frames per second as reported by `cam`. In an unoptimized,
ASAN and UBSAN intrumented build, the difference is even bigger,
which is useful for running lc-compliance in CI in a reasonable time.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
`PixelFormatInfo::planes.size()` always returns 3 since `planes` is
an array, but that is not the number of planes of the pixel format.
Use the `numPlanes()` getter instead.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
There is no reason make copies, these functions return
const lvalue references, access the data through those.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a lux algorithm module to rkisp1 IPA for estimating the lux level of
an image. This is reported in metadata, as well as saved in the frame
context so that other algorithms (mainly AGC) can use its value. It does
not set any controls.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Isaac Scott <isaac.scott@ideasonboard.com>
|
|
Add a Lux helper to libipa that does the estimation of the lux level
given gain, exposure, and luminance histogram. The helper also
handles reading the reference values from the tuning file. These are
expected to be common operations of lux algorithm modules in IPAs, and
is modeled/copied from Raspberry Pi.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Isaac Scott <isaac.scott@ideasonboard.com>
|
|
ColourTemperature is now exported as a writable control so that
applications can set it directly. The AWB algorithm class now requires
a method to be provided to perform this operation. The method should
clamp the passed value to the calibrated range known to the algorithm.
The default range is set very wide to cover all conceivable future AWB
calibrations. It will always be clamped before use.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Tested-by: Stefan Klug <stefan.klug@ideasonboard.com>
|
|
There are many use-cases (tuning-validation, working in static
environments) where a manual ColourTemperature control is helpful.
Implement that by interpolating and applying the white balance gains
from the tuning file according to the requested colour temperature. If
colour gains are provided on the same request, they take precedence.
Store the colour temperature used for a given frame in the frame context
and report that in metadata.
Note that in the automatic case, the colour gains are still based on the
gray world model and the CT curve from the tuning file get ignored.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
For the implementation of a manual colour temperature setting, it is
necessary to read predefined colour gains per colour temperature from
the tuning file. Implement this in a backwards compatible way. If no
gains are contained in the tuning file, loading just continues as
before.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
For manual control it is helpful to be able to specify a fixed colour
temperature. It also provides an easy way to apply the temperature
specific CCMs and colour gains that are contained in the tuning files.
Document this and update the control dependencies.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Now that there is support for retrieving the allowed directions of a
control, print this information when listing controls.
Sample output:
$ cam --list-controls -c 2
Using camera Virtual0 as cam0
Control: [inout] draft::FaceDetectMode:
- FaceDetectModeOff (0)
Control: [inout] libcamera::FrameDurationLimits: [16666..33333]
Size: 2
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add support to ControlId for querying direction information. This allows
applications to query whether a ControlId is meant for being set in
controls or to be returned in metadata or both. This also has a side
effect of properly encoding this information, as previously it was only
mentioned losely and inconsistently in the control id definition.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In preparation for adding support for querying direction information
from controls, parse the direction information from control ID
definitions. This can later be plugged in directly to the IPA code
generators simply by using ctrl.direction.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Stefan Klug <stefan.klug@ideasonboard.com>
|
|
In preparation for adding support for querying direction information
from controls, populate the corresponding field in the control ID
defintions.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Stefan Klug <stefan.klug@ideasonboard.com>
|
|
Add a tuning data file for the IMX415 camera sensor. The black level
offset data is drawn from the camera sensor's datasheet. The lens
shading tables are generated through the libtuning LSC modules.
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Co-developed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
|
|
Add a lens shading correction algorithm to the mali-c55 IPA. This
algorithm parses tables from Yaml in a easy to follow format before
munging them into Arm's interleaved mesh to be copied to the ISP.
A colour temperature estimate from the AGC statistics is used to
select the appropriate table to apply; this can be some interpolation
of two tables, in which case the colour temperature estimate is also
used to derive the coefficient that does the blending.
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Co-developed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
|
|
Add a simple grey-world auto white balance algorithm to the mali-c55
IPA.
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Co-developed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
|
|
Add a Black Level Correction algorithm.
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Daniel Scally <dan.scally@ideasonboard.com>
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
|
|
Add a new algorithm and associated infrastructure for Agc. The
tuning files for uncalibrated sensors is extended to enable the
algorithm.
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Co-developed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Plumb the Pipeline-IPA loop in.
Load the IPA module at camera creation time and create the loop between
the pipeline and the IPA.
When a new Request is queued the IPA is asked to prepare the parameters
buffer, once ready it notifies the pipeline which queues the parameters
to the ISP along with a buffer for statistics and frames,
Once statistics are ready they get passed to the IPA which upates its
settings for the next frame.
Driveby fix an error message in the Pipeline Handler's ::freeBuffers()
function which reported a problem with the wrong video device in an
error path.
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a barebones IPA module for the Mali-C55 ISP. In this initial
implementation pretty much only buffer plumbing is implemented.
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Co-developed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Acquire the mali-c55 3a stats and parameters video devices during
::match() and plumb them in.
For this commit we simply allocate and release buffers for the
statistics and parameters. Statistics buffers are queue and dequeued
from the stats video device but their contents are for now untouched.
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Co-developed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
|
|
The rkisp1 IPA has some utility functions to convert between fixed
and floating point numbers. Move those to libipa so they're available
for use in other IPA modules too.
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Now that we have camera sensor control application delay values in
the CameraSensorProperties class, remove the duplicated definitions
in the RPi IPA's CameraSensorHelpers and update the pipeline handler
to use the values from CameraSensorProperties.
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add entries for the OmniVision OV7251 and OV9281 sensors. The control
delay values are drawn from the Raspberry Pi CameraSensorHelper class
for these sensors - the pixel sizes from the datasheets. Both sensors
are monochrome and do not have test pattern settings that correspond
to defined control values.
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In RkISPPath::validate() the sensor sizes are correctly filtered to the
ones supported by the ISP. But later in RkISPPipeline::configure() the
configured stream size is passed to sensor->getFormat() and the
CameraSensor class chooses the best sensor format for the requested
stream size. This can result in a sensor format that is too big for the
ISP. Fix that by supplying the maximum resolution supported by the ISP
to setFormat().
Fixes: 761545407c76 ("pipeline: rkisp1: Filter out sensor sizes not supported by the pipeline")
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The getFormat function takes the aspect ratio and the area of the
requested size into account when choosing the best sensor size. In case
the sensor is connected to an rkisp1 the maximum supported frame size of
the ISP is another constraining factor for the selection of the best
format. Add a maxSize parameter to support such a constraint.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
|
|
When the dewarper is used, config->validate() needs to take the
restrictions of the dewarper into account. Add the corresponding checks.
As the useDewarper_ variable is now accessed earlier in
PipelineHandlerRkISP1::configure(), ensure it gets set early enough.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
|
|
handler
For the validate() implementation, the RkISP1CameraConfiguration needs
access to dewarper related members of the PipelineHandlerRkISP1 object.
Allow this access by adding const versions of the pipe accessors and
making RkISP1CameraConfiguration a friend of PipelineHandlerRkISP1.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Allow access to the pipeline handler on a const instance of
Camera::Private.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
In configure() and in the future in generateConfiguration() the
calculated stream sizes and crop rectangles depend on the dewarper being
available or not. It is therefore not possible to disable the dewarper
later if configuration fails. Error out in that case.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
|
|
Refactor validation code to prepare for extensions in the upcoming
patches. Code duplication is reduced by moving parts of the validation
logic into a lambda function. This patch does not include any functional
changes.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Add to the Converter interface two functions used by pipeline handlers
to validate and adjust the converter input and output configurations
by specifying the desired alignment for the adjustment.
Add the adjustInputSize() and adjustOutputSize() functions that allows
to adjust the converter input/output sizes with the desired alignment.
Add a validateOutput() function meant to be used by the pipeline
handler implementations of validate(). The function adjusts a
StreamConfiguration to a valid configuration produced by the Converter.
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
|
|
The ScalerMaximumCrop property holds the biggest allowed ScalerCrop
value. Add it to the rkisp1.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
|
|
ScalerCrop is specified as being in sensor coordinates. The current
dewarper implementation on the imx8mp handles ScalerCrop in dewarper
coordinates. This leads to unexpected results and an unusable ScalerCrop
control in camshark. Fix that by transforming back and forth between
sensor coordinates and dewarper coordinates.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
|
|
Query the crop bounds on the dewarper instead of the stream in case the
camera was not yet configured when updateControls() gets called. This
provides sane defaults for the controls.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
|
|
Add a isConfigured() function to be able to check if a given stream was
configured in the converter. This is useful in pipelines to either query
device or stream specific crop bounds depending on whether the stream is
configured or not.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The inputCropBounds_ member of the V4L2M2MConverter::Stream class is
only initialized after a V4L2M2MConverter::configure() call, when the
streams are initialized.
However, the converter has crop limits that do not depend on the
configured Streams, and which are currently not accessible from the
class interface.
Add a new inputCropBounds() function to the V4L2M2MConverter class
that allows to retrieve the converter crop limits before any stream
is configured.
This is particularly useful for pipelines to initialize controls and
properties and to implement validation before the Camera gets
configured.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
|
|
In an upcoming patch it is necessary to get the crop bounds on the
converter itself. The V4L2M2MConverter contains code that is very
similar to the get crop bounds code in the V4L2M2MStream. Merge these
code blocks into a static function to be used from both classes. This
patch contains no functional changes.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|