Age | Commit message (Collapse) | Author |
|
Provide the request SensorTimestamp as the buffer completion time. This
is fake, as there is no SensorTimestamp on the VIVID pipeline as it's a
virtual video device, but it provides a suitable data point.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
In case the setFormat() call on the video device fails to match the
configuration, print both the requested and actual configurations to
ease debugging.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Initialize the CameraData properties with Location and Model.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When constructing the camera, we parse the available controls on the
video capture device, and map supported controls to libcamera controls,
and initialise the defaults.
The controls are handled during queueRequestDevice for each request and
applied to the device through the capture node.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The VIVID pipeline handler retains state globally of it's controls.
Ensure that when we configure this specific pipeline we set initial
parameters on the device that suit our (specific) needs.
This introduces how controls can be set directly on a device, however
under normal circumstances controls should usually be set from libcamera
controls as part of a request. These are VIVID specific only.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When a request is given to a pipeline handler, it must parse the request
and identify what actions the pipeline handler should take to enact on hardware.
In the case of the VIVID pipeline handler, we identify the buffer from the only
supported stream, and queue it to the video capture device.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
We can now add buffer management, and connect up our bufferReady signal
to a callback.
Note that we provide the ability to export buffers from our capture
device (data->video_) using the exportBuffers() functionality from the
V4L2VideoDevice which allows a FrameBufferAllocater to obtain buffers
from this device.
When buffers are obtained through the exportFrameBuffers API, they are
orphaned and left unassociated with the device, and must be reimported
at start() time anyway. This allows the same interface to be used
whether internal buffers, or external buffers are used for the stream.
When a buffer completes, we call the buffer completion handler on the
pipeline handler, and because we have only a single stream, we can also
immediately complete the request.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When the configurations have been generated and validated, they can be
applied to a device.
Vivid supports only a single stream, so it directly obtains the first
StreamConfiguration from the CameraConfiguration.
The VIVID catpure device is a V4L2Video device, so we generate a
V4L2DeviceFormat to apply directly to the capture device node.
Note that we convert the libcamera Format stored in
cfg.pixelFormat to a V4L2PixelFormat using V4L2PixelFormat helper. This
currently defaults to the single-planar formats, and should be extended
to support the Multiplanar configuration from the V4L2Device.
[todo Repair the link between the multiplanar configuration of the
V4L2VideoDevice and the pixel format selection]
Following the call to set the format using the Kernel API, if the format
has been adjusted in any way by the kernel driver, then we have failed
to correctly handle the validation stages, and thus the configure
operation is idendified has having failed.
Finally stream specific data can be directly stored and set as
reflecting the state of the stream.
[NOTE: the cfg.setStream() call here associates the stream to the
StreamConfiguration however that should quite likely be done as part of
the validation process. TBD]
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Implement the support for Generating and Validating the streams the
Camera can provide.
Vivid is a simple case with only a single stream.
Test the configurations can be generated and reported with cam -I:
"""
LIBCAMERA_LOG_LEVELS=Pipeline,VIVID:0 ./src/cam/cam -c 1 -I
[232:02:09.633067174] [2882911] INFO IPAManager ipa_manager.cpp:136 libcamera is not installed. Adding '/home//libcamera/build-vivid/src/ipa' to the IPA search path
[232:02:09.633332451] [2882911] WARN IPAManager ipa_manager.cpp:147 No IPA found in '/usr/local/lib/x86_64-linux-gnu/libcamera'
[232:02:09.633373414] [2882911] INFO Camera camera_manager.cpp:283 libcamera v0.0.11+714-d1ebd889-dirty
Using camera vivid
0: 1280x720-BGR888
* Pixelformat: NV21 (320x180)-(3840x2160)/(+0,+0)
- 320x180
- 640x360
- 640x480
- 1280x720
- 1920x1080
- 3840x2160
* Pixelformat: NV12 (320x180)-(3840x2160)/(+0,+0)
- 320x180
- 640x360
- 640x480
- 1280x720
- 1920x1080
- 3840x2160
* Pixelformat: BGRA8888 (320x180)-(3840x2160)/(+0,+0)
- 320x180
- 640x360
- 640x480
- 1280x720
- 1920x1080
- 3840x2160
* Pixelformat: RGBA8888 (320x180)-(3840x2160)/(+0,+0)
- 320x180
- 640x360
- 640x480
- 1280x720
- 1920x1080
- 3840x2160
"""
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Create a VividCameraData inheriting from the CameraData to handle camera
specific data, and use it to create and register the camera with the
CameraManager.
This can now be tested to see that the camera becomes available to
applications:
"""
LIBCAMERA_LOG_LEVELS=Pipeline,VIVID:0 ./src/cam/cam -l
[231:44:49.325333712] [2880028] INFO IPAManager ipa_manager.cpp:136 libcamera is not installed. Adding '/home/libcamera/build-vivid/src/ipa' to the IPA search path
[231:44:49.325428449] [2880028] WARN IPAManager ipa_manager.cpp:147 No IPA found in '/usr/local/lib/x86_64-linux-gnu/libcamera'
[231:44:49.325446253] [2880028] INFO Camera camera_manager.cpp:283 libcamera v0.0.11+713-d175334d-dirty
Available cameras:
1: vivid
"""
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Verify that we can match on our expected device(s).
Use a temporary debug print to check that the pipeline finds
our device:
"""
LIBCAMERA_LOG_LEVELS=Pipeline,VIVID:0 ./src/cam/cam -l
<snipped>
[230:51:10.670503423] [2872877] DEBUG VIVID vivid.cpp:81 Obtained Vivid Device
"""
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Explicitly disable the unused-parameter warning in this pipeline handler.
Parameters are left unused while they are introduced incrementally, so for
documentation purposes only we disable this warning so that we can compile
each commit independently without breaking the flow of the development
additions.
This is not recommended practice within libcamera, please listen to your
compiler warnings.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Provide all of the skeleton stubs to succesfully compile
and register a new Pipeline Handler for the Vivid test device.
Meson must be reconfigured to ensure that this pipeline handler is
included in the selected pipelines configuration, and after building, we
can test that the PipelineHandler is successfully registered by listing
the cameras on the system with LIBCAMERA_LOG_LEVELS enabled:
"""
LIBCAMERA_LOG_LEVELS=Pipeline:0 ./build-vivid/src/cam/cam -l
[230:30:03.624102821] [2867886] DEBUG Pipeline pipeline_handler.cpp:680 Registered pipeline handler "PipelineHandlerVivid"
Available cameras:
"""
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The v4l2 compatibility layer does not have external dependencies,
the usual benefits of a feature option do not apply. The main
motivation behind this change is making use of the `auto_features`
meson option that can change all feature options set to "auto"
by default to the desired value.
This can be useful for two reasons: (1) using auto_features=disabled
and then building up the list of features that are desired in the
particular build; (2) using auto_features=enabled to achieve
(usually) more testing and compilation coverage.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
If `FrameBufferAllocator::allocate()` causes the construction to be
aborted, the allocated `GstLibcameraAllocator` will not be
deallocated properly. Use `g_autoptr()` to address this.
`g_steal_pointer()` could only be used in glib 2.68 or later because
earlier it evaluates to a pointer-to-void in C++, which would necessitate
a `static_cast`.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
|
|
`FrameBufferAllocator::allocate()` might return a negative error code,
but currently this is handled the same way as success. So instead
of continuing, abort the construction of the allocator object.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
|
|
Previously we call Thread::setThreadAffinityInternal in
Thread::startThread. The purpose was to avoid the main workload being
run on incorrect CPUs. This leads to a race condition of setting
`Thread::thread_` in `Thread::start()` and accessing
`Thread::setThreadAffinityInternal` though.
This patch moves the call after the construction of std::thread to avoid
the race condition. The downside is that the first tasks, if any, upon
starting a thread might be run on incorrect CPUs.
Fixes: 4d9db06d6690 ("libcamera: add method to set thread affinity")
Signed-off-by: Harvey Yang <chenghaoyang@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Some sensors support producing and transmitting embedded data over a
stream separate from the image stream. Add support for this feature in
the CameraSensor interface, and implement it for the CameraSensorRaw
class. The CameraSensorLegacy uses the default stub implementation, as
the corresponding kernel drivers don't support embedded data.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a new CameraSensorRaw implementation of the CameraSensor interface
tailored to devices that implement the new V4L2 raw camera sensors API.
This new class duplicates code from the CameraSensorLegacy class. The
two classes will be refactored to share code.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
With support for metadata in the streams API, the v4l2_meta_format
structure has been extended with width, height and bytesperline fields.
Support them in the V4L2VideoDevice getFormat() and setFormat()
functions is the video device is meta capture device and if the
pixel format is one of the generic line-based metadata formats.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Support the newly introduced V4L2 media bus formats for metadata. This
includes generic metadata formats, and two sensor-specific embedded data
formats.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a CamHelper::setHwConfig() helper used by the IPA to set the
hardware configuration in use by the pipeline. This will be needed by
the IMX500 camera helper in a future commit to determine if the
metadata buffer is strided.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
This property (dataBufferStrided) indicates if the CSI-2 hardware writes
to the embedded/metadata buffer directly, or if it treats the buffer
like an image buffer and strides the metadata lines.
Unicam writes this buffer strided, while the PiSP Frontend writes to it
directly. This information will be relevant to data parsers in the
helpers where the data is structured in lines.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
These functions erase a key/value pair from the metadata object.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Use an r-value reference in set() and setLocked(), allowing more
efficient metadata handling with std::forward and std::move if needed.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Basic tuning done by David Plowman using a Waveshare SKU 28524
"IMX415-98 IR-CUT Camera" module.
Signed-off-by: Dave Stevenson <dave.stevenson@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
As another Starvis sensor, it is near identical to imx290/327.
Signed-off-by: Dave Stevenson <dave.stevenson@raspberrypi.com>
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Believed correct based on imx290.
Signed-off-by: Dave Stevenson <dave.stevenson@raspberrypi.com>
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
As DmaSyncer does sync start/end in the c'tor/d'tor, copying a DmaSyncer
instance would trigger sync end earlier than expected. This patch makes
it non-copyable to avoid the issue.
Fixes: 39482d59fe71 ("DmaBufAllocator: Add Dma Buffer synchronization function & helper class")
Signed-off-by: Harvey Yang <chenghaoyang@chromium.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
After the initial generation, when a frame is requested,
the test pattern generator rotates the image left by 1 column.
The current approach has two shortcomings:
(1) it allocates a temporary buffer to hold one column;
(2) it swaps two columns at a time.
The test patterns are simple ARGB images, in row-major order,
so doing (2) works against memory prefetching. This can be
addressed by doing the rotation one row at a time as that way
the image is addressed in a purely linear fashion. Doing so also
eliminates the need for a dynamically allocated temporary buffer,
as the required buffer now only needs to hold one sample,
which is 4 bytes in this case.
In an optimized build, this results in about a 2x increase in the
number of frames per second as reported by `cam`. In an unoptimized,
ASAN and UBSAN intrumented build, the difference is even bigger,
which is useful for running lc-compliance in CI in a reasonable time.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
`PixelFormatInfo::planes.size()` always returns 3 since `planes` is
an array, but that is not the number of planes of the pixel format.
Use the `numPlanes()` getter instead.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
There is no reason make copies, these functions return
const lvalue references, access the data through those.
Signed-off-by: Barnabás Pőcze <pobrn@protonmail.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a lux algorithm module to rkisp1 IPA for estimating the lux level of
an image. This is reported in metadata, as well as saved in the frame
context so that other algorithms (mainly AGC) can use its value. It does
not set any controls.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Isaac Scott <isaac.scott@ideasonboard.com>
|
|
Add a Lux helper to libipa that does the estimation of the lux level
given gain, exposure, and luminance histogram. The helper also
handles reading the reference values from the tuning file. These are
expected to be common operations of lux algorithm modules in IPAs, and
is modeled/copied from Raspberry Pi.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Isaac Scott <isaac.scott@ideasonboard.com>
|
|
ColourTemperature is now exported as a writable control so that
applications can set it directly. The AWB algorithm class now requires
a method to be provided to perform this operation. The method should
clamp the passed value to the calibrated range known to the algorithm.
The default range is set very wide to cover all conceivable future AWB
calibrations. It will always be clamped before use.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Tested-by: Stefan Klug <stefan.klug@ideasonboard.com>
|
|
There are many use-cases (tuning-validation, working in static
environments) where a manual ColourTemperature control is helpful.
Implement that by interpolating and applying the white balance gains
from the tuning file according to the requested colour temperature. If
colour gains are provided on the same request, they take precedence.
Store the colour temperature used for a given frame in the frame context
and report that in metadata.
Note that in the automatic case, the colour gains are still based on the
gray world model and the CT curve from the tuning file get ignored.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
For the implementation of a manual colour temperature setting, it is
necessary to read predefined colour gains per colour temperature from
the tuning file. Implement this in a backwards compatible way. If no
gains are contained in the tuning file, loading just continues as
before.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
For manual control it is helpful to be able to specify a fixed colour
temperature. It also provides an easy way to apply the temperature
specific CCMs and colour gains that are contained in the tuning files.
Document this and update the control dependencies.
Signed-off-by: Stefan Klug <stefan.klug@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Now that there is support for retrieving the allowed directions of a
control, print this information when listing controls.
Sample output:
$ cam --list-controls -c 2
Using camera Virtual0 as cam0
Control: [inout] draft::FaceDetectMode:
- FaceDetectModeOff (0)
Control: [inout] libcamera::FrameDurationLimits: [16666..33333]
Size: 2
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add support to ControlId for querying direction information. This allows
applications to query whether a ControlId is meant for being set in
controls or to be returned in metadata or both. This also has a side
effect of properly encoding this information, as previously it was only
mentioned losely and inconsistently in the control id definition.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In preparation for adding support for querying direction information
from controls, parse the direction information from control ID
definitions. This can later be plugged in directly to the IPA code
generators simply by using ctrl.direction.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Stefan Klug <stefan.klug@ideasonboard.com>
|
|
In preparation for adding support for querying direction information
from controls, populate the corresponding field in the control ID
defintions.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Stefan Klug <stefan.klug@ideasonboard.com>
|
|
Add a tuning data file for the IMX415 camera sensor. The black level
offset data is drawn from the camera sensor's datasheet. The lens
shading tables are generated through the libtuning LSC modules.
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Co-developed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
|
|
Add a lens shading correction algorithm to the mali-c55 IPA. This
algorithm parses tables from Yaml in a easy to follow format before
munging them into Arm's interleaved mesh to be copied to the ISP.
A colour temperature estimate from the AGC statistics is used to
select the appropriate table to apply; this can be some interpolation
of two tables, in which case the colour temperature estimate is also
used to derive the coefficient that does the blending.
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Co-developed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
|
|
Add a simple grey-world auto white balance algorithm to the mali-c55
IPA.
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Co-developed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
|
|
Add a Black Level Correction algorithm.
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Daniel Scally <dan.scally@ideasonboard.com>
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
|
|
Add a new algorithm and associated infrastructure for Agc. The
tuning files for uncalibrated sensors is extended to enable the
algorithm.
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Co-developed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Plumb the Pipeline-IPA loop in.
Load the IPA module at camera creation time and create the loop between
the pipeline and the IPA.
When a new Request is queued the IPA is asked to prepare the parameters
buffer, once ready it notifies the pipeline which queues the parameters
to the ISP along with a buffer for statistics and frames,
Once statistics are ready they get passed to the IPA which upates its
settings for the next frame.
Driveby fix an error message in the Pipeline Handler's ::freeBuffers()
function which reported a problem with the wrong video device in an
error path.
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a barebones IPA module for the Mali-C55 ISP. In this initial
implementation pretty much only buffer plumbing is implemented.
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Co-developed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Acquire the mali-c55 3a stats and parameters video devices during
::match() and plumb them in.
For this commit we simply allocate and release buffers for the
statistics and parameters. Statistics buffers are queue and dequeued
from the stats video device but their contents are for now untouched.
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Acked-by: Nayden Kanchev <nayden.kanchev@arm.com>
Co-developed-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
Signed-off-by: Daniel Scally <dan.scally@ideasonboard.com>
|