Age | Commit message (Collapse) | Author |
|
Currently, when allocating buffers, the streams of the Camera object are
used. Instead the streams of the CameraConfiguration object should be
used. This is because the Camera object holds all available streams
while the CameraConfiguration holds only the streams associated with the
current configuration.
Signed-off-by: Dafna Hirschfeld <dafna.hirschfeld@collabora.com>
Reviewed-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
This patch adds support for the Soho Enterprises SE327M12 module. The
sensor is an imx327 which therefore uses the imx290 kernel driver and
CamHelper.
To use this module and camera tuning, place the following in the
/boot/config.txt file:
dtoverlay=imx290,clock-frequency=37125000
and then the existing imx290.json file must be over-written with
se327m12.json
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
If the ISP::Output0 stream has not been configured, we must enable it
with a default format and resolution for internal use. This is to allow
the pipeline handler data flow to be consistent, and allow the IPA to
receive statistics for the frame.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
It is possible for the application to call pipeline_handler::configure()
multiple times, which would attempt to open the Unicam embedded data
node on every call. This would cause a warning message as the node
would have already been opened. Avoid this by tracking if the node
has previously been opened.
Note that this is a temporary fix since the open call for the Unicam
embedded data node will be moved from pipeline_handler::configure() to
pipeline_handler::match().
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Document struct DelayedControls::ControlParams and its associated
fields.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
If gst_buffer_pool_acquire_buffer in gst_libcamera_task_run fails, the
unique_ptr to the request-object gets reset and hence, its destructor
is called. However, the wrap-object points to the same object and is
still alive at this moment. When the task_run-function is finished, the
destructor of the wrap-object is called, which in return calls the
destructor of the request-object again.
Instead of taking care of both, the request and the wrap-object, we can
move the request to the wrap which will then effectively take care of
the request object automatically.
Signed-off-by: Marian Cichy <m.cichy@pengutronix.de>
Suggested-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
|
|
There was an off-by-one error in DelayedControls::get() when picking
controls from the queue to return back to the pipeline handler.
This is only noticeable as small oscillations in brightness when closely
viewing frame while AGC is running. The old StaggeredCtrl did not show
this error as the startup queuing mechanism has changed in
DelayedControls.
Fix this by indexing to the correct position in the queue.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reported-by: David Plowman <david.plowman@raspberrypi.com>
Fixes: 3d4b7b005911 ("libcamera: delayed_controls: Add helper for controls that apply with a delay")
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
In DelayedControls::applyControls(), the controls queue would be
extended via a no-op control push to fill the intermittent slots with
unchanged control values. This is needed so that we read back unchanged
control values correctly on every frame.
However, there was one additional no-op performed on every frame that is
not required, meaning that any controls queued by the pipeline handler
would have their write delayed by one further frame. The original
StaggeredCtrl did not do this, as it only had one index to manage,
whereas DelayedControls uses two.
Remove this last no-op push so that the pipeline_handler provided
controls would be written on the very next frame if possible. As a
consequence, we must mark the control update as completed within the
DelayedControls::applyControls() loop, otherwise it might get reused
when cycling through the ring buffer.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reported-by: David Plowman <david.plowman@raspberrypi.com>
Fixes: 3d4b7b005911 ("libcamera: delayed_controls: Add helper for controls that apply with a delay")
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
On DelayedControls::reset(), the values retrieved from the sensor device
were added to the queues with the updated flag set to true. This would
cause the helper to write out the value to the device again on the first
DelayedControls::applyControls() call. This is unnecessary, as the
controls written are identical to what is stored in the device driver.
Fix this by explicitly setting the update flag to false in
DelayedControls::reset() when adding the controls to the queue.
Additionally, use the Info() constructor when adding items to the queue
for consistency.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Fixes: 3d4b7b005911 ("libcamera: delayed_controls: Add helper for controls that apply with a delay")
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
If an exposure time change adjusts the vblanking limits, and we set both
VBLANK and EXPOSURE controls through the VIDIOC_S_EXT_CTRLS ioctl, the
latter may fail if the value is outside of the limits calculated by the
old VBLANK value. This is a limitation in V4L2 and cannot be fixed by
setting VBLANK before EXPOSURE in a single VIDIOC_S_EXT_CTRLS ioctl.
The workaround here is to have the DelayedControls object mark the
VBLANK control as "priority write", which then write VBLANK separately
from (and ahead of) any other controls. This way, the sensor driver will
update the EXPOSURE control with new limits before the new values is
presented, and will thus be seen as valid.
To support this, a new struct DelayedControls::ControlParams is used in
the constructor to provide the control delay value as well as the
priority write flag.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
[Kieran: Fix up trivial comments, merge conflicts]
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
We borrowed a GstBuffer from the pool, if preparing the buffer failed,
we need to push it back to avoid leaking it.
Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Both the DeviceProvider and Device classes had the same mistake,
calling G_OBJECT_GET_CLASS() instead of G_OBJECT_CLASS() when
chaining their finalize call to their base class. This would
crash at destruction, which was causing gst-device-monitor-1.0
tool to crash and application using that API to crash too.
Signed-off-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In kernel 5.11 the rkisp1 uapi had changed to support different hardware
revisions. Currently only revision 10 is supported by the rkisp1 IPA and
therefore 'init' should fail if the revision is not 10.
This changes depends on the kernel driver reporting the hardware
revision, and thus requires the rkisp1 driver from v5.11 or newer.
Signed-off-by: Dafna Hirschfeld <dafna.hirschfeld@collabora.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The IPA of rkisp1 relies on some of the camera's controls.
Therefore it can't work if those controls are not given.
Return -EINVAL from 'configure' in that case.
Also return error from the pipeline's 'configure' method
if the IPA configure fails.
Signed-off-by: Dafna Hirschfeld <dafna.hirschfeld@collabora.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add a method 'hwRevision' to return the
info.hw_version reported by the driver.
Signed-off-by: Dafna Hirschfeld <dafna.hirschfeld@collabora.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
According to the EXIF specification, the GPS method should be UNDEFINED,
and the first 8 bytes will designate the type. However, CTS expects the
first 8 bytes to be part of the data. Remove the 8-byte encoding
designator by changing the encoding to NoEncoding to appease CTS.
This is part of the fix that allows the following CTS test to pass:
- android.hardware.cts.CameraTest#testJpegExif
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
There was a copy-paste error that caused the latitude to be set twice and
the longitude never. Fix this.
This is part of the fix that allows the following CTS test to pass:
- android.hardware.cts.CameraTest#testJpegExif
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Now that setRational() supports setting multiple rational values, use
that in setGPSDateTimestamp and setGPSDMS which previously set every
rational manually.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
setRational was not working properly for EXIF tags in the GPS IFD due to
libexif not supporting those tags in exif_entry_initialize(). Manually
specify the size of the EXIF entry to fix this. While at it, add support
for setting multiple rationals, as that is a common use case for
rational EXIF tags.
As Rational types are no longer initialized by libexif directly, the
EXIF_TAG_{X,Y}_RESOLUTION exif tags will not have their default values
populated.
This allows the GPS altitude to be set properly, and is part of the fix
to allow the following CTS test to pass:
- android.hardware.cts.CameraTest#testJpegExif
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The capture request template for video recording use cases requires
a fixed FPS range. Generate the request templates for the VIDEO_RECORD
and VIDEO_SNAPSHOT capture intents using the preview template and
updating the supported FPS range.
This change fixes the CTS tests
android.hardware.camera2.cts.CameraDeviceTest#testCameraDeviceRecordingTemplate
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The camera supported FPS range is crucial to distinguish between
capture templates for preview and video recording. If the pipeline
handler did not specify an available FPS range by registering the
controls::FrameDurations property so far the control was simply not
added to the generated capture template.
In order to prepare to generate templates for video recording which
require a fixed FPS range, fail earlier in generating any template at
all if the available FPS range is not provided by the Camera.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The current implementation of constructDefaultRequestSettings()
returns the same capture template for all the capture intent.
As the correctness of the generated template is verified by CTS it
is better to return an error for unsupported capture use cases.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The result metadata reports an arbitrary {30, 30} FPS range for the
AE algorithm.
The actual FPS range should be returned in the Request::metadata, but
as libcamera currently does not support that feature temporarily work
around the issue and return the FPS range requested by the camera
framework.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The request template returned by requestTemplatePreview() uses an
arbitrary {15, 30} Auto-Exposure algorithm FPS range. Use the one
calculated at static metadata creation time, which is consistent with
the camera limits.
Once template generation will be performed inspecting the requested
capture intent, the FPS range over which the AE algorithm can range
shall be tuned accordingly.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Use the FrameDuration control reported by pipeline handlers to register
the Auto-Exposure routine FPS range, the minimum stream frame durations
and the sensor maximum frame duration.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Register the FrameDurations control in the IPU3 pipeline handler
computed using the vertical blanking limits and the sensor
pixel rate as parameters.
The FrameDurations control limits should be updated everytime a new
configuration is applied to the sensor.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The controls' limits initialized by the IPU3 pipeline handler depend
on the sensor configuration. In order to compute controls using a known
state apply to the sensor a configuration equal to its own resolution.
Move the \todo note regarding the controls' limits dependency on the
sensor configuration at the beginning of the function and remove the
other redundant ones in the function body.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Now that we support returning int directly in addition to other output
parameters, improve the configure() function in the raspberrypi IPA
interface.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The name vblankDelay is clearer.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
imx290 and imx327 share the same kernel driver (imx290.c) and are
therefore both recognised here as "imx290". We add the necessary
CamHelper for these sensors, as well as a camera tuning file.
The tuning was done with an Innomaker STARVIS IMX327LQR module. These
have no IR cut filter so there is no proper colour tuning. However,
you should obtain reasonable results for most modules using this
sensor. Specific tunings for further modules can always be added
subsequently.
To use this sensor on the Raspberry Pi platform, please add
dtoverlay=imx290,clock-frequency=74250000
into your /boot/config.txt file.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
For some sensors (e.g. imx477) we need to update the vblanking on the
frame before the exposure. For this reason the GetDelays method must
also return the number of frame delays for the vblanking control.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Since commit 96aecfe36508 ("libcamera: camera_sensor: Use active area
size as resolution") the CameraSensor::resolution() method returned the
sensor's active pixel area size.
As the CameraSensor::resolution() method is widely used in the library
code base to retrieve the maximum frame size the sensor can produce,
in case it is smaller than the pixel area size the returned size cannot
be used to configure the sensor correctly.
Fix this by returning the maximum frame resolution the sensor can
produce, or the pixel area size in case the sensor embeds and ISP that
can upscale and the supported maximum frame size is thus larger that
the pixel array size.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
When the IPU3Frames completes, it deletes the internal info storage.
This storage contains the pointer to the Request, but in some cases the
pointer was being accessed after the info structure was removed.
Ensure that the Request is obtained before attempting to complete to
obtain a valid pointer.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Fix trivial spelling mistake.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add tracing to the base pipeline handler class to track when requests are queued.
Tracing is already available for other Request operations, but queuing a Request
is not an operation handled by the Request itself.
Add the tracepoint to the PipelineHandler::queueRequest() so the lifetime of a
Request can be viewed when tracing.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When a pipeline handler completes a request, the request itself is not
deleted by libcamera, and the application regains control over the
object. It may choose to delete the Request, or re-use it.
Clarify this in the comment by removing the declaration that the Request
is deleted, but state that it is no longer managed by the pipeline
handler and must not be accessed further after this function returns.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <email@uajain.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Requests should only be completed from the RequestPending state.
Requests which are completed from the RequestCancelled, or RequestComplete
state, will indicate that a double-complete has been called on the Request,
or that it has been used internally after it has been given back to the
application.
Ensure that this can be caught early if it occurs by enforcing the state
required with an assert.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <email@uajain.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When configuring streams, the camera HAL is supposed to update the
gralloc usage flags to reflect the operations it will need to do on the
stream buffers. Failure to do so leads to incorrect format selection by
gralloc for the HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED format, as
gralloc will not take into consideration the need of the camera to
access the buffers.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Camera sensors can include an ISP. For instance, the AP1302 external ISP
can be connected to up to two raw camera sensors, and the combination of
the sensors and ISP is considered as a (smart) camera sensor from
libcamera's point of view.
The CameraSensor class has limited support for this already. Extend the
simple pipeline handler to support such sensors, by using the media
entity corresponding to the ISP instead of the raw camera sensor's
entity.
We don't need to handle the case where an entity in the SoC would expose
the MEDIA_ENT_F_PROC_VIDEO_ISP function, as pipeline containing an ISP
would have a dedicated pipeline handler.
The implementation is limited as it won't support other multi-entity
camera sensors (such as CCS). While this would be worth supporting, we
don't have a test platform with a CCS-compatible sensor at this point,
so let's not over-engineer the solution. Extending support to CCS (and
possibly other sensor topologies) will likely involve helpers that can
be used by other pipeline handlers (such as generic graph walk helpers
for instance) and extensions to the CameraSensor class.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
When walking the pipeline, follow the first link of each source pad.
This patch removes a redundant condition for choosing the link:
"(link->flags() & MEDIA_LNK_FL_ENABLED) ||
!(link->flags() & MEDIA_LNK_FL_IMMUTABLE)"
since it always returns true.
Signed-off-by: Dafna Hirschfeld <dafna.hirschfeld@collabora.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Tested-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
When opening a V4L2VideoDevice multiple times, for instance to run
multiple jobs on a M2M device, it's useful to attribute log messages to
a particular instance. Include the device fd in the log prefix.
This turns the existing output
[1:43:01.958321522] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[cap]: Queueing buffer 0
[1:43:01.958350060] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[cap]: Queueing buffer 1
[1:43:01.958365137] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[cap]: Queueing buffer 2
into
[1:43:01.958321522] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[14:cap]: Queueing buffer 0
[1:43:01.958350060] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[14:cap]: Queueing buffer 1
[1:43:01.958365137] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[14:cap]: Queueing buffer 2
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
The pipeline handler would enable and use the Unicam embedded data stream
even if the sensor did not support it. This was to allow a means to
pass exposure and gain values for the frame to the IPA in a synchronised
way.
The recent changes to get the pipeline handler to pass a ControlList
with exposure and gain values means this is no longer required. Disable
the use of the embedded data stream when a sensor does not support it.
This change also removes the mappedEmbeddedBuffers_ map as it is no
longer used.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
With the recent change to pass a ControlList to the IPA with exposure
and gain values used for a frame, RPiController::MdParserRPi is not
needed any more. Remove all traces of it.
The derived CamHelper objects now pass nullptr values for the parser to
the base CamHelper class when sensors do not use metadata.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
When running with sensors that had no embedded data, the pipeline handler
would fill a dummy embedded data buffer with gain/exposure values, and
pass this buffer to the IPA together with the bayer buffer. The IPA would
extract these values for use in the controller algorithms.
Rework this logic entirely by having a new RPiCameraData::BayerFrame
queue to replace the existing bayer queue. In addition to storing the
FrameBuffer pointer, this also stores all the controls tracked by
DelayedControls for that frame in a ControlList. This includes include
exposure and gain values. On signalling RPi::IPA_EVENT_SIGNAL_ISP_PREPARE
IPA event, the pipeline handler now passes this ControlList from the
RPiCameraData::BayerFrame queue.
The IPA now extracts the gain and exposure values from the ControlList
instead of using RPiController::MdParserRPi to parse the embedded data
buffer.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Camera sensors can include an ISP, which may be reported as a separate
entity from the pixel array in the media graph.
Support such sensors by accepting MEDIA_ENT_F_PROC_VIDEO_ISP as a valid
entity type. This allows using sensors that can be fully (or at least
meaningfully) configured through the ISP's source pad only. Sensors that
require further configuration, on the ISP sink pad and/or on the pixel
array's source pad, will require further extension to the CameraSensor
class.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
When a sensor can upscale the image, the native sensor resolution isn't
equal to the largest size reported by the sensor. Use the active area
size instead, and default it to the largest enumerated size if the crop
rectangle targets are not supported.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The rkisp1 driver has received support for newer ISP versions, which
changes its userspace API and ABI. Adapt to the API change. This
requires kernel v5.11 or newer, or backporting the corresponding rkisp1
changes to older kernels.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Dafna Hirschfeld <dafna.hirschfeld@collabora.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Implement and expose the symbol and functions that the new cros camera
API requires. Since we don't actually need them, leave them empty.
Update meson accordingly.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Introduce the CameraBuffer backend for the Chromium OS operating system
and the associated meson option.
The Chromium OS CameraBuffer implementation uses the
cros::CameraBufferManager class to perform mapping of 1 plane and multiplane
buffers and to retrieve size information.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|