Age | Commit message (Collapse) | Author |
|
camera_device.cpp doesn't use the PostProcessor class, the
post_processor.h header shouldn't be included. Removing it causes a
compilation failure as the CameraBuffer class is not defined anymore,
include camera_buffer.h instead.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
Rename CameraMetadata::get() to CameraMetadata::getMetadata()
to avoid confusion with std::unique_ptr::get() when CameraMetadata
is used with a std::unique_ptr.
No functional changes intended in this patch.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
There's no need for the move constructor and the destructor to be
inline. Define them explicitly, with default implementations. This
allows usage of the CameraStream class without a complete definition of
the PostProcessor class.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
The camera HAL APIs requires that any acquire fence that hasn't been
waited on to be sent back to the framework as a release fence. The
CameraDevice already copies the acquire fence to the release fence when
signaling request completion, but the CameraStream incorrectly closes
the fence when a wait fails and sets it to -1. Fix it.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
The camera3_stream_buffer_t structure is meant to communicate between
the camera service and the HAL. They are short-live structures that
don't outlive the .process_capture_request() operation (when queuing
requests) or the .process_capture_result() callback.
We currently store copies of the camera3_stream_buffer_t passed to
.process_capture_request() in Camera3RequestDescriptor::StreamBuffer to
store the structure members that the HAL need, and reuse them when
calling the .process_capture_result() callback. This is conceptually not
right, as the camera3_stream_buffer_t pass to the callback are not the
same objects as the ones received in .process_capture_request().
Store individual fields of the camera3_stream_buffer_t in StreamBuffer
instead of copying the whole structure. This gives the HAL full control
of how data is stored, and properly decouples request queueing from
result reporting.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Call abortRequest() in CameraDevice::requestComplete() instead of
open-coding it.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
The camera3_stream_t instances are used to interact with the camera
service, whose API uses non-const pointers. Replace the const reference
returned by CameraStream::camera3Stream() with a non-const pointer. It
turns out that nobody calls this function, but new users will be
introduced in subsequent commits.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Now that we have a proper structure to model a stream buffer, pass it to
CameraStream::process() instead of the camera3_stream_buffer_t. This
will allow accessing other members of StreamBuffer in subsequent
commits.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
The Camera3RequestDescriptor structure stores, for each stream, the
camera3_stream_buffer_t and the libcamera FrameBuffer in two separate
vectors. This complicates buffer handling, as the code needs to keep
both vectors in sync. Create a new structure to group all data about
per-stream buffers to simplify this.
As a side effect, we need to create a local vector of
camera3_stream_buffer_t in CameraDevice::sendCaptureResults() as the
camera3_stream_buffer_t instances stored in the new structure in
Camera3RequestDescriptor are not contiguous anymore. This is a small
price to pay for easier handling of buffers, and will be refactored in
subsequent commits anyway.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
Data (or broader context) required for post processing of a camera request
is saved via Camera3RequestDescriptor. Instead of passing individual
arguments to CameraStream::process(), pass the Camera3RequestDescriptor
pointer to it. All the arguments necessary to run the post-processor can
be accessed from the descriptor.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
The camera3_capture_result_t is only needed to convey capture results to
the camera service through the process_capture_result() callback.
There's no need to store it in the Camera3RequestDescriptor. Build it
dynamically in CameraDevice::sendCaptureResults() instead.
This requires storing the result metadata created in
CameraDevice::requestComplete() in the Camera3RequestDescriptor. A side
effect of this change is that the request metadata lifetime will match
the Camera3RequestDescriptor instead of being destroyed at the end of
requestComplete(). This will be needed to support asynchronous
post-processing, where the request completion will be signaled to the
camera service asynchronously from requestComplete().
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The Camera3RequestDescriptor structure is growing into an object with
member functions. Turn it into a class, uninline the destructor to
reduce code size, explicitly disable copy as requests are not copyable,
and delete the default constructor to force all instances to be fully
constructed.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Camera3RequestDescriptor is a utility structure that groups information
about a capture request. It can be and will be extended to preserve the
context of a capture overall. Since the context of a capture needs to
be shared among other classes (for e.g. CameraStream) having a private
definition of the struct in CameraDevice class doesn't help.
Hence, de-scope the structure so that it can be shared with other
components (through references or pointers). Splitting the structure to
a separate file will help avoiding circular dependencies when using it
through the HAL implementation.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
When the camera HAL detects an out-of-order completion of a request, it
sends to the camera framework a CAMERA3_MSG_ERROR_DEVICE error.
Such error not only forces the service to close the camera as prescribed
by the camera3 specification, but in some implementation (specifically
the ChromeOS one) it causes the camera service to abort and exit.
This prevents any error messages from being printed by libcamera, as the
library gets terminated before getting to that point, and also hides the
printout of error messages that lead to out-of-order completion, making
it impossible to get from the output log what happened.
Move the call to notifyError() at the end of the error path and demote
the error message to LogLevels::Error from Fatal to let the service
implementation decide how to handle CAMERA3_MSG_ERROR_DEVICE errors.
Before this patch, when waiting on a fence fails and the capture
request is not queued to the Camera, we get an out-of-order completion
but no backtrace. With this patch applied the error path is visible:
ERROR HAL camera_worker.cpp:122 Failed waiting for fence: 82: Timer expired
ERROR HAL camera_device.cpp:1110 '\_SB_.PCI0.I2C2.CAM0': Out-of-order completion for request 0x00007e6de4004c70
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
requests"
Commit d165f7da34b8 ("android: camera_device: Configure one stream for
identical stream requests") introduced the ability to generate through
post-processing YUV streams of identical size and format.
However the change didn't fully take into account the situation
where only mapped streams are contained in the request submitted by
the camera service to the HAL. In this case the Request will be queued
with no buffers and refused by the Camera.
Even if this seems a corner case it causes a few CTS to fail, and more
problematically it triggers out-of-order completion of requests, causing
the camera service to abort.
ERROR Camera camera.cpp:1031 Request contains no buffers
ERROR HAL camera_device.cpp:1109 '\_SB_.PCI0.I2C2.CAM0': Out-of-order completion for request 0x00007a1f1800ccd0
ERROR cros_camera_service[15706:15711]: [camera_device_adapter.cc(744)] (15711) Notify(): Fatal device error; aborting the camera service
Revert the commit until a proper solution is implemented.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
Limit the reported minumum frame duration to 30 FPS.
The reason to do is to bring the libcamra HAL in par with the Intel
HAL implementation on IPU3 platform, where 30FPS is the frame rate used
to perform quality tuning in the closed-source IPA module and has been
validated as the most efficient rate for the power/performace budget.
This change bring into the HAL a platform specific constraints, which
might be opportune for most platforms but should rather be configurable
by system integrators. Record that with a \todo entry.
Also record that, even if we report a lower frame rate, we currently
do not limit what the camera actually produce.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
As reported by the CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES documentation
in the Android developer reference:
"For devices advertising any color filter arrangement other than NIR, or
devices not advertising color filter arrangement, this list will always
include (min, max) and (max, max) where min <= 15 and max = the maximum
output frame rate of the maximum YUV_420_888 output size."
Collect the higher FPS of the larger YUV stream and use it with the
minimum FPS rate the camera can produce to populate the
ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES static metadata.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS and
ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS static metadata are
populated by looping on the streamConfigurations_ vector.
Unify them in a single loop to avoid repeating it.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add a debug statement to print out the list of collected output stream
and their characteristics.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Register as preview streams only streams capable of producing at least
30 FPS.
This requirement comes from inspecting the existing HAL implementation
on Intel IPU3 platform and from inspecting the CTS RecordingTests
results.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
While building the list of supported stream configurations also collect
the absolute max frame durations to be used to populate the sensor
maximum frame duration.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
We currently hardcode 2560x1920@30FPS as the only stalling frame duration.
This is of course not correct, and all the required information to
properly populate the ANDROID_SCALER_AVAILABLE_STALL_DURATIONS static
metadata are available from initializeStaticMetadata().
Use the collected stalling durations and sizes to properly popoulate the
static property.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Use the per-configuration stream durations as collected during
initializeStreamConfigurations() to populate the
ANDROID_SCALER_AVAILABLE_STREAM_CONFIGURATIONS_OUTPUT static metadata.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
As we now collect the per-stream frame durations at
initializeStreamConfigurations() times, the Camera is now guaranteed to
support the controls::FrameDurationLimits control.
Remove the check for its presence when populating the
ANDROID_CONTROL_AE_AVAILABLE_TARGET_FPS_RANGES static metadata.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Now that building the list of supported stream configuration requires
applying a configuration to the Camera, re-initialize the camera
controls by applying a configuration generated for the Viewfinder stream
role before building the list of static metadata.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Collect the per-stream frame durations while building the list
of supported stream formats and resolutions.
In order to get an updated list of controls it is necessary to apply
to the Camera the configuration we're testing, which was so far only
validated.
The per-configuration durations will be used to populate the Android
ANDROID_SCALER_AVAILABLE_MIN_FRAME_DURATIONS static metadata.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The libcamera Android Camera HAL generates camera configurations for the
StillCapture, Raw and ViewFinder stream roles. But there is only a check
if the configuration generation failed, for the StillCapture stream role.
This could lead to a NULL pointer dereference if a pipeline handler fails
to generate a default configuration for one of the other two stream roles.
Signed-off-by: Javier Martinez Canillas <javierm@redhat.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The problem is happening because we seem to add a CameraStream
associated buffer (depending on the CameraStream::Type) to the Request,
in CameraDevice::processCaptureRequest().
However, when the camera stops, all the current buffers are marked with
FrameMetadata::FrameCancelled and proceed to completion. But the buffer
associated with the CameraStream (that was previously added to the
request) has now been cleared out with a part of streams_.clear(), even
before the camera stop() has been invoked. Any access to those request
buffers after they have been cleared, will result in a crash.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Creating a CameraBuffer instance doesn't map memory. Fix the error
message accordingly.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
CameraStream always sets the format of processor output buffer to
MJPEG. This fixes the issue.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
There is a possibility that an out-of-order completion of capture
request happens by calling process_capture_result() directly on error
paths. The framework expects that errors should be notified as soon as
possible, but the request completion order should remain intact.
An existing instance of this is abortRequest(), which sends the capture
results on flushing state, without considering order-of-completion.
Since we have a queue of Camera3RequestDescriptor tracking each
capture request placed by framework to libcamera HAL, we should be only
sending back capture results from a single location, by inspecting
the queue. As per the patch, this now happens in
CameraDevice::sendCaptureResults().
Each descriptor is now equipped with its own status to denote whether
the capture request is complete and ready to be send back to the
framework or needs to be waited upon. This ensures that the order of
completion is respected for the requests.
Since we are fixing out-of-order request completion in abortRequest(),
change the function to read from the Camera3RequestDescriptor directly,
instead of camera3_capture_request_t. The descriptor should have all the
information necessary to set the request buffers' state to error.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
abortRequest() and notifyError() do not modify any members of
CameraDevice hence, these functions can be const.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The descriptors_ map holds Camera3RequestDescriptor(s) which are
per-capture requests placed by the framework to libcamera HAL.
CameraDevice::requestComplete() looks for the descriptor for which the
camera request has been completed and removes it from the map.
Since the requests are placed in form of FIFO and the framework expects
the order of completion to be FIFO as well, this calls for a need of
a queue rather than a std::map.
This patch still keeps the same lifetime of Camera3RequestDescriptor as
before i.e. in the requestComplete(). Previously, a descriptor was
extracted from the map and its lifetime was bound to requestComplete().
The lifetime is kept the same by manually calling .pop_front() on the
queue. In the subsequent commit, this is likely to change with a
centralized location of dropping descriptors from the queue for request
completion.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Use Camera3RequestDescriptor as cookie for the Capture Request.
The cookie is used to lookup descriptors map in
CameraDevice::requestComplete(). The map will be transformed to a
queue in subsequent commit.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Acquire fences for streams of type Mapped generated by
post-processing are not correctly handled and are currently
ignored by the camera HAL.
Fix this by adding CameraStream::waitFence(), executed before
starting the post-processing in CameraStream::process().
The change applies to all streams generated by post-processing (Mapped
and Internal) but currently acquire fences of Internal streams are
handled by the camera worker. Postpone that to post-processing time by
passing -1 to the Worker for Internal streams.
Also correct the release_fence handling for failed captures, as the
framework requires the release fences to be set to the acquire fence
value if the acquire fence has not been waited on.
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Tested-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The Camera3RequestDescriptor containing the capture request is added to
the descriptors_ map after a call to CameraWorker::queueRequest(). This
is a race condition since CameraWorker::queueRequest() queues requests
to libcamera::Camera asynchronously. The requests may thus complete
before they get added to descriptors_, in which case requestComplete()
will fail to lookup the request in the map.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Generation of thumbnail is not occuring currently because
ANDROID_JPEG_THUMBNAIL_SIZE is not set for request metadata passed
to PostProcessorJpeg::process(). The commit 1264628d3c92("android:
jpeg: Configure thumbnailer based on request metadata") introduced
the mechanism to retrieve the thumbanil size from request metadata,
however it didn't add the counterpart i.e. inserting the size in
the request metadata in request metadata template, at the first place.
The patch fixes this issue by setting ANDROID_JPEG_THUMBNAIL_SIZE in
the request metadata template populated by
CameraCapabilities::requestTemplatePreview(). The value for
ANDROID_JPEG_THUMBNAIL_SIZE is set to be the first non-zero size
reported by static metadata ANDROID_JPEG_AVAILABLE_THUMBNAIL_SIZES.
Fixes: 1264628d3c92("android: jpeg: Configure thumbnailer based on request metadata")
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
CameraMetadata's constructor take in number of entries and number of
bytes to be allocated for those entries. However, CameraMetadata is
already capable of resizing its container on the fly, in case more
entries are added to it. Hence, the numbers passed in during the
construction acts as hint values for initialization.
Clarify this in CameraCapabilities::requestTemplatePreview() and
remove the \todo, as the arguments and the \todo gives the perspective
that we need to be quite accurate with the numbers of entries / bytes,
which is not the case.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
Returning a non-managed pointer can cause leaks. Use a unique_ptr<>
instead to avoid possible future issues.
The std::move() for the planes argument to the FrameBuffer constructor
is dropped as it's misleading. FrameBuffer has no constructor that takes
an rvalue reference to planes, so the vector was copied despite the
move. This only clarifies the intent, no functional change is
introduced.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
An Android HAL client may request multiple identical streams. It is
redundant that a native camera device produces a separate stream for
each of the identical requests. Configure the camera with a single
stream in that case. The other identical HAL streams will be produced by
the YUV post-processor.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
CameraStream creates PostProcessorYuv if the destination format
is NV12.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
CameraStream creates PostProcessor and FrameBufferAllocator in
the constructor. CameraStream assumes that a used post processor
is JPEG post processor. Since we need to support various post
processors, we would rather move the creation to configure() so
as to return an error code if no proper post processor is found.
This also moves FrameBufferAllocator and Mutex creation for
consistency.
Signed-off-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Rectify variable renaming style for YPlaneSize, UVPlaneSize.
libcamera uses camelCase where first letter should be in lower case.
Fixes: e355ca0087cd9("android: jpeg: Split and pass the thumbnail planes to encoder")
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
YUV post-processor doesn't need any instance reference from CameraDevice
class. Remove it.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
After multi-planar support was introduced for jpeg encoding as well,
EncoderLibJpeg::encode() expects a vector of planes as the source of
framebuffer to be encoded. Currently, we are passing a contiguous buffer
which is treated as only one plane (instead of two, as thumbnail is NV12).
Hence, split the thumbnail data into respective planes according to NV12.
This fixes a crash in encoding of thumbnails.
Fixes: 894ca69f6043("android: jpeg: Support multi-planar buffers")
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Failure can still happen by CameraBufferManager during Unlock() and/or
Deregister() of camera3Buffer handles. We should be logging those
errors as well.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The JPEG post-processor uses MappedFrameBuffer to access pixel data, but
only uses data from the first plane. Pass the vector of planes to the
encode() function to correctly handle multi-planar formats (currently
limited to NV12).
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Acked-by: Umang Jain <umang.jain@ideasonboard.com>
Tested-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When calculating the luma line address, the image width is used instead
of the stride. Without padding at the end of the line the the values
should be identical, but this is conceptually incorrect in any case. Fix
it.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
Now that libcamera correctly supports frame buffers with different
dmabuf for each plane, remove the assumption that a single dmabuf is
used.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add two helpers functions to the PixelFormatInfo class to compute the
byte size of a given plane, taking the frame size, the stride, the
alignment constraints and the vertical subsampling into account.
Use the new functions through the code base to replace manual
implementations.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|