Age | Commit message (Collapse) | Author |
|
When updating the controls the calculation for minCrop incorrectly
indents the parameters to scaledBy().
Fix it.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Provide a constructor for StreamBuffer and use that while populating
Camera3RequestDescriptor::buffers_ vector. Also provide the default
move-constructor (required as StreamBuffer is stored in a vector in
Camera3RequestDescriptor) and destructor for the StreamBuffer struct.
Also declare a default move assignment operator and disable the
copy constructor and move operator explicitly with
LIBCAMERA_DISABLE_COPY().
While at it, initialize pointers members in the StreamBuffer struct
to nullptr, with StreamBuffer::status set to Status::Success by default.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
cros::CameraBufferManager can be nullptr if there is an error in
its creation. Place a null-check guard to check it.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The IPU3 IPA has three events which are handled from the pipeline
handler.
The events are received in the sequence, EventProcessControls,
EventFillParams, and finally EventStatReady, while the code lists these
in a different order.
Update the flow of IPAIPU3::processEvent() to match the expected
sequence of events, to help support the reader in interpreting the flow
of events through the IPA.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The isContiguous debug message is inverted.
Correct the logic.
Reported-by: Roman Stratiienko <roman.o.stratiienko@globallogic.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The aspect ratio calculation divides two integer values then casts to a double.
This might reduce precision when scoring for aspect rato differences.
Fix this by casting the integer to a double before the division.
Reported-by: Coverity CID=361652
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The necessary tuning file and CamHelper is added for the imx519 sensor.
The imx519 is a 16MP rolling shutter sensor. To enable
it, please add
dtoverlay=imx519
to the /boot/config.txt file and reboot the Pi.
Signed-off-by: Lee Jackson <info@arducam.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The unicam driver no longer registers an embedded data node if the sensor does
not provide this stream. Account for this in the pipeline handler match routine
by not assuming it is always present.
Add a warning if Unicam and the CamHelper do not agree on the presense of sensor
embedded data, and disable its usage in these cases.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
With the recent change to switch to programming the sensor device directly,
the notion of packed vs unpacked modes are not relevent, since that is a
Unicam format construct. Remove any scoring based on packed/unpacked modes.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Switch the pipeline handler to use the new Unicam media controller based driver.
With this change, we directly talk to the sensor device driver to set controls
and set/get formats in the pipeline handler.
This change requires the accompanying Raspberry Pi linux kernel change at
https://github.com/raspberrypi/linux/pull/4645. If this kernel change is not
present, the pipeline handler will fail to run with an error message informing
the user to update the kernel build.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Sensor flips might change the Bayer order of the requested format. The existing
code would set a sensor format along with the appropriate Unicam and ISP input
formats, but reset the latter two on start() once the flips had been requested.
We can now set the sensor flips just before we set the sensor mode in
configure(), thereby not needing the second pair of format sets in start().
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add BayerFormat::toPixelFormat() and BayerFormat::fromPixelFormat() helper
functions to convert between BayerFormat and PixelFormat types.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
[Kieran: Minor checkstyle fix]
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add BayerFormat conversions for formats::R10 (10-bit unpacked) format.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Rename the bayerToV4l2 conversion table to bayerToFormat. Update the table to
hold both the PixelFormat and V4L2PixelFormat conversions for a given
BayerFormat. This will allow converting between BayerFormat and PixelFormat
types in a subsequent change.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The unscoped enum BayerFormat::Packing leads to usage of the ambiguous
BayerFormat::None enumerator. Turn the enumeration into a scoped enum to
force usage of BayerFormat::Packing::None, and drop the now redundant
"Packed" suffix for the CSI2 and IPU3 packing.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
This new formats corresponds to the V4L2 V4L2_PIX_FMT_Y10P format, and is a
CSI2-packed version of the DRM_FORMAT_R10 format.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
These new formats corresponds to the V4L2 V4L2_PIX_FMT_Y10 and
V4L2_PIX_FMT_Y12 formats, and are the little-endian version of the
DRM_FORMAT_R10 and DRM_FORMAT_R12 formats.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The PixelFormat::toString() has two \return statements in its
doxygen documentation.
Remove the redundant one.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The V4L2Capability has helpers to interogate the capabilities
of a device.
V4L2VideoDevice::enumPixelformats accesses the raw capabilites to check
if the device is supported by a MediaController device.
Provide a helper, and update the usage.
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
In case the maximum exposure received from the sensor is very high, we
can have a very high shutter speed with a small analogue gain, and it
may result in very slow framerate. We are not really supporting it for
the moment, so clamp the shutter speed to an arbitrary value of 60ms.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Tested-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The IPAFrameContext uses unnamed structures to group items. Doxygen
doesn't seem to support this properly, documentation isn't properly
generated and warnings are output during compilation. Suppress the
warning with a workaround that still results in incorrect generated
documentation until Doxygen gets fixed.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
[JMH: Fix doxygen variable usage]
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
While the stop() function does not currently perform any action, it forms
part of the IPA interface and is a public function in the class.
Promote it to a full (but basic) function implementation and begin the
documentation accordingly so that there is an appropriate stub to
perform stop operations if they come up.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Doxygen warns us because the structures are referenced as \struct while
they should be \var. Fix it.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The IPU3 IPA core is growing with additional documentation. The
ipa_context documentation is stored here, but it pushes the IPU3
documentation and implementation further from the head of the file.
Furthermore, the ipa_context documentation is outside of the ipa::ipu3
namespace and isn't identified correctly by Doxygen.
Move the ipa_context to its own compilation object even though there
isn't any code, but to maintain consistency with our documentation
model.
Correctly re-introduce the documentation into the libcamera::ipa::ipu3
namespace during the move.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
The struct RGB and struct AwbStatus are used only by the internal
implementation of the AWB algorithm module.
Move them into the private class declaration.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
The AWB AwbStatus structure is contained within the Awb class.
Fix the Doxygen reference so that it can be found.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
The ipa_context.h entry incorrectly referenced its file name.
Fix it.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
The IPU3 IPA implements the basic 3A using the ImgU ISP.
Provide an overview document to describe its operations, and provide a
block diagram to help visualise how the components are put together to
assist any new developers exploring the code.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
The tone mapping algorithm is currently undocumented.
Provide an introduction and overview to the implementation as the class
definition and document how the algorithm operates in the process and
prepare methods.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Let the algorithm perform its initial configuration. Implement
configure() to set a default gamma value and let process do the updates
needed.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The tone mapping algorithm calculates the gamma curve for every frame,
regardless of whether the gamma value has changed or not. This issue is
exasperated as we currently hardcode the gamma to a single value.
Optimise the implementation to only recalculate the look up table when
the gamma setting is changed, and store the gamma setting of the LUT
curve as part of the IPA context.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The AGC class was not documented while developing. Extend that to
reference the origins of the implementation, and improve the
descriptions on how the algorithm operates internally.
While at it, rename the functions which have bad names.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Now that we moved the diagram into the AWB class documentation, reword the
accumulator documentation to make it clear it is not meant to be used
only in AWB.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The AWB algorithm is based on the Grey world algorithm and uses the
statistics generated by the ImgU for that. Explain how it uses those,
and reference the original algorithm at the same time.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The stats pointer is marked as [[maybe_unused]]. This is a leftover from
a previous commit which was here to keep the compatibility while
transitioning to the new iterative algorithms.
Remove this attribute to make it explicit that stats are really used to
feed the algorithms.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Clarify the roles and interactions between the pipeline handler events
and the algorithm calls by documenting all the remaining functions of
the class.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Further extend the documentation for the IPAIPU3::configure operation.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
The IPU3 IPA is maturing to a modular and extensible system capable of
handling specific algorithms for the processing blocks on the ImgU.
Provide a top-level class documentation to provide an overview of the
IPA, detailing what events are used and what algorithms are currently
supported, as well as the limitations currently imposed.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
|
|
Introduce a dedicated worker class derived from libcamera::Thread.
The worker class maintains a queue for post-processing requests
and waits for a post-processing request to become available.
It will process them as per FIFO before de-queuing it from the
queue.
The entire post-processing handling iteration is locked under
streamsProcessMutex_ which helps us to queue all the post-processing
request at once, before any of the post-processing completion slot
(streamProcessingComplete()) is allowed to run for post-processing
requests completing in parallel. This helps us to manage both
synchronous and asynchronous errors encountered during the entire
post processing operation. Since a post-processing operation can
even complete after CameraDevice::requestComplete() has returned,
we need to check and complete the descriptor from
streamProcessingComplete() running in the PostProcessorWorker's
thread.
This patch also implements a flush() for the PostProcessorWorker
class which is responsible to purge post-processing requests
queued up while a camera is stopping/flushing. It is hooked with
CameraStream::flush(), which isn't used currently but will be
used when we handle flush/stop scenarios in greater detail
subsequently (in a different patchset).
The libcamera request completion handler CameraDevice::requestComplete()
assumes that the request that has just completed is at the front of the
queue. Now that the post-processor runs asynchronously, this isn't true
anymore, a request being post-processed will stay in the queue and a new
libcamera request may complete. Remove that assumption, and use the
request cookie to obtain the Camera3RequestDescriptor.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
PostProcessor::process() is invoked by CameraStream class
in case any post-processing is required for the camera stream.
The failure or success is checked via the value returned by
CameraStream::process().
Now that the post-processor notifies about the post-processing
completion operation, we can drop the return value of
PostProcessor::process(). The status of post-processing is passed
to CameraDevice::streamProcessingComplete() by the
PostProcessor::processComplete signal's slot.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
Notify that the post processing for a request has been completed,
via a signal. The signal is emitted with a context pointer along with
status of the buffer. The function CameraDevice::streamProcessingComplete()
will finally set the status on the request descriptor and complete the
descriptor if all the streams requiring post processing are completed.
If buffer status obtained is in error state, notify the status to the
framework and set the overall error status on the descriptor via
setBufferStatus().
We need to track the number of streams requiring post-processing
per Camera3RequestDescriptor (i.e. per capture request). Introduce
a std::map to track the post-processing of streams. The nodes
are dropped from the map when a particular stream post processing
is completed (or on error paths). A std::map is selected for tracking
post-processing requests, since we will move post-processing to be
asynchronous in subsequent commits. A vector or queue will not be
suitable as the sequential order of post-processing completion
of various requests won't be guaranteed then.
A streamsProcessMutex_ has been introduced here as well, which will be
applicable to guard access to descriptor's pendingStreamsToProcess_ when
post-processing is moved to be asynchronous in subsequent commits.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
Save and provide the context for post-processor of a camera stream
via Camera3RequestDescriptor::StreamBuffer. We extend the structure
to include source and destination buffers for the post processor, along
with CameraStream::Type::Internal buffer pointer (if any). In addition
to that, a back pointer to Camera3RequestDescriptor is convenient to
get access to overall descriptor (status, metadata settings etc.).
Also, migrate CameraStream::process() and PostProcessor::process()
signature to use Camera3RequestDescriptor::StreamBuffer only. This
will be helpful when we move to async post-processing in subsequent
commits.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
Currently, we use Camera3RequestDescriptor::Status to determine:
- When the descriptor has been completely processed by HAL
- Whether any errors were encountered, during its processing
Both of these are essential to know whether the descriptor is eligible
to call process_capture_results() through sendCaptureResults().
When a status(Success/Error) is set on the descriptor, it is ready to
be sent back via sendCaptureResults(). However, this might lead to
undesired results especially when sendCaptureResults() runs in a
different thread (for e.g. stream's post-processor async completion
slot).
This patch decouples the descriptor status (Success/Error) from the
descriptor's completion status (pending or complete). The advantage
of this is we can set the completion status when the descriptor has
been processed fully by the layer and we can set the error status on
the descriptor wherever an error is encountered, throughout the
lifetime of the descriptor in the HAL layer.
While at it, introduce a wrapper completeDescriptor() around
sendCaptureResults(). completeDescriptor() as the name suggests will
mark the descriptor as complete, so it is ready to be sent back.
The locking mechanism is moved from sendCaptureResults() to this wrapper
since the intention is to use completeDescriptor() in place of existing
sendCaptureResults() calls.
Also make sure the sequence of abortRequest() call happens in the same
order at all places i.e. after its added to the descriptors_ queue. Fix
one of the abortRequest() call accordingly.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
|
|
Instead of simply returning if encoder_ is nullptr, fail hard
via an assertion. It is quite unlikely that encoder_ could only
be null as a result of a fatal bug in the code, so be loud about
the failure.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Instead of checking postProcessor for nullptr, replace this
check with an assertion that checks if the camera stream's
type is not Type::Direct. Since it makes no sense to call
CameraStream::process() on a Type::Direct camera stream.
Signed-off-by: Umang Jain <umang.jain@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Hirokazu Honda <hiroh@chromium.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Each Request is currently creating its own CameraControlValidator
using the Camera instance at construction.
Now that the Camera exposes its own CameraControlValidator on its
private interface, use that one on all Requests.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Create a Camera-specific CameraControlValidator for the Camera instance.
This will allow requests to use a single validator instance without
having to construct their own.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The ternary operation used to get the total bytesused of a V4L2 single
planar format which is stored in a multiplanar buffer can easily be
mis-read to think it's a bug, and appears to be reading the value of the
first of N planes as the total.
Directly explain the reasoning for why it looks like the condition is
inverted, as it is correct that the total bytes used is stored in only
the first plane of the multiplanar buffer.
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Instead of using constants for the analogue gains limits, use the
minimum and maximum from the configured sensor.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
We currently control the exposure value by the shutter speed and the
analogue gain. We can't use the digital gain to have more than the
maximum exposure value calculated because we are not controlling it.
Remove unused code associated with this digital gain.
Signed-off-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|