summaryrefslogtreecommitdiff
AgeCommit message (Collapse)Author
2021-03-09android: camera_device: Compute frame durationsJacopo Mondi
Use the FrameDuration control reported by pipeline handlers to register the Auto-Exposure routine FPS range, the minimum stream frame durations and the sensor maximum frame duration. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-09libcamera: ipu3: Register FrameDurations controlJacopo Mondi
Register the FrameDurations control in the IPU3 pipeline handler computed using the vertical blanking limits and the sensor pixel rate as parameters. The FrameDurations control limits should be updated everytime a new configuration is applied to the sensor. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-09libcamera: ipu3: Initialize controls using sensor resolutionJacopo Mondi
The controls' limits initialized by the IPU3 pipeline handler depend on the sensor configuration. In order to compute controls using a known state apply to the sensor a configuration equal to its own resolution. Move the \todo note regarding the controls' limits dependency on the sensor configuration at the beginning of the function and remove the other redundant ones in the function body. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-09ipa: raspberrypi: Use direct return value for configure()Paul Elder
Now that we support returning int directly in addition to other output parameters, improve the configure() function in the raspberrypi IPA interface. Signed-off-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-09utils: ipc: Make first output parameter direct return if int32Paul Elder
To make it more convenient for synchronous IPA calls to return a status, convert the first output into a direct return if it is an int32. Signed-off-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Acked-by: Naushir Patuck <naush@raspberrypi.com>
2021-03-09utils: ipc: Support custom parameters to init()Paul Elder
Add support to the mojom-based code generator for custom parameters to init(). Remove the parameter type and count validation as well. Signed-off-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-09ipa: raspberrypi: Rename vblank field in SensorConfig to vblankDelayDavid Plowman
The name vblankDelay is clearer. Signed-off-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-09ipa: raspberrypi: Add support for imx290/imx327 sensorsDavid Plowman
imx290 and imx327 share the same kernel driver (imx290.c) and are therefore both recognised here as "imx290". We add the necessary CamHelper for these sensors, as well as a camera tuning file. The tuning was done with an Innomaker STARVIS IMX327LQR module. These have no IR cut filter so there is no proper colour tuning. However, you should obtain reasonable results for most modules using this sensor. Specific tunings for further modules can always be added subsequently. To use this sensor on the Raspberry Pi platform, please add dtoverlay=imx290,clock-frequency=74250000 into your /boot/config.txt file. Signed-off-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-09ipa: raspberrypi: Make CamHelpers return the frame delay for vblankingDavid Plowman
For some sensors (e.g. imx477) we need to update the vblanking on the frame before the exposure. For this reason the GetDelays method must also return the number of frame delays for the vblanking control. Signed-off-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-08libcamera: camera_sensor: Cap resolution to max frame sizeJacopo Mondi
Since commit 96aecfe36508 ("libcamera: camera_sensor: Use active area size as resolution") the CameraSensor::resolution() method returned the sensor's active pixel area size. As the CameraSensor::resolution() method is widely used in the library code base to retrieve the maximum frame size the sensor can produce, in case it is smaller than the pixel area size the returned size cannot be used to configure the sensor correctly. Fix this by returning the maximum frame resolution the sensor can produce, or the pixel area size in case the sensor embeds and ISP that can upscale and the supported maximum frame size is thus larger that the pixel array size. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-08utils: Add kernel headers update scriptLaurent Pinchart
Add a script to update the local copy of kernel headers (in include/linux/) from a Linux kernel git tree. The script handles header installation, manual processing of the IPU3 header that is still in staging, and update of the README file. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-08clang-format: Enable sorted includesKieran Bingham
We aim to ensure that our includes are alphabetically sorted. Whilst checkstyle.py also handles this, enable the clang-format explicitly to define the code-style requirement, and allow clang-format to also correct un-sorted header includes. Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-08clang-format: Update to reflect coding styleLaurent Pinchart
The libcamera coding style allows functions to be defined on a single line only when they're inline. Update the clang-format configuration accordingly. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-08clang-format: Update to clang-format-7Laurent Pinchart
Add all options available in the new version that were previously commented out (or just not listed). The commented out value is replaced by the clang-format-7 default where they differ, to avoid changing the current behaviour. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-08libcamera: pipeline: ipu3: Ensure that IPU3Frames::info is not used after deleteKieran Bingham
When the IPU3Frames completes, it deletes the internal info storage. This storage contains the pointer to the Request, but in some cases the pointer was being accessed after the info structure was removed. Ensure that the Request is obtained before attempting to complete to obtain a valid pointer. Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-08libcamera: pipeline: ipu3: Fix spelling errorKieran Bingham
Fix trivial spelling mistake. Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-08tracing: pipeline_handler: Queue RequestsKieran Bingham
Add tracing to the base pipeline handler class to track when requests are queued. Tracing is already available for other Request operations, but queuing a Request is not an operation handled by the Request itself. Add the tracepoint to the PipelineHandler::queueRequest() so the lifetime of a Request can be viewed when tracing. Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-08README: Update tracing dependenciesKieran Bingham
The packages required for tracing libcamera projects were incorrectly specified. Update with the correct prefix for the dev package on liblttng-ust-dev and also provide the lttng-tools package which provides the utilities required to handle tracing. Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-08libcamera: pipeline_handler: Update request usage commentKieran Bingham
When a pipeline handler completes a request, the request itself is not deleted by libcamera, and the application regains control over the object. It may choose to delete the Request, or re-use it. Clarify this in the comment by removing the declaration that the Request is deleted, but state that it is no longer managed by the pipeline handler and must not be accessed further after this function returns. Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Umang Jain <email@uajain.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-08libcamera: Request: validate state on completeKieran Bingham
Requests should only be completed from the RequestPending state. Requests which are completed from the RequestCancelled, or RequestComplete state, will indicate that a double-complete has been called on the Request, or that it has been used internally after it has been given back to the application. Ensure that this can be caught early if it occurs by enforcing the state required with an assert. Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Umang Jain <email@uajain.com> Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-05android: camera_device: Update gralloc usage flags for streamsLaurent Pinchart
When configuring streams, the camera HAL is supposed to update the gralloc usage flags to reflect the operations it will need to do on the stream buffers. Failure to do so leads to incorrect format selection by gralloc for the HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED format, as gralloc will not take into consideration the need of the camera to access the buffers. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-05libcamera: media_device: Fix typoLaurent Pinchart
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-04libcamera: pipeline: simple: Support camera sensors that contain an ISPLaurent Pinchart
Camera sensors can include an ISP. For instance, the AP1302 external ISP can be connected to up to two raw camera sensors, and the combination of the sensors and ISP is considered as a (smart) camera sensor from libcamera's point of view. The CameraSensor class has limited support for this already. Extend the simple pipeline handler to support such sensors, by using the media entity corresponding to the ISP instead of the raw camera sensor's entity. We don't need to handle the case where an entity in the SoC would expose the MEDIA_ENT_F_PROC_VIDEO_ISP function, as pipeline containing an ISP would have a dedicated pipeline handler. The implementation is limited as it won't support other multi-entity camera sensors (such as CCS). While this would be worth supporting, we don't have a test platform with a CCS-compatible sensor at this point, so let's not over-engineer the solution. Extending support to CCS (and possibly other sensor topologies) will likely involve helpers that can be used by other pipeline handlers (such as generic graph walk helpers for instance) and extensions to the CameraSensor class. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-04libcamera: pipeline: simple: Walk the pipeline by following the first linkDafna Hirschfeld
When walking the pipeline, follow the first link of each source pad. This patch removes a redundant condition for choosing the link: "(link->flags() & MEDIA_LNK_FL_ENABLED) || !(link->flags() & MEDIA_LNK_FL_IMMUTABLE)" since it always returns true. Signed-off-by: Dafna Hirschfeld <dafna.hirschfeld@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-04utils: ipc: templates: Drop unused variableLaurent Pinchart
The has_input variable is unused, drop it. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
2021-03-04libcamera: v4l2_videodevice: Print fd value in log prefixLaurent Pinchart
When opening a V4L2VideoDevice multiple times, for instance to run multiple jobs on a M2M device, it's useful to attribute log messages to a particular instance. Include the device fd in the log prefix. This turns the existing output [1:43:01.958321522] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[cap]: Queueing buffer 0 [1:43:01.958350060] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[cap]: Queueing buffer 1 [1:43:01.958365137] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[cap]: Queueing buffer 2 into [1:43:01.958321522] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[14:cap]: Queueing buffer 0 [1:43:01.958350060] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[14:cap]: Queueing buffer 1 [1:43:01.958365137] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[14:cap]: Queueing buffer 2 Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
2021-03-04libcamera: v4l2_device: Make fd() function constLaurent Pinchart
There are use cases for getting the file descriptor of a const V4L2Device instance, for instance to print it in a log. Make the function const. There's little risk of abuse here (as in code then performing operations on the file descriptors that conceptually modify the V4L2 device), as the fd() function is protected. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
2021-03-04pipeline: raspberrypi: Only enable embedded stream when availableNaushir Patuck
The pipeline handler would enable and use the Unicam embedded data stream even if the sensor did not support it. This was to allow a means to pass exposure and gain values for the frame to the IPA in a synchronised way. The recent changes to get the pipeline handler to pass a ControlList with exposure and gain values means this is no longer required. Disable the use of the embedded data stream when a sensor does not support it. This change also removes the mappedEmbeddedBuffers_ map as it is no longer used. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Tested-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-04ipa: raspberrypi: Remove MdParserRPiNaushir Patuck
With the recent change to pass a ControlList to the IPA with exposure and gain values used for a frame, RPiController::MdParserRPi is not needed any more. Remove all traces of it. The derived CamHelper objects now pass nullptr values for the parser to the base CamHelper class when sensors do not use metadata. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Tested-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-04pipeline: ipa: raspberrypi: Pass exposure/gain values to IPA though controlsNaushir Patuck
When running with sensors that had no embedded data, the pipeline handler would fill a dummy embedded data buffer with gain/exposure values, and pass this buffer to the IPA together with the bayer buffer. The IPA would extract these values for use in the controller algorithms. Rework this logic entirely by having a new RPiCameraData::BayerFrame queue to replace the existing bayer queue. In addition to storing the FrameBuffer pointer, this also stores all the controls tracked by DelayedControls for that frame in a ControlList. This includes include exposure and gain values. On signalling RPi::IPA_EVENT_SIGNAL_ISP_PREPARE IPA event, the pipeline handler now passes this ControlList from the RPiCameraData::BayerFrame queue. The IPA now extracts the gain and exposure values from the ControlList instead of using RPiController::MdParserRPi to parse the embedded data buffer. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Tested-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-03libcamera: camera_sensor: Accept entities exposing the ISP functionLaurent Pinchart
Camera sensors can include an ISP, which may be reported as a separate entity from the pixel array in the media graph. Support such sensors by accepting MEDIA_ENT_F_PROC_VIDEO_ISP as a valid entity type. This allows using sensors that can be fully (or at least meaningfully) configured through the ISP's source pad only. Sensors that require further configuration, on the ISP sink pad and/or on the pixel array's source pad, will require further extension to the CameraSensor class. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
2021-03-03libcamera: camera_sensor: Use active area size as resolutionLaurent Pinchart
When a sensor can upscale the image, the native sensor resolution isn't equal to the largest size reported by the sensor. Use the active area size instead, and default it to the largest enumerated size if the crop rectangle targets are not supported. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03ipa: rkisp1: Update to kernel header changesLaurent Pinchart
The rkisp1 driver has received support for newer ISP versions, which changes its userspace API and ABI. Adapt to the API change. This requires kernel v5.11 or newer, or backporting the corresponding rkisp1 changes to older kernels. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Dafna Hirschfeld <dafna.hirschfeld@collabora.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-03include: linux: Update Linux headers to v5.12-rc1Laurent Pinchart
Update Linux headers to v5.12-rc1, to provide the MEDIA_ENT_F_PROC_VIDEO_ISP entity function. The DRM FourCC and modifiers that were manually added in commits 9db0ed5e2065, 38f2efb05cef and 90c793c6989f are kept. New Intel DRM format modifiers are conflicting with IPU3_FORMAT_MOD_PACKED, which is updated as a result. The V4L2 controls and formats that were manually added in commit 43d81d43fe91 are kept. This causes a conflict in the V4L2 control base for V4L2_CID_USER_BCM2835_ISP_BASE that needs to be resolved in the downstream Raspberry Pi kernel first. The intel-ipu3.h header is manually exported with the scripts/headers_install.sh script. The script complained about a missing "WITH Linux-syscall-note" license extension, which has been worked around manually. The issue has been reported upstream in [1]. [1] https://lore.kernel.org/linux-media/20210207235610.15687-1-laurent.pinchart@ideasonboard.com/T/#u Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-03cros: Support the new cros camera API with set_up and tear_downPaul Elder
Implement and expose the symbol and functions that the new cros camera API requires. Since we don't actually need them, leave them empty. Update meson accordingly. Signed-off-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: Introduce Chromium OS buffer managerJacopo Mondi
Introduce the CameraBuffer backend for the Chromium OS operating system and the associated meson option. The Chromium OS CameraBuffer implementation uses the cros::CameraBufferManager class to perform mapping of 1 plane and multiplane buffers and to retrieve size information. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: mm: Provide helper macro for PIMPLJacopo Mondi
Each memory backend has to declare a CameraBuffer class implementation that bridges the API calls to each CameraBuffer::Private implementation. As the code is likely the same for most (if not all) backends, provide a convenience macro that expands to the CameraBuffer class declaration. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: jpeg: Use CameraBuffer::jpegBufferSize()Jacopo Mondi
Use the newly introduced function to retrieve the size of the JPEG encoding destination buffer, in order to calculate where the JPEG_BLOB_ID should be placed. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: camera_buffer: Add method to get the JPEG blob sizeJacopo Mondi
To maintain compatibility with platforms that do not provide a memory backend implementation add a method to be return the size of the buffer used for JPEG encoding capped to a maximum size. Platforms that implement a memory backend will always calculate the correct buffer size. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: post_processor: Use CameraBuffer APIJacopo Mondi
Use the newly introduced CameraBuffer class as the type for the destination buffer in the PostProcessor class hierarchy in place of the libcamera::MappedFrameBuffer one and use its API to retrieve the length and the location of the CameraBuffer plane allocated for JPEG post-processing. Remove all the assumption on the underlying memory storage and only go through the CameraBuffer API when dealing with memory buffers. To do so rework the Encoder interface to use a raw pointer and an explicit size to remove access to the Span<uint8_t> maps that serve as memory storage for the current implementation but might not be ideal for other memory backend. Now that the whole PostProcessor hierarchy has been converted to use the CameraBuffer API remove libcamera::MappedBuffer as base class of the CameraBuffer interface and only reply on its interface. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: camera_buffer: Implement libcamera::ExtensibleJacopo Mondi
In order to prepare to support more memory backends, make the CameraBuffer class implement the PIMPL (pointer-to-implementation) pattern by inheriting from the libcamera::Extensible class. Temporary maintain libcamera::MappedBuffer as the CameraBuffer base class to maintain compatibility of the CameraStream::process() interface that requires a MappedBuffer * as second argument and will be converted to use a CameraBuffer in the next patch. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: Move buffer mapping to CameraStreamJacopo Mondi
The destination buffer for the post-processing component is currently first mapped in the CameraDevice class and then passed to CameraStream which simply calls the post-processor interface. Move the mapping to CameraStream::process() to tie the buffer mapping to the lifetime of the CameraBuffer instance. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: camera_device: Rename buffer fieldsJacopo Mondi
The buffers passed to the post processor are currently named 'buffer' and 'mapped', names that do not convey their role. Use 'src' and 'dest' instead. Cosmetic change only. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: camera_buffer: Drop 'const' from buffer_handle_tJacopo Mondi
The buffer_handle_t type is defined as 'const native_handle_t*'. Drop the 'const' specifier from the parameter of the CameraBuffer class constructor and in the Android generic memory backend. Also rename 'camera3buffer' in 'camera3Buffer' to comply with the coding style guidelines. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: Introduce CameraBuffer interfaceJacopo Mondi
In order to provide support for different memory backends, move the MappedCamera3Buffer class definition outside of the CameraDevice class to its own file and rename it in CameraBuffer. The interface defined in camera_buffer.h will be implemented by different backends that will be placed in the src/android/mm subdirectory. Provide a first implementation for the 'generic android' backend which matches the existing one. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03meson: options: Add option to select the Android platformJacopo Mondi
The Android Camera3 HAL implementation requires platform specific extensions, such as the selection of the memory backend to use and additional constraints depending on the target device. Define a combo option to select which platform to target and define the currently existing implementation as 'generic'. Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03libcamera: pipeline: simple: Enable multiple streams for compatible devicesLaurent Pinchart
Allow support for multiple streams on a per-device basis. The decision should be made based on the ability of the converter to run multiple times within the duration of one frame. Hardcode it in SimplePipelineInfo for now. We may later compute the number of supported streams dynamically based on the requested configuration, using converter bandwidth information instead of a hardcoded fixed value. All platforms are currently limited to a single stream until they get successfully tested with multiple streams. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: Support usage of multiple streamsLaurent Pinchart
To extend the multi-stream support to runtime operation of the pipeline, expand the converter queue to store multiple output buffers, and update the request queuing and buffer completion handlers accordingly. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: Support configuration of multiple streamsLaurent Pinchart
Extend the SimpleCameraConfiguration to support multiple streams, using the multi-stream capability of the SimpleConverter class. Wiring up multi-stream support in the other pipeline handler operations will come in further commits. To keep the code simple, require all streams to use the converter if any stream needs it. It would be possible to generate one stream without conversion (provided the format and size match what the capture device can generate), and this is left as a future optimization. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: Hardcode the number of internal buffersLaurent Pinchart
The number of internal buffers, used between the capture device and the converter, doesn't need to depend on the number of buffers allocated for the output stream of the pipeline. Hardcode it to a fixed value. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>