summaryrefslogtreecommitdiff
path: root/src
AgeCommit message (Collapse)Author
2021-03-09android: camera_device: Use AE FPS range in templateJacopo Mondi
The request template returned by requestTemplatePreview() uses an arbitrary {15, 30} Auto-Exposure algorithm FPS range. Use the one calculated at static metadata creation time, which is consistent with the camera limits. Once template generation will be performed inspecting the requested capture intent, the FPS range over which the AE algorithm can range shall be tuned accordingly. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-09android: camera_device: Compute frame durationsJacopo Mondi
Use the FrameDuration control reported by pipeline handlers to register the Auto-Exposure routine FPS range, the minimum stream frame durations and the sensor maximum frame duration. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-09libcamera: ipu3: Register FrameDurations controlJacopo Mondi
Register the FrameDurations control in the IPU3 pipeline handler computed using the vertical blanking limits and the sensor pixel rate as parameters. The FrameDurations control limits should be updated everytime a new configuration is applied to the sensor. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-09libcamera: ipu3: Initialize controls using sensor resolutionJacopo Mondi
The controls' limits initialized by the IPU3 pipeline handler depend on the sensor configuration. In order to compute controls using a known state apply to the sensor a configuration equal to its own resolution. Move the \todo note regarding the controls' limits dependency on the sensor configuration at the beginning of the function and remove the other redundant ones in the function body. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-09ipa: raspberrypi: Use direct return value for configure()Paul Elder
Now that we support returning int directly in addition to other output parameters, improve the configure() function in the raspberrypi IPA interface. Signed-off-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-09ipa: raspberrypi: Rename vblank field in SensorConfig to vblankDelayDavid Plowman
The name vblankDelay is clearer. Signed-off-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-09ipa: raspberrypi: Add support for imx290/imx327 sensorsDavid Plowman
imx290 and imx327 share the same kernel driver (imx290.c) and are therefore both recognised here as "imx290". We add the necessary CamHelper for these sensors, as well as a camera tuning file. The tuning was done with an Innomaker STARVIS IMX327LQR module. These have no IR cut filter so there is no proper colour tuning. However, you should obtain reasonable results for most modules using this sensor. Specific tunings for further modules can always be added subsequently. To use this sensor on the Raspberry Pi platform, please add dtoverlay=imx290,clock-frequency=74250000 into your /boot/config.txt file. Signed-off-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-09ipa: raspberrypi: Make CamHelpers return the frame delay for vblankingDavid Plowman
For some sensors (e.g. imx477) we need to update the vblanking on the frame before the exposure. For this reason the GetDelays method must also return the number of frame delays for the vblanking control. Signed-off-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-08libcamera: camera_sensor: Cap resolution to max frame sizeJacopo Mondi
Since commit 96aecfe36508 ("libcamera: camera_sensor: Use active area size as resolution") the CameraSensor::resolution() method returned the sensor's active pixel area size. As the CameraSensor::resolution() method is widely used in the library code base to retrieve the maximum frame size the sensor can produce, in case it is smaller than the pixel area size the returned size cannot be used to configure the sensor correctly. Fix this by returning the maximum frame resolution the sensor can produce, or the pixel area size in case the sensor embeds and ISP that can upscale and the supported maximum frame size is thus larger that the pixel array size. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-08libcamera: pipeline: ipu3: Ensure that IPU3Frames::info is not used after deleteKieran Bingham
When the IPU3Frames completes, it deletes the internal info storage. This storage contains the pointer to the Request, but in some cases the pointer was being accessed after the info structure was removed. Ensure that the Request is obtained before attempting to complete to obtain a valid pointer. Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-08libcamera: pipeline: ipu3: Fix spelling errorKieran Bingham
Fix trivial spelling mistake. Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-08tracing: pipeline_handler: Queue RequestsKieran Bingham
Add tracing to the base pipeline handler class to track when requests are queued. Tracing is already available for other Request operations, but queuing a Request is not an operation handled by the Request itself. Add the tracepoint to the PipelineHandler::queueRequest() so the lifetime of a Request can be viewed when tracing. Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-08libcamera: pipeline_handler: Update request usage commentKieran Bingham
When a pipeline handler completes a request, the request itself is not deleted by libcamera, and the application regains control over the object. It may choose to delete the Request, or re-use it. Clarify this in the comment by removing the declaration that the Request is deleted, but state that it is no longer managed by the pipeline handler and must not be accessed further after this function returns. Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Umang Jain <email@uajain.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-08libcamera: Request: validate state on completeKieran Bingham
Requests should only be completed from the RequestPending state. Requests which are completed from the RequestCancelled, or RequestComplete state, will indicate that a double-complete has been called on the Request, or that it has been used internally after it has been given back to the application. Ensure that this can be caught early if it occurs by enforcing the state required with an assert. Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Umang Jain <email@uajain.com> Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-05android: camera_device: Update gralloc usage flags for streamsLaurent Pinchart
When configuring streams, the camera HAL is supposed to update the gralloc usage flags to reflect the operations it will need to do on the stream buffers. Failure to do so leads to incorrect format selection by gralloc for the HAL_PIXEL_FORMAT_IMPLEMENTATION_DEFINED format, as gralloc will not take into consideration the need of the camera to access the buffers. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-05libcamera: media_device: Fix typoLaurent Pinchart
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-04libcamera: pipeline: simple: Support camera sensors that contain an ISPLaurent Pinchart
Camera sensors can include an ISP. For instance, the AP1302 external ISP can be connected to up to two raw camera sensors, and the combination of the sensors and ISP is considered as a (smart) camera sensor from libcamera's point of view. The CameraSensor class has limited support for this already. Extend the simple pipeline handler to support such sensors, by using the media entity corresponding to the ISP instead of the raw camera sensor's entity. We don't need to handle the case where an entity in the SoC would expose the MEDIA_ENT_F_PROC_VIDEO_ISP function, as pipeline containing an ISP would have a dedicated pipeline handler. The implementation is limited as it won't support other multi-entity camera sensors (such as CCS). While this would be worth supporting, we don't have a test platform with a CCS-compatible sensor at this point, so let's not over-engineer the solution. Extending support to CCS (and possibly other sensor topologies) will likely involve helpers that can be used by other pipeline handlers (such as generic graph walk helpers for instance) and extensions to the CameraSensor class. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-04libcamera: pipeline: simple: Walk the pipeline by following the first linkDafna Hirschfeld
When walking the pipeline, follow the first link of each source pad. This patch removes a redundant condition for choosing the link: "(link->flags() & MEDIA_LNK_FL_ENABLED) || !(link->flags() & MEDIA_LNK_FL_IMMUTABLE)" since it always returns true. Signed-off-by: Dafna Hirschfeld <dafna.hirschfeld@collabora.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-04libcamera: v4l2_videodevice: Print fd value in log prefixLaurent Pinchart
When opening a V4L2VideoDevice multiple times, for instance to run multiple jobs on a M2M device, it's useful to attribute log messages to a particular instance. Include the device fd in the log prefix. This turns the existing output [1:43:01.958321522] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[cap]: Queueing buffer 0 [1:43:01.958350060] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[cap]: Queueing buffer 1 [1:43:01.958365137] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[cap]: Queueing buffer 2 into [1:43:01.958321522] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[14:cap]: Queueing buffer 0 [1:43:01.958350060] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[14:cap]: Queueing buffer 1 [1:43:01.958365137] [277] DEBUG V4L2 v4l2_videodevice.cpp:1440 /dev/video0[14:cap]: Queueing buffer 2 Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
2021-03-04pipeline: raspberrypi: Only enable embedded stream when availableNaushir Patuck
The pipeline handler would enable and use the Unicam embedded data stream even if the sensor did not support it. This was to allow a means to pass exposure and gain values for the frame to the IPA in a synchronised way. The recent changes to get the pipeline handler to pass a ControlList with exposure and gain values means this is no longer required. Disable the use of the embedded data stream when a sensor does not support it. This change also removes the mappedEmbeddedBuffers_ map as it is no longer used. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Tested-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-04ipa: raspberrypi: Remove MdParserRPiNaushir Patuck
With the recent change to pass a ControlList to the IPA with exposure and gain values used for a frame, RPiController::MdParserRPi is not needed any more. Remove all traces of it. The derived CamHelper objects now pass nullptr values for the parser to the base CamHelper class when sensors do not use metadata. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Tested-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-04pipeline: ipa: raspberrypi: Pass exposure/gain values to IPA though controlsNaushir Patuck
When running with sensors that had no embedded data, the pipeline handler would fill a dummy embedded data buffer with gain/exposure values, and pass this buffer to the IPA together with the bayer buffer. The IPA would extract these values for use in the controller algorithms. Rework this logic entirely by having a new RPiCameraData::BayerFrame queue to replace the existing bayer queue. In addition to storing the FrameBuffer pointer, this also stores all the controls tracked by DelayedControls for that frame in a ControlList. This includes include exposure and gain values. On signalling RPi::IPA_EVENT_SIGNAL_ISP_PREPARE IPA event, the pipeline handler now passes this ControlList from the RPiCameraData::BayerFrame queue. The IPA now extracts the gain and exposure values from the ControlList instead of using RPiController::MdParserRPi to parse the embedded data buffer. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Tested-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2021-03-03libcamera: camera_sensor: Accept entities exposing the ISP functionLaurent Pinchart
Camera sensors can include an ISP, which may be reported as a separate entity from the pixel array in the media graph. Support such sensors by accepting MEDIA_ENT_F_PROC_VIDEO_ISP as a valid entity type. This allows using sensors that can be fully (or at least meaningfully) configured through the ISP's source pad only. Sensors that require further configuration, on the ISP sink pad and/or on the pixel array's source pad, will require further extension to the CameraSensor class. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
2021-03-03libcamera: camera_sensor: Use active area size as resolutionLaurent Pinchart
When a sensor can upscale the image, the native sensor resolution isn't equal to the largest size reported by the sensor. Use the active area size instead, and default it to the largest enumerated size if the crop rectangle targets are not supported. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03ipa: rkisp1: Update to kernel header changesLaurent Pinchart
The rkisp1 driver has received support for newer ISP versions, which changes its userspace API and ABI. Adapt to the API change. This requires kernel v5.11 or newer, or backporting the corresponding rkisp1 changes to older kernels. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Dafna Hirschfeld <dafna.hirschfeld@collabora.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-03cros: Support the new cros camera API with set_up and tear_downPaul Elder
Implement and expose the symbol and functions that the new cros camera API requires. Since we don't actually need them, leave them empty. Update meson accordingly. Signed-off-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: Introduce Chromium OS buffer managerJacopo Mondi
Introduce the CameraBuffer backend for the Chromium OS operating system and the associated meson option. The Chromium OS CameraBuffer implementation uses the cros::CameraBufferManager class to perform mapping of 1 plane and multiplane buffers and to retrieve size information. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: mm: Provide helper macro for PIMPLJacopo Mondi
Each memory backend has to declare a CameraBuffer class implementation that bridges the API calls to each CameraBuffer::Private implementation. As the code is likely the same for most (if not all) backends, provide a convenience macro that expands to the CameraBuffer class declaration. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: jpeg: Use CameraBuffer::jpegBufferSize()Jacopo Mondi
Use the newly introduced function to retrieve the size of the JPEG encoding destination buffer, in order to calculate where the JPEG_BLOB_ID should be placed. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: camera_buffer: Add method to get the JPEG blob sizeJacopo Mondi
To maintain compatibility with platforms that do not provide a memory backend implementation add a method to be return the size of the buffer used for JPEG encoding capped to a maximum size. Platforms that implement a memory backend will always calculate the correct buffer size. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: post_processor: Use CameraBuffer APIJacopo Mondi
Use the newly introduced CameraBuffer class as the type for the destination buffer in the PostProcessor class hierarchy in place of the libcamera::MappedFrameBuffer one and use its API to retrieve the length and the location of the CameraBuffer plane allocated for JPEG post-processing. Remove all the assumption on the underlying memory storage and only go through the CameraBuffer API when dealing with memory buffers. To do so rework the Encoder interface to use a raw pointer and an explicit size to remove access to the Span<uint8_t> maps that serve as memory storage for the current implementation but might not be ideal for other memory backend. Now that the whole PostProcessor hierarchy has been converted to use the CameraBuffer API remove libcamera::MappedBuffer as base class of the CameraBuffer interface and only reply on its interface. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: camera_buffer: Implement libcamera::ExtensibleJacopo Mondi
In order to prepare to support more memory backends, make the CameraBuffer class implement the PIMPL (pointer-to-implementation) pattern by inheriting from the libcamera::Extensible class. Temporary maintain libcamera::MappedBuffer as the CameraBuffer base class to maintain compatibility of the CameraStream::process() interface that requires a MappedBuffer * as second argument and will be converted to use a CameraBuffer in the next patch. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: Move buffer mapping to CameraStreamJacopo Mondi
The destination buffer for the post-processing component is currently first mapped in the CameraDevice class and then passed to CameraStream which simply calls the post-processor interface. Move the mapping to CameraStream::process() to tie the buffer mapping to the lifetime of the CameraBuffer instance. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: camera_device: Rename buffer fieldsJacopo Mondi
The buffers passed to the post processor are currently named 'buffer' and 'mapped', names that do not convey their role. Use 'src' and 'dest' instead. Cosmetic change only. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: camera_buffer: Drop 'const' from buffer_handle_tJacopo Mondi
The buffer_handle_t type is defined as 'const native_handle_t*'. Drop the 'const' specifier from the parameter of the CameraBuffer class constructor and in the Android generic memory backend. Also rename 'camera3buffer' in 'camera3Buffer' to comply with the coding style guidelines. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03android: Introduce CameraBuffer interfaceJacopo Mondi
In order to provide support for different memory backends, move the MappedCamera3Buffer class definition outside of the CameraDevice class to its own file and rename it in CameraBuffer. The interface defined in camera_buffer.h will be implemented by different backends that will be placed in the src/android/mm subdirectory. Provide a first implementation for the 'generic android' backend which matches the existing one. Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2021-03-03libcamera: pipeline: simple: Enable multiple streams for compatible devicesLaurent Pinchart
Allow support for multiple streams on a per-device basis. The decision should be made based on the ability of the converter to run multiple times within the duration of one frame. Hardcode it in SimplePipelineInfo for now. We may later compute the number of supported streams dynamically based on the requested configuration, using converter bandwidth information instead of a hardcoded fixed value. All platforms are currently limited to a single stream until they get successfully tested with multiple streams. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: Support usage of multiple streamsLaurent Pinchart
To extend the multi-stream support to runtime operation of the pipeline, expand the converter queue to store multiple output buffers, and update the request queuing and buffer completion handlers accordingly. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: Support configuration of multiple streamsLaurent Pinchart
Extend the SimpleCameraConfiguration to support multiple streams, using the multi-stream capability of the SimpleConverter class. Wiring up multi-stream support in the other pipeline handler operations will come in further commits. To keep the code simple, require all streams to use the converter if any stream needs it. It would be possible to generate one stream without conversion (provided the format and size match what the capture device can generate), and this is left as a future optimization. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: Hardcode the number of internal buffersLaurent Pinchart
The number of internal buffers, used between the capture device and the converter, doesn't need to depend on the number of buffers allocated for the output stream of the pipeline. Hardcode it to a fixed value. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: Move converter data to camera dataLaurent Pinchart
Converter usage is a per-camera property, move its data to the SimpleCameraData class. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: Add output formats to ConfigurationLaurent Pinchart
Store the list of converter output formats in the Configuration structure, to be used to implement multi-stream support. As the Configuration structure grows bigger, avoid duplicating it in the formats_ map for each supported pixel format by storing it in a configs_ vector instead, and storing pointers only in the map. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: Cache pipeline config in SimpleCameraConfigurationLaurent Pinchart
As the pipeline configuration is selected in SimpleCameraConfiguration::validate() already, cache it in the SimpleCameraConfiguration instead of looking it up in SimplePipelineHandler::configure(). This makes little difference at the moment, but will save duplication of more complex logic between validate() and configure() when adding support for multiple streams. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: Rename Configuration::pixelFormatLaurent Pinchart
The Configuration::pixelFormat field stores the pixel format at the output of the capture part of the pipeline. Rename it to captureFormat, to match the related captureSize field. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: Document the pipeline handler designLaurent Pinchart
The simple pipeline handler has grown over time, and isn't that simple anymore that it can easily be understood by an unfamiliar reader. Document the design to explicitly state the expectations of the pipeline handler, and to explain how it operates. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: Drop unused members of configurationLaurent Pinchart
The SimpleCameraConfiguration class has a sensorFormat_ member variable and a corresponding accessor that are never used. Drop them. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: Store streams in a vectorLaurent Pinchart
To prepare for multiple streams support, store the streams in a vector in the SimpleCameraData class. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: converter: Add multi-stream supportLaurent Pinchart
While the M2M device backing the converter doesn't support multiple streams natively, it can be run once per stream to produce multiple outputs from the same input, with different output formats and sizes. To support this, create a class to model a stream and move control of the M2M device to the Stream class. The SimpleConverter class then creates stream instances and iterates over them. Each stream needs its own instance of the V4L2M2MDevice, to support different output configurations. The SimpleConverter class retains a device instance to support the query operations. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: converter: Decouple input and output completionLaurent Pinchart
The SimpleConverter API signals completion of input and output buffer pairs. This unnecessarily delays requeueing the input buffer to the video capture queue until the output buffer completes, and also delays signalling request completion until the input buffer completes. While this shouldn't cause large delays in practice, it will also not scale when multi-stream support will be added to the converter class. To address the current issue and prepare for the future, decouple signalling of input and output buffers completion. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
2021-03-03libcamera: pipeline: simple: converter: Replace open() with isValid()Laurent Pinchart
Simplify the SimpleConverter interface by opening the M2M device in the constructor. The explicit call to open() is replaced by a check through a new isValid() function, and the unused close() function is removed. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Tested-by: Phi-Bang Nguyen <pnguyen@baylibre.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>