summaryrefslogtreecommitdiff
path: root/src/libcamera/pipeline
AgeCommit message (Collapse)Author
2023-01-29fixup! libcamera: imx8-isi: Break out YUV format selectionisi/raw_sensor_v2Jacopo Mondi
2023-01-29libcamera: imx8-isi: Remove mbusCode from formatsMap_Jacopo Mondi
Now that the media bus code selection procedure does not depend on the ISICameraConfiguration::formatsMap_ remove the association between PixelFormat supported by the ISI and the media bus code produced by the sensor. Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
2023-01-29libcamera: imx8-isi: Split Bayer/YUV config generationJacopo Mondi
At generateConfiguration() a YUV/RGB pixel format is preferred for the StillCapture/VideoRecording/Viewfinder roles, but currently there are no guarantees in place that the sensor provides a non-Bayer bus format from which YUV/RGB can be generated. This makes the default configuration generated for those roles not to work if the sensor is a RAW-only one. To improve the situation split the configuration generation in two, one for YUV modes and one for Raw Bayer mode. StreamRoles assigned to a YUV mode will try to first generate a YUV configuration and then fallback to RAW if that's what the sensor can provide. As an additional requirement, for YUV streams, the generated mode has to be validated with the sensor to confirm the desired sizes can be generated. In order to test a format on the sensor introduce CameraSensor::tryFormat(). Reported-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
2023-01-29libcamera: imx8-isi: Automatically select media bus codeJacopo Mondi
The ISICameraConfiguration::validate() function selects which media bus format to configure the sensor with based on the pixel format of the first configured stream using the media bus code associated to it in the formatsMap_ map. In order to remove the PixelFormamt-to-mbus-code association in formatsMap_ provide a wrapper function for the newly introduced getRawMediaBusFormat() and getYubMediaBusFormat() that automatically selects what media bus format to use based on the first stream pixel format. Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
2023-01-29libcamera: imx8-isi: Break out YUV format selectionJacopo Mondi
As per the RAW format selection, the media bus format selection procedure relies on the direct association of PixelFormat and media bus code in the formatsMap_ map. As the ISI can generate YUV and RGB formats from any non-Bayer media bus format, break out the YUV/RGB media bus format selection to a separate function. The newly introduced getYuvMediaBusFormat() tests a list of known-supported media bus formats against the list of media bus formats supported by the sensor and tries to prefer media bus codes with the same encoding as the requested PixelFormat. Use the newly introduced function in ISICameraConfiguration::validateYuv() to make sure the sensor can produce a YUV/RGB media bus format. Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
2023-01-21libcamera: imx8-isi: Break-out RAW format selectionJacopo Mondi
The current implementation of the ISI pipeline handler handles translation of PixelFormat to media bus formats from the sensor through a centralized map. As the criteria to select the correct media bus code depend if the output PixelFormat is a RAW Bayer format or not, start by splitting the RAW media bus code procedure selection out by adding a method for such purpose to the ISICameraData class. Add the function to the ISICameraData and not to the ISICameraConfiguration because: - The sensor is a property of CameraData - The same function will be re-used by the ISIPipelineHandler during CameraConfiguration generation. Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com>
2023-01-16libcamera: rkisp1: Re-sort includesJacopo Mondi
Comply with a checkstyle suggestion and separate inclusion directives. Signed-off-by: Jacopo Mondi <jacopo.mondi@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-12-24libcamera: Simple typo fixesPavel Machek
Fix various typos in the code base. Signed-off-by: Pavel Machek <pavel@ucw.cz> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-12-14libcamera: pipeline: simple: converter: Use generic converter interfaceXavier Roumegue
Move the simple converter implementation to a generic V4L2 M2M class derived from the converter interface. This latter could be used by other pipeline implementations and as base class for customized V4L2 M2M converters. Signed-off-by: Xavier Roumegue <xavier.roumegue@oss.nxp.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2022-11-29pipeline: ipa: raspberrypi: Use IPA cookiesNaushir Patuck
Pass an IPA cookie from the pipeline handler to the IPA and eventually back to the pipeline handler through the setDelayedControls signal. This cookie is used to index the RPiController::Metadata object to be used for the frame. The IPA cookie is then returned from DelayedControls when the frame with the applied controls has been returned from the sensor, and eventually passed back to the IPA from the signalIspPrepare signal. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Reviewed-by: David Plowman <david.plowman@raspberrypi.com> Tested-by: David Plowman <david.plowman@raspberrypi.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-11-29pipeline: raspberrypi: delayed_controls: Add user cookie to DelayedControlsNaushir Patuck
Allow the caller to provide a cookie value to DelayedControls::reset and DelayedControls::push. This cookie value is returned from DelayedControls::get for the frame that has the control values applied to it. The cookie value is useful in tracking when a set of controls has been applied to a frame. In a subsequent commit, it will be used by the Raspberry Pi IPA to track the IPA context used when setting the control values. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Reviewed-by: David Plowman <david.plowman@raspberrypi.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-11-29pipeline: raspberrypi: delayed_controls: Template the ControlRingBuffer classNaushir Patuck
Convert ControlRingBuffer to a templated class to allow arbitrary ring buffer array types to be defined. Rename ControlRingBuffer to RingBuffer to indicate this. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Reviewed-by: David Plowman <david.plowman@raspberrypi.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-11-29pipeline: raspberrypi: Switch to RPi::DelayedControlsNaushir Patuck
Switch the Raspberry Pi pipeline handler to use the DelayedControls implementation in the RPi:: namespace. This will allow us to use Raspberry Pi specific API changes in future commits. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Reviewed-by: David Plowman <david.plowman@raspberrypi.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-11-29pipeline: raspberrypi: Fork DelayedControlsNaushir Patuck
Fork the libcamera::DelayedControls implementation in the RPi:: namespace, with the intention of updating the API for Raspberry Pi specific features. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Reviewed-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-11-25pipeline: rkisp1: Support raw Bayer capture configurationFlorian Sylvestre
Implement support for raw Bayer capture during configuration generation, validation and camera configuration. While at it, fix a typo in a comment. Signed-off-by: Florian Sylvestre <fsylvestre@baylibre.com> Signed-off-by: Paul Elder <paul.elder@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2022-11-25pipeline: rkisp1: Support raw Bayer capture at runtimeFlorian Sylvestre
Implement support for raw Bayer capture at runtime, from start() to stop(). Support of raw formats in the camera configuration is split to a subsequent change to ease review. In raw mode, the ISP is bypassed. There is no need to provide parameter buffers, and the ISP will not generate statistics. This requires multiple changes in the buffer handling: - The params and stats buffers don't need to be allocated, and the corresponding video nodes don't need to be started or stopped. - The IPA module fillParamsBuffer() operation must not be called in queueRequestDevice(). As a result, the IPA module thus doesn't emit the paramsBufferReady signal. The main and self path video buffers must thus be queued directly in queueRequestDevice(). - The tryCompleteRequest() function must not to wait until the params buffer has been dequeued. - When the frame buffer has been captured, the IPA module processStatsBuffer() operation must be called directly to fill request metadata. Signed-off-by: Florian Sylvestre <fsylvestre@baylibre.com> Signed-off-by: Paul Elder <paul.elder@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Tested-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2022-11-25pipeline: rkisp1: Fix stream size validationLaurent Pinchart
Unlike RkISP1Path::generateConfiguration(), the validate() function doesn't take the camera sensor resolution into account but only considers the absolute minimum and maximum sizes supported by the ISP to validate the stream size. Fix it by using the same logic as when generating the configuration. Instead of passing the sensor resolution to the validate() function, pass the CameraSensor pointer to prepare for subsequent changes that will require access to more camera sensor data. While at it, update the generateConfiguration() function similarly for the same reason. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2022-11-25pipeline: rkisp1: Query the driver for formatsPaul Elder
Query the driver for the output formats and sizes that it supports, instead of hardcoding them. This allows future-proofing for formats that are supported by some but not all versions of the driver. As the rkisp1 driver currently does not support VIDIOC_ENUM_FRAMESIZES, fallback to the hardcoded list of supported formats and framesizes. This feature will be added to the driver in parallel, though we cannot guarantee that users will have a new enough kernel for it. Signed-off-by: Paul Elder <paul.elder@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2022-11-25libcamera: stream: Turn StreamRole into scoped enumerationLaurent Pinchart
The StreamRole enum has enumerators such as 'Raw' that are too generic to be in the global libcamera namespace. Turn it into a scoped enum to avoid namespace clashes, and update users accordingly. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2022-11-23ipa: rkisp1: add FrameDurationLimits controlNicholas Roth
Currently, the Android HAL does not work on rkisp1-based devices because required FrameDurationLimits metadata is missing from the IPA implementation. This change sets FrameDurationLimits for rkisp1 based on the existing ipu3 implementation, using the sensor's reported range of vertical blanking intervals with the minimum reported horizontal blanking interval. Signed-off-by: Nicholas Roth <nicholas@rothemail.net> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-11-23ipa: rkisp1: Fail hard on empty CameraSensorInfoJacopo Mondi
The RkISP1 pipeline and IPA module allows for the CameraSensorInfo to be empty, probably to accommodate some sensor used in a test platform that does not provide the mandatory libcamera requirements. As the \todo item in the IPA reports, there is a possibility that the received CameraSensorInfo is empty and it should be checked before accessing it, but currently such requirement is not enforced in the code. This allows to assume all the test platforms in use have now successfully moved their sensor driver to comply with the minimum requirements and provide a populated CameraSensorInfo to the IPA. As the safety check is not enforced, and as we don't want to allow faulty sensors to send empty CameraSensorInfo to the IPA, remove the \todo item in the IPA and fail hard in the pipeline handler if the sensor does not comply with libcamera requirements. Signed-off-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-11-23ipa: rkisp1: Use IPAConfig in IPA::configure()Jacopo Mondi
The RkISP1 implementation of IPA::configure() still uses the legacy interface where sensor controls (and eventually lens controls) are passed from the pipeline handler to the IPA in a map. Since the introduction of mojom-based IPA interface definition, it is possible to define custom data types and use them in the interface definition between the pipeline handler and the IPA. Align the RkISP1 IPA::configure() implementation with the one in the IPU3 IPA module by using a custom data type instead of relying on a map to pass controls to the IPA. Signed-off-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-11-23pipeline: raspberrypi: Remove enum BuffferMask from the mojom interfaceNaushir Patuck
The BufferMask enum provides a way of identifying which stream a frame buffer belongs to. This enum is defined in the raspberrypi.mojom interface file. However, the IPA does not need these enum definitions to mmap buffers that it uses. Move this enum out of the raspberrypi.mojom interface file and put it into the RPi namespace visible only to the pipeline handler. This removes the need to include the auto-generated IPA interface header in the RPi::Stream definition. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: David Plowman <david.plowman@raspberrypi.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-11-21pipeline: imx8-isi: Set SensorTimestamp metadataLaurent Pinchart
Report the sensor timestamp in metadata. Use the timestamp from the first buffer. Accuracy could be improved by using the frame start event from the CSI-2 receiver, but the kernel driver doesn't support it yet. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2022-11-21libcamera: imx8-isi: Enumerate supported stream formatsJacopo Mondi
Add to the formats map all the supported ISI video capture stream formats. This allows to populate the list of stream formats for all the non-RAW use cases, as the ISI can perform colorspace conversion between YUV and RGB. Signed-off-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-11-17libcamera: pipeline: Add IMX8 ISI pipelineJacopo Mondi
Add a pipeline handler for the ISI capture interface found on several versions of the i.MX8 SoC generation. The pipeline handler supports capturing multiple streams from the same camera in YUV, RGB or RAW formats. The number of streams is limited by the number of ISI pipelines, and is currently hardcoded to 2 as the code has been tested on the i.MX8MP only. Further development will make this dynamic to support other SoCs. Signed-off-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
2022-11-14libcamera: pipeline: raspberrypi: Free buffers when a camera is releasedDavid Plowman
Implement the PipelineHandlerRPi::releaseDevice method which allows us to free any allocated buffers when a camera is released. Signed-off-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: Naushir Patuck <naush@raspberrypi.com> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2022-11-14Revert "pipeline: raspberrypi: Do not unconditionally free buffers on close"David Plowman
This reverts commit 30d704732badc675f72fe73d14749669cb645c23. It turns out that this commit causes some regressions and is in fact unnecessary because the related commit "libcamera: v4l2_videodevice: Guard against releasing unallocated buffers" (a2bdff6d0b67475492ac7cf9318866b6d89a28fd) fixes the problem completely (if the buffers were never allocated, the video device avoids trying to free them even if the pipeline handler asks). The reason for the regressions is that in this new (broken) scheme we would never call clearBuffers() on all the streams if the internal buffers were never allocated (i.e. buffersAllocated_ is never set). This causes the stream's bufferMap_ list to get longer and longer if there are multiple back-to-back calls to configure, and dev_->importBuffers() will ultimately to fail. So either we need to think more carefully about how to stop the pipeline handler from freeing buffers that it doesn't own, or we just leave it as the other commit resolves the problem on its own. In the interim, simply reverting this commit certainly seems like the best solution. Fixes: 30d704732bad ("pipeline: raspberrypi: Do not unconditionally free buffers on close") Signed-off-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Naushir Patuck <naush@raspberrypi.com> Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2022-11-09pipeline: rkisp1: Pass info pointer to tryCompleteRequest()Laurent Pinchart
The tryCompleteRequest() function looks up the RkISP1FrameInfo that all but one of its callers already look up. Remove the double look up by passing the info pointer to tryCompleteRequest(). Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
2022-11-09libcamera: pipeline_handler: Return unique_ptr from generateConfiguration()Laurent Pinchart
The PipelineHandler::generateConfiguration() function allocates a CameraConfiguration instance and returns it. The ownership of the instance is transferred to the caller. This is a perfect match for a std::unique_ptr<>, which the Camera::generateConfiguration() function already returns. Update PipelineHandler::generateConfiguration() to match it. This fixes a memory leak in one of the error return paths in the IPU3 pipeline handler. While at it, update the Camera::generateConfiguration() function documentation to drop the sentence that describes the ownership transfer, as that is implied by usage of std::unique_ptr<>. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2022-10-28pipeline: raspberrypi: Do not unconditionally free buffers on closeNaushir Patuck
When a camera is terminated, do not unconditionally free buffers in the RPiCameraData destructor. Otherwise, this causes harmless error log messages to be displayed if no buffer have previously been allocated. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Tested-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2022-10-18ipa: vimc: Add Flags to parametersPaul Elder
For the purpose of testing serializing/deserializing Flags in function parameters, add an enum class TestFlags and Flags<TestFlags> to some function parameters, both for input and output and Signals. While at it, update the ipa_interface_test. Signed-off-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-10-18ipa: vimc: Add IPAOperationCode to init() parameter listPaul Elder
For the purpose of testing serializing/deserializing enums in function parameters, add IPAOperationCode to the parameter list of init(). Signed-off-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-10-18pipeline: ipa: raspberrypi: Add HBLANK control to DelayedControlsNaushir Patuck
Update CamHelper::getDelays() to return the sensor HBLANK delay. The HBLANK delay is set to the same value as VBLANK delay for all sensors in the Raspberry Pi IPA. Return the HBLANK gain delay from the IPA to the pipeline handler, and initialise DelayedControls to handle V4L2_CID_HBLANK with this delay value. As a drive-by, check that the V4L2_CID_HBLANK control is available when calling IPARPi::configure(). Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Tested-by: Dave Stevenson <dave.stevenson@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-10-10pipeline: rkisp1: Set bytesused before queuing parameters bufferLaurent Pinchart
The bytesused value for the parameters buffer is initialized to 0 and never set. The V4L2 API specification indicates that, for an output video device, the driver will set the bytesused value to the size of the plane in that case. The videobuf2 framework does so, but considers this as deprecated and prints a warning: [ 54.375534] use of bytesused == 0 is deprecated and will be removed in the future, [ 54.388026] use the actual size instead. Fix it by setting bytesused to the correct value before queuing the parameters buffer. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Umang Jain <umang.jain@ideasonboard.com> Tested-by: Naushir Patuck <naush@raspberrypi.com>
2022-10-10pipeline: ipu3: Set bytesused before queuing parameters bufferLaurent Pinchart
The bytesused value for the parameters buffer is initialized to 0 and never set. The V4L2 API specification indicates that, for an output video device, the driver will set the bytesused value to the size of the plane in that case. The videobuf2 framework does so, but considers this as deprecated and prints a warning: [ 54.375534] use of bytesused == 0 is deprecated and will be removed in the future, [ 54.388026] use the actual size instead. Fix it by setting bytesused to the correct value before queuing the parameters buffer. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Umang Jain <umang.jain@ideasonboard.com> Tested-by: Naushir Patuck <naush@raspberrypi.com>
2022-10-05pipeline: raspberrypi: Update naming convention for tuning filesNaushir Patuck
Append "_mono" to the sensor name when generating the tuning filename for monochrome sensor variants. So the new naming convention is as follows: <sensor_name>.json - Standard colour sensor variant <sensor_name>_mono.json - Monochrome sensor variant Rename the existing imx296.json file to imx296_mono.json as this tuning file is based on the monochrome variant of the IMX296. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Reviewed-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-10-04pipeline: raspberrypi: Detect monochrome "R" formats as being rawDavid Plowman
The "R" pixel formats (R8, R10, R10_CSI2P etc.) record the associated colour space as being YUV rather than RAW, meaning that the code was not detecting them as being raw formats. In the case of Raspberry Pi, we deal only with raw formats, so the revised test must work correctly for both these and the standard Bayer formats. Signed-off-by: David Plowman <david.plowman@raspberrypi.com> Reviewed-by: Naushir Patuck <naush@raspberrypi.com> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-09-23pipeline: raspberrypi: Improve Unicam timeout handlingNaushir Patuck
Currently, if a Unicam timeout is signalled, the pipeline handler only raises an error message. Update the error handling to put the pipeline handler in an internal error state, disable all device streams, and return all outstanding requests as cancelled. Any subsequent requests that come into the pipeline handler will also be returned as cancelled. Any further error handling (e.g. a reset with camera stop()/start()) is up to the application to perform as it requires. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: David Plowman <david.plowman@raspberrypi.com> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2022-09-23pipeline: raspberrypi: Add an error stateNaushir Patuck
Add an error state used internally in the Raspberry Pi pipeline handler. Currently this state is never set, but will be in a subsequent commit when a device timeout has been signalled. Add a isRunning() helper to identify if the state machine is in a stopped/error state. Signed-off-by: Naushir Patuck <naush@raspberrypi.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Reviewed-by: David Plowman <david.plowman@raspberrypi.com> Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
2022-09-07pipeline: uvcvideo: Fail match() if the camera has no supported formatLaurent Pinchart
A UVC device could expose only formats that are not supported by libcamera. The pipeline handler doesn't currently consider this as an error, and happily creates a camera. The camera won't be usable, and worse, generateConfiguration() and validate() will crash as those functions assume that at least one format is supported. Fix this by failing match() if none of the formats exposed by the camera are supported. Log an error message in that case to notify the user. Bug: https://bugs.libcamera.org/show_bug.cgi?id=145 Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Umang Jain <umang.jain@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Tested-by: Christian Rauch <Rauch.Christian@gmx.de>
2022-09-07pipeline: uvcvideo: Cache supported formats in UVCCameraDataLaurent Pinchart
Populate and cache the list of supported formats in UVCCameraData::init(), to avoid repeating the operation every time generateConfiguration() is called. Combine this with the search for the largest size advertised by the camera to avoid iterating over the formats twice in init(). Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Umang Jain <umang.jain@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Tested-by: Christian Rauch <Rauch.Christian@gmx.de>
2022-09-07pipeline: uvcvideo: Move camera ID generation to UVCCameraData classLaurent Pinchart
Move the camera ID generation to UVCCameraData, and cache the ID in that class. This will be useful to access the ID in multiple locations, such as when printing error messages. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Umang Jain <umang.jain@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Tested-by: Christian Rauch <Rauch.Christian@gmx.de>
2022-09-01pipeline: uvcvideo: Add color space supportLaurent Pinchart
Add support for color space to the uvcvideo pipeline handler. UVC devices have a fixed color space per format, so only the validate() function needs to be extended to retrieve the color space from the kernel. There is no need to pass the value back to the driver in configure(). Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Umang Jain <umang.jain@ideasonboard.com>
2022-08-30libcamera: pipeline: rkisp1: Implement color space supportLaurent Pinchart
Implement color space support in the rkisp1 pipeline handler, in the configuration generation, configuration validation and camera configuration. As all the processing related to the color space is performed in the part of the pipeline shared by all streams, a single color space must cover all stream configurations. This is enforced manually when generating the configuration, and with the validateColorSpaces() helper when validating it. Only the Y'CbCr encoding and quantization range are currently taken into account, and they are programmed through the V4L2 color space API. The primary colors chromaticities and the transfer function need to be configured in the ISP parameters buffer, and thus conveyed to the IPA, but the rkisp1 driver does not currently support tone mapping, so this will be done later. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com> Reviewed-by: Florian Sylvestre <fsylvestre@baylibre.com>
2022-08-25libcamera: color_space: Rename Jpeg to SyccLaurent Pinchart
The JPEG color space is badly name, as the JPEG specification (ITU-T T.81) doesn't define any particular color space: The interchange format does not specify a complete coded image representation. Application-dependent information, e.g. colour space, is outside the scope of this Specification. The JFIF specification (ITU-T T.871) is clearer as it requires ITU-R BT.601 YCbCr encoding and a full quantization range: The interpretations of Y, CB, and CR are derived from the E'Y, E'Cb, and E'Cr signals defined in the 625-line specification of Rec. ITU-R BT.601, but these signals are normalized so as to permit the usage of the full range of 256 levels of the 8-bit binary encoding of the Y component. It however doesn't specify color primaries or a transfer function explicitly. It only mentions the latter when describing the conversion from YCbCr to RGB: The inverse relationship for computing full scale 8-bit per colour channel gamma pre-corrected RGB values (following Rec. ITU-R BT.601 gamma pre-correction and colour primary specifications) from YCbCr colours (with 256 levels per component) can be computed as follows: [...] Given that ITU-R BT.601-5 (1995) didn't specify color primaries or a transfer function, and that the later ITU-R BT.601-7 (2011) version specifies color primaries for the 625-line variant that do not match sRGB, the JPEG color space in libcamera is badly named. This is confirmed by ITU-T T.871: As this Recommendation | International Standard is based on the prior informally-circulated JFIF version 1.02 specification that was produced in 1992, which referenced Rec. ITU-R BT.601 (formerly CCIR 601), it references that specification for definition of the E'Y, E'Cb, and E'Cr signals that correspond to the YCBCR values specified herein. However, since the development of the prior JFIF version 1.02 specification, additional industry specifications have been developed, Rec. ITU-R BT.601 has been updated, and common industry practice has emerged which often follows the sYCC specification in IEC 61966-2-1/Amd.1. The difference between the use of the colour interpretation specification in this Recommendation | International Standard and that of the sYCC specification may be considered negligible in practice. Rename the color space to sYCC, as its definition matches the sYCC standard, and indicate that it is typically used to encode JPEG images. Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Umang Jain <umang.jain@ideasonboard.com> Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
2022-08-25libcamera: ipu3: Use std::max() instead of expandTo() to get the max resolutionHan-Lin Chen via libcamera-devel
Using Size::expandTo() to find the max resolution might generate a non-existent resolution. For example, when application request streams for 1920x1080 and 1600x1200, the max resolution will be wrongly 1920x1200 and fails the configuration. Bug: https://bugs.libcamera.org/show_bug.cgi?id=139 Signed-off-by: Han-Lin Chen <hanlinchen@chromium.org> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2022-08-25libcamera: Use const reference for range loopsChristian Rauch via libcamera-devel
Use a const reference in range-based for loops to avoid copies of the loop elements. While at it, change looping over controls in PipelineHandlerUVC::processControls to use structured bindings. Signed-off-by: Christian Rauch <Rauch.Christian@gmx.de> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Jacopo Mondi <jacopo@jmondi.org> Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
2022-08-21libcamera: pipeline: rkisp1: Remove unused assignmentChristian Rauch
The variable 'ret' is assigned but not read until after the next assignment. Signed-off-by: Christian Rauch <Rauch.Christian@gmx.de> Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
2022-08-09pipeline: ipu3: Support IPA tuning fileLaurent Pinchart
Pass the path name of the YAML IPA tuning file to the IPA module. The file name is derived from the sensor name ("${sensor_name}.yaml"), with a fallback to "uncalibrated.yaml". Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com> Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>