Age | Commit message (Collapse) | Author |
|
Set the sensor model property from the model reported in the media
graph.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The UVCCameraData::init() is the only consumer of the default entry in
the media graph, move the lookup of it into the init() function and pass
it the MediaDevice. This is done in preparation to extend the CameraData
initialization to consume more information from the MediaDevice.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Set the sensor model property.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add method that removes non-ASCII characters from a string.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Umang Jain <email@uajain.com>
|
|
The model name must to the extent possible describe the sensor. For most
devices this is the model name of the sensor. While for some devices the
sensor model is unavailable as the sensor or the entire camera is part
of a larger unit and exposed as a black-box to the system. In such cases
the model name of the smallest component closest to the sensor must be
used.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The exif object sets the byte ordering on construction, and then
during later calls re-states the byte ordering when setting values.
It could be argued that this ordering should already be known to the exif
library and is redundant, but even so we must provide it.
Ensure we are consistent in always using the same byte ordering by setting
a private class member to re-use a single value.
Reviewed-by: Umang Jain <email@uajain.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The self path supports XRGB8888, list it as a supported format.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add support for XRGB8888 and XBGR8888 formats.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Both the main and self path support R8, list it as a supported format.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The self path supports RGB565, list it as a supported format.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The upstream driver has changed how the link formats are validated when
starting to stream [1]. This revealed that libcamera did not adjust the
media bus format from the link between {main,self} resizer source pad
and the capture video device as expected by the driver.
The media bus code YUYV8_2X8 was hardcoded to MEDIA_BUS_FMT_YUYV8_2X8
for all pixel formats while it must be adjusted to YUYV8_1_5X8 for NV12
and NV21, fix this.
1. 6803a9e0e1e43e9e ("media: staging: rkisp1: cap: simplify link validation by comparing media bus code")
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The upstream driver has dropped support for YVYU and VYUY [1], remove
support from the pipeline handler.
1. 3acb3e06baf64e28 ("media: staging: rkisp1: cap: remove unsupported formats")
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Instead of manually tracking if a path is enable or not use the media
graph link status. There is no functional change.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Move the path link handling to RkISP1Path, there is no functional
change.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Move the start and stop of a path to RkISP1Path. This allows the
importing of buffers to be moved closer the path start/stop simplifying
the code. Also by adding a simple running tracker the error logic in
PipelineHandlerRkISP1 can be simplified as stop() can always be called.
This also removes all external users of RkISP1Path::video_ so it can be
made private.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
As a step to be able to make RkISP1Path::video_ private add simple
wrappers for buffer handling. There is no functional change.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
validation to RkISP1Path
Move the path configuration generation and validation to RkISP1Path.
This is done to increase code reuse and to encapsulate the main and self
path differences inside the RkISP1Path class. There is no functional
change.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Move the path configuration to RkISP1Path to increase code reuse and
make the V4L2 subdevice resizer private to the path. There is no
functional change.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The self and main paths are very similar and the introduction of support
for two simultaneous streams have made it clear their handling could be
abstracted in a separate class.
This is the first step to create such a class by breaking out the
initialization and storage of the video and subdevices. There is no
functional change in this patch.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
With the buffer copy removed from all pipelines for raw capture
rename StillCaptureRaw to Raw to better describe the role.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
|
|
Be more precise in commenting why the ImgU shall not be configured
if only the RAW stream is requested.
As an example, if the ImgU gets unecessary configured:
cam -srole=viewfinder -c2 -C -> WORKS
cam -srole=stillraw -c2 -C -> WORKS
cam -srole=viewfinder -c2 -C -> Failed to queue buffer 0: Invalid argument
Since commit ("libcamera: ipu3: Fix RAW+YUV capture") the ImgU
configuration procedure also correctly implements the assumption that at
least one of the YUV output is being operated. If that's not the case,
as in the RAW-only capture use case, the code tries to access a
non-existing configuration. One more reason to exit early if no YUV
stream is requested.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Since the re-implementation of the IPU3 pipeline handler
configuration procedure, the main output is always assigned in
case any YUV stream is requested.
Remove a dead code block that checks for the main output to be
valid.
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
When requesting one RAW stream and one YUV stream the
StreamConfiguration assigned to the RAW stream is the first one
added to the CameraConfiguration, while the YUV stream gets assigned to
the main output.
At configure() time the viewfinder output needs to be configured with
the same format as the main output, but since the introduction of RAW
capture support, the pipeline has not been updated and still assumes
the main output configuration is the first one in the
CameraConfiguration. This causes the viewfinder to be configured
with the same format as the raw stream, breaking capture operations.
Before this commit the following command fails and the ImgU does not
produce frames:
cam -srole=stillraw -srole=viewfinder -c2 -C
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Change variable names to camel case to be consistent with the rest of
the source files. Remove #define consts and replace with constexpr.
Add some newlines to make the code more readable.
There are no functional changes in this commit.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Acked-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
[Kieran: Rebase merge conflicts resolved]
[Kieran: Fix checkstyle line under 80 chars]
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
All IPA related types/params are now moved to the RPi namespace.
There are no functional changes in this commit.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Acked-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
[Kieran: Rebase merge conflicts fixed]
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
This avoids a namespace clash with the RPi namespace used by the ipa and
pipeline handlers, and cleans up the syntax slightly.
There are no functional changes in this commit.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Acked-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Rename RPi::RPiStream -> RPi::Stream and RPi::RPiDevice -> RPi::Device.
There are no functional changes in this commit.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Acked-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Update ALSC (Auto Lens Shading Correction) to handle correctly the
user transform now passed in the camera mode.
The user transform is applied directly in the sensor so the image
statistics already incorporate it, and the adaptive algorithm is
entirely agnostic towards it, so all we have to do is flip the
calibrated tables to match. (These tables will have been calibrated
without the user transform.)
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
This commit plumbs the user transform from the Raspberry Pi pipeline
handler through to the IPA. Note that the transform is actually
handled in the sensor (by setting the h/v flip bits), so the IPAs need
to understand the orientation of the image they receive.
Once in the IPA we add it to the CameraMode description, so that it
becomes automatically available to all the individual control
algorithms.
The IPA configure method has to be reordered just a little so as to
fill in the transform in the camera mode before calling SwitchMode.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The Raspberry Pi pipeline handler allows all transforms except those
involving a transpose. The user transform is combined with any
inherent rotation of the camera, and the camera's H and V flip bits
are set accordingly.
Note that the validate() method has to work out what the final Bayer
order of any raw streams will be, before configure() actually applies
the transform to the sensor. We make a note of the "native"
(untransformed) Bayer order when the system starts, so that we can
deduce transformed Bayer orders more easily.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Add a field to the CameraConfiguration (including the necessary
documentation) to represent a 2D transform requested by the
application. All pipeline handlers are amended to coerce this to the
Identity, marking the configuration as "adjusted" if something
different had been requested.
Pipeline handlers that support Transforms can be amended subsequently.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
This type encodes BayerFormats in an explicit way, that makes them
easier to use than some of the other more opaque type formats. This
makes the BayerFormat useful for editing or manipulating Bayer types
more easily.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
We implement 2D transforms as an enum class with 8 elements,
consisting of the usual 2D plane transformations (flips, rotations
etc.).
The transform is made up of 3 bits, indicating whether the transform
includes: a transpose, a horizontal flip (mirror) and a vertical flip.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The V4L2Device::controlInfo method simply returns a pointer to the
v4l2_query_ext_ctrl structure for the given control, which has already
been retrieved and stored.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
before configure()"
This reverts commit 1e8c91b65695449c5246d17ba7dc439c8058b781.
Now that we shall be implementing application-defined 2D transforms
it's no longer possible to set the sensor orientation so early on. We
have to wait until we have the CameraConfiguration object as that's
where the application puts its choice of transform.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The EXIF specification defines three timezone related tags, namely
OffsetTime, OffsetTimeOriginal and OffsetTimeDigitized. However,
these are not supported by libexif (as of v0.6.21) hence, carry
the tags' positional values in our implementation until we get
this support from libexif itself.
Since these tags were introduced in EXIF specification v2.31, set
the exif version number explicitly too.
Signed-off-by: Umang Jain <email@uajain.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The EXIF standard states that EXIF_FORMAT_UNDEFINED shall not be
terminated with NULL. The patch implements this particular detail and
pad one extra byte for EXIF_FORMAT_ASCII to null-terminate strings.
Signed-off-by: Umang Jain <email@uajain.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Refresh the RkISP1 user-space header to match the latest state in the
media-tree [1]. This requires update of symbol names in the RkISP1 IPA
but there is no functional change.
Unfortunately the upstream header has a few problems that needs to be
fixed before it can be used.
1. The SPDX header does not satisfy the Linux scripts/headers_install.sh
so the installation step have to be done manually (dropping _UAPI
prefix from header include guard). Issue is reported upstream.
2. The BIT() macro is used in the header but unfortunately this macro
is not accessible in user-space headers. Fix this by reverting back
to open code setting the bit without macro. Fix submitted upstream
and acked by maintainer.
1. d7a81a5b07313535 ("media: staging: rkisp1: uapi: remove __packed")
2. [PATCH v2] staging: rkisp1: uapi: Do not use BIT() macro
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Acked-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Acked-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Do not depend on other headers to pull in the V4L2 controls header.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Acked-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Expose the self stream to applications and prefers it for the viewfinder
and video roles as it can be extended to produce RGB. Keep preferring
the main path for still capture as it could be extended to support RAW
formats which makes most sense for still capture.
With this change the self path becomes available to applications and a
camera backed by this pipeline can produce two streams simultaneously.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Extend the format validation to work with both main and self paths. The
heuristics honors that the first stream in the configuration has the
highest priority while still examining both streams for a best match.
It is not possible to capture from the self path as the self stream is
not yet exposed to applications.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In preparation of supporting both the main and self path extend
RkISP1FrameInfo to track buffers from the self path stream.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Allow for both the main and self path streams to be configured. This
change adds the self path as an internal stream to the pipeline handler.
It is not exposed as a Camera stream so it can not yet be used.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Add the V4L2 device nodes needed to operate the self path.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In preparation of supporting both the main and self path prefix the main
path specific variables with mainPath.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Changing resolutions back and forth can provoke the crop rectangle to go
out of sync, set it as part of format configuration.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The information about stream format is available but not exported to
applications, fix this.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Create RkISP1Frames from camera data instead of picking information out
from it. This is done to prepare for multi stream support where more
information from the camera data will be needed.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The buffer ready handlers are designed for a single application facing
stream from the main path. To prepare for multiple application facing
streams from main and/or self path the handlers need to be prepared.
The data keeping track of the frame number and advancing the timeline
can be moved from the application facing buffer ready handler to the
statistics handler. For each request processed there will always be a
statistic buffer and as the ISP is inline and is the source of both
main, self and statistic paths there is no change in behavior.
The application facing handler no longer needs a special case for
cancelled frames and can be made simpler. With this change the handlers
are ready to deal with any combinations of application facing streams.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
In preparation of supporting both the main and self path configure all
the media graph links as a part of the configuration step. Before this
change the link between ISP and DMA engine was setup at match time as
the only supported path was the main path and only the link between
sensor and ISP was updated at part of the configuration step.
The main path is still the only path between ISP and DMA engine that is
possible to enable.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|