Age | Commit message (Collapse) | Author |
|
Recent doxygen versions don't appreciate unquoted PROJECT_NUMBER values
that contain spaces. Fix this by quoting the string.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Let's try not to mix draft controls and regular controls.
Draft controls are unstable by definition, and removing or adding them
should not impact the enumeration of stable controls.
Keep draft controls at the end of the control_ids.yaml file and
add a comment to make clear where the draft controls section begins.
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Signed-off-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
The description text lists "stillraw" as a stream role option. This is
incorrect, it should be listed as "raw" instead.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
With the recent change in the bayer/embedded buffer matching code,
a condition would make the bayer buffer be requeued back to the device,
even though it could potentially be queued for matching. This would
cause unnecessary frame drops as sync would be lost.
The fix is to ensure the bayer buffer only gets requeued if the embedded
data buffer queue is not empty, i.e. the buffer truly is orphaned.
Additionally, we do this test before deciding to flush any of the two
queues of all their buffers.
Fixes: 909882b (pipeline: raspberrypi: Rework bayer/embedded data buffer matching)
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Tested-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Introduce a pipeline model that lists the operations applied by the
camera pipeline. This is a first step towards defining explicitly how
the camera processes images, and how the libcamera controls affect the
processing.
The initial list of operations is meant to be expanded, and possibly
refactored (a block diagram should also be considered to make this
easier to read). How the controls affect the pipeline is largely missing
at this stage, with only a short explanation of the digital zoom to show
how this is meant to be documented. More documentation will be added
over time.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Sebastian Fricke <sebastian.fricke.linux@gmail.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
A few grammatical errors remain in the Control class documentation.
Fix them.
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
Previously we required that the sensor absolutely reaches the target
exposure, but this can fail if frame rates or analogue gains are
limited. Instead insist only that we get several frames with the same
exposure time, analogue gain and that the algorithm's target exposure
hasn't changed either.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
partly saturated images
When parts of an image saturate then the image brightness no longer
increases linearly with increased exposure/gain. Having calculated a
linear gain value it's better then to try it, allowing for saturating
regions, and if necessary increase the gain some more. We repeat this
several times.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
constructor
Use memset in the constructor for embedded structures, it is tidier
and initialises everything. We use the initialiser list for other
members.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
SwitchMode
When an application has specified fixed exposure time and/or gain they
must be programmed into the sensor immediately, even before the sensor
has been started. For this to happen they must be written into the
image metadata when the SwitchMode method is invoked.
We also make the default exposure/gain, when nothing has been set,
customisable in the tuning file.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The Awb class now implements a SwitchMode method which outputs its
AwbStatus for other algorithms to read, should they be interested.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Introduce a function to fetch the AwbStatus (fetchAwbStatus), and call
it unconditionally at the top of Prepare so that both Prepare and
Process know thereafter that it's been done.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Previously the calculation computed Y for each region before returning
the weighted average, which "baked in" the over-importance of small
statistics regions. The revised calculation will treat all pixels
equally when the region weights are the same, making it easier to
use. With the previous scheme, proper "average" metering was difficult
to implement.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The method formerly known as divvyupExposure is given a more
understandable name.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
On the libcamera/VC4 platform the AGC Prepare/Process methods, and any
changes to the AGC settings, run synchronously - so a number of
mutexes and copies are unnecessary and can be removed.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Replace Raspberry Pi debug with libcamera debug.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Crop rectangle was not being configured on the isp output pad nor in the
resizer input pad, causing an unecessary crop in the image and an
unecessary scaling by the resizer when streaming with a higher
resolution then the default 800x600.
Example:
cam -c 1 -C -s width=3280,height=2464
In the pipeline:
sensor->isp->resizer->dma_engine
isp output crop is set to 800x600, which limits the output format to
800x600, which is propagated to the resizer input format set to 800x600,
and the resizer output format is set to the desired end resolution
3280x2464.
Signed-off-by: Helen Koike <helen.koike@collabora.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
The names have to match for the setting to work. Use the libcamera
terminology for consistency (even though it touches more files).
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Support the 'cloudy' AWB mode which was left out when the AwbModeTable
was introduced.
Signed-off-by: David Plowman <david.plowman@raspberrypi.com>
Reviewed-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
There is a condition that would cause the buffers to never be in sync
when we using only a single buffer in the bayer and embedded data streams.
This occurred because even though both streams would get flushed to resync,
one stream's only buffer was already queued in the device, and would end
up never matching.
Rework the buffer matching logic by combining updateQueue() and
tryFlushQueue() into a single function findMatchingBuffers(). This would
allow us to flush the queues at the same time as we match buffers, avoiding
the the above condition.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Tested-by: David Plowman <david.plowman@raspberrypi.com>
Acked-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|
|
The JPEG post-processor is marked as licensed under GPL-2.0-or-later.
This is an oversight and unvoluntary. License it under the
LGPL-2.1-or-later as the rest of the camera HAL implementation.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Acked-by: Umang Jain <email@uajain.com>
Acked-by: Hirokazu Honda <hiroh@chromium.org>
|
|
Instead of directly mmaping/unmapping buffers passed to the IPA, use
a MappedFrameBuffer. The latter is a cleaner interface, and avoid
code duplication.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Use a MappedFrameBuffer to mmap embedded data buffers for the pipeline
handler to use in the cases where the sensor does not fill it in. This
avoids the need to mmap and unmap on every frame.
Signed-off-by: Naushir Patuck <naush@raspberrypi.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Signed-off-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
When configuring the converter, the format is first set on the output
side based on the format of the camera pipeline output, and then the
format is set on the capture side to match the desired stream
configuration. The format parameter passed to
V4L2VideoDevice::setFormat() uses the same variable for both calls,
which has the unwanted side effect of carrying plane configuration from
the output side to the capture side of the converter. In particular, the
stride or plane size requested on the capture side can become
unnecessarily large when converting to a format with a lower number of
bits per pixel (for instance converting YUYV to NV12).
Fix this by resetting the format variable before using it to configure
the capture side.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The request completion handler is invoked in the camera manager thread,
which shouldn't be blocked for large amounts of time. As writing the
frames to disk can be a time-consuming process, move request processing
to the main thread by queueing an event to the event loop.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
Add a deferred cals queue to the EventLoop class to support queuing
calls from a different thread and processing them in the event loop's
thread.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
There's no user of the EventDispatcher (and the related EventNotifier
and Timer classes) outside of libcamera. Move those classes to the
internal API.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
To prepare for removal of the EventDispatcher from the libcamera public
API, switch to libevent to handle the event loop.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
Get the event dispatcher from the current thread instead of the camera
manager. This prepares for the removal of
CameraManager::eventDispatcher().
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
To enable reusing Request objects, we kept a pool of free Requests. This
pool was not cleared upon stopping capture, however, which caused a
segfault when switching to another camera. Fix this by clearing the
Request pool on stopCapture().
Fixes: c753223ad6b9 ("libcamera, android, cam, gstreamer, qcam, v4l2: Reuse Request")
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Tested-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Tested-by: Jean-Michel Hautbois <jeanmichel.hautbois@ideasonboard.com>
|
|
Add licensing information for mojo in dep5.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Specify in the readme and meson file that we depend on python3-ply and
python3-jinja2 for generating the IPA interface.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
Import mojo from the Chromium repository, so that we can use it for
generating code for the IPC mechanism. The commit from which this was
taken is:
a079161ec8c6907b883f9cb84fc8c4e7896cb1d0 "Add PPAPI constructs for
sending focus object to PdfAccessibilityTree"
This tree has been pruned to remove directories that didn't have any
necessary code:
- mojo/* except for mojo/public
- mojo core, docs, and misc files
- mojo/public/* except for mojo/public/{tools,LICENSE}
- language bindings for IPC, tests, and some mojo internals
- mojo/public/tools/{fuzzers,chrome_ipc}
- mojo/public/tools/bindings/generators
- code generation for other languages
No files were modified.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Acked-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Acked-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Acked-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The V4L2_EVENT_FRAME_SYNC event may occur on both V4L2 video-devices
(V4L2VideoDevice) and sub-devices (V4L2Subdevice). Move the start of
frame detection to the common base class of the two, V4L2Device.
There is no functional change.
Signed-off-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Use the d-pointer infrastructure offered by the Extensible class to
replace the custom implementation.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
Use the d-pointer infrastructure offered by the Extensible class to
replace the custom implementation.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
The d-pointer design patterns helps creating public classes that can be
extended without breaking their ABI. To facilitate usage of the pattern
in libcamera, create a base Extensible class with associated macros.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Jacopo Mondi <jacopo@jmondi.org>
|
|
Add a formatter to ensure consistent naming of 'd' and 'o' variables
related to the d-pointer design pattern, as implemented by the
Extensible class. The formatter also ensures that the pointer is always
const. const-correctness issues related to the data pointed to will be
caught by the compiler, and thus don't need to be checked here.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
When closing the libcamerasrc, the reference to camera is released and
the camera manager is stopped. However, the camera configuration still
exists at that point, and holds a reference to the camera. This leads to
a warning from the device enumerator complaining that the media devices
are still in use:
[1:53:48.792327560] [408] ERROR DeviceEnumerator device_enumerator.cpp:165 Removing media device /dev/media1 while still in use
[1:53:48.792354022] [408] ERROR DeviceEnumerator device_enumerator.cpp:165 Removing media device /dev/media0 while still in use
A crash follows when the libcamerasrc is finalized, as deleting the
camera configuration will then release the last reference to the camera,
which attempts to delete the camera object with deleteLater() without an
event dispatcher.
Fix it by deleting the camera configuration before stopping the camera
manager.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Nicolas Dufresne <nicolas.dufresne@collabora.com>
|
|
The V4L2DeviceFormat class now has default initializers for all members,
explicit initialization isn't needed anymore.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
When setting (or trying) a format with a multiplanar device, the
V4L2VideoDevice::trySetFormatMeta() function iterates over all planes
available in the V4L2DeviceFormat structure. The caller is responsible
for setting the plane count, and failure to do so properly may result in
memory corruption. This can lead to a crash way after the function
returns, making the problem difficult to debug.
As the issue is caused by a bug in the caller, use an assertion to catch
it.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
|
|
The V4L2DeviceFormat class doesn't have a default constructor, neither
does it specifies default member initializers for the plane-related
members. This results in the planes array and planesCount members being
uninitialized by default, leading to undefined behaviour if the user of
the class doesn't initialize it explicitly.
Most users initialize V4L2DeviceFormat instances, but some don't. We
could fix them, but that would likely turn into a game of whack-a-mole.
As there's no use case for instantiating a large number of
V4L2DeviceFormat instances in a performance-critical code path, let's
instead add default initializers to avoid future issues.
While at it, define a type of the structures containing plane
information, and use an std::array.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
Add support for 24-bit and 32-bit RGB formats. The fragment samples the
texture and reorders the components, using a pattern set through the
RGB_PATTERN macro. The pattern stores the shader vec4 element indices
(named {r, g, b, a} by convention, for elements 0 to 3) to be extracted
from the texture samples, when interpreted by OpenGL as RGBA.
Note that, as textures are created with GL_UNSIGNED_BYTE, the RGBA
order corresponds to bytes in memory, while the libcamera formats are
named based on the components order in a 32-bit word stored in memory in
little endian format.
An alternative to manual reordering in the shader would be to set the
texture swizzling mask. This is however not available in OpenGL ES
before version 3.0, which we don't mandate at the moment.
Only the BGR888 and RGB888 formats have been tested, with the vimc
pipeline handler.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Andrey Konovalov <andrey.konovalov@linaro.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
In preparation for RGB formats support, store the three Y, U and V
textures in an array. This makes the code more generic, and will avoid
referring to an RGB texture as textureY_.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Andrey Konovalov <andrey.konovalov@linaro.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
In preparation for RGB formats support, rename the pointer to image data
from yuvData_ to data_.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Andrey Konovalov <andrey.konovalov@linaro.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
In preparation for RGB formats support, rename the identity vertex
shader from YUV.vert to identity.vert.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Andrey Konovalov <andrey.konovalov@linaro.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
There's no need to cast the yuvData_ unsigned char pointer to a char
pointer before performing pointer arithmetics. Drop the unneeded casts.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Andrey Konovalov <andrey.konovalov@linaro.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
When ViewFinderGL::setFormat() is called, the fragment shader is deleted
and recreated for the new format. This results in unnecessary shader
recompilation if only the size is changed and the pixel format remains
the same. Keep the existing shader in that case.
The null test for fragmentShader_ can be removed, as if the shader
program is linked, the fragment shader is guaranteed to exist.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Andrey Konovalov <andrey.konovalov@linaro.org>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
When setting a new format, the existing fragment shader is deleted and a
new shader should be created. However, the shader pointer isn't set to
nullptr after deleting it, resulting in the deleter shader being reused.
Fix it by managing shader pointers with std::unique_ptr<> to prevent
similar bugs from happening in the future.
Signed-off-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
Reviewed-by: Kieran Bingham <kieran.bingham@ideasonboard.com>
Reviewed-by: Niklas Söderlund <niklas.soderlund@ragnatech.se>
|
|
Add SPDX headers and copyright notices to the tracepoints definition
files.
Signed-off-by: Paul Elder <paul.elder@ideasonboard.com>
Reviewed-by: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
|