.. SPDX-License-Identifier: CC-BY-SA-4.0 =========== libcamera =========== **A complex camera support library for Linux, Android, and ChromeOS** Cameras are complex devices that need heavy hardware image processing operations. Control of the processing is based on advanced algorithms that must run on a programmable processor. This has traditionally been implemented in a dedicated MCU in the camera, but in embedded devices algorithms have been moved to the main CPU to save cost. Blurring the boundary between camera devices and Linux often left the user with no other option than a vendor-specific closed-source solution. To address this problem the Linux media community has very recently started collaboration with the industry to develop a camera stack that will be open-source-friendly while still protecting vendor core IP. libcamera was born out of that collaboration and will offer modern camera support to Linux-based systems, including traditional Linux distributions, ChromeOS and Android. .. section-begin-getting-started Getting Started --------------- To fetch the sources, build and install: .. code:: git clone https://git.libcamera.org/libcamera/libcamera.git cd libcamera meson setup build ninja -C build install Dependencies ~~~~~~~~~~~~ The following Debian/Ubuntu packages are required for building libcamera. Other distributions may have differing package names: A C++ toolchain: [required] Either {g++, clang} Meson Build system: [required] meson (>= 0.60) ninja-build pkg-config for the libcamera core: [required] libyaml-dev python3-yaml python3-ply python3-jinja2 for IPA module signing: [recommended] Either libgnutls28-dev or libssl-dev, openssl Without IPA module signing, all IPA modules will be isolated in a separate process. This adds an unnecessary extra overhead at runtime. for improved debugging: [optional] libdw-dev libunwind-dev libdw and libunwind provide backtraces to help debugging assertion failures. Their functions overlap, libdw provides the most detailed information, and libunwind is not needed if both libdw and the glibc backtrace() function are available. for device hotplug enumeration: [optional] libudev-dev for documentation: [optional] python3-sphinx doxygen graphviz texlive-latex-extra for gstreamer: [optional] libgstreamer1.0-dev libgstreamer-plugins-base1.0-dev for Python bindings: [optional] libpython3-dev pybind11-dev for cam: [optional] libevent-dev is required to support cam, however the following optional dependencies bring more functionality to the cam test tool: - libdrm-dev: Enables the KMS sink - libjpeg-dev: Enables MJPEG on the SDL sink - libsdl2-dev: Enables the SDL sink for qcam: [optional] libtiff-dev qt6-base-dev qt6-tools-dev-tools for tracing with lttng: [optional] liblttng-ust-dev python3-jinja2 lttng-tools for android: [optional] libexif-dev libjpeg-dev for Python bindings: [optional] pybind11-dev for lc-compliance: [optional] libevent-dev libgtest-dev for abi-compat.sh: [optional] abi-compliance-checker Basic testing with cam utility ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ The ``cam`` utility can be used for basic testing. You can list the cameras detected on the system with ``cam -l``, and capture ten frames from the first camera and save them to disk with ``cam -c 1 --capture=10 --file``. See ``cam -h`` for more information about the ``cam`` tool. In case of problems, a detailed debug log can be obtained from libcamera by setting the ``LIBCAMERA_LOG_LEVELS`` environment variable: .. code:: :~$ LIBCAMERA_LOG_LEVELS=*:DEBUG cam -l Using GStreamer plugin ~~~~~~~~~~~~~~~~~~~~~~ To use the GStreamer plugin from the source tree, use the meson ``devenv`` command. This will create a new shell instance with the ``GST_PLUGIN_PATH`` environment set accordingly. .. code:: meson devenv -C build The debugging tool ``gst-launch-1.0`` can be used to construct a pipeline and test it. The following pipeline will stream from the camera named "Camera 1" onto the OpenGL accelerated display element on your system. .. code:: gst-launch-1.0 libcamerasrc camera-name="Camera 1" ! queue ! glimagesink To show the first camera found you can omit the camera-name property, or you can list the cameras and their capabilities using: .. code:: gst-device-monitor-1.0 Video This will also show the supported stream sizes which can be manually selected if desired with a pipeline such as: .. code:: gst-launch-1.0 libcamerasrc ! 'video/x-raw,width=1280,height=720' ! \ queue ! glimagesink The libcamerasrc element has two log categories, named libcamera-provider (for the video device provider) and libcamerasrc (for the operation of the camera). All corresponding debug messages can be enabled by setting the ``GST_DEBUG`` environment variable to ``libcamera*:7``. Presently, to prevent element negotiation failures it is required to specify the colorimetry and framerate as part of your pipeline construction. For instance, to capture and encode as a JPEG stream and receive on another device the following example could be used as a starting point: .. code:: gst-launch-1.0 libcamerasrc ! \ video/x-raw,colorimetry=bt709,format=NV12,width=1280,height=720,framerate=30/1 ! \ queue ! jpegenc ! multipartmux ! \ tcpserversink host=0.0.0.0 port=5000 Which can be received on another device over the network with: .. code:: gst-launch-1.0 tcpclientsrc host=$DEVICE_IP port=5000 ! \ multipartdemux ! jpegdec ! autovideosink The GStreamer element also supports multiple streams. This is achieved by requesting additional source pads. Downstream caps filters can be used to choose specific parameters like resolution and pixel format. The pad property ``stream-role`` can be used to select a role. The following example displays a 640x480 view finder while streaming JPEG encoded 800x600 video. You can use the receiver pipeline above to view the remote stream from another device. .. code:: gst-launch-1.0 libcamerasrc name=cs src::stream-role=view-finder src_0::stream-role=video-recording \ cs.src ! queue ! video/x-raw,width=640,height=480 ! videoconvert ! autovideosink \ cs.src_0 ! queue ! video/x-raw,width=800,height=600 ! videoconvert ! \ jpegenc ! multipartmux ! tcpserversink host=0.0.0.0 port=5000 .. section-end-getting-started Troubleshooting ~~~~~~~~~~~~~~~ Several users have reported issues with meson installation, crux of the issue is a potential version mismatch between the version that root uses, and the version that the normal user uses. On calling `ninja -C build`, it can't find the build.ninja module. This is a snippet of the error message. :: ninja: Entering directory `build' ninja: error: loading 'build.ninja': No such file or directory This can be solved in two ways: 1. Don't install meson again if it is already installed system-wide. 2. If a version of meson which is different from the system-wide version is already installed, uninstall that meson using pip3, and install again without the --user argument. d='n133' href='#n133'>133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153
/* SPDX-License-Identifier: BSD-2-Clause */
/*
* Copyright (C) 2019, Raspberry Pi (Trading) Limited
*
* histogram.cpp - histogram calculations
*/
#include "histogram.h"
#include <cmath>
#include <libcamera/base/log.h>
/**
* \file histogram.h
* \brief Class to represent Histograms and manipulate them
*/
namespace libcamera {
namespace ipa {
/**
* \class Histogram
* \brief The base class for creating histograms
*
* This class stores a cumulative frequency histogram, which is a mapping that
* counts the cumulative number of observations in all of the bins up to the
* specified bin. It can be used to find quantiles and averages between quantiles.
*/
/**
* \brief Create a cumulative histogram
* \param[in] data A pre-sorted histogram to be passed
*/
Histogram::Histogram(Span<uint32_t> data)
{
cumulative_.reserve(data.size());
cumulative_.push_back(0);
for (const uint32_t &value : data)
cumulative_.push_back(cumulative_.back() + value);
}
/**
* \fn Histogram::bins()
* \brief Retrieve the number of bins currently used by the Histogram
* \return Number of bins
*/
/**
* \fn Histogram::total()
* \brief Retrieve the total number of values in the data set
* \return Number of values
*/
/**
* \brief Cumulative frequency up to a (fractional) point in a bin.
* \param[in] bin The bin up to which to cumulate
*
* With F(p) the cumulative frequency of the histogram, the value is 0 at
* the bottom of the histogram, and the maximum is the number of bins.
* The pixels are spread evenly throughout the “bin” in which they lie, so that
* F(p) is a continuous (monotonically increasing) function.
*
* \return The cumulative frequency from 0 up to the specified bin
*/
uint64_t Histogram::cumulativeFrequency(double bin) const
{
if (bin <= 0)
return 0;
else if (bin >= bins())
return total();
int b = static_cast<int32_t>(bin);
return cumulative_[b] +
(bin - b) * (cumulative_[b + 1] - cumulative_[b]);
}
/**
* \brief Return the (fractional) bin of the point through the histogram
* \param[in] q the desired point (0 <= q <= 1)
* \param[in] first low limit (default is 0)
* \param[in] last high limit (default is UINT_MAX)
*
* A quantile gives us the point p = Q(q) in the range such that a proportion
* q of the pixels lie below p. A familiar quantile is Q(0.5) which is the median
* of a distribution.
*
* \return The fractional bin of the point
*/
double Histogram::quantile(double q, uint32_t first, uint32_t last) const
{
if (last == UINT_MAX)
last = cumulative_.size() - 2;
ASSERT(first <= last);
uint64_t item = q * total();
/* Binary search to find the right bin */
while (first < last) {
int middle = (first + last) / 2;
/* Is it between first and middle ? */
if (cumulative_[middle + 1] > item)
last = middle;
else
first = middle + 1;
}
ASSERT(item >= cumulative_[first] && item <= cumulative_[last + 1]);
double frac;
if (cumulative_[first + 1] == cumulative_[first])
frac = 0;
else
frac = (item - cumulative_[first]) / (cumulative_[first + 1] - cumulative_[first]);
return first + frac;
}
/**
* \brief Calculate the mean between two quantiles
* \param[in] lowQuantile low Quantile
* \param[in] highQuantile high Quantile
*
* Quantiles are not ideal for metering as they suffer several limitations.
* Instead, a concept is introduced here: inter-quantile mean.
* It returns the mean of all pixels between lowQuantile and highQuantile.
*
* \return The mean histogram bin value between the two quantiles
*/
double Histogram::interQuantileMean(double lowQuantile, double highQuantile) const
{
ASSERT(highQuantile > lowQuantile);
/* Proportion of pixels which lies below lowQuantile */
double lowPoint = quantile(lowQuantile);
/* Proportion of pixels which lies below highQuantile */
double highPoint = quantile(highQuantile, static_cast<uint32_t>(lowPoint));
double sumBinFreq = 0, cumulFreq = 0;
for (double p_next = floor(lowPoint) + 1.0;
p_next <= ceil(highPoint);
lowPoint = p_next, p_next += 1.0) {
int bin = floor(lowPoint);
double freq = (cumulative_[bin + 1] - cumulative_[bin])
* (std::min(p_next, highPoint) - lowPoint);
/* Accumulate weighted bin */
sumBinFreq += bin * freq;
/* Accumulate weights */
cumulFreq += freq;
}
/* add 0.5 to give an average for bin mid-points */
return sumBinFreq / cumulFreq + 0.5;
}
} /* namespace ipa */
} /* namespace libcamera */