.. SPDX-License-Identifier: CC-BY-SA-4.0 .. include:: documentation-contents.rst .. _camera-sensor-model: .. todo: Move to Doxygen-generated documentation The libcamera camera sensor model ================================= libcamera defines an abstract camera sensor model in order to provide a description of each of the processing steps that result in image data being sent on the media bus and that form the image stream delivered to applications. Applications should use the abstract camera sensor model defined here to precisely control the operations of the camera sensor. The libcamera camera sensor model targets image sensors producing frames in RAW format, delivered through a MIPI CSI-2 compliant bus implementation. The abstract sensor model maps libcamera components to the characteristics and operations of an image sensor, and serves as a reference to model the libcamera CameraSensor class and SensorConfiguration classes and operations. In order to control the configuration of the camera sensor through the SensorConfiguration class, applications should understand this model and map it to the combination of image sensor and kernel driver in use. The camera sensor model defined here is based on the *MIPI CCS specification*, particularly on *Section 8.2 - Image readout* of *Chapter 8 - Video Timings*. Glossary -------- .. glossary:: Pixel array The full grid of pixels, active and inactive ones Pixel array active area The portion(s) of the pixel array that contains valid and readable pixels; corresponds to the libcamera properties::PixelArrayActiveAreas Analog crop rectangle The portion of the *pixel array active area* which is read out and passed to further processing stages Subsampling Pixel processing techniques that reduce the image size by binning or by skipping adjacent pixels Digital crop Crop of the sub-sampled image data before scaling Frame output The frame (image) as output on the media bus by the camera sensor Camera sensor model ------------------- The abstract sensor model is described in the following diagram. .. figure:: sensor_model.svg 1. The sensor reads pixels from the *pixel array*. The pixels being read out are selected by the *analog crop rectangle*. 2. The pixels can be subsampled to reduce the image size without affecting the field of view. Two subsampling techniques can be used: - Binning: combines adjacent pixels of the same colour by averaging or summing their values, in the analog domain and/or the digital domain. .. figure:: binning.svg - Skipping: skips the read out of a number of adjacent pixels. .. figure:: skipping.svg 3. The output of the optional sub-sampling stage is then cropped after the conversion of the analogue pixel values in the digital domain. 4. The resulting output frame is sent on the media bus by the sensor. Camera Sensor configuration parameters -------------------------------------- The libcamera camera sensor model defines parameters that allow users to control: 1. The image format bit depth 2. The size and position of the *Analog crop rectangle* 3. The subsampling factors used to downscale the pixel array readout data to a smaller frame size without reducing the image *field of view*. Two configuration parameters are made available to control the downscaling factor: - binning A vertical and horizontal binning factor can be specified, the image will be downscaled in its vertical and horizontal sizes by the specified factor. .. code-block:: c :caption: Definition: The horizontal and vertical binning factors horizontal_binning = xBin; vertical_binning = yBin; - skipping Skipping reduces the image resolution by skipping the read-out of a number of adjacent pixels. The skipping factor is specified by the 'increment' number (number of pixels to 'skip') in the vertical and horizontal directions and for even and odd rows and columns. .. code-block:: c :caption: Definition: The horizontal and vertical skipping factors horizontal_skipping = (xOddInc + xEvenInc) / 2; vertical_skipping = (yOddInc + yEvenInc) / 2; Different sensors perform the binning and skipping stages in different orders. For the sake of computing the final output image size the order of execution is not relevant. The overall down-scaling factor is obtained by combining the binning and skipping factors. .. code-block:: c :caption: Definition: The total scaling factor (binning + sub-sampling) total_horizontal_downscale = horizontal_binning + horizontal_skipping; total_vertical_downscale = vertical_binning + vertical_skipping; 4. The output size is used to specify any additional cropping on the sub-sampled frame. 5. The total line length and frame height (*visibile* pixels + *blankings*) as sent on the MIPI CSI-2 bus. 6. The pixel transmission rate on the MIPI CSI-2 bus. The above parameters are combined to obtain the following high-level configurations: - **frame output size** Obtained by applying a crop to the physical pixel array size in the analog domain, followed by optional binning and sub-sampling (in any order), followed by an optional crop step in the output digital domain. - **frame rate** The combination of the *total frame size*, the image format *bit depth* and the *pixel rate* of the data sent on the MIPI CSI-2 bus allows to compute the image stream frame rate. The equation is the well known: .. code-block:: c frame_duration = total_frame_size / pixel_rate; frame_rate = 1 / frame_duration; where the *pixel_rate* parameter is the result of the sensor's configuration of the MIPI CSI-2 bus *(the following formula applies to MIPI CSI-2 when used on MIPI D-PHY physical protocol layer only)* .. code-block:: c pixel_rate = csi_2_link_freq * 2 * nr_of_lanes / bits_per_sample; ='n94' href='#n94'>94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 155 156 157 158 159 160 161 162 163 164 165 166 167 168 169 170 171 172 173 174 175 176 177 178 179 180 181 182
#!/bin/sh
# SPDX-License-Identifier: GPL-2.0-or-later
# Copyright (C) 2019, Google Inc.
#
# Author: Laurent Pinchart <laurent.pinchart@ideasonboard.com>
#
# rkisp-capture.sh - Capture processed frames from cameras based on the
# Rockchip ISP1
#
# The scripts makes use of the following tools, which are expected to be
# executable from the system-wide path or from the local directory:
#
# - media-ctl (from v4l-utils git://linuxtv.org/v4l-utils.git)
# - raw2rgbpnm (from git://git.retiisi.org.uk/~sailus/raw2rgbpnm.git)
# - yavta (from git://git.ideasonboard.org/yavta.git)

# Locate the sensor entity
find_sensor() {
	local bus
	local sensor_name=$1

	bus=$(grep "$sensor_name " /sys/class/video4linux/v4l-subdev*/name | cut -d ' ' -f 2)
	if [[ -z $bus ]]; then
		echo "Sensor '$sensor_name' not found." >&2
		exit 1
	fi

	echo "$sensor_name $bus"
}

# Locate the media device
find_media_device() {
	local mdev
	local name=$1

	for mdev in /dev/media* ; do
		media-ctl -d $mdev -p | grep -q "^driver[ \t]*$name$" && break
		mdev=
	done

	if [[ -z $mdev ]] ; then
		echo "$name media device not found." >&2
		exit 1
	fi

	echo $mdev
}

# Get the sensor format
get_sensor_format() {
	local format
	local sensor=$1

	format=$($mediactl --get-v4l2 "'$sensor':0" | sed 's/\[\([^ ]*\).*/\1/')
	sensor_mbus_code=$(echo $format | sed 's/fmt:\([A-Z0-9_]*\).*/\1/')
	sensor_size=$(echo $format | sed 's/[^\/]*\/\([0-9x]*\).*/\1/')

	echo "Capturing ${sensor_size} from sensor $sensor in ${sensor_mbus_code}"
}

# Configure the pipeline
configure_pipeline() {
	local format="fmt:$sensor_mbus_code/$sensor_size"
	local capture_mbus_code=$1
	local capture_size=$2

	echo "Configuring pipeline for $sensor in $format"

	$mediactl -r

	$mediactl -l "'$sensor':0 -> 'rockchip-sy-mipi-dphy':0 [1]"
	$mediactl -l "'rockchip-sy-mipi-dphy':1 -> 'rkisp1-isp-subdev':0 [1]"
	$mediactl -l "'rkisp1-isp-subdev':2 -> 'rkisp1_mainpath':0 [1]"

	$mediactl -V "\"$sensor\":0 [$format]"
	$mediactl -V "'rockchip-sy-mipi-dphy':1 [$format]"
	$mediactl -V "'rkisp1-isp-subdev':0 [$format crop:(0,0)/$sensor_size]"
	$mediactl -V "'rkisp1-isp-subdev':2 [fmt:$capture_mbus_code/$capture_size crop:(0,0)/$capture_size]"
}

# Capture frames
capture_frames() {
	local file_op
	local capture_format=$1
	local capture_size=$2
	local frame_count=$3
	local save_file=$4

	if [[ $save_file -eq 1 ]]; then
		file_op="--file=/tmp/frame-#.bin"
	fi

	yavta -c$frame_count -n5 -I -f $capture_format -s $capture_size \
		$file_op $($mediactl -e "rkisp1_mainpath")
}

# Convert captured files to ppm
convert_files() {
	local format=$1
	local size=$2
	local frame_count=$3

	echo "Converting ${frame_count} frames (${size})"

	for i in `seq 0 $(($frame_count - 1))`; do
		i=$(printf %06u $i)
		raw2rgbpnm -f $format -s $size /tmp/frame-$i.bin /tmp/frame-$i.ppm
	done
}

# Print usage message
usage() {
	echo "Usage: $1 [options] sensor-name"
	echo "Supported options:"
	echo "-c,--count n      Number of frame to capture"
	echo "--no-save         Do not save captured frames to disk"
	echo "-r, --raw         Capture RAW frames"
	echo "-s, --size wxh    Frame size"
}

# Parse command line arguments
capture_size=1024x768
frame_count=10
raw=false
save_file=1

while [[ $# -ne 0 ]] ; do
	case $1 in
	-c|--count)
		frame_count=$2
		shift 2
		;;
	--no-save)
		save_file=0
		shift
		;;

	-r|--raw)
		raw=true
		shift
		;;
	-s|--size)
		capture_size=$2
		shift 2
		;;
	-*)
		echo "Unsupported option $1" >&2
		usage $0
		exit 1
		;;
	*)
		break
		;;
	esac
done

if [[ $# -ne 1 ]] ; then
	usage $0
	exit 1
fi

sensor_name=$1

modprobe mipi_dphy_sy
modprobe video_rkisp1

sensor=$(find_sensor $sensor_name) || exit
mdev=$(find_media_device rkisp1) || exit
mediactl="media-ctl -d $mdev"

get_sensor_format "$sensor"
if [[ $raw == true ]] ; then
	capture_format=$(echo $sensor_mbus_code | sed 's/_[0-9X]$//')
	capture_mbus_code=$sensor_mbus_code
else
	capture_format=YUYV
	capture_mbus_code=YUYV8_2X8
fi

configure_pipeline $capture_mbus_code $capture_size
capture_frames $capture_format $capture_size $frame_count $save_file
[[ $save_file -eq 1 ]] && convert_files $capture_format $capture_size $frame_count