top of page
Technology guides_edited.jpg

technology guidE

sony EVENT-BASED VISION SENSOR (EVS)

1x1 Sony 500px.jpg

An Event-based Vision Sensor (EVS) realizes high-speed data output with low latency by limiting the output data to luminance changes from each pixel, combined with information on coordinates and time.

EVS captures movements (luminance changes)

EVS is designed to emulate how the human eye senses light.
The human eye functions in such a way that the receptors on the retina, when exposed to light, convert it to visual signals to be sent to the brain. The subsequent neuronal cells identify the light and shade, and the information is conveyed to the visual cortex in the brain via the retinal ganglion cells.

In the EVS the incident light is converted into electric signals in the imager’s light receiving circuit. The signals pass through the amp unit and reach a comparator where the differential luminance data is separated and divided into positive and negative signals which are then processed and output as events.

evs 1.PNG

About EVS (movies)

Technology part

Application part

EVS mechanism

In the Event-based Vision Sensor the luminance changes detected by each pixel are filtered to extract only those that exceed the preset threshold value. This event data is then combined with the pixel coordinate, time, and polarity information before being output. Each pixel operates asynchronously, independently from any other.
Figures illustrate how the sensor captures the ball movement.

evs 2.PNG

Each pixel consists of a light receiving and luminance detection unit. The incident light is converted into a voltage in the light receiving unit. The differential detection circuit in the luminance detection unit detects the changes between the reference voltage and the converted incident light voltage. If the changes are greater than the set threshold value in either a positive or negative direction then the comparator identifies it as an event and this data is output.

evs 3.PNG

The circuit is reset with the detected event luminance as the reference, and the threshold values are set from this new reference voltage in the positive (light) and negative (dark) directions. If the incident light changes in luminance by an amount greater than the value set as a threshold in the positive direction (i.e. the output voltage surpasses the positive threshold) a positive event is output; conversely, if the voltage is lower than the negative threshold, a negative event is output.

products_evs04.gif

As the pixels convert the incident light luminance into electric voltage logarithmically, the sensor can detect subtle changes in the low luminance range while it responds to a large luminance changes in the high luminance range to prevent event saturation, as illustrated in the diagram. In this way the sensor realizes a high dynamic range.

evs 5.PNG

This mechanism produces EVS images as shown below (right).

evs 6.PNG

EVS image looks like the outline of the moving subject is extracted since the brightness of the pixels changes when the subject moves.(Shooting condition)
Above images were taken by a camera equipped with EVS on the dashboard of a car.

Core technology behind EVS

The industry’s smallest pixel: miniaturized high-resolution sensor made possible by utilizing a stacked structure

Unlike conventional technology that has the light-receiving circuit and luminance detection circuit on the same layer, this technology incorporates them on different layers: a pixel chip (upper layer) and logic chip (lower layer) which includes integrated signal processing circuits. These chips are stacked and connected using Cu-Cu technology within each pixel. The industry’s smallest pixel (4.86 μm) is integrated with the logic chip on a 40nm process, resulting in a 1/2.5 type sensor with HD resolution of 1,280 x 720.

evs 7.PNG

Highspeed and lower latency

Each pixel detects luminance changes asynchronously and will output event data immediately. When multiple pixels produce events the arbitration circuit controls the output order based on the earliest-received event. In this way the sensor outputs events as they are generated, making it possible to only output necessary data at the microsecond order while keeping power consumption low.

evs 8.PNG

Built-in H/W event filter

In order to cater to various applications the sensor is equipped with several filter functions specifically designed for event data.
This feature enables removal of unnecessary events such as periodical events due to LEDs flickering and other events that are highly unlikely to be the outline of a moving object. It can also restrict the data volume when necessary to ensure it falls below the event rate that can be processed in downstream systems.

evs 9.PNG

Image with event data accumulated for an equivalent of a single frame at 30 fps (left and right)
By turning on the filter (right image), the overall data volume can be reduced while retaining the information required for a given application.
The right image (with the filter turned on) shows the white lines on the road as the significant information.

Use cases where EVS can be leveraged

Areas of application for EVS and use cases

evs 10.PNG

Use cases

evs 11.PNG
evs 12.PNG
evs 13.PNG
evs 14.PNG
evs 15.PNG
bottom of page