This detailed poster presentation from OMNIVISION explores advancements in image reconstruction using Combined Image Sensor (CIS) and Event-based Vision Sensor (EVS) technology. **Section Highlights:** 1. **Motivation:** Discusses the challenges posed by EVS nonidealities, such as pixel and readout latency, especially under low-light conditions, which can lead to ghosting or blurry frames and intensify artifact severity due to the EVS pixel refractory period. 2. **Frame Reconstruction:** Delved into the pixel-wise photocurrent estimation problem, framed as a graph optimization problem. Equations highlight the cost function, and the CIS and EVS measurements are detailed, illustrating the complex mathematical framework involved in the reconstruction. 3. **Key Highlights:** - A joint formulation process addressing deblurring, rolling-shutter correction, and frame interpolation using both CIS and EVS sensors. - An optimization method that tackles coupled inverse problems and post-processing for artifact reduction. - This proposed method offers a notable performance improvement with up to 4 dB enhancement in PSNR and a 12% increase in LPIPS score. 4. **Joint Framework:** - Provides a comprehensive optimization methodology for the coupled inverse problem. - Focuses on refining the framework to reduce noise and improve overall image quality. 5. **Refinement Network:** - Adopts the NAFNet basic concept with additional modifications: - Spatial attention focusing on noisy motion areas. - Residual loss aiding in better image quality by mitigating blurriness and texture degradation. This presentation signifies a critical step forward in addressing and improving image quality and reconstruction using a combination of cutting-edge sensor technologies and advanced computational methods. Text transcribed from the image: OMNIVISION™ Rui Jiang*, Fangwen Tu*, Yixuan Long, Aabhaas Vaish, Bowen Zhou, Qinyi Wang, Wei Zhang, Yuntan Fang, Luis Eduardo Motivation EVS nonidealities, such as pixel and readout latency, significantly affect frame interpolation image quality. Under low-light conditions such as indoor capture, EVS response becomes slower, resulting in ghosting or blurry frame. EVS pixel refractory period are inevitable and worsen artifacts. Frame Reconstruction • Eqn. (6) The pixel-wise photocurrent estimation problem is modelled as graph opti states x = [ipd (k)] Dipd(1) Eqn. (7) Eqn. (5) ipa(2) time Eqn. (5) CIS measurement ipd(M) EVS measurement CIS= ZEVS = [DN (1)] = [c(k + 1)]T Eqn. (7) VFE(1)Q Eqn. (8) Eqn. (7) Eqn. (7) Cost function equation: ecis, едvs: measurement e OVFE(M) Eqn. (8) Eqn. (8) x* = NEVS, NCIS: diagonal weig T argmin (ecis CiseCIS + еEVSE CIS measurement: DN = G GS. te ipd(t)dt (5) foc(ipd): current-voltage conw pixel voltage VDC event firing updating Vref charge VON DN light exposure charge2v ADC->>> VFE Cdown VFE event firing & events readout latency tri refractory period trp EVS measurement: photodiode 12v readout 'pd in-pixel off-pixel time Hybrid CIS-EVS sensor principle.. Highlights • timestamp timestamp tin toff Example of VDC and VFE curves and triggered events. VDC represents the VFE under DC excitation. The joint formulation combines deblurring, rolling-shutter correction, and frame interpolation problems using CIS and EVS fusion, explicitly modeling EVS sensor nonidealities. An optimization method jointly solves the coupled inverse problem and the post-processing network for artifact removal. The proposed method outperforms state-of-the-art methods in reducing the shadowing effect with up to 4 dB improvement in PSNR and 12% improvement in LPIPS score. Joint Framework • Frame reconstruction: Provides the optimization methodology for solving the coupled inverse problem. Refinement network: Focuses on removing noise and improving the image quality. 120 FPS blurry CIS input 10,000 FPS output with RS effect without RS effect Frame reconstruction Nonideal EVS input Final output Refinement network • ts fy (ipd): maping ipd onto the time-varying coefficient of the LPF VFE(k+1) = f (ipd(k)) (fDC (ipd (k + 1)) - VFE(k)) c(k+1) VFE(k + 1) - VFE(k) When enabling the voltage compensation for readout latency (RL) and refractory time (RP) m: the slope approximation of the VFE curve AV(k) =m(k+1) [toff(k) - tin (k)] AVE(k) = m(k+1)trp c(k+1) ≈ Vre(k+1)-VFE(k)-AV(k) - AV (k) FE (8) (12) The frame estimation problem is formulated as pixel-wise computati adjusted CIS timing parameters. row exposure time Deblurred frame with the RS effect corrected CIS frame Zcis(-1) CIS frame zcis() CIS frame ZCIS(+1) Reconstruction using Zcis(-1), ZCIS() and events Reconstruction using Zcis(), ZCIS(+1) and events Each row is reconstructed by merg consecutive CIS frames and their corre events based on the row-specific rolling timing characteristics. row exposu Refinement Network The NAFNet basic concept is adopted, and two major modifications are made: • Spatial attention: Focuses on noisy motion areas for better denoising. Perceptual loss: Helps recover image quality and mitigates blurriness or texture degradation. Loss function: HHHH