Loading…

Optical Flow Reusing for High-Efficiency Space-Time Video Super Resolution

In this paper, we consider the task of space-time video super-resolution (ST-VSR), which can increase the spatial resolution and frame rate for a given video simultaneously. Despite the remarkable progress of recent methods, most of them still suffer from high computational costs and inefficient lon...

Full description

Saved in:
Bibliographic Details
Published in:IEEE transactions on circuits and systems for video technology 2023-05, Vol.33 (5), p.2116-2128
Main Authors: Zhang, Yuantong, Wang, Huairui, Zhu, Han, Chen, Zhenzhong
Format: Article
Language:English
Subjects:
Citations: Items that this one cites
Items that cite this one
Online Access:Get full text
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Summary:In this paper, we consider the task of space-time video super-resolution (ST-VSR), which can increase the spatial resolution and frame rate for a given video simultaneously. Despite the remarkable progress of recent methods, most of them still suffer from high computational costs and inefficient long-range information usage. To alleviate these problems, we propose a Bidirectional Recurrence Network (BRN) with the optical-flow-reuse strategy to better use temporal knowledge from long-range neighboring frames for high-efficiency reconstruction. Specifically, an efficient and memory-saving multi-frame motion utilization strategy is proposed by reusing the intermediate flow of adjacent frames, which considerably reduces the computation burden of frame alignment compared with traditional LSTM-based designs. In addition, the proposed hidden state in BRN is updated by the reused optical flow and refined by the Feature Refinement Module (FRM) for further optimization. Moreover, by utilizing intermediate flow estimation, the proposed method can inference non-linear motion and restore details better. Extensive experiments demonstrate that our optical-flow-reuse-based bidirectional recurrent network (OFR-BRN) is superior to state-of-the-art methods in accuracy and efficiency. Codes are available on URL: https://github.com/hahazh/OFR-BRN
ISSN:1051-8215
1558-2205
DOI:10.1109/TCSVT.2022.3222875