Early Skip Mode Decision of Versatile Video Coding on 8K 360-degree Videos

Ying Lee, Chih Wei Tang

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

Virtual reality (VR) with 8K 360-degree videos provides users immersive viewing experiences. Although Versatile Video Coding (VVC) achieves superior coding gain, speedup of VVC based interframe coding of 8K data that has huge amount of data is desired while few related schemes exist. Thus this paper proposes early Skip mode decision for inter prediction of VVC on 8K 360-degree videos using hybrid equiangular cubemap (HEC) format, captured by static cameras. Based on coding statistics of 360-degree videos using HEC format, early Skip mode decision is made by referring to the sum of absolute transform distortion (SATD) cost of Affine mode and Merge mode. Compared with the VVC test model VTM 7.0, the proposed scheme achieves time saving up to 26.7% with BD-rate 0.34% on 8K 360-degree videos using HEC format.

Original languageEnglish
Title of host publication2021 IEEE International Conference on Consumer Electronics, ICCE 2021
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781728197661
DOIs
StatePublished - 10 Jan 2021
Event2021 IEEE International Conference on Consumer Electronics, ICCE 2021 - Las Vegas, United States
Duration: 10 Jan 202112 Jan 2021

Publication series

NameDigest of Technical Papers - IEEE International Conference on Consumer Electronics
Volume2021-January
ISSN (Print)0747-668X

Conference

Conference2021 IEEE International Conference on Consumer Electronics, ICCE 2021
Country/TerritoryUnited States
CityLas Vegas
Period10/01/2112/01/21

Keywords

  • 360-degree videos
  • 8K video
  • HEC
  • Versatile Video Coding (VVC)
  • early skip mode decision
  • inter prediction
  • virtual reality (VR)

Fingerprint

Dive into the research topics of 'Early Skip Mode Decision of Versatile Video Coding on 8K 360-degree Videos'. Together they form a unique fingerprint.

Cite this