Motion extrapolation for video story planning

Nick C. Tang, Joseph C. Tsai, Hsing Ying Zhong, Timothy K. Shih, Hong Yuan Mark Liao

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

2 Scopus citations

Abstract

We create video scenes using existing videos. A panorama is generated from background video, with foreground objects removed by video inpainting technique. A video planning script is provided by the user on the panorama with accurate timing of actors. Actions of these actors are extensions of existing cyclic motions, such as walking, tracked and extrapolated using our motion analysis techniques. Although the types of video story generated are limited, however, it is possible to use the mechanism proposed to generate forgery videos. Interested readers should visit our tool demonstration and forgery videos at http://member.mine.tku.edu.tw/www/ACMMM08-VideoPlanning.

Original languageEnglish
Title of host publicationMM'08 - Proceedings of the 2008 ACM International Conference on Multimedia, with co-located Symposium and Workshops
Pages685-688
Number of pages4
DOIs
StatePublished - 2008
Event16th ACM International Conference on Multimedia, MM '08 - Vancouver, BC, Canada
Duration: 26 Oct 200831 Oct 2008

Publication series

NameMM'08 - Proceedings of the 2008 ACM International Conference on Multimedia, with co-located Symposium and Workshops

Conference

Conference16th ACM International Conference on Multimedia, MM '08
Country/TerritoryCanada
CityVancouver, BC
Period26/10/0831/10/08

Keywords

  • Layer segmentation
  • Special effect
  • Tracking
  • Video inpainting

Fingerprint

Dive into the research topics of 'Motion extrapolation for video story planning'. Together they form a unique fingerprint.

Cite this