Depending on the amount of data to process, file generation may take longer.

If it takes too long to generate, you can limit the data by, for example, reducing the range of years.

Article

Download file Download BibTeX

Title

Temporally Aware Objective Quality Metric for Immersive Video

Authors

[ 1 ] Instytut Telekomunikacji Multimedialnej, Wydział Informatyki i Telekomunikacji, Politechnika Poznańska | [ P ] employee

Scientific discipline (Law 2.0)

[2.3] Information and communication technology

Year of publication

2026

Published in

Applied Sciences

Journal year: 2026 | Journal volume: vol. 16 | Journal number: iss. 1

Article type

scientific article / paper

Publication language

english

Keywords
EN
Abstract

EN State-of-the-art objective quality metrics designed for immersive content typically prioritize spatial distortions; therefore, they can omit temporal artifacts introduced by view synthesis and dynamic scene rendering. Consequently, metrics such as the commonly used peak signal-to-noise ratio for immersive video (IV-PSNR) are “temporally blind”, creating a conceptual gap where temporally stable distortions cannot be distinguished from disruptive temporal flickering. To address this limitation, we propose a temporal extension of the IV-PSNR metric that incorporates motion information into the quality assessment process. The method augments the traditional Y, U, and V color components with a fourth channel representing motion vectors (M), enabling the proposed four-component IV-PSNRYUVM metric to account for dynamic distortions introduced by view rendering. To evaluate the effectiveness of the proposed approach, multiple configurations of motion integration were tested, including metrics based solely on motion consistency, metrics combining motion with texture, and several dense optical flow algorithms with different parameter settings. Extensive experiments performed on immersive video sequences demonstrate that the proposed four-component IV-PSNRYUVM achieves the highest correlation with subjectively perceived video quality. These results confirm that combining texture with motion information provides a benefit, making the proposal a valuable addition for real-world immersive video systems.

Date of online publication

26.12.2025

Pages (from - to)

274-1 - 274-18

DOI

10.3390/app16010274

URL

https://www.mdpi.com/2076-3417/16/1/274/pdf

Comments

Article number: 274

License type

CC BY (attribution alone)

Open Access Mode

open journal

Open Access Text Version

final published version

Release date

26.12.2025

Date of Open Access to the publication

at the time of publication

Full text of article

Download file

Access level to full text

public

Ministry points / journal

100

This website uses cookies to remember the authenticated session of the user. For more information, read about Cookies and Privacy Policy.