Full Text

Turn on search term navigation

© 2023 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

The purpose of the no-reference image quality assessment (NR-IQA) is to measure perceived image quality based on subjective judgments; however, due to the lack of a clean reference image, this is a complicated and unresolved challenge. Massive new IQA datasets have facilitated the creation of deep learning-based image quality measurements. We present a unique model to handle the NR-IQA challenge in this research by employing a hybrid strategy that leverages from pre-trained CNN model and the unified learning mechanism that extracts both local and non-local characteristics from the input patch. The deep analysis of the proposed framework shows that the model uses features and a mechanism that improves the monotonicity relationship between objective and subjective ratings. The intermediary goal was mapped to a quality score using a regression architecture. To extract various feature maps, a deep architecture with an adaptive receptive field was used. Analyses of this biggest NR-IQA benchmark datasets demonstrate that the suggested technique outperforms current state-of-the-art NR-IQA measures.

Details

Title
Improved Image Quality Assessment by Utilizing Pre-Trained Architecture Features with Unified Learning Mechanism
Author
Ryu, Jihyoung
First page
2682
Publication year
2023
Publication date
2023
Publisher
MDPI AG
e-ISSN
20763417
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
2779526752
Copyright
© 2023 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.