Les Inscriptions à la Bibliothèque sont ouvertes en
ligne via le site: https://biblio.enp.edu.dz
Les Réinscriptions se font à :
• La Bibliothèque Annexe pour les étudiants en
2ème Année CPST
• La Bibliothèque Centrale pour les étudiants en Spécialités
A partir de cette page vous pouvez :
Retourner au premier écran avec les recherches... |
Signal processing. Image communication / European association for signal processing . Vol. 25 N° 7Signal processing. Image communicationMention de date : Août 2010 Paru le : 16/09/2012 |
Dépouillements
Ajouter le résultat dans votre panierNo-reference image and video quality estimation / Sheila S. Hemami in Signal processing. Image communication, Vol. 25 N° 7 (Août 2010)
[article]
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 469–481
Titre : No-reference image and video quality estimation : Applications and human-motivated design Type de document : texte imprimé Auteurs : Sheila S. Hemami, Auteur ; Amy R. Reibman, Auteur Année de publication : 2012 Article en page(s) : pp. 469–481 Note générale : Electronique Langues : Anglais (eng) Mots-clés : No-reference Video quality Quality metrics Quality estimator Applications of quality metrics Blind quality assessment Résumé : This paper reviews the basic background knowledge necessary to design effective no-reference (NR) quality estimators (QEs) for images and video. We describe a three-stage framework for NR QE that encompasses the range of potential use scenarios for the NR QE and allows knowledge of the human visual system to be incorporated throughout. We survey the measurement stage of the framework, considering methods that rely on bitstream, pixels, or both. By exploring both the accuracy requirements of potential uses as well as evaluation criteria to stress-test a QE, we set the stage for our community to make substantial future improvements to the challenging problem of NR quality estimation. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596510000688 [article] No-reference image and video quality estimation : Applications and human-motivated design [texte imprimé] / Sheila S. Hemami, Auteur ; Amy R. Reibman, Auteur . - 2012 . - pp. 469–481.
Electronique
Langues : Anglais (eng)
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 469–481
Mots-clés : No-reference Video quality Quality metrics Quality estimator Applications of quality metrics Blind quality assessment Résumé : This paper reviews the basic background knowledge necessary to design effective no-reference (NR) quality estimators (QEs) for images and video. We describe a three-stage framework for NR QE that encompasses the range of potential use scenarios for the NR QE and allows knowledge of the human visual system to be incorporated throughout. We survey the measurement stage of the framework, considering methods that rely on bitstream, pixels, or both. By exploring both the accuracy requirements of potential uses as well as evaluation criteria to stress-test a QE, we set the stage for our community to make substantial future improvements to the challenging problem of NR quality estimation. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596510000688 Perceptual-based quality assessment for audio–visual services / Junyong You in Signal processing. Image communication, Vol. 25 N° 7 (Août 2010)
[article]
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 482–501
Titre : Perceptual-based quality assessment for audio–visual services : A survey Type de document : texte imprimé Auteurs : Junyong You, Auteur ; Ulrich Reiter, Auteur ; Miska M. Hannuksela, Auteur Année de publication : 2012 Article en page(s) : pp. 482–501 Note générale : Electronique Langues : Anglais (eng) Mots-clés : Objective quality metric Subjective quality assessment HVS Psychophysical approach Engineering approach Perception Alignment PEAQ Semantic importance Résumé : Accurate measurement of the perceived quality of audio–visual services at the end-user is becoming a crucial issue in digital applications due to the growing demand for compression and transmission of audio–visual services over communication networks. Content providers strive to offer the best quality of experience for customers linked to their different quality of service (QoS) solutions. Therefore, developing accurate, perceptual-based quality metrics is a key requirement in multimedia services. In this paper, we survey state-of-the-art signal-driven perceptual audio and video quality assessment methods independently, and investigate relevant issues in developing joint audio–visual quality metrics. Experiments with respect to subjective quality results have been conducted for analyzing and comparing the performance of the quality metrics. We consider emerging trends in audio–visual quality assessment, and propose feasible solutions for future work in perceptual-based audio–visual quality metrics. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596510000299 [article] Perceptual-based quality assessment for audio–visual services : A survey [texte imprimé] / Junyong You, Auteur ; Ulrich Reiter, Auteur ; Miska M. Hannuksela, Auteur . - 2012 . - pp. 482–501.
Electronique
Langues : Anglais (eng)
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 482–501
Mots-clés : Objective quality metric Subjective quality assessment HVS Psychophysical approach Engineering approach Perception Alignment PEAQ Semantic importance Résumé : Accurate measurement of the perceived quality of audio–visual services at the end-user is becoming a crucial issue in digital applications due to the growing demand for compression and transmission of audio–visual services over communication networks. Content providers strive to offer the best quality of experience for customers linked to their different quality of service (QoS) solutions. Therefore, developing accurate, perceptual-based quality metrics is a key requirement in multimedia services. In this paper, we survey state-of-the-art signal-driven perceptual audio and video quality assessment methods independently, and investigate relevant issues in developing joint audio–visual quality metrics. Experiments with respect to subjective quality results have been conducted for analyzing and comparing the performance of the quality metrics. We consider emerging trends in audio–visual quality assessment, and propose feasible solutions for future work in perceptual-based audio–visual quality metrics. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596510000299 No-reference perceptual image quality metric using gradient profiles for JPEG2000 / Luhong Liang in Signal processing. Image communication, Vol. 25 N° 7 (Août 2010)
[article]
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 502–516
Titre : No-reference perceptual image quality metric using gradient profiles for JPEG2000 Type de document : texte imprimé Auteurs : Luhong Liang, Auteur ; Shiqi Wang, Auteur ; Jianhua Chen, Auteur Année de publication : 2012 Article en page(s) : pp. 502–516 Note générale : Electronique Langues : Anglais (eng) Mots-clés : Blur Ringing Perceptual quality Gradient profile Image compression JPEG2000 Résumé : No-reference measurement of perceptual image quality is a crucial and challenging issue in modern image processing applications. One of the major difficulties is that some inherent features of natural images and artifacts are possibly rather ambiguous. In this paper, we tackle this problem using statistical information on image gradient profiles and propose a novel quality metric for JPEG2000 images. The key part of the metric is a histogram representing the sharpness distribution of the gradient profiles, from which a blur metric that is insensitive to inherently blurred structures in the natural image is established. Then a ringing metric is built based on ringing visibilities of regions associated with the gradient profiles. Finally, a combination model optimized through plenty of experiments is developed to predict the perceived image quality. The proposed metric achieves performance competitive with the state-of-the-art no-reference metrics on public datasets and is robust to various image contents. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596510000160 [article] No-reference perceptual image quality metric using gradient profiles for JPEG2000 [texte imprimé] / Luhong Liang, Auteur ; Shiqi Wang, Auteur ; Jianhua Chen, Auteur . - 2012 . - pp. 502–516.
Electronique
Langues : Anglais (eng)
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 502–516
Mots-clés : Blur Ringing Perceptual quality Gradient profile Image compression JPEG2000 Résumé : No-reference measurement of perceptual image quality is a crucial and challenging issue in modern image processing applications. One of the major difficulties is that some inherent features of natural images and artifacts are possibly rather ambiguous. In this paper, we tackle this problem using statistical information on image gradient profiles and propose a novel quality metric for JPEG2000 images. The key part of the metric is a histogram representing the sharpness distribution of the gradient profiles, from which a blur metric that is insensitive to inherently blurred structures in the natural image is established. Then a ringing metric is built based on ringing visibilities of regions associated with the gradient profiles. Finally, a combination model optimized through plenty of experiments is developed to predict the perceived image quality. The proposed metric achieves performance competitive with the state-of-the-art no-reference metrics on public datasets and is robust to various image contents. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596510000160 Content-partitioned structural similarity index for image quality assessment / Chaofeng Li in Signal processing. Image communication, Vol. 25 N° 7 (Août 2010)
[article]
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 517–526
Titre : Content-partitioned structural similarity index for image quality assessment Type de document : texte imprimé Auteurs : Chaofeng Li, Auteur ; Alan C. Bovik, Auteur Année de publication : 2012 Article en page(s) : pp. 517–526 Note générale : Electronique Langues : Anglais (eng) Mots-clés : Four-component image model Image quality assessment Structural similarity (SSIM) Multi-scale structural similarity (MS-SSIM) Gradient structural similarity (G-SSIM) Résumé : The assessment of image quality is important in numerous image processing applications. Two prominent examples, the Structural Similarity Image (SSIM) index and Multi-scale Structural Similarity (MS-SSIM) operate under the assumption that human visual perception is highly adapted for extracting structural information from a scene. Results in large human studies have shown that these quality indices perform very well relative to other methods. However, the performance of SSIM and other Image Quality Assessment (IQA) algorithms are less effective when used to rate blurred and noisy images. We address this defect by considering a four-component image model that classifies image local regions according to edge and smoothness properties. In our approach, SSIM scores are weighted by region type, leading to modified versions of (G-)SSIM and MS-(G-)SSIM, called four-component (G-)SSIM (4-(G-)SSIM) and four-component MS-(G-)SSIM (4-MS-(G-)SSIM). Our experimental results show that our new approach provides results that are highly consistent with human subjective judgment of the quality of blurred and noisy images, and also deliver better overall performance than (G-)SSIM and MS-(G-)SSIM on the LIVE Image Quality Assessment Database. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596510000354 [article] Content-partitioned structural similarity index for image quality assessment [texte imprimé] / Chaofeng Li, Auteur ; Alan C. Bovik, Auteur . - 2012 . - pp. 517–526.
Electronique
Langues : Anglais (eng)
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 517–526
Mots-clés : Four-component image model Image quality assessment Structural similarity (SSIM) Multi-scale structural similarity (MS-SSIM) Gradient structural similarity (G-SSIM) Résumé : The assessment of image quality is important in numerous image processing applications. Two prominent examples, the Structural Similarity Image (SSIM) index and Multi-scale Structural Similarity (MS-SSIM) operate under the assumption that human visual perception is highly adapted for extracting structural information from a scene. Results in large human studies have shown that these quality indices perform very well relative to other methods. However, the performance of SSIM and other Image Quality Assessment (IQA) algorithms are less effective when used to rate blurred and noisy images. We address this defect by considering a four-component image model that classifies image local regions according to edge and smoothness properties. In our approach, SSIM scores are weighted by region type, leading to modified versions of (G-)SSIM and MS-(G-)SSIM, called four-component (G-)SSIM (4-(G-)SSIM) and four-component MS-(G-)SSIM (4-MS-(G-)SSIM). Our experimental results show that our new approach provides results that are highly consistent with human subjective judgment of the quality of blurred and noisy images, and also deliver better overall performance than (G-)SSIM and MS-(G-)SSIM on the LIVE Image Quality Assessment Database. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596510000354 Deblocking filtering method using a perceptual map / Aladine Chetouani in Signal processing. Image communication, Vol. 25 N° 7 (Août 2010)
[article]
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 527–534
Titre : Deblocking filtering method using a perceptual map Type de document : texte imprimé Auteurs : Aladine Chetouani, Auteur ; Ghiles Mostafaoui, Auteur ; Azeddine Beghdadi, Auteur Année de publication : 2012 Article en page(s) : pp. 527–534 Note générale : Electronique Langues : Anglais (eng) Mots-clés : Blocking effect Deblocking HVS Masking effect Résumé : A new method of deblocking is proposed. It aims to reduce the blocking artifacts in the compressed image by analyzing their visibility. A perceptual map is obtained using some Human Visual System (HVS) characteristics. This perceptual map is used as input to a recursive filter to reduce the blocking effect. The obtained results have been compared with a very recent efficient method. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596509001222 [article] Deblocking filtering method using a perceptual map [texte imprimé] / Aladine Chetouani, Auteur ; Ghiles Mostafaoui, Auteur ; Azeddine Beghdadi, Auteur . - 2012 . - pp. 527–534.
Electronique
Langues : Anglais (eng)
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 527–534
Mots-clés : Blocking effect Deblocking HVS Masking effect Résumé : A new method of deblocking is proposed. It aims to reduce the blocking artifacts in the compressed image by analyzing their visibility. A perceptual map is obtained using some Human Visual System (HVS) characteristics. This perceptual map is used as input to a recursive filter to reduce the blocking effect. The obtained results have been compared with a very recent efficient method. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596509001222 Modelling of spatio–temporal interaction for video quality assessment / Quan Huynh-Thu in Signal processing. Image communication, Vol. 25 N° 7 (Août 2010)
[article]
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 535–546
Titre : Modelling of spatio–temporal interaction for video quality assessment Type de document : texte imprimé Auteurs : Quan Huynh-Thu, Auteur ; Mohammed Ghanbari, Auteur Année de publication : 2012 Article en page(s) : pp. 535–546 Note générale : Electronique Langues : Anglais (eng) Mots-clés : Video Subjective quality Objective model Spatial quality Temporal quality Spatio–temporal interaction Résumé : Video services have appeared in the recent years due to advances in video coding and convergence to IP networks. As these emerging services mature, the ability to deliver adequate quality to end-users becomes increasingly important. However, the transmission of digital video over error-prone and bandwidth-limited networks may produce spatial and temporal visual distortions in the decoded video. Both types of impairments affect the perceived video quality. In this paper, we examine the impact of spatio–temporal artefacts in video and especially how both types of errors interact to affect the overall perceived video quality. We show that the impact of the spatial quality on overall video quality is dependent on the temporal quality and vice-versa. We observe that the introduction of a degradation in one modality affects the quality perception in the other modality, and this change is larger for high-quality conditions than for low-quality conditions. The contribution of the spatial quality to the overall quality is found to be greater than the contribution of the temporal quality. Our results also indicate that low-motion talking-head content can be more negatively affected by temporal frame freezing artefacts than other general type of content with higher motion. Based on the results of a subjective experiment, we propose an objective model to predict overall video quality by integrating the contributions of a spatial quality and a temporal quality. The non-linear model shows a very high linear correlation with subjective data. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596510000378 [article] Modelling of spatio–temporal interaction for video quality assessment [texte imprimé] / Quan Huynh-Thu, Auteur ; Mohammed Ghanbari, Auteur . - 2012 . - pp. 535–546.
Electronique
Langues : Anglais (eng)
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 535–546
Mots-clés : Video Subjective quality Objective model Spatial quality Temporal quality Spatio–temporal interaction Résumé : Video services have appeared in the recent years due to advances in video coding and convergence to IP networks. As these emerging services mature, the ability to deliver adequate quality to end-users becomes increasingly important. However, the transmission of digital video over error-prone and bandwidth-limited networks may produce spatial and temporal visual distortions in the decoded video. Both types of impairments affect the perceived video quality. In this paper, we examine the impact of spatio–temporal artefacts in video and especially how both types of errors interact to affect the overall perceived video quality. We show that the impact of the spatial quality on overall video quality is dependent on the temporal quality and vice-versa. We observe that the introduction of a degradation in one modality affects the quality perception in the other modality, and this change is larger for high-quality conditions than for low-quality conditions. The contribution of the spatial quality to the overall quality is found to be greater than the contribution of the temporal quality. Our results also indicate that low-motion talking-head content can be more negatively affected by temporal frame freezing artefacts than other general type of content with higher motion. Based on the results of a subjective experiment, we propose an objective model to predict overall video quality by integrating the contributions of a spatial quality and a temporal quality. The non-linear model shows a very high linear correlation with subjective data. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596510000378 Overt visual attention for free-viewing and quality assessment tasks / O. Le Meur in Signal processing. Image communication, Vol. 25 N° 7 (Août 2010)
[article]
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 547–558
Titre : Overt visual attention for free-viewing and quality assessment tasks : Impact of the regions of interest on a video quality metric Type de document : texte imprimé Auteurs : O. Le Meur, Auteur ; A. Ninassi, Auteur ; P. Le Callet, Auteur Année de publication : 2012 Article en page(s) : pp. 547–558 Note générale : Electronique Langues : Anglais (eng) Mots-clés : Visual attention Video quality assessment Video quality metric Free-viewing and quality tasks Résumé : The aim of this study is to understand how people watch a video sequence during free-viewing and quality assessment tasks. To this end, two eye tracking experiments were carried out. The video dataset is composed of 10 original video sequences and 50 impaired video sequences (five levels of impairments obtained by a H.264 video compression). A first experiment consisted in recording eye movements in a free-viewing task. The 10 original video sequences were used. The second experiment concerned an eye tracking experiment in a context of a subjective quality assessment. Eye movements were recorded while observers judged on the quality of the 50 impaired video sequences. The comparison between gaze allocations indicates the quality task has a moderate impact on the visual attention deployment. This impact increases with the presentation number of impaired video sequences. The locations of regions of interest remain highly similar after several presentations of the same video sequence, suggesting that eye movements are still driven by the low level visual features after several viewings. In addition, the level of distortion does not significantly alter the oculomotor behavior. Finally, we modified the pooling of an objective full-reference video quality metric by adjusting the weight applied on the distortions. This adjustment depends on the visual importance (the visual importance is deduced from the eye tracking experiment realized on the impaired video sequences). We observe that a saliency-based distortion pooling does not significantly improve the performances of the video quality metric. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596510000561 [article] Overt visual attention for free-viewing and quality assessment tasks : Impact of the regions of interest on a video quality metric [texte imprimé] / O. Le Meur, Auteur ; A. Ninassi, Auteur ; P. Le Callet, Auteur . - 2012 . - pp. 547–558.
Electronique
Langues : Anglais (eng)
in Signal processing. Image communication > Vol. 25 N° 7 (Août 2010) . - pp. 547–558
Mots-clés : Visual attention Video quality assessment Video quality metric Free-viewing and quality tasks Résumé : The aim of this study is to understand how people watch a video sequence during free-viewing and quality assessment tasks. To this end, two eye tracking experiments were carried out. The video dataset is composed of 10 original video sequences and 50 impaired video sequences (five levels of impairments obtained by a H.264 video compression). A first experiment consisted in recording eye movements in a free-viewing task. The 10 original video sequences were used. The second experiment concerned an eye tracking experiment in a context of a subjective quality assessment. Eye movements were recorded while observers judged on the quality of the 50 impaired video sequences. The comparison between gaze allocations indicates the quality task has a moderate impact on the visual attention deployment. This impact increases with the presentation number of impaired video sequences. The locations of regions of interest remain highly similar after several presentations of the same video sequence, suggesting that eye movements are still driven by the low level visual features after several viewings. In addition, the level of distortion does not significantly alter the oculomotor behavior. Finally, we modified the pooling of an objective full-reference video quality metric by adjusting the weight applied on the distortions. This adjustment depends on the visual importance (the visual importance is deduced from the eye tracking experiment realized on the impaired video sequences). We observe that a saliency-based distortion pooling does not significantly improve the performances of the video quality metric. ISSN : 0923-5965 En ligne : http://www.sciencedirect.com/science/article/pii/S0923596510000561
Exemplaires
Code-barres | Cote | Support | Localisation | Section | Disponibilité |
---|---|---|---|---|---|
aucun exemplaire |