<?xml version="1.0" encoding="ISO-8859-1"?>
<metadatalist>
	<metadata ReferenceType="Conference Proceedings">
		<site>sibgrapi.sid.inpe.br 802</site>
		<identifier>8JMKD3MGPEW34M/438DG7H</identifier>
		<repository>sid.inpe.br/sibgrapi/2020/09.11.16.10</repository>
		<lastupdate>2020:10.01.19.25.59 sid.inpe.br/banon/2001/03.30.15.38 rafapires@gmail.com</lastupdate>
		<metadatarepository>sid.inpe.br/sibgrapi/2020/09.11.16.10.02</metadatarepository>
		<metadatalastupdate>2020:10.28.20.46.47 sid.inpe.br/banon/2001/03.30.15.38 administrator {D 2020}</metadatalastupdate>
		<citationkey>PiresSanSanSanPap:2020:ImDeUs</citationkey>
		<title>Image Denoising using Attention-Residual Convolutional Neural Networks</title>
		<format>On-line</format>
		<year>2020</year>
		<date>Nov. 7-10, 2020</date>
		<numberoffiles>1</numberoffiles>
		<size>1980 KiB</size>
		<author>Pires, Rafael Gonçalves,</author>
		<author>Santos, Daniel Felipe Silva,</author>
		<author>Santana, Marcos Cleison Silva,</author>
		<author>Santos, Claudio Filipe Gonçalves dos,</author>
		<author>Papa, João Paulo,</author>
		<affiliation>São Paulo State University (UNESP)</affiliation>
		<affiliation>São Paulo State University (UNESP)</affiliation>
		<affiliation>São Paulo State University (UNESP)</affiliation>
		<affiliation>Federal University of São Carlos (UFSCAR)</affiliation>
		<affiliation>São Paulo State University (UNESP)</affiliation>
		<editor>Musse, Soraia Raupp,</editor>
		<editor>Cesar Junior, Roberto Marcondes,</editor>
		<editor>Pelechano, Nuria,</editor>
		<editor>Wang, Zhangyang (Atlas),</editor>
		<e-mailaddress>rafapires@gmail.com</e-mailaddress>
		<conferencename>Conference on Graphics, Patterns and Images, 33 (SIBGRAPI)</conferencename>
		<conferencelocation>Virtual</conferencelocation>
		<booktitle>Proceedings</booktitle>
		<publisher>IEEE Computer Society</publisher>
		<publisheraddress>Los Alamitos</publisheraddress>
		<documentstage>not transferred</documentstage>
		<transferableflag>1</transferableflag>
		<contenttype>External Contribution</contenttype>
		<tertiarytype>Full Paper</tertiarytype>
		<keywords>image restoration, deep learning.</keywords>
		<abstract>During the image acquisition process, noise is usually added to the data mainly due to physical limitations of the acquisition sensor, and also regarding imprecisions during the data transmission and manipulation. In that sense, the resultant image needs to be processed to attenuate its noise without losing details. Non-learning-based strategies such as filter-based and noise prior modeling have been adopted to solve the image denoising problem. Nowadays, learning-based denoising techniques showed to be much more effective and flexible approaches, suchas Residual Convolutional Neural Networks. Here, we propose a new learning-based non-blind denoising technique named Attention Residual Convolutional Neural Network (ARCNN), and its extension to blind denoising named Flexible Attention Residual Convolutional Neural Network (FARCNN). The proposed methods try to learn the underlying noise expectation using an Attention-Residual mechanism. Experiments on public datasets corrupted by different levels of Gaussian and Poisson noise support the effectiveness of the proposed approaches against some state-of-the-art image denoising methods. ARCNN achieved an overall average PSNR results of around 0.44dB and 0.96dB for Gaussian and Poisson denoising, respectively FARCNN presented very consistent results, even with slightly worsen performance compared to ARCNN.</abstract>
		<language>en</language>
		<targetfile>PID6634881.pdf</targetfile>
		<username>rafapires@gmail.com</username>
		<usergroup>rafapires@gmail.com</usergroup>
		<visibility>shown</visibility>
		<mirrorrepository>sid.inpe.br/banon/2001/03.30.15.38.24</mirrorrepository>
		<nexthigherunit>8JMKD3MGPEW34M/43G4L9S</nexthigherunit>
		<hostcollection>sid.inpe.br/banon/2001/03.30.15.38</hostcollection>
		<agreement>agreement.html .htaccess .htaccess2</agreement>
		<lasthostcollection>sid.inpe.br/banon/2001/03.30.15.38</lasthostcollection>
		<url>http://sibgrapi.sid.inpe.br/rep-/sid.inpe.br/sibgrapi/2020/09.11.16.10</url>
	</metadata>
</metadatalist>