Close

1. Identity statement
Reference TypeConference Paper (Conference Proceedings)
Sitesibgrapi.sid.inpe.br
Holder Codeibi 8JMKD3MGPEW34M/46T9EHH
Identifier8JMKD3MGPEW34M/43BCFTS
Repositorysid.inpe.br/sibgrapi/2020/09.29.21.31
Last Update2020:09.29.21.31.15 (UTC) administrator
Metadata Repositorysid.inpe.br/sibgrapi/2020/09.29.21.31.15
Metadata Last Update2022:06.14.00.00.12 (UTC) administrator
DOI10.1109/SIBGRAPI51738.2020.00044
Citation KeyMagalhăesSilGomMarSil:2020:EvEmWi
TitleEvaluating the Emergence of Winning Tickets by Structured Pruning of Convolutional Networks
FormatOn-line
Year2020
Access Date2024, Oct. 15
Number of Files1
Size182 KiB
2. Context
Author1 Magalhăes, Whendell Feijó
2 Silva, Jeferson Ferreira da
3 Gomes, Herman Martins
4 Marinho, Leandro Balby
5 Silveira, Plínio
Affiliation1 Federal University of Campina Grande
2 Federal University of Campina Grande
3 Federal University of Campina Grande
4 Federal University of Campina Grande
5 Hewlett Packard Enterprise, Brazil
EditorMusse, Soraia Raupp
Cesar Junior, Roberto Marcondes
Pelechano, Nuria
Wang, Zhangyang (Atlas)
e-Mail Addresswhendell@copin.ufcg.edu.br
Conference NameConference on Graphics, Patterns and Images, 33 (SIBGRAPI)
Conference LocationPorto de Galinhas (virtual)
Date7-10 Nov. 2020
PublisherIEEE Computer Society
Publisher CityLos Alamitos
Book TitleProceedings
Tertiary TypeFull Paper
History (UTC)2020-09-29 21:31:15 :: whendell@copin.ufcg.edu.br -> administrator ::
2022-06-14 00:00:12 :: administrator -> whendell@copin.ufcg.edu.br :: 2020
3. Content and structure
Is the master or a copy?is the master
Content Stagecompleted
Transferable1
Version Typefinaldraft
Keywordsneural network compression
structured pruning
winning tickets
weight rewinding
learning rate rewinding
AbstractThe recently introduced Lottery Ticket Hypothesis has created a new investigation front in neural network pruning. The hypothesis states that it is possible to find subnetworks with high generalization capabilities (winning tickets) from an over-parameterized neural network. One step of the algorithm implementing the hypothesis requires resetting the weights of the pruned network to their initial random values. More recent variations of this step may involve: (i) resetting the weights to the values they had at an early epoch of the unpruned network training, or (ii) keeping the final training weights and resetting only the learning rate schedule. Despite some studies have investigated the above variations, mostly with unstructured pruning, we do not know of existing evaluations focusing on structured pruning regarding local and global pruning variations. In this context, this paper presents novel empirical evidence that it is possible to obtain winning tickets when performing structured pruning of convolutional neural networks. We setup an experiment using the VGG-16 network trained on the CIFAR-10 dataset and compared networks (pruned at different compression levels) got by weight rewinding and learning rate rewinding methods, under local and global pruning regimes. We use the unpruned network as baseline and also compare the resulting pruned networks with their versions trained with randomly initialized weights. Overall, local pruning failed to find winning tickets for both rewinding methods. When using global pruning, weight rewinding produced a few winning tickets (limited to low pruning levels only) and performed nearly the same or worse compared to random initialization. Learning rate rewinding, under global pruning, produced the best results, since it has found winning tickets at most pruning levels and outperformed the baseline.
Arrangement 1urlib.net > SDLA > Fonds > SIBGRAPI 2020 > Evaluating the Emergence...
Arrangement 2urlib.net > SDLA > Fonds > Full Index > Evaluating the Emergence...
doc Directory Contentaccess
source Directory Contentthere are no files
agreement Directory Content
agreement.html 29/09/2020 18:31 1.2 KiB 
4. Conditions of access and use
data URLhttp://urlib.net/ibi/8JMKD3MGPEW34M/43BCFTS
zipped data URLhttp://urlib.net/zip/8JMKD3MGPEW34M/43BCFTS
Languageen
Target File133.pdf
User Groupwhendell@copin.ufcg.edu.br
Visibilityshown
Update Permissionnot transferred
5. Allied materials
Mirror Repositorysid.inpe.br/banon/2001/03.30.15.38.24
Next Higher Units8JMKD3MGPEW34M/43G4L9S
8JMKD3MGPEW34M/4742MCS
Citing Item Listsid.inpe.br/sibgrapi/2020/10.28.20.46 33
sid.inpe.br/sibgrapi/2022/06.10.21.49 4
sid.inpe.br/banon/2001/03.30.15.38.24 1
Host Collectionsid.inpe.br/banon/2001/03.30.15.38
6. Notes
Empty Fieldsarchivingpolicy archivist area callnumber contenttype copyholder copyright creatorhistory descriptionlevel dissemination edition electronicmailaddress group isbn issn label lineage mark nextedition notes numberofvolumes orcid organization pages parameterlist parentrepositories previousedition previouslowerunit progress project readergroup readpermission resumeid rightsholder schedulinginformation secondarydate secondarykey secondarymark secondarytype serieseditor session shorttitle sponsor subject tertiarymark type url volume
7. Description control
e-Mail (login)whendell@copin.ufcg.edu.br
update 


Close