Close
Metadata

Identity statement area
Reference TypeConference Paper (Conference Proceedings)
Sitesibgrapi.sid.inpe.br
Identifier8JMKD3MGPEW34M/45CUTES
Repositorysid.inpe.br/sibgrapi/2021/09.06.22.34
Last Update2021:09.06.22.53.01 (UTC) moacirponti@gmail.com
Metadatasid.inpe.br/sibgrapi/2021/09.06.22.34.57
Metadata Last Update2021:11.12.11.47.14 (UTC) administrator
Citation KeyPontiSantRibeCava:2021:AvPiGo
TitleTraining Deep Networks from Zero to Hero: avoiding pitfalls and going beyond
FormatOn-line
Year2021
Access Date2022, Jan. 22
Number of Files1
Size1275 KiB
Context area
Author1 Ponti, Moacir Antonelli
2 Santos, Fernando Pereira dos
3 Ribeiro, Leo Sampaio Ferraz
4 Cavallari, Gabriel Biscaro
Affiliation1 Universidade de São Paulo
2 Universidade de São Paulo
3 Universidade de São Paulo
4 Universidade de São Paulo
EditorPaiva, Afonso
Menotti, David
Baranoski, Gladimir V. G.
Proença, Hugo Pedro
Junior, Antonio Lopes Apolinario
Papa, João Paulo
Pagliosa, Paulo
dos Santos, Thiago Oliveira
e Sá, Asla Medeiros
da Silveira, Thiago Lopes Trugillo
Brazil, Emilio Vital
Ponti, Moacir A.
Fernandes, Leandro A. F.
Avila, Sandra
e-Mail Addressmoacirponti@gmail.com
Conference NameConference on Graphics, Patterns and Images, 34 (SIBGRAPI)
Conference LocationGramado (Virtual), Brazil
DateOctober 18th to October 22nd, 2021
PublisherIEEE Computer Society
Publisher CityLos Alamitos
Book TitleProceedings
Tertiary TypeTutorial
History (UTC)2021-09-06 22:53:01 :: moacirponti@gmail.com -> administrator :: 2021
2021-11-12 11:47:14 :: administrator -> moacirponti@gmail.com :: 2021
Content and structure area
Is the master or a copy?is the master
Content Stagecompleted
Transferable1
Content TypeExternal Contribution
KeywordsDeep Learning
Convolutional Networks
Survey
Training
AbstractTraining deep neural networks may be challenging in real world data. Using models as black-boxes, even with transfer learning, can result in poor generalization or inconclusive results when it comes to small datasets or specific applications. This tutorial covers the basic steps as well as more recent options to improve models, in particular, but not restricted to, supervised learning. It can be particularly useful in datasets that are not as well-prepared as those in challenges, and also under scarce annotation and/or small data. We describe basic procedures as data preparation, optimization and transfer learning, but also recent architectural choices such as use of transformer modules, alternative convolutional layers, activation functions, wide/depth, as well as training procedures including curriculum, contrastive and self-supervised learning.
Arrangementurlib.net > SDLA > SIBGRAPI 2021 > Training Deep Networks...
doc Directory Contentaccess
source Directory Contentthere are no files
agreement Directory Content
agreement.html 06/09/2021 19:34 1.3 KiB 
Conditions of access and use area
data URLhttp://sibgrapi.sid.inpe.br/ibi/8JMKD3MGPEW34M/45CUTES
zipped data URLhttp://sibgrapi.sid.inpe.br/zip/8JMKD3MGPEW34M/45CUTES
Languageen
Target File2021_sibgrapi__tutorial_CR.pdf
User Groupmoacirponti@gmail.com
Visibilityshown
Allied materials area
Mirror Repositorysid.inpe.br/banon/2001/03.30.15.38.24
Next Higher Units8JMKD3MGPEW34M/45PQ3RS
Host Collectionsid.inpe.br/banon/2001/03.30.15.38
Notes area
Empty Fieldsaccessionnumber archivingpolicy archivist area callnumber copyholder copyright creatorhistory descriptionlevel dissemination documentstage doi edition electronicmailaddress group holdercode isbn issn label lineage mark nextedition notes numberofvolumes orcid organization pages parameterlist parentrepositories previousedition previouslowerunit progress project readergroup readpermission resumeid rightsholder secondarydate secondarykey secondarymark secondarytype serieseditor session shorttitle sponsor subject tertiarymark type url versiontype volume
Description control area
e-Mail (login)moacirponti@gmail.com
update 

Close