Identity statement area
Reference TypeConference Proceedings
Last Update2018: administrator
Metadata Last Update2020: administrator
Citation KeyBentoSouzFray:2018:AuApEv
TitleMulticenter Imaging Studies: Automated Approach to Evaluating Data Variability and the Role of Outliers
DateOct. 29 - Nov. 1, 2018
Access Date2020, Dec. 02
Number of Files1
Size1017 KiB
Context area
Author1 Bento, Mariana
2 Souza, Roberto
3 Frayne, Richard
Affiliation1 University of Calgary
2 University of Calgary
3 University of Calgary
EditorRoss, Arun
Gastal, Eduardo S. L.
Jorge, Joaquim A.
Queiroz, Ricardo L. de
Minetto, Rodrigo
Sarkar, Sudeep
Papa, João Paulo
Oliveira, Manuel M.
Arbeláez, Pablo
Mery, Domingo
Oliveira, Maria Cristina Ferreira de
Spina, Thiago Vallin
Mendes, Caroline Mazetto
Costa, Henrique Sérgio Gutierrez
Mejail, Marta Estela
Geus, Klaus de
Scheer, Sergio
Conference NameConference on Graphics, Patterns and Images, 31 (SIBGRAPI)
Conference LocationFoz do Iguaçu, PR, Brazil
Book TitleProceedings
PublisherIEEE Computer Society
Publisher CityLos Alamitos
History2018-08-31 15:33:34 :: -> administrator :: 2018
2020-02-19 03:10:44 :: administrator -> :: 2018
Content and structure area
Is the master or a copy?is the master
Document Stagecompleted
Document Stagenot transferred
Content TypeExternal Contribution
Tertiary TypeFull Paper
Keywordsmulticenter MR data, outlier detection, data variability.
AbstractMagnetic resonance (MR) as well as other imaging modalities have been used in a large number of clinical and research studies for the analysis and quantification of important structures and the detection of abnormalities. In this context, machine learning is playing an increasingly important role in the development of automated tools for aiding in image quantification, patient diagnosis and follow-up. Normally, these techniques require large, heterogeneous datasets to provide accurate and generalizable results. Large, multi-center studies, for example, can provide such data. Images acquired at different centers, however, can present varying characteristics due to differences in acquisition parameters, site procedures and scanners configuration. While variability in the dataset is required to develop robust, generalizable studies (i.e., independent of the acquisition parameters or center), like all studies there is also a need to ensure overall data quality by prospectively identifying and removing poor-quality data samples that should not be included, e.g., outliers. We wish to keep image samples that are representative of the underlying population (so called inliers), yet removing those samples that are not. We propose a framework to analyze data variability and identify samples that should be removed in order to have more representative, reliable and robust datasets. Our example case study is based on a public dataset containing T1-weighted volumetric head images data acquired at six different centers, using three different scanner vendors and at two commonly used magnetic fields strengths. We propose an algorithm for assessing data robustness and finding the optimal data for study occlusion (i.e., the data size that presents with lowest variability while maintaining generalizability (i.e., using samples from all sites)).
source Directory Contentthere are no files
agreement Directory Content
agreement.html 24/08/2018 13:12 1.2 KiB 
Conditions of access and use area
Target File57_manuscript.pdf
Allied materials area
Next Higher Units8JMKD3MGPAW/3RPADUS
Notes area
Empty Fieldsaccessionnumber archivingpolicy archivist area callnumber copyholder copyright creatorhistory descriptionlevel dissemination doi edition electronicmailaddress group holdercode isbn issn label lineage mark nextedition notes numberofvolumes orcid organization pages parameterlist parentrepositories previousedition previouslowerunit progress project readergroup readpermission resumeid rightsholder secondarydate secondarykey secondarymark secondarytype serieseditor session shorttitle sponsor subject tertiarymark type url versiontype volume