Show simple item record

dc.contributor.authorFernández-Pena, Alberto
dc.contributor.authorMartín de Blas, Daniel
dc.contributor.authorNavas-Sánchez, Francisco J
dc.contributor.authorMarcos-Vidal, Luis
dc.contributor.authorM Gordaliza, Pedro
dc.contributor.authorSantonja, Javier
dc.contributor.authorJanssen, Joost
dc.contributor.authorCarmona, Susanna
dc.contributor.authorDesco, Manuel 
dc.contributor.authorAlemán-Gómez, Yasser
dc.date.accessioned2023-03-22T11:27:22Z
dc.date.available2023-03-22T11:27:22Z
dc.date.issued2023-01
dc.identifier.citationNeuroinformatics. 2023 Jan;21(1):145-162es_ES
dc.identifier.urihttp://hdl.handle.net/20.500.12105/15682
dc.description.abstractThe archetypical folded shape of the human cortex has been a long-standing topic for neuroscientific research. Nevertheless, the accurate neuroanatomical segmentation of sulci remains a challenge. Part of the problem is the uncertainty of where a sulcus transitions into a gyrus and vice versa. This problem can be avoided by focusing on sulcal fundi and gyral crowns, which represent the topological opposites of cortical folding. We present Automated Brain Lines Extraction (ABLE), a method based on Laplacian surface collapse to reliably segment sulcal fundi and gyral crown lines. ABLE is built to work on standard FreeSurfer outputs and eludes the delineation of anastomotic sulci while maintaining sulcal fundi lines that traverse the regions with the highest depth and curvature. First, it segments the cortex into gyral and sulcal surfaces; then, each surface is spatially filtered. A Laplacian-collapse-based algorithm is applied to obtain a thinned representation of the surfaces. This surface is then used for careful detection of the endpoints of the lines. Finally, sulcal fundi and gyral crown lines are obtained by eroding the surfaces while preserving the connectivity between the endpoints. The method is validated by comparing ABLE with three other sulcal extraction methods using the Human Connectome Project (HCP) test-retest database to assess the reproducibility of the different tools. The results confirm ABLE as a reliable method for obtaining sulcal lines with an accurate representation of the sulcal topology while ignoring anastomotic branches and the overestimation of the sulcal fundi lines. ABLE is publicly available via https://github.com/HGGM-LIM/ABLE .es_ES
dc.description.sponsorshipThis work was supported by the project exAScale ProgramIng models for extreme Data procEssing (ASPIDE), that has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No 801091. This work has received funding from “la Caixa” Foundation under the project code LCF/PR/HR19/52160001. Susanna Carmona funded by Instituto de Salud Carlos III, co-funded by European Social Fund “Investing in your future” (Miguel Servet Type I research contract CP16/00096). The CNIC is supported by the Instituto de Salud Carlos III (ISCIII), the Ministerio de Ciencia e Innovación (MCIN) and the Pro CNIC Foundation, and is a Severo Ochoa Center of Excellence (SEV-2015-0505). Yasser Alemán-Gómez is supported by the Swiss National Science Foundation (185897) and the National Center of Competence in Research (NCCR) SYNAPSY - The Synaptic Bases of Mental Diseases, funded as well by the Swiss National Science Foundation (51AU40-1257).es_ES
dc.language.isoenges_ES
dc.publisherHumana Press es_ES
dc.type.hasVersionVoRes_ES
dc.rights.urihttp://creativecommons.org/licenses/by/4.0/*
dc.subject.meshMagnetic Resonance Imaging es_ES
dc.subject.meshConnectome es_ES
dc.subject.meshHumans es_ES
dc.subject.meshReproducibility of Results es_ES
dc.subject.meshCerebral Cortex es_ES
dc.subject.meshBrain es_ES
dc.titleABLE: Automated Brain Lines Extraction Based on Laplacian Surface Collapse.es_ES
dc.typejournal articlees_ES
dc.rights.licenseAtribución 4.0 Internacional*
dc.identifier.pubmedID36008650es_ES
dc.format.volume21es_ES
dc.format.number1es_ES
dc.format.page145es_ES
dc.identifier.doi10.1007/s12021-022-09601-7es_ES
dc.contributor.funderUnión Europea. Comisión Europea. H2020 es_ES
dc.contributor.funderFundación La Caixa es_ES
dc.contributor.funderInstituto de Salud Carlos III es_ES
dc.contributor.funderUnión Europea. Fondo Social Europeo (ESF/FSE) es_ES
dc.contributor.funderFundación ProCNIC es_ES
dc.contributor.funderMinisterio de Ciencia e Innovación. Centro de Excelencia Severo Ochoa (España) es_ES
dc.contributor.funderSwiss National Science Foundation es_ES
dc.description.peerreviewedes_ES
dc.identifier.e-issn1559-0089es_ES
dc.relation.publisherversion10.1007/s12021-022-09601-7es_ES
dc.identifier.journalNeuroinformaticses_ES
dc.repisalud.orgCNICCNIC::Unidades técnicas::Imagen Avanzadaes_ES
dc.repisalud.institucionCNICes_ES
dc.relation.projectIDinfo:eu-repo/grantAgreement/EC/H2020/801091es_ES
dc.rights.accessRightsopen accesses_ES
dc.relation.projectFECYTinfo:eu-repo/grantAgreement/ES/LCF/PR/HR19/52160001es_ES
dc.relation.projectFECYTinfo:eu-repo/grantAgreement/ES/CP16/00096es_ES
dc.relation.projectFECYTinfo:eu-repo/grantAgreement/ES/SEV-2015-0505es_ES


Files in this item

Acceso Abierto
Thumbnail

This item appears in the following Collection(s)

Show simple item record

Atribución 4.0 Internacional
This item is licensed under a: Atribución 4.0 Internacional