You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Today there is no structure in brainstorm that can handle demographic data such as sex or age. As we might need it to perform group analysis I think it could be a good idea to discuss our need here before submitting the idea to brainstorm.
So i thought we could add a a new structure in the subject structure to handle those data. For example :
I think we should align with BIDS for this, see http://bids.neuroimaging.io/bids_spec.pdf and look for participants.tsv. Using a table with all participants is more convenient that splitting the info by subject: it's easier to load data, to spot missing data and to review.
Brainstorm made efforts to support BIDS for MEG data, see https://www.nature.com/articles/sdata2018110. However, they do not handle the participants table (I checked in the code, they only load subject-specific data files).
When reading the BIDS spec, it seems there is no standard specification for columns in participants.tsv. Instead, it relies on a side-car json file which provides some guidelines but does not seem to contain hard formatting constraints.
As to where to store this participant information, I would rather create a table in @group, since we'd load everything at once for all subjects.
Let's draft something on our side (wiki+code) and then ask people from brainstorm if they'd like to integrate.
Hi.
Today there is no structure in brainstorm that can handle demographic data such as sex or age. As we might need it to perform group analysis I think it could be a good idea to discuss our need here before submitting the idea to brainstorm.
So i thought we could add a a new structure in the subject structure to handle those data. For example :
Moreover, having access to demographics data could be usefull to process the MBLL.
Best,
Edouard
The text was updated successfully, but these errors were encountered: