1.. _space_net:
2
3==========================================================
4SpaceNet: decoding with spatial structure for better maps
5==========================================================
6
7The SpaceNet decoder
8=====================
9
10:class:`nilearn.decoding.SpaceNetRegressor` and :class:`nilearn.decoding.SpaceNetClassifier`
11implements spatial penalties which improve brain decoding power as well as decoder maps:
12
13* penalty="tvl1": priors inspired from TV (Total Variation) [:footcite:t:`michel:inria-00563468`], TV-L1 [:footcite:t:`Baldassarre2012`], [:footcite:t:`gramfort:hal-00839984`].
14
15* penalty="graph-net": GraphNet prior [:footcite:t:`GROSENICK2013304`].
16
17These regularize :term:`classification` and :term:`regression`
18problems in brain imaging. The results are brain maps which are both
19sparse (i.e regression coefficients are zero everywhere, except at
20predictive :term:`voxels<voxel>`) and structured (blobby). The superiority of TV-L1
21over methods without structured priors like the Lasso, :term:`SVM`, :term:`ANOVA`,
22Ridge, etc. for yielding more interpretable maps and improved
23prediction scores is now well established [:footcite:t:`Baldassarre2012`], [:footcite:t:`gramfort:hal-00839984`], [:footcite:t:`GROSENICK2013304`].
24
25Note that TV-L1 prior leads to a difficult optimization problem, and so can be slow to run.
26Under the hood, a few heuristics are used to make things a bit faster. These include:
27
28- Feature preprocessing, where an F-test is used to eliminate
29  non-predictive :term:`voxels<voxel>`, thus reducing the size of the brain
30  mask in a principled way.
31- Continuation is used along the regularization path, where the
32  solution of the optimization problem for a given value of the
33  regularization parameter `alpha` is used as initialization
34  for the next regularization (smaller) value on the regularization
35  grid.
36
37**Implementation:** See [:footcite:t:`dohmatob:hal-01147731`] and [:footcite:t:`dohmatob:hal-00991743`] for technical details regarding the implementation of SpaceNet.
38
39Related example
40===============
41
42:ref:`Age prediction on OASIS dataset with SpaceNet <sphx_glr_auto_examples_02_decoding_plot_oasis_vbm_space_net.py>`.
43
44.. figure:: ../auto_examples/02_decoding/images/sphx_glr_plot_oasis_vbm_space_net_002.png
45
46.. note::
47
48    Empirical comparisons using this method have been removed from
49    documentation in version 0.7 to keep its computational cost low. You can
50    easily try SpaceNet instead of FREM in :ref:`mixed gambles study <sphx_glr_auto_examples_02_decoding_plot_mixed_gambles_frem.py>` or :ref:`Haxby study <sphx_glr_auto_examples_02_decoding_plot_haxby_frem.py>`.
51
52.. seealso::
53
54    :ref:`FREM <frem>`, a pipeline ensembling many models that yields very
55    good decoding performance at a lower computational cost.
56
57References
58==========
59
60.. footbibliography::
61