<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="https://www.nitrc.org/themes/nitrc3.0/css/rss.xsl.php?feed=https://www.nitrc.org/export/rss20_forum.php?forum_id=8139" ?>
<?xml-stylesheet type="text/css" href="https://www.nitrc.org/themes/nitrc3.0/css/rss.css" ?>
<rss version="2.0"> <channel>
  <title>NITRC News Group Forum: combined-fmri--and-eye-movement-based-decoding-of-bistable-plaid-motion-perception.</title>
  <link>http://www.nitrc.org/forum/forum.php?forum_id=8139</link>
  <description>
	&lt;table border=&quot;0&quot; width=&quot;100%&quot;&gt;&lt;tr&gt;&lt;td align=&quot;left&quot;/&gt;&lt;/tr&gt;&lt;/table&gt;
        &lt;p&gt;&lt;b&gt;Combined fMRI- and eye movement-based decoding of bistable plaid motion perception.&lt;/b&gt;&lt;/p&gt;          
        &lt;p&gt;Neuroimage. 2017 Dec 30;:&lt;/p&gt;
        &lt;p&gt;Authors:  Wilbertz G, Ketkar M, Guggenmos M, Sterzer P&lt;/p&gt;
        &lt;p&gt;Abstract&lt;br/&gt;
        The phenomenon of bistable perception, in which perception alternates spontaneously despite constant sensory stimulation, has been particularly useful in probing the neural bases of conscious perception. The study of such bistability requires access to the observer's perceptual dynamics, which is usually achieved via active report. This report, however, constitutes a confounding factor in the study of conscious perception and can also be biased in the context of certain experimental manipulations. One approach to circumvent these problems is to track perceptual alternations using signals from the eyes or the brain instead of observers' reports. Here we aimed to optimize such decoding of perceptual alternations by combining eye and brain signals. Eye-tracking and functional magnetic resonance imaging (fMRI) was performed in twenty participants while they viewed a bistable visual plaid motion stimulus and reported perceptual alternations. Multivoxel pattern analysis (MVPA) for fMRI was combined with eye-tracking in a Support vector machine to decode participants' perceptual time courses from fMRI and eye-movement signals. While both measures individually already yielded high decoding accuracies (on average 86% and 88% correct, respectively) classification based on the two measures together further improved the accuracy (91% correct). These findings show that leveraging on both fMRI and eye movement data may pave the way for optimized no-report paradigms through improved decodability of bistable motion perception and hence for a better understanding of the neural correlates of consciousness.&lt;br/&gt;
        &lt;/p&gt;&lt;p&gt;PMID: 29294388 [PubMed - as supplied by publisher]&lt;/p&gt;
    </description>
  <language>en-us</language>
  <copyright>Copyright 2000-2026 NITRC OSI</copyright>
  <webMaster></webMaster>
  <lastBuildDate>Sat, 02 May 2026 12:49:09 GMT</lastBuildDate>
  <docs>http://blogs.law.harvard.edu/tech/rss</docs>
  <generator>NITRC RSS generator</generator>
 </channel>
</rss>
