Creating a list of important computational predictions
Dear all,
I believe you would agree that, the greatest recognition of our computational work comes with the experimental verification of our computational predictions.
To highlight the most important outcomes of our theoretical work and facilitate a closer interaction with experimental labs so as to increase the chances of our predictions being experimentally tested, I would like to encourage you to nominate your favorite prediction at the following google form:
https://docs.google.com/forms/d/e/1FAIpQLSeJZ2RrXJ0nZonIuyNzN0Cl gXDKDb7xTzoJOiqZTGxm9Kh7ow/viewform?c=0&w=1&usp=mail_form_link
This effort started during the CNS 2017 meeting in Antwerp with the support of the OCNS Board. The resulting list will be posted on the CNS 2017 web site and disseminated in social media, as a reference for experimentalists.
Thank you very much for supporting this important effort!
Best wishes,
Yiota Poirazi
Neural Computation - Volume 32, Number 12 - December 1, 2020
available online for download now: http://www.mitpressjournals.org/toc/neco/32/12 http://cognet.mit.edu/content/neural-computation
-----
Articles
Resonator Networks Outperform Optimization Methods at Solving High-dimensional Vector Factorization Spencer Kent, E. Paxon Frady, Friedrich T Sommer, and Bruno A. Olshausen
Resonator Networks for Factoring Distributed Representations of Data Structures E. Paxon Frady, Spencer Kent, Bruno A. Olshausen, and Friedrich T Sommer
Differential Covariance: A New Method to Estimate Functional Connectivity in fMRI Tiger W. Lin, Yusi Chen, Qasim Bukhari, Giri P. Krishnan, Maxim Bazhenov, and Terrence J. Sejnowski
Letters
Synchrony and Complexity in State-related EEG Networks: An Application of Spectral Graph Theory Amir Hossein Ghaderi, Bianca R. Baltaretu, Masood Nemati Andevari, Vishal Bharmauria, and Fuat Balci
Toward a Unified Framework for Cognitive Maps Woori Kim, Yongseok Yoo
Active Learning for Level Set Estimation Under Input Uncertainty and Its Extensions Yu Inatsu, Masayuki Karasuyama, Keiichi Inoue, and Ichiro Takeuchi
Redundancy-aware Pruning of Convolutional Neural Networks GuoTian Xie
Analyzing and Accelerating the Bottlenecks of Training Deep SNNs With Backpropagation
Ruizhi Chen, Ling Li
------------
ON-LINE -- http://www.mitpressjournals.org/neuralcomp
MIT Press Journals, One Rogers Street, Cambridge, MA 02142-1209 Tel: (617) 253-2889 FAX: (617) 577-1545 journals-cs@mit.edu
------------
participants (2)
-
Terry Sejnowski
-
Yiota Poirazi