Hi everyone, We have started a new collaborative, open comp neuro research project (in the spirit of the Polymath project<https://en.wikipedia.org/wiki/Polymath_Project> that was a huge success in mathematics). Everyone is invited to participate, and anyone who does contribute will get to be an author. Some details below, but you can just jump straight in to the website and take a look here: https://comob-project.github.io/snn-sound-localization/ The idea is to experiment with doing science in the spirit of collaboration rather than competition. We've set up an initial research topic and some code/results, and then you choose what direction you want to take, get feedback from others and give feedback to them on their ideas. We have an infrastructure that makes all contributions appear automatically on the website. We're running a discussion group, and planning local workshops. The first one of those will be in Paris on July 8th, just before FENS. This project grew out of my Cosyne tutorial<https://neural-reckoning.github.io/cosyne-tutorial-2022/> on using machine learning methods with spiking neural networks (surrogate gradient descent). It's a relatively easy topic to get into and a lot of low hanging fruit to pick, so hopefully ideal for this approach. If you watch the tutorial video and run the notebooks, you're already at the boundary of new research. Contributing is super easy. Go to our GitHub repo, open it up and start running the code on Google Colab (one click, no local installation required). Use one of the existing notebooks to start your own, commit the changes (via pull request for the first one), and your notebook automatically appears on the website. Detailed instructions on the website<https://comob-project.github.io/snn-sound-localization/Contributing.html>. Thanks, and hope to see your contributions and discussion soon! Dan Goodman<http://neural-reckoning.org/> Tomas Fiers<https://tomasfiers.net/> Marcus Ghosh<https://neural-reckoning.org/marcus_ghosh.html> Rory Byrne<https://twitter.com/ryrobyrne>