{"forum": "BylUXXFI8S", "submission_url": "https://openreview.net/forum?id=BylUXXFI8S", "submission_content": {"abstract": "Flies and mice are species separated by 600 million years of evolution, yet have evolved olfactory systems that share many similarities in their anatomic and functional organization. What functions do these shared anatomical and functional features serve, and are they optimal for odor sensing? In this study, we address the optimality of evolutionary design in olfactory circuits by studying artificial neural networks trained to sense odors. We found that artificial neural networks quantitatively recapitulate structures inherent in the olfactory system, including the formation of glomeruli onto a compression layer and sparse and random connectivity onto an expansion layer. Finally, we offer theoretical justifications for each result. Our work offers a framework to explain the evolutionary convergence of olfactory circuits, and gives insight and logic into the anatomic and functional structure of the olfactory system.", "keywords": ["evolution", "perception", "olfaction", "connectivity"], "title": "Evolving the Olfactory System", "authors": ["Robert Guangyu Yang", "Peter Yiliu Wang", "Yi Sun", "Ashok Litwin-Kumar", "Richard Axel", "LF Abbott"], "TL;DR": "Artificial neural networks evolved the same structures present in the olfactory systems of flies and mice after being trained to classify odors", "pdf": "/pdf/028c1846f01aaba2c4064b8da8d304167e508389.pdf", "authorids": ["gyyang.neuro@gmail.com", "yw2500@columbia.edu", "yisun@math.columbia.edu", "ak3625@columbia.edu", "ra27@columbia.edu", "lfa2103@columbia.edu"], "paperhash": "yang|evolving_the_olfactory_system"}, "submission_cdate": 1568211741924, "submission_tcdate": 1568211741924, "submission_tmdate": 1572303100933, "submission_ddate": null, "review_id": ["S1eZSksPDH", "r1xQZaPFPH"], "review_url": ["https://openreview.net/forum?id=BylUXXFI8S¬eId=S1eZSksPDH", "https://openreview.net/forum?id=BylUXXFI8S¬eId=r1xQZaPFPH"], "review_cdate": [1569333049319, 1569451258796], "review_tcdate": [1569333049319, 1569451258796], "review_tmdate": [1570047559484, 1570047553634], "review_readers": [["everyone"], ["everyone"]], "review_writers": [["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper3/AnonReviewer2"], ["NeurIPS.cc/2019/Workshop/Neuro_AI/Paper3/AnonReviewer1"]], "review_reply_count": [{"replyCount": 0}, {"replyCount": 0}], "review_replyto": ["BylUXXFI8S", "BylUXXFI8S"], "review_content": [{"evaluation": "4: Very good", "intersection": "2: Low", "importance_comment": "This work is important because it uses stochastic gradient descent (SGD) as an optimisation tool, rather than a 'learning model' as is so commonly done these days. The authors then argue that certain quantities in the model are robust to optimisation / optimal under certain constraints by converging to these parameters values for a wide range of initial conditions.", "clarity": "4: Well-written", "technical_rigor": "3: Convincing", "intersection_comment": "This is really a theoretical neuroscience paper, I think.", "rigor_comment": "Though the space search is still numerical, the search methodology (i.e. optimisation) is adequate for the claims made.\nAn interesting step further would be to look at how 'sharp' posteriors over these parameters are (e.g. as in work like [Lueckmann, J. et al. Amortised inference for mechanistic models of neural dynamics; COSYNE2019])", "comment": "This work is interesting and has nice conclusions, though its relevance to this exact workshop maybe a little off. It does not specify any sort of \"idea cross pollination\" from AI<->neuroscience. Still I think many in the attending audience will find it interesting.", "importance": "3: Important", "title": "Using SGD as an optimisation tool to find robust principles in a neuroscience model", "category": "Not applicable", "clarity_comment": "Nice clear paper."}, {"title": "Understanding the olfactory circuit by perturbing it in silico", "importance": "4: Very important", "importance_comment": "ANNs based on the olfactory circuit, which uses dimensionality compression followed by expansion, have already provided performance on nearest-neighbor lookup comparable with modern hashing algorithms. Continued research into understanding why the olfactory circuit has evolved its unique architecture could provide key insight into designing a new class of biological-inspired neural networks. ", "rigor_comment": "The findings in this paper appear strong, as they match experimental findings of PN-KC connectivity. I do have some questions about the model:\n- Do you see the same connectivity results if you include an APL component in the model? \n- Do you see the same results if you implement divisive normalization in the model? (Something like a softmax function between the ORN and PN)\n- Does each odor class have the same number of odors, so it assumes that the fly encounters all odors evenly, or are some odor classes rarer than others?", "clarity_comment": "This paper is well-written, I have some small questions:\n- Is the data used synthetic data, or is it based on a real dataset? (Hallem?)\n- How was the data partitioned into training/validation/test sets?\n- For the odor classification, what was the precision and recall across odors?\n- \"activities of different PNs are uncorrelated\", how was this determined?\n- At the end of a paragraph, \"The formation of glomeruli is minimally dependent on input noise\", what does this refer to?\n- Fig. 3: Does this include changes to the number of PNs and KCs? Or just ORs?", "clarity": "4: Well-written", "evaluation": "5: Excellent", "intersection_comment": "This paper trains an artificial feed-forward neural network inspired by the architecture of the fly olfaction circuit. The authors perform experiments in silico and examine how the architecture of the model makes the network resistant to fluctuations in the weights.", "intersection": "4: High", "comment": "I think this paper is quite good, as provides a hypothesis on why the olfactory circuit has evolved its distinct architecture. I had some comments on what kind of data was used to train the model (real or synthetic), and also on if the same connectivity results persist if the model includes components such as divisive normalization and the APL.", "technical_rigor": "4: Very convincing", "category": "AI->Neuro"}], "comment_id": [], "comment_cdate": [], "comment_tcdate": [], "comment_tmdate": [], "comment_readers": [], "comment_writers": [], "comment_reply_content": [], "comment_content": [], "comment_replyto": [], "comment_url": [], "meta_review_cdate": null, "meta_review_tcdate": null, "meta_review_tmdate": null, "meta_review_ddate ": null, "meta_review_title": null, "meta_review_metareview": null, "meta_review_confidence": null, "meta_review_readers": null, "meta_review_writers": null, "meta_review_reply_count": null, "meta_review_url": null, "decision": "Accept (Poster)"}