Please use this identifier to cite or link to this item:
|Title:||Biological neural network based chemotaxis behaviors modeling of C. elegans||Authors:||Xu, J.-X.
|Issue Date:||2010||Citation:||Xu, J.-X.,Deng, X. (2010). Biological neural network based chemotaxis behaviors modeling of C. elegans. Proceedings of the International Joint Conference on Neural Networks : -. ScholarBank@NUS Repository. https://doi.org/10.1109/IJCNN.2010.5596961||Abstract:||In this work, it is the fist time that a biologically real neural circuitry is used to model the chemotaxis behaviors of the nematode Caenorhabditis elegans (C. elegans), such as food attraction, toxin avoidance, or multi-tasks behaviors. The use of biological neural network becomes feasible because the structure and connectivity of the C. elegans' nerve system have been completely understood through anatomical research. In this work, several biological neuron network structures are extracted from the anatomical wire diagram of C. elegans, which are complete in function from sensor neurons to motor neurons. In particular, either single-sensor or dual-sensor neurons are taken into consideration. The biological neural network is mathematically constructed using the dynamical neural network approach. The Real time recurrent learning (RTRL) algorithm is carried out to train the biological neural network to approximate a set of switch functions that describe different chemotaxis behaviors of C. elegans. Simulation results conclude that the biological neural circuitry can be trained by RTRL to successfully capture the chemotaxis behaviors of C. elegans. © 2010 IEEE.||Source Title:||Proceedings of the International Joint Conference on Neural Networks||URI:||http://scholarbank.nus.edu.sg/handle/10635/69507||ISBN:||9781424469178||DOI:||10.1109/IJCNN.2010.5596961|
|Appears in Collections:||Staff Publications|
Show full item record
Files in This Item:
There are no files associated with this item.
checked on Apr 13, 2019
checked on Mar 23, 2019
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.