Question icon
Your Current Search
Choose below to refine your search
Research Topic
Download abstract book

Download the NI2012 abstract book here. The page numbers in the index are clickable for easy browsing.


Reducing duplication and redundancy in declarative model specifications


Robert Cannon (Textensor Limited), Padraig Gleeson (University College London), Sharon Crook (Arizona State University), Angus Silver (University College London)

Methods for storing and sharing biological models, whether by scripts, or with declarative languages such as SBML or CellML, tend to focus on directly encoding a model and its equations. However, many models share essentially the same equations, except for differences in parameter values or the number of instances of particular processes. We have therefore developed a mechanism within NeuroML[1] whereby the common structural and mathematical features of a family of models can be expressed separately from the parameter values that make up a particular member of the family. Specifications in this form correspond closely to the conceptual
understanding that modelers have of the systems they work on. Supporting this style of description allows models to be expressed more concisely than if flatter structures are used since it avoids repeating common elements. It also enables a wide range of model related processing beyond simply executing them, including systematic comparisons, archiving, model composition, and sensitivity analysis.

The resulting system, known as LEMS (Low Entropy Model Specification), has been developed to meet the needs of NeuroML but can also be applied to other declarative model specifications. It makes it possible to retro-fit most of the existing high level concepts ion NeuroML version 1 with domain independent definitions built out of the LEMS elements. This preserves the benefits of the existing semantically rich high-level concepts while adding the ability to extend it with new concepts at the user-level rather than requiring changes to the language. Simulator developers have a choice between directly supporting the library of core types in NeuroML or supporting the underlying LEMS definitions. Libraries are available in both Java and Python to allow simulator developers to work with LEMS models without having to implement the low level support themselves. These include options for flattening models into sets of simultaneous differential equations, and for code generation to create standalone executable units.

1. Gleeson P, Crook S, Cannon RC, Hines ML, Billings GO, et al. NeuroML: A Language for Describing Data Driven Models of Neurons and Networks with a High Degree of Biological Detail. PLoS Comput Biol 6(6): 2010, e1000815. doi:10.1371/journal.pcbi.1000815
Preferred presentation format: Poster
Topic: General neuroinformatics