Claremont McKenna College - Mathematics
A fast spiking neural network analysis toolkit which leverages the ease of use of a Python frontend with the power and speed of a C++ backend. \n\nhttps://github.com/HRLAnalysis/HRLAnalysis
Corey Thibeault
Michael
OBrien
Neural Analytics
Claremont McKenna College
Raytheon
Aerospace Corporation
Reed Institute of Decision Science
HRL Laboratories
LLC
Claremont McKenna College
Los Angeles
CA
Senior Data Scientist
Neural Analytics
Malibu
CA
Member of Center for Neural and Emergent Systems
Information and System Sciences Laboratory \n•\tWork on IARPA’s KRNS project\n•\tDevelop optimization techniques for signal processing of fMRI data\n•\tUse Formal Concept Analysis (rooted in lattice theory) to map fMRI data to semantic structures to understand the hierarchical structure of the human brain\n•\tWork on DARPA’s SyNAPSE project\n•\tContribute to the development of the large scale neural network simulator called HRLSim\n•\tLook for a minimal set of biological characteristics that achieve desired neural network behaviors\n•\tDevelop novel neural network models that exemplify self-organization\n•\tDevelop reinforcement learning algorithms to train networks to perform desired tasks\n•\tDevelop analytical techniques and metrics to quantify in concrete terms the network learning dynamics\n•\tContribute to the open source analytics package HRLAnalysis for analyzing large spiking networks\n•\tHRL Outstanding Team Award
Research Scientist
HRL Laboratories
LLC
Investigated several aspects of additive number theory\n· Collaborated with a team of undergraduate mathematics students\n· Employed creative problem solving to prove an original mathematical theorem\n· Presented weekly progress reports to other students and professors\n· Wrote a mathematical paper describing findings (to be published)
Reed Institute of Decision Science
Claremont McKenna College
Claremont
CA
Taught single variable calculus\n· Developed class curriculum including daily lectures
homework assignments and exams
Adjunct Professor Mathematics
Developed methods to check for errors in telemetry tables\n· Wrote software that uses the methods I developed to automatically check for errors in the telemetry tables
\nrepair the errors
and write error analysis reports\n· Supported the Systems Engineering Department through various data and error analysis tasks
Raytheon
Software Engineering Intern
Model and analyze satellite software systems to predict failure rates
average system availability and overall\nreadiness of the software package\n· Find root causes of problems based on system models\n· Advise program offices of software analysis and recommend software for deployment or give an estimate for\namount of testing necessary to bring software to maturity\n· Spot Award for excellence in work on SBIRS during the summer of 2006\n· Spot Award for excellence in work on AEHF during the summer of 2007
Aerospace Corporation
Claremont McKenna College
Claremont
CA
Taught single variable calculus\n· Developed class curriculum including daily lectures
homework assignments and exams
Adjunct Professor Mathematics
Masters
Mathematics
UCLA
Doctor of Philosophy (Ph.D.)
Mathematics; Computational Neuroscience
UCLA
Bachelor of Science
Physics; Mathematics
Claremont McKenna College
Neural Analytics Inc. Awarded National Science Foundation Grant of $743
756 for Non-Invasive Monitoring of Intracranial Pressure for Traumatic Brain Injury
LOS ANGELES--(BUSINESS WIRE)-- Neural Analytics
a medical device company focusing on developing devices and services to diagnose
monitor
and triage a number of neurological conditions announced it has been awarded a $743
756 National Science Foundation grant for its Small Business Innovation Research Phase II project to monitor intracranial pressure and improve the management of severe traumatic brain injury (TBI).
Neural Analytics Inc. Raises $10 MM Series A
LOS ANGELES-(BUSINESS WIRE)-December 21
2015- Neural Analytics Inc.
a medical device company focused on developing devices and services to measure
diagnose and track brain health
announced today that it has secured $10 MM in Series A financing. This brings the total investment amount in Neural Analytics to $13 MM including their $3 MM Series Seed financing last year.
Neural Analytics Inc. Raises $10 MM Series A
UCLA spinout nabs $10M Series A to diagnose
track traumatic brain injury
Neural Analytics' Transcranial Doppler device--Courtesy of Neural Analytics Neural Analytics is developing portable transcranial Doppler ultrasound devices that it expects could be used by first responders and emergency room physicians to accurately assess mild to severe traumatic brain injury (TBI)
including concussions.
UCLA spinout nabs $10M Series A to diagnose
track traumatic brain injury
Computer Science
Scientific Computing
Linux
Neural Networks
Scientific Writing
LaTeX
Computational Mathematics
Mathematics
Microsoft Office
Windows
Emacs
Matlab
Bash
Programming
Maple
Python
Calculus
Research
Physics
C++
A novel analytical characterization for short-term plasticity parameters in spiking neural networks
Narayan Srinivasa
Corey M. Thibeault
Short-term plasticity (STP) is a phenomenon that widely occurs in the neocortex with implications for learning and memory. Based on a widely used STP model
we develop an analytical characterization of the STP parameter space to determine the nature of each synapse (facilitating
depressing
or both) in a spiking neural network based on presynaptic firing rate and the corresponding STP parameters. We demonstrate consistency with previous work by leveraging the power of our characterization to replicate the functional volumes that are integral for the previous network stabilization results. We then use our characterization to predict the precise transitional point from the facilitating regime to the depressing regime in a simulated synapse
suggesting in vitro experiments to verify the underlying STP model. We conclude the work by integrating our characterization into a framework for finding suitable STP parameters for self-sustaining random
asynchronous activity in a prescribed recurrent spiking neural network. The systematic process resulting from our analytical characterization improves the success rate of finding the requisite parameters for such networks by three orders of magnitude over a random search.
A novel analytical characterization for short-term plasticity parameters in spiking neural networks
Corey Thibeault
The additional capabilities provided by high-performance neural simulation environments and modern computing hardware has allowed for the modeling of increasingly larger spiking neural networks. This is important for exploring more anatomically detailed networks but the corresponding accumulation in data can make analyzing the results of these simulations difficult. This is further compounded by the fact that many existing analysis packages were not developed with large spiking data sets in mind. Presented here is a software suite developed to not only process the increased amount of spike-train data in a reasonable amount of time
but also provide a user friendly Python interface. We describe the design considerations
implementation and features of the HRLAnalysis™ suite. In addition
performance benchmarks demonstrating the speedup of this design compared to a published Python implementation are also presented. The result is a high-performance analysis toolkit that is not only usable and readily extensible
but also straightforward to interface with existing Python modules.
Analyzing large-scale spking neural data with HRLAnalysis
Corey Thibeault
Modeling of large scale spiking neural models is an important tool in the quest to understand brain function and subsequently create real-world applications. This paper describes a spiking neural network simulator environment called HRLSim. This simulator is suitable for implementation on a cluster of General Purpose Graphical Processing Units (GPGPUs). Novel aspects of HRLSim are described and an analysis of its per- formance is provided for various configurations of the cluster. With the advent of inexpensive GPGPU cards and compute power
HRLSim offers an affordable and scalable tool for design
real-time simulation
and analysis of large scale spiking neural networks. Index
HRLSim: A High Performance Spiking Neural Network Simulator for GPGPU Clusters
Narayan Srinivasa
In this thesis
we assess the role of short-term synaptic plasticity in an artificial neural network constructed to emulate two important brain functions: self-sustained activity and signal propagation. We employ a widely used short-term synaptic plasticity model (STP) in a symbiotic network
in which two subnetworks with differently tuned STP behaviors are weakly coupled. This enables both self-sustained global network activity
generated by one of the subnetworks
as well as faithful signal propagation within subcircuits of the other subnetwork. Finding the parameters for a properly tuned STP network is difficult. We provide a theoretical argument for a method which boosts the probability of finding the elusive STP parameters by two orders of magnitude
as demonstrated in tests.\nWe then combine STP with a novel critic-like synaptic learning algorithm
which we call\nARG-STDP for attenuated-reward-gating of STDP. STDP refers to a commonly used long- term synaptic plasticity model called spike-timing dependent plasticity. With ARG-STDP
we are able to learnmultiple distal rewards simultaneously
improving on the previous reward modulated STDP (R-STDP) that could learn only a single distal reward. However
we also provide a theoretical upperbound on the number of distal reward that can be learned using ARG-STDP. \nWe also consider the problem of simulating large spiking neural networks. We describe\nan architecture for efficiently simulating such networks. The architecture is suitable for implementation on a cluster of General Purpose Graphical Processing Units (GPGPU). Novel aspects of the architecture are described and an analysis of its performance is benchmarked on a GPGPU cluster. With the advent of inexpensive GPGPU cards and compute power
the described architecture offers an affordable and scalable tool for the design
real-time simulation
and analysis of large scale spiking neural networks.
The Role of Short-Term Synaptic Plasticity in Neural Network Spiking Dynamics and in the Learning of Multiple Distal Rewards by
Rajan Bhattacharyya
Kendrick Kay
James Benvenuto
Rachel Millin
We present an algorithm
Sparse Atomic Feature\nLearning (SAFL)
that transforms noisy labeled datasets into a\nsparse domain by learning atomic features of the underlying\nsignal space via gradient minimization. The sparse signal rep-\nresentations are highly compressed and cleaner than the original\nsignals. We demonstrate the effectiveness of our techniques on\nfMRI activity patterns. We produce low-dimensional
sparse\nrepresentations which achieve over 98% compression of the\noriginal signals. The transformed signals can be used to classify\nleft-out testing data at a higher accuracy than the initial data.
Sparse Atomic Feature Learning via Gradient Regularization: With Applications to Finding Sparse Representations of fMRI Activity Patterns
A Spiking Neural Model for Stable Reinforcement of Synapses Based on Multiple Distal Rewards
Narayan Srinivasa
In this letter
a novel critic-like algorithm was developed to extend the synaptic plasticity rule described in Florian (2007) and Izhikevich (2007) in order to solve the problem of learning multiple distal rewards simultaneously. The system is augmented with short-term plasticity (STP) to stabilize the learning dynamics
thereby increasing the system's learning capacity. A theoretical threshold is estimated for the number of distal rewards that this system can learn. The validity of the novel algorithm was verified by computer simulations.
A Spiking Neural Model for Stable Reinforcement of Synapses Based on Multiple Distal Rewards
Mike O'Neill
For prime p
we classify those pairs of subsets of Z mod p for which equality is attained in Pollard’s theorem. Our result may be considered as an extension of the theorem of Vosper characterizing the critical pairs for the Cauchy-Davenport inequality.
Equality in Pollard's Theorem on Set Addition of Congruence Classes
Corey Thibeault
Efficiently passing spiking messages in a neural model is an important aspect of high-performance simulation. As the scale of networks has increased so has the size of the computing systems required to simulate them. In addition
the information exchange of these resources has become more of an impediment to performance. In this paper we explore spike message passing using different mechanisms provided by the Message Passing Interface (MPI). A specific implementation
MVAPICH
designed for high-performance clusters with Infiniband hardware is employed. The focus is on providing information about these mechanisms for users of commodity high-performance spiking simulators. In addition
a novel hybrid method for spike exchange was implemented and benchmarked.
Efficiently passing messages in distributed spiking neural network simulation.
http://www.militaryaerospace.com/articles/2012/06/krns-proposers-day.html\nhttp://www.militaryaerospace.com/articles/2013/01/iarpa-krns-ai.html
SyNAPSE
http://www.technologyreview.com/news/532176/a-brain-inspired-chip-takes-to-the-sky/\nhttp://www.technologyreview.com/featuredstory/526506/neuromorphic-chips/\nhttp://www.economist.com/news/science-and-technology/21582495-computers-will-help-people-understand-brains-better-and-understanding-brains\n
The following profiles may or may not be the same professor: