This chapter discusses how signals in Space and Time and apertures and Arrays affect Array Processing and the role that symbols play in this processing.Expand

A locally competitive algorithm (LCA) is described that solves a collection of sparse coding principles minimizing a weighted combination of mean-squared error and a coefficient cost function to produce coefficients with sparsity levels comparable to the most popular centralized sparse coding algorithms while being readily suited for neural implementation.Expand

An understanding of the convergence and synchronization of statistical signal processing algorithms in continuous time is developed, and an understanding of linear and nonlinear circuits for analog memory is explored, and the “soft-multiplexer” is proposed.Expand

We define a new distance measure the resistor-average distance between two probability distributions that is closely related to the Kullback-Leibler distance. While the KullbackLeibler distance is… Expand

The fundamental limits on how well information can be represented by and extracted from neural discharges are described, which are illustrated by considering recordings from the lower auditory pathway.Expand

The requirements for digital sequences by other digital sequences and the use of such representations to implement a nonlinear warping of the digital frequency axis are discussed within the framework of simulating linear time-invariant systems.Expand

We show that an Au nanoshell with a pH-sensitive molecular adsorbate functions as a standalone, all-optical nanoscale pH meter that monitors its local environment through the pH-dependent… Expand

These examples illustrate that neurons can simultaneously represent at least two kinds of information with different levels of fidelity, indicating that it is possible for an evolving neural code to represent information with constant fidelity.Expand

This work extends the general M-hypothesis Bayesian detection problem where zero cost is assigned to correct decisions, and finds that the Bayesian cost function's exponential decay constant equals the minimum Chernoff distance among all distinct pairs of hypothesized probability distributions.Expand