Concepts
   Neuron
   Automata
   Entropy
   Waves
   Thoughts

Computation
   Synapses
   Coherence
   Neuralwaves

Sequences
   Circuits
   Clusters
   Code

Software
   Prism
   Kali











Home-> Concepts->Automata


Automata

This section is about the computational machinery in a neural circuit. The synapses that occur along the circuit is represented by an ordered sequence. The term "ordered" means that the firing of a synapse depends on the state of the neural circuit before the synapse occurred. The transistion into each new state of the system depends on the previous states [1].

The neuron is describe by a state function of the form

x' = U (x,q)

where x and q are variables representing sequences of ordered events.

State transition equation can explicitly represent the propagation or recurrence of a state in consequence observable steps.

x(k+1) = U (k;k+1) x(k)

x(k) is the kth state, and U (k;k+1) is the transition to the k+1 state. This is the recurrence relationship of an ordered movement of energy flow.


            |
            |
         n0 |  x00 --> x01 --> x02 --> x03 --> x04
      n     |    U0(01)  U0(12)  U0(23)  U0(34)
         n1 |  x10 --> x11 --> x12 --> x13 --> x14
      n     |    U1(01)  U1(12)  U1(23)  U1(34)
      e  n2 |  x20 --> x21 --> x22 --> x23
      u     |    U2(01)  U2(12)  U2(23)      
      r  n3 |  x30 --> x31 --> x32 --> x33
      o     |    U3(01)  U3(12)  U3(23)
      n     |
      s  nn |  xn0 --> xn1 --> xn2 --> xn3 --> xn4
            |    Un(01)  Un(12)  Un(23)  Un(34)
            |_______________________________________
           0
                         state(t)

          Sequence diagram for n+1 cell automata

In the synthetic neuron model, groups of synchronous neurons in a circuit is grown by making connections between a groups of neurons in a circuit. The cluster of neurons in a circuit is compose of sets or lists of artificial neurons. The figure above shows the synapses of the n-th neuron which can be written as a list of state transitions x_n (i).

The Language of Modelling Neurons

The lisp language was created with simple, yet elegant, programming methods for computing lists efficiently. One of these methods is recursion. To create recursive code, we try to break down the process into simpler subprocesses. The process of breaking down a complex set of sequences into a set of simpler sequences of instructions on a computer is called reduction. The process of reduction is analogous to the process of clustering used in pattern recognition. As clustering is an aggregative concept which enables the recognition of more complex epicentric paradigms, reduction is a process of simplifying complex sequences by breaking up these long sequences into shorter simpler ones. Lisp tries to efficiently process recursive loops by using a technique called continuations in which the code branches back into itself like using a "goto" statement. This alleviates the necessity of repeatedly branching out of the code by calling a subroutine.

Reduction is a way of transforming a long hard problem into many short simple ones. Reduction as used in studying complex problems has some strong parallels with the term clustering which is used to reduce complexity in pattern recognition. They are basically similar paradigms. Our quest is to find intelligent ways to cluster synchronous neural circuits together in large part by using the experience of building the software code in the formalism of lisp.

State Propagation In Neural Networks

The state transition matrix describes the propagation from observable, consecutive states k to k+1. U(k;k+1) contains the essential information in the neural system.

x(k+1) = U (k;k+1) x(k)

The kernel, U, contains the code for the ordered sequence of state transitions. The form of these transitions has the tiling structure of a binary tree.

A sequence is an ordered binary list. The sequence below is an ordered branch of a complete binary tree. The end point of this branch is a sequence with leaves (daad).

Footnote

1. Satosi Watanabe wrote [2]:

In passing, it may be noted that the (cognitive version of the) inverse H-theorem also stems from the asymmetric interpretation of the symmetric relation

Namely, when P(F|E) is given, p(E|F) has then to be computed by the Bayes formula,

and the passage from the prior probability p(E) to the ulterior probability p(E|F) results in an entropy decrease.

The fundamental rule of probability, Bayes rule, constraints the system to be temporally ordered. The Bayes rule applied to each state transition in a sequence of events determines the system's directional movement in time. These state transitions are time ordered or time dependent Markov processes because a transition probability follows the conditional probability P(E|F) * P(F|G) * P(G|H) ..., where event E depends on F, and F upon G, and G upon H, etc., in temporal order. Given two state sequences X_1 and X_2 from two neurons N_1 and N_2, we can get a measure of the cross-correlation by comparing the similarity of the Markov transition probabilities in the sequence determined by P(E|F) * (F|G) ... for neurons N_1 and N_2.

2. Time and the Probabilistic View of the World, Satosi Watanabe,
The Voices of Time, ed. J. T. Fraser, 1966, p. 527


next: Entropy