View
218
Download
2
Category
Preview:
Citation preview
ww
w.gfai.de/~
heinzw
ww
.gfai.de/~heinz
How Network Topology Defines its Behavior How Network Topology Defines its Behavior --Serial Code Detection with Spiking Serial Code Detection with Spiking NetworksNetworks
Dr. Gerd Heinz Dr. Gerd Heinz
Gesellschaft zur Förderung Gesellschaft zur Förderung angewandter Informatik e.Vangewandter Informatik e.V
Berlin-AdlershofBerlin-Adlershof
Workshop „Autonomous Workshop „Autonomous Systems”Systems”
Herwig Unger & Wolfgang HalangHerwig Unger & Wolfgang Halang
Hotel Sabina Playa, Cala Millor Hotel Sabina Playa, Cala Millor
Mallorca, 13-17 Oct. 2013Mallorca, 13-17 Oct. 2013 Sensor- und Motor- Sensor- und Motor- Homunculus. Homunculus.
Natural History Museum, Natural History Museum, London London
ww
w.gfai.de/~
heinzw
ww
.gfai.de/~heinz
Contents
Abstract Convolution A Small Interference Network Construction of Transfer Functions Applying a Convolution Spike Output Frequency Analysis Unipolar or Bipolar Signal Levels? Interpreting Bursts Examples
3heinz@gfai.de www.gfai.de/~heinz
Abstract Compared with technical sensors, sound and code analysis of
nerve system is fascinating We differ between the whisper of the wind or the branding of
waves, we know the songs of birds, we hear dangerous noises of a defect car engine, we feel, if an airplane starts
And we speak and understand languages: Do we have a chance, to interprete the function of a nerve net on the level of net structure?
We try to analyze a simplest delaying network in nerve-like structure
Our net consists of delays T and weights W Basing on Interference Network (IN) abstraction we transform
the net into a transfer function H of a linear time-invariant system (LTI-system)
We use convolution between input time-function and transfer function to find the "behaviour" of the LTI-system
* The work bases on the book "Neuronale Interferenzen", Kap.8b, S.181, download: www.gfai.de/~heinz/publications/NI/index.htm
4heinz@gfai.de www.gfai.de/~heinz
Convolution
"Faltung" (terminus created by Felix Bernstein, 1922):
Discrete form (Cauchy product):
Example: FIR-filteras direct implementationof convolution, form: Y = X * S
dhtxthtxtyt
)()()(*)()(0
kn
n
kkn xhy
0
5heinz@gfai.de www.gfai.de/~heinz
A Small Interference Network
N N'
x(t) y(t)
1
2
n
. .
.
1w2w
nw
. . . +
N N'
x(t) y(t). .
.
Form: Our Abstraction:
Delay vector:
Weight vector:
Transfer function:
],...,,[ 21 nT
],...,,[ 21 nwwwW
n
iii txw
nty )(
1)(
6heinz@gfai.de www.gfai.de/~heinz
Construction of Transfer Function H
(Transfer function of LTI-system)Discrete transfer function H seen as discrete time function with
sample distance ts = 1/fs and with growing index i :
i = [… 2, 3, 4, 5, 6, 7, 8, 9, …]
H = [… wi-1, wi, wi+1, wi+2, wi+3, wi+4, wi+5, wi+6, …]
Length of H is greater the delay difference: length(H) ≥ max(T) – min(T)
Construction of the transfer function of the net by addition of weights:H(j) = H(j) + W(i) mit j = T(i) :H(T(i)) = H(T(i)) + W(i)
fs = 1/ts
7heinz@gfai.de www.gfai.de/~heinz
Get Transfer Function with Scilab
function [H] = trans(T,W,fs); if length(T) == length(W) then T = T * fs; // apply sample rate of H T = round(T); // T becomes index: integer H = 1:max(T); H = H * 0; // create an empty H for i = 1:length(T), // for all T(i), W(i) j = T(i), // delay becomes the H-index j H(j) = H(j) + W(i), // add the weight to H end // for else // if printf('\n\nerror: T and W have different size\n'); end // if endfunction;
H is the transfer function of a LTI-system!
8heinz@gfai.de www.gfai.de/~heinz
Applying a Convolution
What is the system answer Y for different input functions X ?It is simple the convolution with H , the multiplication of time series
y(t) = h(t) * x(t)
Using vectorsY = X * H
Scilab formY = convol(H,X)
Fourier Analysis of H F = abs(fft(H))
HX Y
9heinz@gfai.de www.gfai.de/~heinz
Barker Codes and Spikes
Hebbian rule in neuro-science shows, that a neuron needs high synchronous emissions to learn
We need spikes at the output of the neuron Barker codes maximize spike-like output of long
sequences in RADAR technology:
Example:H = [1, 1, 1, -1, 1] (Barker code no. 5)X = rev(H)
Y = convol(X,H) = [1, 0, 1, 0, 5, 0, 1, 0, 1]
But neurons don't have negative signal values!What can we do?
10heinz@gfai.de www.gfai.de/~heinz
Spectral Analysis of Transfer Function H
FFT of any unipolar transfer function shows the maximum for frequency f = 0 Hz (DC)
It is not possible to learn with unipolar H ; codes are AC:
Unipolar{0…1}
Bipolar{-1…1}
n
njj enHeF )()(
Highest level at 0 Hz
11heinz@gfai.de www.gfai.de/~heinz
Unipolar or Bipolar Signal Levels?
Unipolar signals, unipolar synapses: {-1…0…1}
12heinz@gfai.de www.gfai.de/~heinz
Unipolar or Bipolar Signal Levels?
Bipolar signals, bipolar synapses: {0…1}
13heinz@gfai.de www.gfai.de/~heinz
Unipolar or Bipolar Signal Levels?
Unipolar signals, bipolar synapses (neuron) {0…1} {-1…1}
14heinz@gfai.de www.gfai.de/~heinz
Unipolar or Bipolar Signal Levels?
unipolar signals and bipolar synapses (neuron) X, Y: uni {0…1}
H: bi {-1…1}
Big surprize: Using unipolar signals X, Y and bipolar H, the system is not
significant worse compared to the best case uni/uni
Test it: Use relating Scilab sources under
www.gfai.de/~heinz/publications/papers/2013_autosys.pdfwww.gfai.de/~heinz/techdocs/index.htm#conv
Conclusion Nerve systems do not need bipolar signals to detect code and
sound, if the synapses are bipolar (inhibiting or exciting)!
15heinz@gfai.de www.gfai.de/~heinz
Interpreting Bursts
Noisy groups of pulses are known at different locations in nerve system
Is it possible, to find the net structure behind them?
16heinz@gfai.de www.gfai.de/~heinz
The Inversive Procedure
We interprete a burst as transfer function H (seen as pulse response) and reproduce the delays T and weights W of the network behind:function [T,W] = net(H,fs); // returns T and W j=1; // W-index j for i=1:length(H) // H-index i if H(i) == 0 then ; // do nothing else // write the value to W, the index to T W(j) = H(i); // value to W T(j) = i; // index to T j = j+1; // increment j end; // endif end; // endfor T = T ./ fs; // multiply with sample duration T = T - min(T); // scale to min: reduced T-vectorendfunction;
17heinz@gfai.de www.gfai.de/~heinz
Example H = f(T,W)
Delays T, weights W, transfer function H, reducing vectors: index r
Delays:Weights:
Reduced T, W:
Transfer function:
]8,3,5[],,[ 321 T
]1,5.0,1[],,[ 321 wwwW
]5,2,0[],,[ 321 RRRRT
]1,1,5[.],,[ 321 RRRR wwwW
),0,0,,0,( 312 wwwH
1,0,0,1,0,5.0H
18heinz@gfai.de www.gfai.de/~heinz
Example
Key X and keyhole Hunipolar
max(FFT) at 0 Hz(uni/uni)
19heinz@gfai.de www.gfai.de/~heinz
Conclusion
To characterize time- and frequency domain, we transform delays and weights of a simplest interference network into a LTI transfer response
A procedure [H] = trans(T,W,fs) calculates the (time-discrete) transfer function H (pulse response) of the net from delay vector T (delay mask) and weight vector W
The FFT shows learning problems for unipolar signals and unipolar H because of highest DC-value
A mixture between unipolar signals and bipolar transfer function (weights) acts as good alternative (nerve nets)
Interpreting bursts as transfer functions (pulse responses), we design an inverse procedure [T,W] = net(H,fs) that reconstructs the net structure [T,W] from transfer function H
Find Scilab sources and the paper on the webwww.gfai.de/~heinz/publications/papers/2013_autosys.pdfwww.gfai.de/~heinz/techdocs/index.htm#conv
20heinz@gfai.de www.gfai.de/~heinz
Relevance for ANN
The transfer function or pulse response H is responsible for all sequential properties of a network: for code and sound generation or detection
The lecture shows, that smallest delays and delay differences change the pulse response H of the network
Remembering the "Neural Networks" (NN, ANN) approach with layers clocked by clock cycles we find, that the NN-approach destroyes the sequential structure of each network complete
In no case ANN or NN are candidates to understand the function of nerve like structures
Thinking about nerves we need interferential approches that does not destroy the delay structure of the net.
21heinz@gfai.de www.gfai.de/~heinz
Vielen Dank für die Vielen Dank für die Aufmerksamkeit!Aufmerksamkeit!
Dr. G. Heinz, GFaIDr. G. Heinz, GFaI
Volmerstr.3 Volmerstr.3
12489 Berlin12489 Berlin
Tel. +49 (30) 814563-490Tel. +49 (30) 814563-490www.gfai.de/~heinz
heinz@gfai.de
Erfolgreiche Google-Suchterme: Erfolgreiche Google-Suchterme: "Interferenznetze", "Mathematik des "Interferenznetze", "Mathematik des Nervensystems", "Heinz", Nervensystems", "Heinz", "Akustische Kamera""Akustische Kamera"
Und der Herr sprach: "So führte ich Und der Herr sprach: "So führte ich euch auf den Weg der Erkenntnis. euch auf den Weg der Erkenntnis. Gehet nun, und traget die Botschaft in Gehet nun, und traget die Botschaft in die Welt hinaus!"die Welt hinaus!"
Recommended