Storage of information on a circular structure

"A text is a tissue of white space" which "lives on an added value of meaning put in by the addressee"  Umberto Eco

 

Séminaire TIMC IMAG 7 Mai 93

http://www.futura-sciences.com/comprendre/d/dossier48-1.php


Download article PDF  (in french)

          We propose a model of storage of information resulting from a surprising hydrodynamic phenomenon and a reflexion on the reverberating loops , possible supports of the short-term memory. It enabled a research by the contents of noisy or incompleted images which make it close to associative memories.

          The vectors memorized on this stucture in loop are quasi orthogonal. Knowing that the orthogonality of the stored images is necessary to obtain a good memorizing on neuron networks one can consider this mode of storage and orthogonalization like a pretreatment of images intended for storage on associative memory.

           It should be noted that in this case, the images being stored in chaotic form, the recognition of an image stored on the network obligatorily needs a phase of pretreatment.

 

 

1- Storage by superposition of vectors transformed without modulation

1. 1 general information

      The guiding principle of an associative memory is that any new element is interpreted like a noise compared to the set of the elements already stored by the structure. Reciprocally, stored elements form a noise for a given vector.

      In an associative memory of Hopfield type, the vector to be stored modifies a matrix of values formed by already stored elements. In our model, the principle remains the same one: the vector, after transformation, is superimposed on the already stored vectors and it is regarded as a noise by the network.

      The storage section is represented by a certain number of transformed vectors.

      Vectors have 120 components . Each component has as value a level of gray { 0..100 } of an image 1D.

      Example:

1. 2 stage of storage

      We transform the vector origin O of components Ci , i{ 1..n }, in a vector T of components C'i , i{ 1..d } of size (d) much higher than the vector origin (n).

      The formula (1) of computation of A' 1 becomes then:

A' i = i * b mod (d+1)
where i { 1.. d }
                                                 b, (d+1) incommensurable numbers

      Components C’i of address A'i , i { 1..n }, have as V'i values those, permuted, of the vector of origin.

      Components C’i of address A'i , i { n+1..d }, have V'i = 0.

      Different values of b, selected in the series of the prime, are used to memorise each vector O.

Example with b=7 an d=15:

We store Tk, we modified d, b does not change, it is the key assigned to the vector. With b=7 and d=18 we obtain another vector T'k

We obtain thus, for each vector Ok, k {1, ..., K}, a set of transformed of Ok,{Tk,T’k,T’’k,...} of dimensions d ( m1, m2, .. , mM) different.

        We add then the vectors of same dimension d.

       The j-ième component of a vector result, Qd , d{1=m1, ..., d=mM } is, for a given dimension d:

      The variation of d will generate, for each storage section, a different transformed vector.

      A vector origin of key " b " thus will be registered, in the form of several transformed vectors, in different " noises ".

      The vectors are associated by two, the dimension of the one decreases when the dimension of the other increases. The length resulting and the (D) number of storage vectors are parameters of the model.

 

1. 3 stage of restitution

      Restitution of the vector from its transformed vectors amounts to extracting a signal from a noise. We chose a method derived from the one used in neurology for evoked potentials: the elimination of the noise is obtained by average of multiple copies of the signal, each one being disturbed in a different way. For a given key the vector result is obtained by making the average of the transformed vectors recorded on the network:

 

2- Model with modulation associated with the chaotic transformation

      The principle is unchanged but one introduces a treatment before or at the time of the storage. It is an amplitude modulation of a sinusoidal signal. The purpose of the use of an amplitude modulation in radio is to shift, in the radio spectrum, the peculiar frequency of the signal to be transmitted. It allows to avoid the superposition of the broadcasting stations .

Y(t) = A(t).sin 2pi F.t

Y(t) is the modulated signal, A(t) the modulating signal, F the carrier wave.

      Here the modulation will increase the geometric distance between the vectors to store, without loss of information, since the demodulation restores the original signal . The carrier wave is generated by sampling a sinusoid. Its period is a submultiple of the dimension of the transformed vectors. In other words, the number of periods in each transformed vector is an integer.

       The step of sampling is equal to the parameter b which is the key assigned to each vector to store.

       Some carrier waves obtained by sampling the basic sinusoid:

 

 

      example of modulated signal obtained with b = 7

      The following figure shows the profit obtained in orthogonalization for the same vector , transformed with b, prime different { 5..51 }

      The profit in orthogonalization remains weak for the only chaotic transformation but this one generates a specific modulation for each image which appears very powerful. If we take two different vectors, instead the same one, the transformed vectors become quasi orthogonal:

3- Studies:

3.1 theoretical maximum capacity of storage:

       The operation of modulation is a method of orthogonalization. We compared it with the method of reference of GRAM SCHMIDT [ KEE ]. A first vector to be stored is transformed with a " key " b into a set of chaotic vectors of different size d (Q1 with Qd ). This unit constitutes the base on which will come to be added the following sets. A second vector is transformed with a different key and generates a second set of chaotic vectors. If we want to add the two sets so that one seems a noise compared to the other, we have to orthogonalized the second set compared with the first. It was the role of the modulation in our method. In the method of GRAM SCHMIDT the unit is recomputed. The coefficient to be applied to each component of a transformed vector of dimension d given is stored. It will allow the rebuilding of the vector of origin in the stage of restitution. The following vectors are stored in the same way: each new unit is recomputed in order to be orthogonal with the already stored elements. In our experimentation the method of GRAM SCHMIDT makes it possible to double the storage capacity (20 vector-tests on levels of gray instead of 10).

                                                     20 images stored
                                                          examples
    restitution after superposition

 

3.2 activity of the network:

       The figure below shows the first 100 values of the 4 first sections storage Qm1 , Qm2 , Qm3 , Qm4 , after the storage of 6 transformed vectors Tk, k{1.. 6}on the network and the part of the vector sum corresponding. It is noticed that the values particular to each image are cancelled and that the sampled sinusoidal signal is reconstituted by addition of carrier waves assigned to the images.

 



  
       3.3: draft studies
                - maximum capacity of storage
             - desaturation of the network by extraction of  registered images (treated and untreated), influence of the residual noise on the network
                - short-term memorization of images intended for an associative memory

4- References

[DAV89] Eric Davalo, Patrick Naïm, Le modèle de HOPFIED , in des Réseaux de Neurones, Eyrolles 89, p 104-112

[DEW87] Dewdney A., Explorez le monde étrange du chaos, Récréations informatiques, in Pour la Science, N° 119, Sept 87, p 13-16

[ KEE ] James P.Keener, Transformation and approximation, in Principles of Applied Mathematics , Addison Wesley Publishing Company

 

6- Curriculum vitae

7- Links (in french)

             ymorere.multimania.com

         Sharkowskii's theorem: search on GOOGLE with the key words: period three implies chaos

         Choatic dynamic systems
              http://iridia.ulb.ac.be/~cphilemo/index2.html

         Chaos In Neural Network Group
         a new connexionnist approach
              http://iridia.ulb.ac.be/ChINN/home/index.php

         Learning and control by chaotic neuronal networks
              http://www.etis.ensea.fr/~quoy/

         Chaotic systems and system control
              http://spacetown.free.fr/

         Basic texts:
         K. Fukushima: "A model of associative memory in the brain", Kybernetik, 12[2], pp. 58-63 (Feb. 1973).
         french translation (fichier pdf): "Un modèle de mémoire associative dans le cerveau"

         

return to the previous page