associative fine tuning of biologically inspired active
play

Associative Fine-Tuning of Biologically Inspired Active - PowerPoint PPT Presentation

Associative Fine-Tuning of Biologically Inspired Active Neuro-Associative Knowledge Graphs Adrian Horzyk Janusz A. Starzyk horzyk@agh.edu.pl starzykj@ohio.edu Google: Horzyk Google: Janusz Starzyk Ohio University, Athens, Ohio, U.S.A., AGH


  1. Associative Fine-Tuning of Biologically Inspired Active Neuro-Associative Knowledge Graphs Adrian Horzyk Janusz A. Starzyk horzyk@agh.edu.pl starzykj@ohio.edu Google: Horzyk Google: Janusz Starzyk Ohio University, Athens, Ohio, U.S.A., AGH University of School of Electrical Engineering and Computer Science Science and Technology University of Information Technology Krakow, Poland and Management, Rzeszow, Poland

  2. Research inspired by brains and biological neurons  Work in in par arall llel an and as asynchronously ly  Ass Associa iate stim imuli li con ontext xt-sensit itiv ively ly  Use se ti time ap approach for or computations  Represent var arious data an and th their rela lations  Se Self lf-organiz ize neurons develo lopin ing a a very ry comple lex structure  Ag Aggregate representation of of si simil ilar data  In Integrate memory an and th the procedures  Provid ide pla lastic icit ity to o develo lop a a structure to represent data an and ob obje ject rela lations

  3. ACTIVE NEURO-ASSOCIATIVE KNOWLEDGE GRAPHS (ANAKG)  ANAKG produces a complex graph structure of dynamic and reactive neurons and connections to represent a set of training sequences.  Neurons aggregate all instances of the same elements that occur in all sequences.

  4. Objectives and Contribution  Construction of the fine-tuning algorithm for synaptic weights to achieve better recalling of associatively stored training sequences and better generalization.  Avoid unintended activations to stop possible false-recalling of sequences.  Construct a well-aggregative model for storing correlated training sequences.  Reproduce functionality of the biological neural substance.

  5. ASN Neurons  Connect context-sensitively to emphasize training sequences and automatically develop an ANAKG network structure.  Aggregate representations of the same elements of the training sentences - no duplicates!  Work asynchronously in parallel because time influences the results of the ANAKG network.  Integrate memory and associative processes GOAL: Reproduce functionality of the biological neural substance!

  6. Associative Spiking Neurons ASN  Were developed to reproduce plasticity and associative properties of real neurons that work in time.  They implement internal neuronal processes (IP) and efficiently manage their processing using internal process queues (IPQ) and a global event queue (GEQ).  ASN neurons are updated only at the end of the internal processes (not continuously) to provide efficiency of data processing!

  7. How ASN neurons work and how they are modeled? IPQ represents a short sequence of internal changes of a neuronal state dependent on the external stimuli and previous internal states of the neuron. Internal states of ASN neurons are updated only at the end of internal processes (IP) that are supervised by the Global Event Queue (GEQ).

  8. Model and Adaptation of Associative Spiking Neurons Synaptic efficacy defines the efficiency of the synapsis of the stimulations and spiking reactions of the postsynaptic neurons: It depends on: ∆𝑢 𝐵 - the period of time that lapsed between the stimulation of the synapse between the 𝑂 𝑛 and 𝑂 𝑛+𝑠 neurons and the activation of the postsynaptic neuron 𝑂 𝑛+𝑠 during training of the training sequence set 𝕋 = 𝑇 1 , … , 𝑇 𝑂 ; ∆𝑢 𝐷 - the period of time necessary to charge and activate the postsynaptic neuron 𝑂 𝑛+𝑠 after stimulating the synapse between the 𝑂 𝑛 and 𝑂 𝑛+𝑠 neurons (here ∆𝑢 𝐷 ); ∆𝑢 𝑆 = 200ms - the maximum period of time during which the postsynaptic neuron 𝑂 𝑛+𝑠 recovers and returns to its resting state after its charging that was not strong enough to activate this neuron; 𝜄 𝑂 𝑛+𝑠 = 1 - the activation threshold of the postsynaptic neuron 𝑂 𝑛+𝑠 ; 𝑜 𝜐 = 4 - the context influence factor changing the influence of the previously activated and connected neurons on the postsynaptic neuron 𝑂 𝑛+𝑠 . How do neurons work?

  9. Model and Adaptation of Associative Spiking Neurons Synaptic efficacy 𝜀 and the number 𝜃 of activations of the presynaptic neuron 𝑂 𝑛 during training of the training sequence set 𝕋 is used to define synaptic permeability 𝑞 : OR Which is finally used to compute synaptic weights: where 𝑑 is the synaptic influence: excitatory (c = 1) or inhibitory (c = −1 ), and m is the multiplication factor modeling the number of synapses connecting the presynaptic and postsynaptic neurons.

  10. Adaptation and Tuning of Associative Spiking Neurons The weights computed in a presented way are good enough for the primary set of weights in the complex graph neural networks: I have a monkey. My monkey is very small. It is very lovely. It is also very clever. The introduced tuning process allows for the achievement of better recalling results thanks to the slight modification of the multiplication factors of the synapses.

  11. Tuning Process of ANAKG Two repetitive steps of the tuning process : 1. All undesired and premature activations of neurons are avoided for all training sequences by using weakening operations . 2. Conflicts between correlated training sequences are fine-tuned using strengthening operations . 𝑑ℎ𝑏𝑠𝑕𝑓 - the strength of the last stimulus, We define: 𝑡 𝑚𝑏𝑡𝑢 𝑦 – charge level at the moment when the last stimulus came 𝑛𝑏𝑦 - the maximum dynamic charge level of each stimulated neuron 𝑦 𝑏𝑚𝑚 𝑑ℎ𝑏𝑠𝑕𝑓 𝑗𝑔 𝑦 + 𝑡 𝑚𝑏𝑡𝑢 𝑑ℎ𝑏𝑠𝑕𝑓 > 𝑦 𝑏𝑚𝑚 𝑛𝑏𝑦 𝑛𝑏𝑦 = 𝑦 + 𝑡 𝑚𝑏𝑡𝑢 𝑦 𝑏𝑚𝑚 𝑛𝑏𝑦 𝑦 𝑏𝑚𝑚 𝑝𝑢ℎ𝑓𝑠𝑥𝑗𝑡𝑓 𝑛𝑏𝑦 𝑦 𝑑𝑝𝑜𝑢𝑓𝑦𝑢 - the previous maximum charge level establishing the context of the last stimulus that should activate the neuron: 𝑛𝑏𝑦 − 𝑡 𝑚𝑏𝑡𝑢 𝑑ℎ𝑏𝑠𝑕𝑓 𝑛𝑏𝑦 𝑦 𝑑𝑝𝑜𝑢𝑓𝑦𝑢 = 𝑦 𝑏𝑚𝑚 The correct activation of the neuron assumes that 𝑑ℎ𝑏𝑠𝑕𝑓 𝑛𝑏𝑦 𝑛𝑏𝑦 𝑦 𝑑𝑝𝑜𝑢𝑓𝑦𝑢 < 𝜄 ≤ 𝑦 𝑑𝑝𝑜𝑢𝑓𝑦𝑢 + 𝑡 𝑚𝑏𝑡𝑢 On this basis we can define strengthening and weakening operations for the tuning process.

  12. Weakening Operation The weakening operation defines how the multiplication factor m decreases when a neuron is activated in the incorrect context or prematurely in the reduced context: 𝜄 𝑔𝑝𝑠 𝑢ℎ𝑓 𝑣𝑜𝑒𝑓𝑡𝑗𝑠𝑓𝑒 𝑏𝑑𝑢𝑗𝑤𝑏𝑢𝑗𝑝𝑜𝑡 𝑛𝑏𝑦 + 𝜁 𝑦 𝑏𝑚𝑚 𝛿 = 𝜄 𝑔𝑝𝑠 𝑢ℎ𝑓 𝑞𝑠𝑓𝑛𝑏𝑢𝑣𝑠𝑓 𝑏𝑑𝑢𝑗𝑤𝑏𝑢𝑗𝑝𝑜𝑡 𝑛𝑏𝑦 𝑦 𝑑𝑝𝑜𝑢𝑓𝑦𝑢 + 𝜁 𝑛 = 𝑛 ∙ 𝛿 𝑥 = 𝑑 ∙ 𝑞 ∙ 𝑛 The multiplication factors of the incorrect activations must be deceased to operate on the right stimulation context of the next neurons of the recalled training sequence. Weakening operations always start and finish the tuning process of the ANAKG network.

  13. Strengthening Operation The strengthening operation defines how the multiplication factor m increases when a neuron is not activated in the right context of all predecessor of the training sequence or too late: 𝜄 𝛿 = 𝑛𝑏𝑦 − 𝜁 𝑦 𝑏𝑚𝑚 𝑛 = 𝑛 ∙ 𝛿 𝑥 = 𝑑 ∙ 𝑞 ∙ 𝑛 The strengthening operation always tries to achieve stimulation of the next sequence element. However, sometimes it is not beneficial if the initial context is not unique, e.g. there are few training sequences which start from the same subsequences of elements. Strengthening operations allows for recalling of the following elements of the training sequences when the stimulation context is unique.

  14. Experimental Results TRA RAINING DATA SET SET: I ha I have ve a a mon onkey. My y mon onkey is is ver very sm smal all. . It It is is ver very lo lovel vely. It li It likes es to o sit sit on on my y hea head. d. The e ach chie ieved It can It an jum jump ver very qu quic ickly. res esults ts con onfirm It is It is als also o ver very cle cleve ver. It lear It learns s qu quic ickly. that th t th the e My y mon onkey is is lo lovel vely. I als I also o ha have ve a a big big cat. at. proposed tu tunin ing My y so son n als also o ha has s a a mon onkey. process is is It It li likes es to o sit sit on on his his lam lamp. I ha I have ve an an old old sis sister. ben enefi ficial l and She Sh e is is ver very lo lovel vely. My y sis sister ha has s a a sm smal all cat. at. produce better- Sh She e li likes s to o sit sit in in the he libr library adapted weig eights ts and read and ead bo book oks. s. Sh She e qu quic ickly lear learns ns lan langu guages es. . allo llowin ing to o My y sis sister ha has s a a cat. at. ach chie ieved ed better It is It is ver very sm small all. You ou ha have ve a a cat at as as well ell. rec ecall lls fr from th the e It It is is big big. . I I ha have ve a a you young ng bro brothe her. ANAGK netw twork. My y brot brother is is sm smal all. He e ha has s a a mon onkey an and d do dogs gs. His is mon onkey is is small ll as well ell. . We e ha have ve lo lovel vely do dogs gs.

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend