%%%computer Programming - Java - Neural Network Gui With Joone (2002)

Published on December 2021 | Categories: Documents | Downloads: 13 | Comments: 0 | Views: 139
of 91
Download PDF   Embed   Report

Comments

Content

 

  ABOUT THIS DOCUMENT

 Joon  Jo one e i s a fr fre ee N eural N etw or k Engi E ngi ne for Java Java

Y ou can can fi nd t his hi s i nform nfor mati on and t he source sour ce code code at source sour cefor forge ge.ne .net  t 

T he thre thr ee P D F man anua uals ls w er e mad ade e i nto a si ngle ngl e fi l e i n Feb Febr uar uar y 2003 2003

 

   av a  a  O b je c t  t  O rie n te d  N    N eu r al  E   E n gi n  Ja  ne 

GUI Editor User Guide 

 

Joone Editor User Guide  Guide 

Contents  1 

INTRODUCTION ................................ ................................ ..................................... 6  1.1   1.1 INTENDED AUDIENCE ................................ ................................................................ ........................................................... ........................... 6



DOWNLOAD AND INSTALLATION ................................ ................................ ... 7  2.1  2.1  JAVA .................................................................................................................... ................................................................ .................................................... 7 2.2   2.2 JOONE FILES ................................ ................................................................ ......................................................................... ......................................... 7 2.3 .3   OTHER FILES ................................................................ ........................................ ................................................................................................ ........ 7 2.3.1   Note ................................................................ ............................................. ................................................... ............................................. ............. 7   2.4   2.4 RUNNING THE JOONE EDITOR ................................ ............................................................................... ............................................... 7



MENU................................ ......................................................................................... 8  3.1  3.1  FILE ................................................................ ................................................................................................ ...................................................... ...................... 8 3.1.1   New................................  New.............. .................................................. .............................................................................. .............................................. 8   3.1.2  Open ................................................................ ................................ ................................................................ ............................................ ............ 8   3.1.3  Save ................................................................................................ ............................................................................................................. ............. 8   3.1.4  Save As ................................................................................................ ........................................................................................................ ........ 8   3.1.5   Export Neural Neural Net  .......................................................................................   ....................................................................................... 8   3.1.6   Print................................................................ ................................ ............................................. ............. 8   3.1.7    Exit ................................................................ .............................................................................................................. .............................................. 8   3.2   3.2 EDIT ................................ ..................................................................................................................... ..................................................................................... 8 3.2.1  Cut ................................................................ Cut  ............................................................................................................... ............................................... 9  3.2.2  Copy ................................ Copy  ................................................................ ................................................................ ............................................ ............ 9  3.2.3  Paste ................................ ................................................................ ................................................................ ............................................ ............ 9  3.2.4   Duplicate ................................ ................................................................ ..................................................................... ..................................... 9  3.2.5   Delete ................................  Delete  ................................................................ .......................................................................... .......................................... 9  3.2.6  Group ................................ Group  ................................................................ .......................................................................... .......................................... 9 3.2.7    Ungroup ...................................................................................................... ...................................................................................................... 9   3.2.8   Sent to Back ................................ Back ................................................................................................ ................................................................ . 9  3.2.9   Bring to Front .............................................................................................. .............................................................................................. 9  3.3   3.3 ALIGN ................................................................ ................................ .................................................. .................. 9 3.3.1  Toggle Tog gle Snap Snap to Grid .................................................................................... ................................ .................................................... 9  3.3.2   Left ................................................................................................. ............................................................... ............................................. ............. 10  3.3.3  Center................................ ................................ ........................................................................ ........................................ 10  3.3.4   Right ................................  Right  ................................................................ ................................................................ ....................................... .......... ... 10  3.3.5  Top................................................................ Top ................................................................................................ ............................................. ............. 10  3.3.6    Middle................................................................  Middle................................ ........................................................................ ........................................ 10  3.3.7    Bottom ................................  Bottom  ................................................................ ....................................................................... ....................................... 10  3.4   3.4 ATTRIBUTES ................................ ................................................................ ....................................................................... ....................................... 10  10  3.4.1  3.4.2 

Figure Attributes: ................................ ............................................................................... ...................................................... ....... 10  Text Attributes ................................ ............................................................................ ........................................................... ............... 10 

3.5  3.5  CONTROL ................................................................ ........................................... 11   ................................................................................................ ........... 11 3.5.1  Control Panel ................................................................ ............................................. ............................................... ............................ 11  3.5.2   Add Noise ................................ Noise ................................................................................................ ..................................................................12 ..12  3.5.3   Randomizee ................................................................  Randomiz ................................................................................................ ................................ . 12  2

 

Joone Editor User Guide  Guide  ................... 13  3.5.4   Reset Input Input Streams................................................................ Streams ................................................................................... 3.5.5   Macro Editor Editor ................................................................ ................................ ................................ ............................. 13  3.6   3.6 LOOK’N’FEEL................................ ................................................................ ................................................................ ................................... ..... .. 17  17  3.6.1   Metal................................  Metal ................................................................ ................................................................ ....................................... .......... ... 17   3.6.2  CDE/Motif ................................................................ CDE/Motif  ................................................................................................ ................................ . 17   3.6.3  Windows ................................................................................................ ................................ .................................................................... .... 17   3.7   3.7 HELP ................................................................................................ ................................................................ .................................................. .................. 17 17   3.7.1   About Joone Joone ................................ ................................................................................ ............................................................... ............... 17   3.7.2   Help Content  Content s ............................................................................................ ................................................................ ............................ 17   4 

............. ......... .. 18  TOOLBAR ................................................................ ................................ ......

4.1  4.1  SELECTION TOOL ................................ ................................................................ ............................................................... 18   ............................... 18 4.2   4.2 LABEL AND CONNECTED LABEL................................................................ ......... 18 18   4.3   4.3 DRAWING TOOLS ................................ ................................................................................ ............................................................... ............... 18  18  4.4   4.4 INPUT LAYERS................................ ................................................................................................ .................................................................... .... 18 18   4.5   4.5 LAYERS ................................ ................................................................................................ .............................................................................. .............. 18 18   4.6   4.6 OUTPUT LAYERS ................................ .......................................................................... ................................................................ ...................... 19  19  4.7   4.7 CHARTING COMPONENT ................................................................ ....................................................................... .............. ............. ........ 19 19   4.8   4.8 SYNAPSES................................................................ ................................................................................................ ....................................... ........... .... 20 20   4.8.1  The Pop-Up Menu for all Connection Connection Types Types ............................................ 22  4.9   4.9 PLUGINS ................................................................ ................................................................................................ ............. 22 22   ............................................. 5 

LAYERS................................ ................................................................ ................... 23  5.1  5.1  PROCESSING LAYERS ................................ ......................................................... 23   ................................................................ ......................... 23 5.1.1   Linear ................................  Linear  ................................................................ ........................................................................ ........................................ 23  5.1.2  Sigmoid ................................................................ Sigmoid  ................................................................................................ ................................... ...... ... 23  5.1.3  Tanh................................................................ ................................ ....................................... ........... .... 23  5.1.4   Logarithmic  Logarithm ic ................................................................................ ................................ ............................................................... ............... 23  5.1.5  Context ................................................................................................ ................................................................................................... ...... ... 24  5.1.6    Nested ANN ................................ ANN ............................................................................................... ............................................................... 24  5.1.7    Delay ................................................................................................ ................................ ................................................................ ......... 24  5.2   5.2 ........................................................................ 26   I/O LAYERS ................................ ................................................................ ........................................ 26 5.2.1  File Input ................................................................ ................................ ................................................................... ................................... 26   5.2.2  URL Input ................................ Input ................................................................................................ ..................................................................26  ..26   5.2.3   Excel Input................................................................................................ Input................................................................ ................................ . 26   5.2.4  S witch witch Input................................................................................ Input............................................................................................... ............... 26   5.2.5   Learning Switch................................ Switch................................................................ ....................................... .............. .............. ........... .... 27    ................................................................................................ 28   5.2.6   File Output  ................................................................................................

5.2.7   5.2.8  

 Excel Output ................................  Excel Output ................................................................ .............................................................. .............................. 28   Switch Output ................................................................ ............................................. ............................................... ............................ 28  

5.2.9  Teacher ................................................................ Teacher  ................................................................................................ ................................... ...... ... 28   PLUGINS ................................ ................................ ................................................. 30 



6.1  6.1  PRE-PROCESSING PLUGINS ................................ .......................................................................... ................................................. ....... 30  30  6.2   6.2 MONITOR PLUGINS ................................ ................................................................ ............................................................. ............................. 31 31   6.2.1  The Annealing Concept ................................ Concept ................................................................ ............................................. ............. 32 

3

 

Joone Editor User Guide  Guide  7 

................................................................ ...................... 32  BASIC TUTORIAL ................................ ..........................................



...... ... 34  AN ADVANCED EXAMPLE: THE XOR PROBLEM ................................ ...

8.1   8.1

TESTING THE TRAINED NET ................................ ............................................... 37   ................................................................ ............... 37

........... .... 38  THE XML PARAMETER FILE ................................ ................................ .......



9.1  9.1  9.2   9.2 9.3   9.3 9.4   9.4 10 

THE <BUTTONS> … </ BUTTONS ....................... 38 38   BUTTONS> SECTION ................................ ....................................................... THE <OPTIONS> … </ OPTIONS ....................................... .............. .............. ............ ..... 39 39   OPTIONS> SECTION ................................ SEPARATORS ................................................................................................ ................................................................ ................................... ...... ... 39  39  TEMPORARILY REMOVING ITEMS ................................ ....................................................................... ....................................... 40  40 

ONLINE RESOURCES............................................................................... ...................................................... ....... 41 

10.1  10.1  10.2   10.2 10.3   10.3 10.4   10.4 10.5   10.5 10.6   10.6 10.7   10.7 11 

JOONE................................ ................................................................ .......................................................................... ................................................. ....... 41  41  ARTIFICIAL NEURAL NETWORKS TECHNOLOGY.......................................... ................................................. ....... 41  41  JAVA ................................................................ ................................................................................................ .................................................. .................. 41 41   JAXP ................................ ................................................................ .......................................................................... ................................................. ....... 41  41  JHOTDRAW ................................ ................................................................ ........................................................................ ........................................ 41  41  SOURCE FORGE ................................ ................................................................................................ ..................................................................41 ..41   SUN MICROSYSTEMS ....................................................................... ................................................................ .............. .............. ............ ..... 41 41  

GLOSSARY................................ ................................................................ ......... 42 

11.1  11.1  11.2   11.2 11.3   11.3 11.4   11.4 11.5   11.5 11.6   11.6 11.7   11.7 11.8   11.8 11.9   11.9 11.10   11.10 11.11   11.11 11.12   11.12

ANN / NN................................................................ NN ................................................................................................ ....................................... ........... .... 42 42   CLASSPATH ................................ ................................................................ ........................................................................ ........................................ 42  42  GUI................................................................ GUI ................................................................................................ ....................................... .............. ............. ......42 42   JAR FILE ................................................................ ................................................................................................ ............................................. ............. 42 42   JAXP ................................ ................................................................ .......................................................................... ................................................. ....... 42  42  LAYER ................................................................ ................................ ................................................ 42   ................ 42 NEURON ................................................................................................ ................................................................ ............................................. ............. 42 42   NEURAL NETWORK ............................................................................................ ................................................................ ............................ 42 42   PE ................................ ................................................................ ............................................................................... ...................................................... ....... 42  42  SWING ................................ ................................................................ ............................................................................ ............................................ 42  42  XML ................................................................................................ ................................ .............................................................................. .............. 43 43   ZIP FILE ................................ ................................................................ ................................................................ ....................................... .......... ... 43 43  

4

 

Joone Editor User Guide  Guide  Revision   Revision Revision   Revision

Date   Date

Author   Author

0.1.0   0.1.0

October 12, 2001  2001 

Harry Glasgow  Glasgow 

0.1.5   0.1.5

October 15, 2001  2001 

Paolo Marrone  Marrone 

0.1.6   0.1.6

October 21, 2001  2001 

Paolo Marrone  Marrone 

0.2.0  0.2.0 

November 6, 2001  2001 

Paolo Marrone  Marrone 

0.5.7  0.5.7 

January 8, 2002  2002 

Paolo Marrone  Marrone 

0.5.8   0.5.8

January 13, 2002  2002 

Paolo Marrone  Marrone 

0.5.9   0.5.9

January 22, 2002  2002 

Harry Glasgow  Glasgow 

0.6  0.6 

February 12, 2002  2002 

Harry Glasgow  Glasgow 

Comments  • Pre Pre--release draft draft   •

Added a parameter file on the command line to reflect the last change of the editor package  package 



Added the description of the XML parameter file   Added the description of the Teacher component and of the Control Panel  Panel  Added the advanced example based on the XOR problem  problem  Added page numbers  numbers 



• • • •



Added the drawing tool buttons  buttons  Added the official URL of Joone (www.joone.org (www.joone.org))



Added plugin sections  sections 



Added the use of the Monitor plugins  plugins  Added the ‘Add Noise’ and ‘reset Input Streams’ menu items   items Added the new layers (XL I/O, nested ANN, Switch I/O)  I/O)  Updated the list of libraries needed to run the editor  editor  Enhanced the explanation of the Delay Layer  Layer  Added the Help Contents menu item  item  Added the Learning Switch component and the validation parameter in the Control Panel  Panel  Added the use of the Scripting Plugin  Plugin   Added details of new sections for the two-part menu, the Macro Editor pane and the charting module.  module.  Added the description of the Context and Logarithmic layers   layers







0.6.5   0.6.5 April 9, 2002  2002 

Paolo Marrone  Marrone 

• • •

0.6.6   0.6.6



May 15, 2002  2002 

Paolo Marrone  Marrone  •

0.7 0.7.0  .0 



September 2002   2002

02,

Harry Glasgow  Glasgow 

Added the Export Neural Net menu item  item  Added the <option> section of the XML parameter file  file 



5

 

Joone Editor User Guide  Guide 

1

Introduction  

The Java Object Oriented Neural Engine (Joone) is a system of Java modules that provide a framework for developing, teaching and testing neural networks.  networks.   Joone comprises two elements, a core engine and a GUI editor. The core engine is used by the editor to process neural networks. This guide focuses chiefly upon the editor component.   component.

1.1  Intended Audience   This document is intended for non-programmers, persons interested in developing neural networks with the Joone editor.  editor.  

6

 

Joone Editor User Guide  Guide 

2

Download and Installation 

2.1  Java   To run Joone, Java 2 (release 1.2 or above) needs to be installed on your computer. This is available from Sun Microsystems at http://java.sun.com/j2se/ . http://java.sun.com/j2se/ . Installation of Java falls outside the scope of this document but is relatively straightforward.  straightforward.  

2.2   Joone Files   The compiled Joone project files are available from Source Forge at http:// sourceforge.net/projects/joone/ .. The Joone engine and editor zip files are available from sourceforge.net/projects/joone/  the project’s Summary page. Both these files are required to run the Joone editor and should be included in your computer’s Java Classpath.  Classpath.  

2.3   Other Files   The Joone editor makes use of the following libraries:

The Joone editor makes use of the following libraries:  libraries:    jhotdraw.jar  ar  from  from JhotDraw 5.2, a Java based drawing package  package  •  jhotdraw.j  and crimson.jar  from  from JAXP, the Java XML processing package  package  •  xalan.jar  and above  •  jh.jar from JavaHelp 1.1 or above  •  poi-hssf.jar, poi-poifs.jar, poi-util.jar and hssf -serializer.jar  -serializer.jar   from Jakarta HSSF project, the library to read/write Excel files  files   •

www.beanshell.org), ), a Java scripting engine  engine  bsh-core.jar from BeanShell ((www.beanshell.org

The joone-ext.zip package contains a version of the above libraries.  libraries. 

2.3.1  Note • •

JoonEdit does not work with the new JhotDraw 5.3, only the 5.2 version. A new version of JoonEdit will soon be released that works with JhotDraw 5.3.  5.3.   Sun Microsystems has released Java Standard Edition 1.4, which includes JAXP, so it may not be necessary to include this separately for Java editions 1.4 and above. 

2.4   Running the Joone Editor   To run be theinvoked Joone editor, putfollowing all the above packages on the classpath, then JoonEdit class should with the command: command:    java org.joone.edit.JoonEdit <parameters.xml> 

where <parameters.xml> is the optional XML parameters file with the complete path (see  /org/joone/data/lay  /org/joon e/data/layers.xml ers.xml for an example). If the parameter is not specified, the editor will use a default set of parameters in the joonejoone -edit.jar file  file  On a Windows platform, Click Start then Run and type in the above command, then click OK. Alternatively you can use the RunEditor.bat contained in the executable distribution package.   package.

7

 

Joone Editor User Guide  Guide 

3

Menu 

The menu of the Joone editor accesses standard operations. operations.  

3.1  File   Projects built with the Joone editor can be saved and reopened at a later date. Projects are saved as serialized files with the file extension of ‘ser’. Only one project can be edited at a time. Joone will prompt to save an edited project prior to closing, or opening a new project.   project.

3.1.1  New  Creates a new Joone project. Any existing project changes are lost.  lost.  

3.1.2  Open  Opens an existing Joone project. This will replace any existing project.  project.  

3.1.3  Save  Allows the current Joone project to be saved as a serializable file.  file. 

3.1.4  Save As  Allows a Joone project to be saved as a serializable file with a different name or path.  

3.1.5  Export Neural Net  Allows the exporting of a neural net in a serialized form (.snet). This is provided for use in the distributed environment (or for other future uses).  uses).  

3.1.6  Print  Prints a graphical representation of the current project.  project. 

3.1.7  Exit  Exits the Joone editor.  editor. 

3.2   Edit   Joone allows various editing and positional actions to be performed on a component. Edit options may not be available if the required number of components is not selected. Copy

8

 

Joone Editor User Guide  Guide  and paste operations are only available with drawing tools and not with Joone components.   components.

3.2.1  Cut  Deletes a selected component from the screen but keeps it in memory for Paste operations.   operations.

3.2.2  Copy 

Copies a selected component from the screen to memory.  memory. 

3.2.3  Paste  Copies a selected component from memory to the screen.  screen. 

3.2.4  Duplicate  Duplicates a selected component.  component. 

3.2.5  Delete  Deletes a selected component from the screen.  screen.  

3.2.6  Group 

This menu item groups a number of components together so that they can be manipulated as a single component.  component. 

3.2.7  Ungroup  This ungroups a previously grouped set of components.  components.  

3.2.8  Sent to Back  Positions a selected component so that other components that overlap it are drawn over this one.  one. 

3.2.9  Bring to Front  Positions a selected component so that this component is drawn over other components that it overlaps. overlaps.

3.3   Align   Several components can be selected concurrently by either clicking on each required component while holding down the Shift key, or by dragging a rectangle around a group of components. Once a number of components are selected, alignment menu options can be applied to align the controls relative to each other.  other. 

3.3.1  Toggle Toggle Snap  Snap to Grid  Turns on/off the alignment of the components on a fixed grid facilitating the arrangement of the objects on the drawing area.  area.  

9

 

Joone Editor User Guide  Guide 

3.3.2  Left  Aligns selected components vertically along their left hand edge.  edge.  

3.3.3  Center  Aligns selected components vertically along their center line.  line. 

3.3.4  Right  Aligns selected components vertically along their right hand edge.  edge.  

3.3.5  Top  Aligns selected components horizontally along their top edge.  edge. 

3.3.6  Middle  Aligns selected components horizontally along their middle.  middle. 

3.3.7  Bottom Aligns selected components horizontally along their bottom edge.  edge. 

3.4   Attributes   The attributes of the drawing tools (see below) can be modified with the following commands. The following attributes can be changed:  changed:  

3.4.1  Figure Attributes:  • • •

Fill Color  Color  Pen Color  Color  Line’s Arrows  Arrows 

3.4.2  Text Attributes  • • • •

Font  Font  Font Size  Size  Font Style  Style  Text Color  Color 

These commands do not affect the attributes of the neural network components (layers and connections).  connections). 

10 10  

 

Joone Editor User Guide  Guide 

3.5   Control   3.5.1  Control Panel 

The Control Panel is the tool that controls the behaviour of the neural net. It contains three buttons:  buttons: 

Run: Continue: Stop:

Starts the neural net beginning from the first pattern of the input data set.  set.  Restarts the neural net from the last pattern processed.  processed.   Stops the neural net.  net. 

The Control Panel parameters are: are: The total number of the cycles for which the net is to be trained.   Epochs:  Input Patterns:  The total number of input rows for which the net is to be trained. This can be different from the number of rows read from the FileInput component (lastRow – firs  firstRow tRow + 1).  Momentum :  The value of the momentum (see the literature about the backpropagation algorithm).  Learning Rate:  The value of the learning rate (see the literature about the backpropagation algorithm).  Learning:  True if the net is to be trained, trained , otherwise set false.  Validation:  True if the net is to be tested on a validation data set. Used ONLY in conjunction with a Learning Switch component inserted in the net.   The number of initial cycles skipped from the learning algorithm. Pre--Learning :  Pre No Normally rmally this parameter is zero, used when there a DelayLayer component in the net. Inand thisiscase pre-learning mustis be set equal to the number of taps of that component allowing its buffer to become filled before the learning cycle starts.  

11 11  

 

 

Joone Editor User Guide  Guide 

To better explain the use of the Pre-Learning parameter, it serves to avoid making changes to the values of biases and synapses in the presence of a delayed layer. This is because if these values are altered from the first cycle of the learning process, this would woul d adjust them using a wrong input pattern, obtained before the full input ‘temporal window’ is presented to the network. For instance, if an input pattern is composed of one column as follows:  follows:  0.2  0.2  0.5   0.5 0.1   0.1 ....  ....  and an input delay layer with taps = 3 is present, then when the network might only have read the first two input values (0.2 and 0.5), and the output of this first layer would be: Delay Layer 0

-1 0

-1 0.2

-1

0.5

… 0.1 0.5 0.2

pattern.  In this case the network would learn the wrong {0 { 0, 0, 0.2, 0.5} pattern.  Thus the Pre-Lea Pre-Learning rning parameter must be set equal to the taps parameter so that the network starts to learn only when all the ‘taps’ values have been read.  read. 

3.5.2  Add Noise  This adds a random noise component to the net and is useful for allowing the net to exit from local At thesoend a round of training, adding noise may  ‘jolt’ the netorkaout of aminimum. local minimum thatoffurther training produces a better network. network. 

3.5.3  Randomize  This reset the weights of a neural network, initializing it to a random state.  state.  

12 12  

 

Joone Editor User Guide  Guide 

3.5.4  Reset Input Streams  This command resets all the buffered input streams in input and teacher layers, permitting the of their with the to input data. is useful after the contents of some filesreloading have changed andbuffers it is necessary reload theThis data.

3.5.5  Macro Editor  A powerful scripting management engine is provided in the Joone Editor. Before describing this feature, it is important to understand the concepts underlying the scripting engine.   engine. There are two types of script in Joone:  Joone:    Event -driven scripts  User -driven scripts 

These are are executed when a neural network’s event is raised.  raised.  These are executed manually by the user.  user. 

Both of the above type of scripts are contained within the neural network, and are serialized, stored and transported along with the neural network neural network that contains them (in the ó°‚®  ó°‚®  same way that macros defined in a MS Word of Excel document are). In this manner the defined scripts can be executed even when the net is running on remote machines outside the GUI editor.  editor.  It is possible to load, edit, save and run any script (in this document also referred to as Macro) using a user friendly graphical interface. The Macro Editor can be opened with the Control Control Macr Macro o Editor Editor menu menu item. item. A window window will will be shown, shown, as depict depicted ed in the following figure:  figure: 

13 13  

 

Joone Editor User Guide  Guide  In the Macro Editor, the user can define both of the two script types. On the left side of the editorthe there the list of thethe available scripts forthe theavailable neural For scripts. a new network, list is already contains definitions of all avai lable network. event -driven To edit a script, the user simply clicks on the desired script on the list and inserts the code in the text area on the right.  right.   The text area has some useful features to help with the writing of the BeanShell code:  code:  code based on the Java syntax (comments, keywords, literals, etc.)  etc.)   • Coloured code based parenthesise.   • Highlighting of matching opening/closing brackets and parenthesise.  • Highlighting of the current edited row of code.  code.  • Different colours used to emphasize unclosed strings.  strings. 

Note in the above figure the bold Java keywords “ new” and “for”, the small box indicating the corresponding open bracket matching the one near to the cursor, the greyed row indicating the current edited row, and the red colour used to indicate an unterminated unterminated string.   string. The event -driven scripts can not be renamed nor deleted from the list. If the user does not want to define an event -driven script, s/he can just remove or comment the corresponding code in the text area.  area.   The user -driven scripts can be added, renamed or deleted by choosing the corresponding

menu item in the ‘Macro’ menu.  menu.  

3.5.5.1 The Macro Editor Menu  Here are details of all the menu items of the Macro Editor frame.   File   File Import Macro  Macro  Save Macro as…

The content of a text file can be imported into the text area of the selected selected script. The old text will be replaced. The content of the selected script can be exported into a text file.  file.  14 14  

 

Joone Editor User Guide  Guide 

Close 

Closes the Macro Editor window.  window. 

Cut 

Select All 

Copies the selected text into the clipboard and delete it from the text area (you can also use CtrlCtrl -X). X).   Copies the selected text into the clipboard (you can also use Ctrl--C).  Ctrl Pastes the content of the clipboard into the text area starting at the cursor position (you can also use CtrlCtrl -V). V).   Selects all the content of the text area.  area. 

Macro   Macro Enable Scripting  Scripting 

If checked, this enables the execution of the event -driven 

Edit 

Copy  Paste   Paste

Add  Remove  Rename  Run  Set Rate… 

scripts for the edited neural network. If not checked, all the events of the net will be ignored.  ignored.  Adds a new user -driven  script (the user cannot insert new event -driven scripts).  scripts).   Removes the selected user -driven  script. Disabled for event -driven scripts.  scripts.   Permits renaming of the selected user -driven  script. Disabled for event -driven scripts.  scripts.   Runs the selected script.  script.  Sets the execution rate (the number of free training cycles between two execution calls) for cyclic event -driven scripts. The cyclic  event -driven  scripts are the ‘cycleTerminated’ and the ‘errorChanged’ scripts. Warning: The default value of rate for a new network is 1 (one), but it is recommended tat the value be set to between 10 and 100 (or even more) to ensure that there is sufficient processing time available for the running of neural network.   network.

15 15  

 

Joone Editor User Guide  Guide 

3.5.5.2 Macro Scripting Features  Features  The following section describes some characteristics of the scripting feature added to Joone’s engine.  engine. 

How to use internal Joone objects   To obtain a reference to the internal neural network’s objects use:  use:  •  jNet to access to the edited org.joone.net.NeuralNet  object  object    object   •  jMon to access to the contained org.joone.engine.Monitor  object For example:  example:  •

jNet.getLayers().size() jNet .getLayers().size()  

returns the number of the layers contained in the



jMon.getGlobalError()  

returns the last RMSE value from the net.  net. 

neural network.  network. 

A list of the callable public methods for the above two objects is available in the project’s  javadoc html files. To use the objects from the Joone libraries, it is not necessary to import the corresponding package. The following packages are imported automatically for you by the script’s engine:  engine:   org.joone.engine.*   org.joone.engine.*  org.joone.engine.learning.*;   org.joone.edit.* org.joone.util.*   org.joone.net.*  org.joone.net.* org.joone.io.*  org.joone.io.*  

org.joone.script.*;  org.joone.script.*;  

For instance, to create a new sigmoid layer, simply write:  write:  newLayer = new  SigmoidLayer();  new SigmoidLayer();

How to call another script from within a macro  Within a macro it is possible to call another script contained in the neural network (both  and event -driven scripts).  scripts).   user  and To do this, use the following code:  code:  name = “NameOfTheMacroToCall ”;  ”;  macro = jNet .getMacroPlugin().getMacroManager().getMacro(name); in().getMacroManager().getMacro(name);   jNet.getMacroPlug eval(macro);

The scope of the script’s variables All the scripts defined in a neural network (both user  and   and event -driven scripts) share the same namespace and actualactual-values storage. Thus a global variable declared and an d initialised in script_A can be accessed in script_B.  

For example: example:   1.  1.  Add a macro named ‘macro1’ and insert into it the code:

myVariable = 5; 

16 16  

 

Joone Editor User Guide  Guide  2.  2.  Add a macro named ‘macro2’ and insert into it the code: print(“The

value is:

“+ myVariable); 

Run first the macro1 and then the macro2; you will see into the standard output console the result: “The value is: 5” For further details about scope and the use of variables, see the BeanShell manual.  manual.  

3.6   Look’n’Feel   The default ‘look and feel’ of the Joone editor is Metal. Currently there is no way to make another style the default.  default. 

3.6.1  Metal  Selecting this menu option displays the editor in the Metal GUI style.  style.  

3.6.2  CDE/Motif  Selecting this menu option displays the editor in the CDE/Motif GUI style.  style.  

3.6.3  Windows  Selecting this menu option displays the editor in the Windows GUI style.

Selecting this menu option displays the editor in the Windows GUI style.  style.  

3.7   Help   3.7.1  About Joone  This option displays the Joone Editor About screen. The current version of the Editor and of the Engine being used is also displayed. Version numbers are of the form ‘majorrelease.minor--release release.minor release.build’, .build’, i.e. 1.2.5. If an older, incompatible engine version is detected, a warning will be displayed in the About screen, as backward compatibility is not guaranteed between editor and engine versions.

3.7.2  Help Contents  This option displays the on line help of help of the editor.  editor. 

17 17  

 

Joone Editor User Guide  Guide 

4

Toolbar 

The tool bar buttons are divided into two palettes. One contains all the drawing buttons, while the other contains all the construction components, as shown in the following figure:   figure:

The content of the drawing palette is determined by the Joone application while the component panel is configurable by modifying the layers.xml file. Please note that not all of the following components may appear by default in the Joone Editor application due to limited space on the tool bars. See the chapter on the XML parameter file for details on

how to alter the items that appear in the toolbar.

4.1  Selection Tool   The Selection Tool allows Layers to be selected and dragged around the screen. It is also use to create create links between Layers.  Layers. 

4.2   Label and Connected Label   The Label tool allows text comments to be placed on the screen.   The Connected Label tool allows the addition of text to each drawing tool. The attached text will follow the figure’s movements.  movements. 

4.3   Drawing Tools   These tools permit the addition of several figures to the drawing panel of the GUI editor. They will be saved along with the neural network (Save menu item) and then restored on the drawing panel (Open menu item).  item). 

4.4   Input Layers   The New File Input Layer, New URL Layer, New Excel Input Layer, the Switch Input Layer and the Learning Switch tools allow new input layers to be added to the screen.  screen. 

4.5   Layers   The New Linear Layer, New Sigmoid Layer, New Tanh Layer, New Logarithmic Layer, New Context Layer, New Nested ANN Layer and New Delay Layer tools allow new processing layers to be added to the screen.  screen. 

18 18  

 

Joone Editor User Guide  Guide 

4.6   Output Layers   The New Switch Output Layer, New File Output Layer, New Excel Output Layer and New Teacher Layer tools allow new output layers to be added to the screen.  

4.7   Charting Component   This component belongs to the output components family, so it can be used anywhere

that makes sense to insert an output component. For instance, it can be attached to the output of a Layer, or to the output of a Teacher component.   The charting component has the following properties:  properties:  maxXaxis:

maxYaxis: name: resizable:

show: title: serie:   serie:

The maximum maximum value of the X axis. Set this value to the maximum number of  the  the samples required to visualise in the chart.  chart.  The m maximum aximum value of the Y axis. Set this value to maximum maximum value you expect to display in the chart.  chart.  The name of the chart component.  component.  If true, the window containing the chart will be resizable. The resizing of the window will be rescale the entire chart, adapting itself to the new size of theto frame. frame.  Used show  or hide the window containing the chart.  chart.   The text shown in the title bar of the window containing the ch art. art.   Indicates   what series (column) in a multicolumn output data is to be Indicates displayed   displayed

All the above properties are updateable at any time, including during the running of the network making it possible to show the chart at several sizes or resolutions. This is an example of the use of the charting component:  component: 

19 19  

 

Joone Editor User Guide  Guide 

In the above example, the charting component is attached to the output of a teacher component to capture and display the RMSE values during the training phase. The maxXaxis property is set to the number of the training epochs, while the maxYaxis is set to the max value we want to show in the chart.  chart.   The user can change either of these values at any time to show a zoomed view of the chart.   chart.

4.8   Synapses   These components allow the user to choose the type of synapse that connects two layers.   In the toolbar there are three buttons as above, representing respectively a Full Synapse, a Direct Synapse and a Delayed Synapse. More type will be added to future versions of Joone.   Joone.

20 20  

 

Joone Editor User Guide  Guide 

• •



A Full Synapse will connect every output connection of one layer to this input of every neuron in another layer. A Direct Synapse will connect each output connection of one layer to exactly one neuron in the other layer. The number of outputs of the first layer must match the number of neurons in the second layer, or an exception will be generated when the net is started.  started.  A Delayed Synapse behaves as a full Synapse where each connection is implemented with a FIRFilter object. In this connection is implemented the temporal backpropagation backpro pagation algorithm by Eric A. Wan, as in 'Time Series Prediction by Using a Connectionist Network with Internal Delay Lines ' in Time Series Prediction. Forecasting the Future and Understanding the Past, by A.Weigend and N.Gershenfeld. N.Gershenfe ld. AddisonAddison-Wesley, 1994.  1994. 

To use these components, the firstly selects the tool button corresponding to the synapse required, then drags a line from one layer to another in the drawing area. The new inserted synapse will be shown with an arrow containing a small box at its centre, as in the following figure:  figure: 

The box contains a label indicating the type and state of the synapse. The available types are shown the following table:  table:   F  = Full Synapse  Synapse  D  = Direct Synapse  Synapse  -1  = Delayed Synapse  Synapse 

If the little box is greyed greyed,, then the synapse is disabled, indicating that this branch of the net is interrupted. To disable a synapse, rightright-click it and set the ‘Enabled’ property in the property panel to false. This feature is very useful in the designing of a neural network to try several architectures on the fly.  fly.   The neural network contained in the file org/joone/samples/synapses/synapses.ser contains an example using synapses currently available in JoonEdit. Warning: • Use the above tool buttons ONLY to connect two layers and NOT to connect any other component such as input or output components, plugins, etc. Doing otherwise will cause an exception.  exception.  • The basic method to connect two layers is to simply drag an arrow from the right handle of a layer to the other, which default to a Full Synapse. Sy napse.  

21 21  

 

Joone Editor User Guide  Guide  •

Neural networks saved with a previous version of JoonEdit will continue to work but the synapses will be shown unlabelled. This should not create confusion because in previous releases of the Joone editor only full synapses was created. Popup menu will work on older versions.  versions. 

4.8.1  The Pop-Up Menu for all Connection Types  Right--clicking on any connection displays a popRight pop -up menu as per other components.  components.  

The menu contains two items:  items:  Properties  Delete 

To show the property panel for the synapse synaps e selected.  selected.  To delete the connection.  connection. 

NB  • •

The Properties item is not shown if the line selected does not contain a synapse, for instance the arrow connecting an input component to a layer.  layer.  The Delete item is included with all poppop-up menus.  menus. 

4.9   Plugins  

The Center Zero Layer, New Normalize Layerallow and the Turning Layer tools,New as well as Linear, Dynamic Annealing Monitor newNew plugin layers Point to be added to the screen.  screen. 

22 22  

 

Joone Editor User Guide  Guide 

5

Layers 

There are numerous layer types available in the the Joone editor.  editor. 

5.1  Processing Layers   These layers contain processing neurons. They consist of a number of neurons as set by the layer’s row parameter. The differences between layer types are described in this section.   section.

5.1.1  Linear  The output of a linear layer neuron is the sum of the input values, scaled by the beta parameter. No transfer function is applied to limit the output value, the layer weights are always unity and are not altered during the learning phase.  phase. 

5.1.2  Sigmoid  The output of a sigmoid layer neuron is the sum of the weighted input values, applied to a sigmoid function. This function is expressed mathematically as:  as:  y = 1 / (1 + e-x) 

This has the effect of smoothly limiting the output within the range 0 and 1.  

5.1.3  Tanh  The tanh layer is similar to the sigmoid layer except that the applied function is a hyperbolic tangent function. This function is expressed as:  as:   y = (ex – e-x)/(ex + e-x) 

This has the effect of smoothly limiting the output within the range –1 and 1.  1.  During training, the weights of the Sigmoid and Sigmoid and Tanh layers are adjusted to match teacher layer data to the network output.  output. 

5.1.4  Logarithmic  The logarithmic layer is similar to the sigmoid layer except that the applied function is a logarithmic function. This function is expressed as:  as:  y = log(1 + x)  x)  y = log(1 – x) – x)   

if x >= 0  0  if x < 0  0 

This layer is useful to avoid to saturate the input of the layer in presence of input values near the extreme points 0 and 1.  1.  

23 23  

 

 

Joone Editor User Guide  Guide 

5.1.5  Context  The context layer is similar to the linear layer except that it has an auto -recurrent recurrent   connection between its output and input, like depicted in the following figure:  figure:   w

x

PE

y

Its activation function is expressed as:  as:  (t-1) y = b * (x + y(t  * w)  w) 

where : where : b = the beta parameter (inherited from the linear layer)  layer)  w = the fixed weight of the recurrent connection (not learned)  learned)  The w parameter is named ‘timeConstant’ in the property panel because it back propagates the past output signals and, as its value is less than one, the contribute of the past signals decays slowly toward zero at each cycle.  cycle.  In this manner the context layer has a own ‘memory’ embedded mechanism.  mechanism.   This layer is used in recurrent neural networks like the JordanJordan -Elman NNs.  NNs. 

5.1.6  Nested ANN  The nested neural network layer permits an entire neural network to be added to the network being edited. Using this component, it is possible to build modular neural networks, constituted of several pre-built neural networks, allowing complex, compound ‘organisms’ to benetwork created.(saved The parameter ‘Nested ANN’neural must net be filled the name of a serialized neural with the FileFile ->Export menuwith item). item).    name Note: A neural network to be used in a nested ANN component, must be composed solely of processing elements and not file I/O and/or Teacher layers.  layers.  

5.1.7  Delay  The delay layer applies the sum of the input values to a delay line, so that the output of each neuron is delayed a number of iterations specified by the taps parameter.  parameter.  To understand the meaning of the taps parameter, the following picture that contains two different delay delay layers, one with 1 rows and 3 taps, and another with 2 rows and 3 taps:  

24 24  

 

 

Joone Editor User Guide  Guide 

X2(t - 3) X1(t - 3)

X1(t - 3) -1

-1

-1 X2(t - 2) X1(t - 2) -1

-1

X1(t - 2)

-1 X2(t - 1) X1(t - 1) -1

-1

-1 X2(t)

X2(t) X1(t)

X1(t) Rows = 1  1  Taps = 3  3 

X1(t - 1)

X1(t)

X1(t) Rows = 2  2  Taps = 3  3 

the delay layer has:  has:  • the number of inputs equal to the rows parameter  parameter   1)   • the number of outputs equal to the rows * (taps + 1) The taps parameter indicates the number of output delayed cycles for each row of neurons, plus one because the delayed layer also presents the actual input sum signal Xn(t) to the output. During a training phase, error values are fed backwards through the delay layer is as very required. required.    to train a neural network to predict a time-series, giving it a Thiss layer Thi useful ‘temporal window’ of the input raw data.  data. 

25 25  

 

Joone Editor User Guide  Guide 

5.2   I/O Layers   The I/O (Input / Output) layers represent interfaces between the processing layers of a neural network and the external environment, providing the net with the data needed for processing and/or training.  training. 

5.2.1  File Input  The file input layer allows data in a file to be applied to a network for processing. Data for processing is expected as a number of rows of semicolon- separa separated ted columns of values. For example, the following is a set of three rows of four columns:  columns:  0.2;0.5;0.6;0.4  0.2;0.5;0.6;0.4   0.3;0.3; -0.35;0.23;0.29 0.35;0.23;0.29    0.7;0.99;0.56;0.4  0.7;0.99;0.56;0.4 

Each value in a row will be made available as an output of the file layer, and the rows will be processed sequentially by successive processing steps of the network. As some files may contain information additional to the required data, the parameters firstRow, lastRow, firstCol and lastCol may be used to define the range of useable data. The filename parameter specifies the file that is to be read from.

5.2.2  URL Input  The URL input layer allows data from a remote location to be applied to a network for processing. The allowed protocols are http and ftp. The data format is the same as for the FileInput layer.

5.2.3  Excel Input Input 

The Excel Input layer permits data from an Excel file to be applied to a neural network for processing. Its ‘sheet’ parameter allows the name of the sheet to be chosen from which the input data is read.  read. 

5.2.4  Switch Input  The switch input allows the choice of which input component is to be connected to the neural network, choosing between all the input components attached to it. The user, after having attached several input components to its input, can set the ‘active input’ parameter with the name of the chosen component that is to be connected to the net. The ‘default input’ parameter must be filled with the name of the default component (the one activated when the user selects the ‘Control->Reset ‘Control->Reset Input Streams’ menu item).  item).  The switch input component, along with the output switch layer, permits dynamic changing of the architecture of the neural network, changing the input and/or output data layers attached to the neural network at any time. This is useful to switch the input source, for instance, between the file containing the training data set and the file containing the validation data set to test the training of the neural network, as depicted in the following screen shot:  shot:  26 26  

 

Joone Editor User Guide  Guide 

5.2.5  Learning Switch  The learning switch is a special implementation of the Switch Input component and can be used to attach both a training data set and a validation data set to the neural net. In this way the user can test the generalization property of a neural network using a different data set to the one used during the training phase.  phase.   The training input data set can be attached by dragging an arrow from the input component to the learning switch, while the validation input data set can be attached simply by dragging an arrow from the red square on top of the learning switch to the

input the validation set.in To changecomponent the value ofcontaining the 'validation' parameter data shown the switch Controlbetween Panel.   them, simply Panel.  This component has two properties: trainingPatterns : validationPatterns:

Must be set to the number of the input patterns constituting the training data set.  set.  Must be set to the number of the input patterns constituting the validation data set.  set. 

Both these above parameters are obtained from the following formula lastRow - firstRow + 1 where the last/firstRow variables contain the values of the corresponding parameters of the attached input components.  components.  The following figure depicts the use of this component:  component:  

27 27  

 

Joone Editor User Guide  Guide 

Warning: Because a validation data set will also be required for the desired data, this component must be inserted both before the input layer of the neural network and between the Teacher layer and the desired  input  input data sets.  sets. 

5.2.6  File Output  The file output layer is used to convert the results of a processing layer to a text file. The filename parameter specifies the file that the results are to be written to. Results are written in the same semicolonsemicolon-separated form as file input layers.  layers. 

5.2.7  Excel Output  The Excel output layer is used to write the results of a processing layer to an Excel formatted file. The filename parameter specifies the file that the results are to be written to. The ‘sheet’ parameter allows the name of the sheet to be chosen, to which the input data is to be written.  written. 

5.2.8  Switch Output  The switch output output permits the choice of which output component is to be connected to the neural network, choosing between all the output attached components. The user, after having attached several components to its output, can set the ‘active output’ parameter with the name of the chosen component that is to be connected to the net. The ‘default output’ parameter must be filled with with the name name of the default component (which one activated when the user selects the ‘Control‘Control ->Reset Input Streams’ menu item).  item). 

5.2.9  Teacher  The Teacher layer allows the training of a neural net, using the back -propagation learning algorithm. It calculates the difference between the actual output of the net and the expected value from an input source. It provides this difference to the output layer for the training.   training. To train a net, add a Teacher component, connecting it to the output layer of the net, and then connect an input layer component to it, linked to a source containing the expected data (see figure).  figure).  28 28  

 

Joone Editor User Guide  Guide 

29 29  

 

Joone Editor User Guide  Guide 

6

Plugins 

There are three types of pre-processing plugins for the input data, and two monitor plugins. A connection to a plugin can be added by dragging an arrow from the magenta square handle on the bottom side of an input layer, as depicted in the following figure:  figure: 

6.1  Pre -- Processing P   rocessing Plugins   There are three pre-processing plugins implemented, but others can be implemented by extending the org.joone.util.ConverterPlugIn  class:  class:  

Centre on Zero  Normalizer  Turning Points Extractor 

This plugin centres the entire data set around the zero axis by subtracting subtracting the average value.  value.  This plugin can normalize an input data stream within a range determined by its min and max parameters.  parameters.   This plugin extracts the turning points of a time series, generating a useful input signal for a neural net, emphasising the relative max and min of a time series (very useful to extract buy and sell instants for stock forecasting). Its minChangePercentage parameter indicates what the minimum change around a turning point should be to consider it a real change of direction of the time series. Setting this parameter to a relative high value helps to reject the noise of the input data.  data.  

Every plugin has a common parameter named serie. This indicates what series (column) in a multicolumn input input data is to be affected (0 = all series).  series).   A plugin can be attached to an input layer, or to another plugin so that pre-processing modules can be cascaded. If both centre on zero and normalize processing is required for an input stream, the centre on zero plugin can be connected to a file input layer, and then a normalizer plugin attached to this, as shown in the following figure:  figure: 

30 30  

 

Joone Editor User Guide  Guide 

6.2   Monitor Plugins   There are also two Monitor Plugins. These are useful for dynamically controlling the behaviour behav iour of control panel parameters (parameters contained in the org.joone.engine.Monitor object).  object).  The   pluginduring changes the values of thevary learning rateinitial (LR)value and to thea Linearparameters Annealinglinearly momentum training. The values from an final value linearly, and the step is determined by the following formulas:  formulas:   step = (FinalValue (FinalValue - InitValue) / numberOfEpochs  numberOfEpochs  LR = LR – step The Dynamic Annealing plugin controls the change of the learning rate based on the difference between the the last two global error (E) values as follows:  follows:  If E(t) > E(tE(t-1) then LR = LR * (1 - step/100%).  step/100%).   If E(t) <= E(tE(t-1) then LR remains unchanged.  unchanged.  The ‘rate’ parameter indicates how many epochs occur between an annealing change. These plugins are useful to implement the annealing (hardening) of a neural network, changing the learning rate during the training process. With the Linear Annealing plugin, the LR starts with a large value, allowing the network to quickly find a good minimum, and then the LR reduces permitting the found minimum to be fine tuned toward the best value, with little the risk of escaping from a good minimum by a large LR.  LR.  

 

 

The Dynamic Annealing plugin is an enhancement to the Linear concept, reducing the LR only as required, when the global error of the neural net augments are larger (worse) than the previous step’s error. This may at first appear counter-intuitive, but it allows a good minimum to be found quickly and then helps to prevent its loss.  loss.  

31 31  

Joone Editor User Guide  Guide 

6.2.1  The Annealing Concept 

Error surface

Actual error of the NN  NN 

Relative minimum Absolute minimum

To explain why the learning rate has to diminish as the error increases, look at the above figure:   figure: All the weights of a network represent an error surface of n-dimensions n-dimensions (for simplicity, in the figure there are only two dimensions). To train a network means to modify the connection weights so as to find the best group of values that give the minimum error for certain input patterns.  patterns.  In the above figure, the red ball represents the actual error. It ‘runs’ on the error surface during the training process, approaching the minimum error. Its velocity is proportionate to the value of the learning rate, so if this velocity is too high, the ball can overstep the absolute minimum and become trapped in a relative minimum.  minimum.  To avoid this side effect, the the velocity (learning rate) of the ball needs to be reduced as the error becomes worse (the grey ball).  ball). 

7

Basic Tutorial 

This tutorial creates a simple network connecting a file input layer containing four examples of two input values to a file output layer via a linear layer containing two neurons.   neurons. 1.  1.  Using a text editor, create a new file and add four lines as follows:   0.2;0.3  0.2;0.3    0.4;0.5  0.4;0.5 0.6;0.8  0.6;0.8  0.9;1.0  0.9;1.0  

2.  2.  Save the file file to to disk disk (e.g. (e.g. c:\ temp temp \ sample.txt).  sample.txt).  

32 32  

 

Joone Editor User Guide  Guide  3.  3.  Start JoonEdit and insert a new linear layer. Click on this layer to display the properties page. Set the rows value to 2.  2.  4.  4.  Insert a File Input layer to the left of the linear layer, then click on it to display the properties window:  window:  Set the firstCol parameter to 1.  1.  Set the lastCol parameter to 2.  2.   \ temp temp \ sample.txt sample.txt in the fileName parameter.  parameter.  • Enter c: \  • Leave the firstRow as 1 and the lastRow as 0 so that the input layer will read all the rows in the file.  file.   Connect the input layer to the linear layer dragging a line from the little circle on the right hand side of the input layer, releasing the mouse button when the arrow is on the linear layer.  layer.   Now insert a File Output layer to the right of the linear layer, click on it and insert into the properties window:  window:  • ”c: \   \ temp temp \ sampleout.txt” sampleout.txt” on the fileName parameter.  parameter.  Connect the linear layer to the file output layer by dragging a line from the little circle on the right hand side of the linear layer, releasing the mouse button when the arrow is on the file output layer.  layer.   At this stage the screen should look similar to this:  this: 

• •

5.  5. 

6.  6. 

7.  7. 

8.  8. 

9.  9.  Click on the ‘Net->Control Panel’ menu item to display the control panel. Insert the following:  following:  • Set the totCicles parameter to 1. This will process the file once.  once.   • Set the patterns parameter to 4. This sets the number of example rows to read.   • Lea Leave ve the learningRate and the momentum fields unchanged. These parameters are used for training a net. • Also set the learning parameter to FALSE, as the net is not being trained.  trained.  10.  Click the START button  10. button  11.  A file named c: \  11.  \ temp temp \ sampleout.txt sampleout.txt will be written with the results. 12. If 12.  If you want, you can save the net to disk with the ‘File->Save As’ menu item, and reload it later with ‘File‘File->Open’. >Open’.  

33 33  

 

 

Joone Editor User Guide  Guide 

8

An Advanced Example: The XOR Problem

This example illustrates a more complete construction of a neural net to teach the classical clas sical XOR problem.  problem.  In this example, the net is required to learn the following XOR truth table: Input 1  0 0 1 1

Input 2  0 1 0 1

Output 0 1 1 0

So, we must create a file containing these values: 0.0;0.0;0.0   0.0;1.0;1.0   1.0;0.0;1.0   1.0;1.0;0.0  

A semicolon needs to separate each column. The decimal point is not mandatory if the numbers are integer.  integer.  Create this file with a text editor and save it on the file system (for instance c:\ c:\joone\xor.txt in a Windows environment). Now we'll build a neural net like this:

Run the editor, and execute these following steps: 1.  1.  Add a new sigmoid layer parameter to 2  2 

and set its layerName to 'Input' and the rows

2.  2.  Add a new sigmoid layer, and set its layerName to 'Hidden' and the rows parameter to 3  3 

3.  3.  Add a new sigmoid layer, and set its layerName to 'Output', leaving the rows parameter as 1  1 

34 34  

 

Joone Editor User Guide  Guide  4.  4.  Connect the input layer to the hidden layer by dragging a line from the little circle on the right hand side of the input layer, releasing the mouse button when the arrow is on the hidden layer  layer  5.  5.  Repeat the above step connecting the hidden layer to the output layer  layer  At this stage the screen should look similar to this:

6.  6.  Insert a File Input layer  layer 

to the left of the input layer, then click on it to display

the properties window:  window:  o Set the firstCol parameter to 1  1  o

Set the lastCol parameter to 2  2 

o

Enter c: \   \   joone \ xor.txt xor.txt in the fileName parameter  parameter 

o

Leave the firstRow as 1 and the lastRow as 0 so that the input layer will read all the rows in the file  file  

7.  7.  Connect the File Input to the input layer  layer  8.  8.  Insert a Teacher layer  layer 

to the right of the output layer  layer  

9.  9.  Connect the output layer to the Teacher layer  layer  Now we must provide the desired data to the teacher (the last column of the file xor.txt) to train the net: 10.  Insert a File Input layer 10. the properties window:  window: 

above of the Teacher layer, then click on it to display

o

Set the firstCol parameter to 3  3 

o

Set the lastCol parameter to 3  3 

o

Enter c: \   \   joone \x  \xor.txt or.txt in the fileName parameter  parameter 

o

Leave the firstRow as 1 and the lastRow as 0 so that the input layer will read all the rows in the file  file  

o

Set the name parameter to 'Desired data'  data' 

11. Connect 11.  Connect the Teacher layer to that last File Input layer by dragging a line from the little red square square   on the top side of the Teacher layer, releasing the mouse button when the yellow arrow is on the last inserted File Input layer   At this stage the screen should look similar to this:

35 35  

 

Joone Editor User Guide  Guide 

12. Click 12.  Click on the ‘Net->Control ‘Net- >Control Panel’ menu item to display the control panel. Insert the following:  following:  o

Set the totCicles parameter to 10,000. This will process the file 10,000 times   times

o

Set the patterns parameter to 4. This sets the number of example rows to read.   read.

o

Set the learningRate parameter to 0.8 and the momentum parameter to 0.3  0.3  

o

Set the learning parameter to TRUE, as the net must be trained.  trained. 

o

Set the validation parameter to FALSE, as the net is not being tested  tested  

13.  Click the START button, and you'll see the training process starting.  13. starting.   The Control Panel shows the cycles remaining and the current error of the net. At the end of the last cycle, the error should be very small (less than 0.1), otherwise click on the Net->Randomize menu item (to add a little ‘noise’ to the weights of the net) and cli click ck the START button again. If required, you can save the net with the 'File->Save 'File->Save As…' menu item, so you can reuse the net later by loading it from the file system.

36 36  

 

Joone Editor User Guide  Guide 

8.1  Testing the Trained Net   To test the trained net: 14.  Add an Output File layer on the right of the output layer, click on it and insert into 14. the properties window:  window:  o

”c: \   \ temp temp \ xorout.txt” xorout.txt” on the fileName parameter  parameter 

15.  Connect the output layer of the net to the File Output layer.   15. 16. Select 16.  Select the line that connects the output layer to the Teacher layer and click on the Edit-->Delete to disconnect the Teacher from the neural net.  Edit net.   17.  On the Control Panel change the following:  17. following:  o

Set the totCicles parameter to 1. This will process the input file once.  once.  

o

Set the learning parameter to FALSE, as the net is not being trained.  trained. 

lick on the START button.  button.  18. Click 18. C 19. Open the xorout.txt  file with an editor, and you'll see a result like this (the 19. Open values can change from one run to another, depending on the initial random values of the weights):  weights):  0.02592995527027603   0.9664584492704527   0.9648193164739925   0.03994103766843536  

This result shows that the neural net has learned the XOR problem, providing the correct results: •

a value near zero when the input columns are equal to 0;0 and 1;1  1;1  



a value near one when the input columns are equal to 0;1 and 1;0 and  1;0  

37 37  

 

Joone Editor User Guide  Guide 

9

The XML Parameter File 

This section explains how to modify the tool palette to personalize and extend the editor with new components.  components.  The following sections are provided in the layers.xml file, provided with the editor package. By default he Joone Editor will look for the file org/joone/data/layers.xml as the parameter file, but this behaviour can be overridden by specifying the parameter file as a parameter to the run command: java org.joone.edit.JoonEdit /some_path/your_parameter_file .xml 

The layers.xml file provided with Joone includes all the available component types in the Joone project although some of the lessless-used ones are commented out. 

9.1  The <buttons> … </buttons> Section   This section contains all the buttons that are present in the toolbar. toolbar.   ="… <layer type=" …" class  class=" ="…” image  image=" ="…" />  /> 

This section describes a layer. Its parameters are:  are:   type:

The name of the layer that will be shown when the mouse passes over it.  it.  

class:

image:

The name of the class (complete with the package xxx.yyy.zzz.classname xxx.yyy.zzz.c lassname ). ). The class must extend the org.joone.engine.Layer   class and must be in the JVM classpath. The name name of the image of the toolbar button. The name of the file should be: “org.joone.images.” + image_name + “.gif”, so the searched file name will be:  /o  /org/joone/images/image_name.gif rg/joone/images/image_name.gif (only gif images are allowed in the actual version of the editor).  editor). 

<input_layer type="…" class ="…" image=" ="…  class=" …" /> 

This section describes an input layer. All parameters are the same as above. The input layer class must extend the org.joone.io.StreamInputSynapse   and must be in the classpath. <input_switch type=" ="… …" class=" ="… …" image  image=" ="…" />  /> 

This section describes an input switch layer. All parameters are the same as above. The input switch layer class must extend the org.joone.engine.InputSwitchSynapse   and must be in the classpath.  classpath.   class=" ="…" image  image=" ="…" /> />    <learning_switch type="…" class

This section describes a learning switch layer.  layer.   All parameters are the same as above. The learning switch layer class must extend the org.joone.engine.learning.LearningSwitch  and must be in the classpath.  classpath.   ="… ="… <output_layer output_layer   type=" …" class=" …" image  image=" ="…" />  /> 

38 38  

 

Joone Editor User Guide  Guide  This section describes an output layer. All parameters are the same as above. The output layer class must extend the org.joone.io.StreamOutputSynapse   and must be in the classpath. <output_switch type="…" class  class=" ="…" image  image=" ="…" /> 

This section describes an output switch layer. All parameters are the same as above. The output switch layer class must extend the org.joone.engine.OutputSwitchSynapse   and must be in the classpath. the classpath.   < teacher_layer  type="… "  class="… "  image="… " />

This section describes a teacher layer. All parameters are the same as above. The teacher layer class must extend the org.joone.engine.learning.TeachingSynapse   and must be in the classpath.  classpath.  ="… ="… <input_plugin type=" …" class=" …" image  image=" ="…" />  /> 

This section describes an input plugin for the data pre-processing. All parameters are the same as above. The input plugin class must extend the

org.joone.util.ConverterPlugin

class and must be in the classpath.  classpath.  

<monitor_plugin  type=" type="… ="… ="… />    …" class=" …" image=" …" />

This section describes a monitor plugin to control the behaviour of the training process. All parameters are the same as above. The monitor plugin class must extend the org.joone.util.MonitorPlugin class and must must be in the classpath.  classpath.  …" image …" />   <synapse type="…" class="…" label  label=" ="…  image=" ="… /> 

This section describes a synapse to connect two layers together. This tag has the same properties as the other components, plus a ‘label’ property to set the label shown in the lit little tle box. The label is not mandatory but its use is recommended to visually distinguish various types of synapse displayed in the drawing area. If no label is specified, the synapses will be drawn as an arrow. The synapse component must extend the org.joone.engine.Synapse  class.  class.  

9.2   The <options> … </options> Section   This section contains all the parameters that control the behaviour of the GUI editor. Currently it contains the following tags:  tags:   value="nnn" /> refreshing_rate  />    <refreshing_rate

This parameter controls the refreshing rate of the fields shown in the control panel during the training. To speed up the training process, set this value to a high value (100 or more).

9.3   Separators   Small spaces can be inserted between toolbar buttons by inserting separator tags at the required position in the layers file thus:  thus:   <separator/>  

39 39  

 

Joone Editor User Guide  Guide 

9.4   Temporarily Removing Items   Any item can be temporarily removed from the toolbar by enclosing the item in comment tags <!-- and --> -->. For example, to temporarily remove a separator from the toolbar, edit the separator tag… <separator/>

…to… <!--<separator/>-<separator/>--> >

…or alternatively… alternatively…   <!--<separator/ <!-<separator/ --> -->

 

 

40 40  

Joone Editor User Guide  Guide 

10  Online Resources  10.1  Joone   Home page: http://www.joone.org/  Summary page: http://sourceforge.net/projects/joone/ 

10.2   Artificial Neural Networks Technology   http://www.dacs.dtic.mil/te chs/neural/neural_ l/neural_ToC.html ToC.html http://www.dacs.dtic.mil/techs/neura 10.3   Java   http://java.sun.com/ 

10.4   JAXP   http://java.sun.com/xml

10.5   JHotDraw   http://sourceforge.net/projects/jhotdraw/ 

10.6   Source Forge   www.sourceforge.net http:// www.sourceforge.net

10.7   Sun Microsystems   http://www.sun.com/ 

41 41  

 

Joone Editor User Guide  Guide 

11  Glossary  11.1  Ann / NN   Artificial Neural Network. Synonymous throughout this document with the term neural network, an interconnected network of simulated neurons.  neurons. 

11.2   C lasspath  lasspath   The environment variable listing the files and directories that Java will search for runtime classes. On a Windows system, this variable is available by right-clicking the My Computer icon on the desktop, selecting the Advanced tab of properties, selecting Environment Variables. A semicolon should separate different entries in the Classpath.  Classpath. 

11.3   GUI   Graphical User Interface.  Interface. 

11.4   Jar File   Similar to a Zip file.  file. 

11.5   JAXP   Java API for XML Parsing. Parsing. Java’s set of packages for processing XML files.  files.  

11.6   Layer   A collection of neurons that make up a Neural Network. The output of each neuron in a layer is linked to the input of every neuron in connected layers.  

11.7   Neuron   An independent processing unit, similar to a neuron in the brain. A neuron accepts inputs from the outputs of other neurons and produces a result from these.  these. 

11.8   Neural Network   Computer models based on the neural structure of the brain that are able to learn from experience.   experience.

11.9   PE   Processing Element, i.e. a node (the neuron) constituting a layer.

11.10   Swing   Java’s GUI component set. The Joone editor uses these components as the graphical framework of the GUI.  GUI. 

42 42  

 

Joone Editor User Guide  Guide 

11.11  XML  Extensible Markup Language. The Joone editor uses this language to process some parameter files.  files. 

11.12   Zip File   A compressed set of files. Java is able to read zipped files at run time, allowing Java programs to be deployed as a small number of files.  files.  

43 43  

 

This document was created with Win2PDF available at http://www.daneprairie.com. The unregistered version of Win2PDF is for evaluation or non-commercial use only.

 

   av a  a  O b je c t  t  O rie n te d  N    N eu r al  E   E n gi n  Ja  ne 

Joone Core Engine   Developer Guide  Guide  Paolo Marrone ( [email protected]  [email protected]  ceforge.net )

 

Joone Core Engine  Engine 

 Developer Guide Guide 

Summary  Revision............................................................................ Revision..................................................... .............................................. ............................................. ............................................. .......................3 3 Overview ............................................ ................................................................... .............................................. .............................................. ............................................. ............................. .......4 4 Requirements................................................ Requirements........................ ............................................... ............................................ ............................................. ........................................ ................4 4 Performances..................... Performances ............................................ ............................................... ............................. ..... Errore. Il segnalibro non è definito.  The first neural network ............................................ ................................................................... .............................................. ............................................. ............................. .......5 5 5 A simple but useless neural network........................................................... network.................................................................................. ..................................... ..............5 A real implementation: the XOR problem. ............................................. .................................................................... ......................................... ..................6 6 Saving and restoring a neural network ............................................. ................................................................... .............................................. ............................. ......9 .9 The simplest way............................................................ way.................................................................................. ............................................ ............................................ ......................9 9 Using a NeuralNet object ............................................. .................................................................... .............................................. ...........................................10 ....................10   Using the outcome of a neural network............................................ network.................................................................. ............................................. ............................12 .....12   Writing the results to an output file................................................................. file....................................................................................... ...............................12 .........12 Getting the results into an array ........................................... .................................................................. ............................................... ...................................13 ...........13   Using multiple input patterns ........................................... .................................................................. ............................................. ...................................13 .............13   Using only one input pattern ............................................ ................................................................... .............................................. ...................................15 ............15

http://www.joone.org

2

 

 Developer Guide Guide 

Joone Core Engine  Engine 

Revision  Revision  0.1.0 0.1.5 0.1.6

Date   Date April 11, 2002  2002 

Author  Author  Paolo Marrone  Marrone 

April 16, 2002  2002 

Paolo Marrone  Marrone 

May 6, 2002  2002 

Paolo Marrone  Marrone 

June 4, 2002

Paolo Marrone  Marrone 

0.1.7

Comments  Comments  • Pre Pre--release draft  draft  • • • • • •

Added the overview section  section  Added the example about the use of the NeuralNet object  object  Fixed a bug in the restoreNeuralNet example  example   Added the chapter about the use of the outcome of a NN  NN   Fixed some bugs on the example code  code  Added the examples about the several methods to interrogate a trained NN  NN 

http://www.joone.org

3

 

Joone Core Engine  Engine 

 Developer Guide Guide 

 

Overview This manual contains a developer guide to explain how to use use the Joone’s core engine API.  API.  This paper illustrates the use of the engine through code examples, as we are confident that this is the best way to learn the mechanisms underlying this powerful library.  library.  To better understand the concepts behind Joone, we recommend you read the Technical Overview, available on the Joone’s sourceforge download area.  area.  

Requirements   To try the samples contained in this manual you need the following packages:   -engine.jar   joone   - The - The joone’s JDK core implementation engine library  library  (from SUN or IBM, for instance)  jdk 1.1.8 or above  –  – Whatever  Whatever instance)  

The core engine doesn’t need any other library, and can run on whatever operating system and hardware platform.  platform.  We’re committed to maintain true the above two assertions for all the t he classes contained contained in the org.joone.engine package for all the next releases of the Joone’s core engine.  engine.   It’s not guaranteed, instead, the future compatibility with the JVM 1.1.x.

Performance   To improve the performance of the engine, especially in the presence of large training data sets, we recommend to use a JVM 1.2 or above, enabling the JIT compiler (higher performances are

obtained using the SUN’s HotSpot™ technology).  technology).  Because the engine is built on a multi-threaded structural design, Joone will benefit from a multiprocessor hardware architecture.  architecture. 

http://www.joone.org

4

 

Joone Core Engine  Engine 

 Developer Guide Guide 

The first neural network  A simple (but useless) neural network   Consider a feedfeed-forward neural net composed of three layers like this:

To build this net with Joone, three Layer objects and two Synapse objects are required:

SigmoidLayer layer1 = new SigmoidLayer();  SigmoidLayer();   SigmoidLayer SigmoidLay er layer2 = new SigmoidLaye SigmoidLayer(); r();    SigmoidLayer layer3 = new SygmoidLayer();  SygmoidLayer();   FullSynapse synapse1 = new FullSynapse();  FullSynapse();   FullSynapse synapse2 = new FullSynapse();  FullSynapse();  

The SigmoidLayer objects and the FullSynapse objects FullSynapse objects are real implementations of the abstract the abstract Layer and Synapse objects.  objects.  Set the dimensions of the layers:  layer1.setRows(3); layer2.setRows(3);   layer3.setRows(3);  

Then complete the net, connecting the three layers with the synapses: layer1.addOutputSynapse(synapse1);   layer2.addInputSynapse(synapse1);   layer2.addOutputSynapse(synapse2);   layer3.addInputSynapse(synapse2);  

 

 

As you can see, each synapse is both the output synapse of one layer and the input synapse of the next layer in layer in the net. This simple net is ready, but it can't do any useful work because there are no components to read or

http://www.joone.org

5

 Developer Guide Guide 

Joone Core Engine  Engine 

write the data. The next example shows how to build a real net that can be trained and used for a real problem.

A real implementation: the XOR problem. Suppose a net to teach on the classical XOR problem is required. In this example, the net has to learn the following XOR ‘truth table’: Input 1 Input 2 Output 0 























Firstly, a file containing these values is created: 0.0;0.0;0.0  0.0;0.0;0.0  0.0;1.0;1.0  0.0;1.0;1.0   1.0;0.0;1.0  1.0;0.0;1.0   1.0;1.0;0.0  1.0;1.0;0.0  

Each column must be separated by a semicolon. The decimal point is not mandatory if the numbers are integer.  integer.  Write this file with a text editor and save it on the file system (for instance c:  \   joone \   \ xor.txt xor.txt  in a Windows environment).  environment).  Now build a neural net that has the following three layers:  layers:   • An input layer with 2 neurons, to map the two inputs of the XOR function • A hidden layer with 3 neurons, a good value to assure a fast convergence • An output layer with 1 neuron, to represent the XOR function's output as shown by the following figure:

First, create the three layers (using the sigmoid transfer function): SigmoidLayer input = new SigmoidLayer();  SigmoidLayer();   SigmoidLayer hidden = new SigmoidLayer();  SigmoidLayer();   SigmoidLayer output = new SygmoidLayer();  SygmoidLayer();  

set their dimensions: input.setRows(2);   hidden.setRows(3);   output.setRows(1);  

http://www.joone.org

6

 

Joone Core Engine  Engine 

 Developer Guide Guide 

Now build the neural net connecting the layers by creating the two synapses using the FullSynapse class that connects all the neurons on its input with wit h all the neurons on its output (see the above

figure): FullSynapse synapse_IH = new FullSynapse(); /* Input -> Hidden conn. */  */  FullSynapse synapse_HO = new FullSynapse(); /* Hidden -> Output conn. */  */ 

Next connect the input layer with the hidden layer: input.addOutputSynapse(synapse_IH);   hidden.addInputSynapse(synapse_IH);  

and then, the hidden layer with the output layer: hidden.addOutputSynapse(synapse_HO);   output.addInputSynapse(synapse_HO);  

Now create a Monitor object to provide the net with all the parameters needed for it to work: Monitor monitor monitor = new Monitor(); Monitor();    monitor.setLearningRate(0.8);   monitor.setMomentum(0.3);  

Give the layers a reference to that Monitor: input.setMonitor(monitor);   hidden.setMonitor(monitor);   output.setMonitor(monitor);  

The application registers itself as a monitor's listener, so it can receive the notifications of termination from the net. To do this, the application must implement the org.joone.engine.NeuralNetListener interface. monitor.addNeuralNetListener(this);  

Now define an input for the net, then create an org.joone.io.FileInputStream and give it all the parameters: FileInputSynapse inputStream = new FileInputSynapse();  FileInputSynapse();  /* The first two columns contain the input values */  */  inputStream.setFirstCol(1); inputStrea m.setFirstCol(1);    inputStream.setLastCol(2);   /* This is the file that contains the input data */  */   inputStream.setFileName("c: \\joone\\XOR.txt");

Next add the input synapse to the first layer. The input synapse extends the Synapse object, so it can be attached to a layer like a synapse.  synapse.  

input.addInputSynapse(inputStream);  

A neural net can learn from examples, so it needs to be provided it with the right responses. For each input the net must be provided with the differenc diff erencee between the desired response and the actual response gave from the net. The org.joone.engine.learning.TeachingSynapse is the object that has this task:

 

 

TeachingSynapse trainer = new TeachingSynapse();  TeachingSynapse();  /* Setting of the file containing the desired responses, provided by a FileInputSynapse */  */  FileInputSynapse samples = new FileInputSynapse();  FileInputSynapse();   samples.setFileName("c: \\joone\\XOR.txt"); trainer.setDesired(samples);   /* The output values are on the third column of the file */  */   samples.setFirstCol(3);   samples.setLastCol(3);   /* We give it the monitor's reference */  */   trainer.setMonitor(monitor);  

http://www.joone.org

7

Joone Core Engine  Engine 

 Developer Guide Guide 

The TeacherSynapse object extends the Synapse object. This can be added as the output output of the last layer of the net. output.addOutputSynapse(trainer);  

Now all the layers must be activated by invoking their method start. The layers implement the  java.lang.Runnable  java.lang.Ru nnable interface, interface, in that that way they run on separated separated threads. threads. input.start();

hidden.start(); output.start(); trainer.start();  trainer.start();  

Set all the training parameters of the net: monitor.setPatterns(4); /* # of rows contained in the input file */ monitor.setTotCicles(20000); /* How many times the net must be trained*/  trained*/   monitor.setLearning(true); /* The net must be trained */  */   monitor.Go(); /* The net starts the training phase */  */ 

Here is an example describing how to handle the netStopped and  cicleTerminated events. Remember: To be notified, the main application must implement the org.joone.NeuralNetListener  interface and must be registered to the Monitor object by calling the Monitor.addNeuralNetListener(this) method.  public void netStopped(NeuralNetEvent e) {  {  System.out.println("Training finished");  finished");    System.exit(0);  System.exit(0); } public void cicleTermina cicleTerminated(NeuralN ted(NeuralNetEvent etEvent e) {  Monitor mon = (Monitor)e.getSource(); (Monitor)e.getSource();    long c = mon.getCurrentCicle();  mon.getCurrentCicle();  long cl = c / 1000;  1000;  /* We want print the results every 1000 cycles */  */   if ((cl * 1000) == c)  c)   System.out.println(c + " cycles remaining - Error = " + mon.getGlobalError());   }

(The source code can be found in the CVS repository in the org.joone.samples.xor package)

http://www.joone.org

8

 

Joone Core Engine  Engine 

 Developer Guide Guide 

Saving and restoring a neural network  To have the possibility of reusing a neural network built with Joone, we need to save it in a serialized format. To accomplish this goal, all the core elements of the engine implement the Serializable interface, permitting a neural network network to be saved in a byte stream, to store it on the file system or data base, or transport it on remote machines using any wired or wireless protocol.  

The simplest way   A simple way to save a neural network is to serialize each layer using an ObjectOutputStream object, like illustrated in the following example that extends the XOR java class:  class:  public void saveNeuralNet saveNeuralNet(String (String fileName) {  {  try { FileOutputStream FileOutputS tream stream = new FileOutputS FileOutputStream(fileNa tream(fileName); me);    ObjectOutputStream ObjectOutput Stream out = new ObjectOutputStream(stream);  ObjectOutputStream(stream);  out.writeObject(input);  out.writeObject(input);   out.writeObject(hidden);   out.writeObject(output);   out.writeObject(trainer);  out.writeObject(trainer);   out.close(); } catch (Exception excp) {  {  excp.printStackTrace();   } }

We don’t need to explicitly save the synapses constituting the neural network, because they are linked by the layers. The writeObject method recursively saves all the objects contained in the nontransient variables of the serialized class, also avoiding having to store the same object’s instance twice in case it is referenced by two separated objects –for instance a synapse connecting two layers.   layers. We can later restore the above neural network using the following code:   public void restoreNeuralNet restoreNeuralNet(String (String filename) {  {  try { FileInputStream FileInpu tStream stream = new FileInputStream(fileName);  FileInputStream(fileName);  ObjectInputStream inp = new ObjectInputStream(stream);  ObjectInputStream(stream);  Layer input = (Layer)inp.readObject();  (Layer)inp.readObject();  Layer hidden = (Layer)inp. (Layer)inp.readObject() readObject(); ;  Layer output = (Layer)inp. (Layer)inp.readObject() readObject(); ;  TeachingSynapse trainer = (TeachingSynapse)inp.readObject();  (TeachingSynapse)inp.readObject();  } catch (Exception excp) {  {  excp.printStackTrace();   } /* * that,network we can and, restore all the to manage * After the neural finally, we internal can run variables it.   it.  */ /* We restore the monitor of the NN. * It’s indifferent which layer we use to do this */  */   Monitor monitor = input.getMonitor();  input.getMonitor();   /* The main application registers itself as a NN’s listener */

http://www.joone.org

 

9

Joone Core Engine  Engine 

 Developer Guide Guide 

monitor.addNeuralNetListener(this);   /* Now we can run the restored net */  */  input.start(); hidden.start(); output.start(); trainer.start(); monitor.Go(); }

The method illustrated in this chapter is very simple and works well, but it’s not flexible enough, because we have to write a different piece of code for each saved neural network, as the number and the order of the saved layers of the network is hardhard -coded in the program.  program.  We now consider a quicker and more flexible method to save and restore a neural network.  network. 

Using a NeuralNet object   The org.joone.net.Neuralnet object comes in our aid by offering a simple but powerful mechanism to manage a neural network built with Joone.  Joone.  We now will try to t o rewrite the XOR sample using this new component. In any case we must create all the necessary components of the neural network, repeating all the instructions already written for the previous example:  example:  /* The Layers */  */  SigmoidLayer input = new SigmoidLayer();  SigmoidLayer();   SigmoidLayer hidden = new SigmoidLayer();  SigmoidLayer();   SigmoidLayer output = new SygmoidLayer();  SygmoidLayer();   input.setRows(2);   hidden.setRows(3); output.setRows(1);   /* The Synapses */  */   FullSynapse synapse_IH = new FullSynapse(); /* Input  Input  -> Hidden conn. */  */  FullSynapse synapse_HO = new FullSynapse(); /* Hidden -> Output conn. */  */  input.addOutputSynapse (synapse_IH); hidden.addInputSynapse(synapse_IH);   hidden.addOutputSynapse(synapse_HO);   output.addInputSynapse(synapse_HO);   /* The I/O components */  */   FileInputSynapse inputStream = new FileInputSynapse();  FileInputSynapse();  inputStream.setFirstCol(1); inputS tream.setFirstCol(1);   inputStream.setLastCol(2);   inputStream.setFileName("c: \\joone\\XOR.txt"); input.addInputSynapse(inputStream);   /* The Trainer and its desired file */  */   TeachingSynapse trainer = new TeachingSynapse();  TeachingSynapse();   FileInputSynapse samples = new FileInputSynapse();  FileInputSynapse();   samples.setFileName("c: \\joone\\XOR.txt"); trainer.setDesired(samples);   samples.setFirstCol(3);   samples.setLastCol(3);   output.addOutputSynapse(trainer);  

Now we add this structure to a NeuralNet object:  object:     NeuralNet nnet = new NeuralNet(); NeuralNet();  nnet.addLayer(input, NeuralNet.INPUT_LAYER);  NeuralNet.INPUT_LAYER); 

http://www.joone.org

10

 

Joone Core Engine  Engine 

 Developer Guide Guide 

nnet.addLayer(hidden, NeuralNet.HIDDEN_LAYER);  NeuralNet.HIDDEN_LAYER);  nnet.addLayer(output, NeuralNet.OUTPUT_LAYER);  NeuralNet.OUTPUT_LAYER);  nnet.addTeacher(trainer);

and we instead use the contained Monitor object to create a new one:  one:   Monitor monitor = nnet.getMonitor();  nnet.getMonitor();  monitor.setLearningRate(0.8);   monitor.setMomentum(0.3);     monitor.setPatterns(4) monitor.se tPatterns(4); ; /* # of rows contained in the input file */ */  monitor.setTotCicles(20000); /* How many times the net must be trained*/ monitor.setLearning(true); /* The net must be trained */  */   monitor.addNeuralNetListener(this);  

and now we now we can run the neural network simply simp ly writing:  writing:  nnet.start(); nnet.GetMonitor().Go();

Where are the differences?  differences?  1. We don’t need any more to set the Monitor object for each component, as the NeuralNet does this task for us; 2. We don’t need to invoke the start method for all the layers, but only on the NeuralNet object.   object. But the main support provided by the NeuralNet object is the ability to easily store and read a neural network with few and generalized rows of code:  code:  public void saveNeuralNet saveNeuralNet(String (String fileName) {  {  try { FileOutputStream stream = new FileOutputStream(fileName);  FileOutputStream(fileName);   ObjectOutputStream out = new ObjectOutputStream(stream);  ObjectOutputStream(stream);   out.writeObject(nnet);   out.close(); } catch (Exception excp) {  {  excp.printStackTrace();   } } public void restoreNeuralNet restoreNeuralNet(String (String filename) {  {  try { FileInputStream stream = new FileInputStream(fileName); FileInputStream(fileName);    ObjectInputStream inp = new ObjectInputStream(stream);  ObjectInputStream(stream);  nnet = (NeuralNet)inp.readObject();  (NeuralNet)inp.readObject();  } catch (Exception excp) {  {  excp.printStackTrace();   return; } /* * After that, we can restore all the internal variables to manage   * the neural network and, finally, we can run it.  it.   */ /* The main application registers itself as a NN’s listener */

nnet.getMonitor().addNeuralNetListener(this);   /* Now we can run the restored net */  */  nnet.start();   nnet.getMonitor().Go();  nnet.getMonitor().Go(); } 

http://www.joone.org

11

 

Joone Core Engine  Engine 

 Developer Guide Guide 

Using the outcome of a neural network  After having learned how to train and save/restore a neural network, we will see how we can use the resulting patterns from a trained neural network.  network.  To we must use an object inherited from network the OutputStreamSynapse class,two so cases: that we will be abledo tothis, manage all the output patterns of a neural for both the following 1. User’s needs: to permit a user user to read the results of a neural ne network, twork, we must be aable ble to write them onto a file, in some useful format, for instance, in ASCII format. format.   2. Application’s needs: needs: to permit an embedd embedding ing application to read th thee results of a neural neural network, we must be able to write them onto a memory buffer – a 2D array of type double, for instance – and to read them automatically at the end of the elaboration. Note: The examples shown in the following two chapters use the serialized form of the XOR neural network. To obtain that file, you must first create the XOR neural network with the editor, a s illustrated in the GUI Editor User Guide, and export it using the File->Export File- >Export menu item.  item. 

Writing the results to an output file   The first example we will see is about how to write the results of a neural network into an ASCII file, so a user can read and use it in practice.  practice.  To do this, we will use a FileOutputSynapse object, attaching it as the output of the last layer of the neural network. Assume that we have saved the XOR neural net from the previous example in a serialized form named ‘xor.snet’ so we can use it by simply loading it from the file system and attaching to its last layer the output synapse.  synapse.  First of all, we write the code necessary to read a serialized NeuralNet object from an external application: ame) { NeuralNet restoreNeuralNet(String fileN fileName) NeuralNet nnet = null;  null;  try { FileInputStream FileInputS tream stream = new FileInputSt FileInputStream(fileNa ream(fileName); me);    ObjectInputStream inp = new ObjectInputStream(stream); ObjectInputStream(stream);    nnet = (NeuralNet)inp.readObject();  (NeuralNet)inp.readObject();  } catch (Exception excp) {  {  excp.printStackTrace();   } return nnet; }

then we write the code to use the restored neural network:  network:  

NeuralNet xorNNet = this.restoreNeuralNet("/somepath/xor.snet");  this.restoreNeuralNet("/somepath/xor.snet");   if (xorNNet != null) {  {  Vector layers = xorNNet.getLayers();  xorNNet.getLayers();  // we get the third layer  layer  Layer output = (Layer)laye (Layer)layers.elementAt rs.elementAt(2); (2); // we create an output synapse  synapse  FileOutputSynapse fileOutput = new FileOutputSynapse();  FileOutputSynapse();  FileOutput.setFileName("/somepath/xor_out.txt");   // we attach the output synapse to the last layer of the NN  NN   output.addOutputSynapse(fileOutput);   // we run run the neural network for only one cycle in recall mode  mode   xorNNet.getMonitor().se xorNNet.get Monitor().setTotCicles tTotCicles = 1;  1; 

http://www.joone.org

12

 

Joone Core Engine  Engine 

 Developer Guide Guide 

xorNNet.getMonitor().setLearning(false);   xorNNet.start(); xorNNet.getMonitor().Go();   }

After the above execution, we can print out the obtained file, and, if the net the  net is correctly trained, we will see a content like this: 0.016968769233825207   0.9798790621933134   0.9797402885436198   0.024205151360285334  

This demonstrates the correctness of the previous training cycles.  cycles. 

Getting the results into an array   We nowThe willobvious see the approach use of a neural an the embedding that needs use its results. in this network case is tofrom obtain result of application the recall phase into antoarray of doubles, so the external application can use it as needed.  needed.   We will see two usages of a trained neural network:  network:   1. The test of a net using a set of predefined patterns ; in this case we want interrogate the net with several patterns all collected before to query the net  net   2. The test of a net using only one input pattern; in this case we need to interrogate the net with a pattern provided by an external asynchronous source of data  data  We will see an example of both the above methods.  methods. 

Using multiple input patterns  To accomplish this goal we will use the org.joone.io.MemoryOutputSynapse object, as illustrated in the following example.  example.  Look at the following code:  code:  // The input array used for this example  example 

private double[][] inputArray = { {0, 0}, {0, 1}, {1, 0}, {1, 1} }; private void Go(String fileName) {  {  // We load load the serialized XOR neural net  net   NeuralNet xor = restoreNeuralNet(fileName);  restoreNeuralNet(fileName);  if (xor != null) {  {  Vector layers = xor.getLayers();  xor.getLayers();   /* We get the first layer of the net (the input layer), then remove all the input synapses attached to it  it   and attach a MemoryInputSynapse */  */   Layer input = (Layer)layers.elementAt(0);  (Layer)layers.elementAt(0);  input.removeAllInputs();   MemoryInputSynapse memInp = new MemoryInputSynapse(); memInp.setFirstRow(1); memI np.setFirstRow(1);   memInp.setFirstCol(1);   memInp.setLastCol(2);   input.addInputSynapse(memInp);   memInp.setInputArray(inputArray);  

http://www.joone.org

13

 

 Developer Guide Guide 

Joone Core Engine  Engine 

/* We get the last layer of the net (the output layer), then remove all the output synapses attached to it and attach a MemoryOutpu MemoryOutputSynapse tSynapse */ */    Layer output = (Layer)layers.elementAt(2);  (Layer)layers.elementAt(2);  // Remove all the output synapses attached to it... it...    output.removeAllOutputs();   //...and attach a MemoryOutputSynapse  MemoryOutputSynapse  MemoryOutputSynapse MemoryOutp utSynapse memOut = new MemoryOutpu MemoryOutputSynapse(); tSynapse(); output.addOutputSynapse(memOut);   // Now we interrogate the net  net   xor.getMonitor().setTotCicles(1);   xor.getMonitor().setPatterns(4);   xor.getMonitor().setLearning(false);   xor.start(); xor.getMonitor().Go();   for (int i=0; i < 4; ++i) {  {  // Read the next pattern and print out it double[] pattern = memOut.getNextPattern();  memOut.getNextPattern();   System.out.println("O System.out .println("Output utput Pattern #"+(i+1)+" = "+pattern[0 "+pattern[0]); ]); } xor.stop();  xor.stop();   System.out.println("Finished");

}  } 

}

As illustrated in the above code, we load the serialized neural net (using the same restoreNeuralNet method used in the previous chapter), and then we attach a MemoryInputSynapse to its input layer and a MemoryOutputSynapse to its output layer.  layer.   Before that, we have removed all the I/O components of the neural network, to be not aware of the I/O components used in the editor to train the net.  net.   This is a valid example about how to dynamically modify a serialized neural network to be used in a different environment respect respect to that used for its design and training.  training.   To provide the neural network with the input patterns, we call MemoryInputSynapse.setInputArray method, passing a predefined 2D array of double.  double.   To get the resulting patterns from the recall phase we call

the the

To get the resulting patterns from the recall phase we call the MemoryOutputSynapse.getNextPattern MemoryOutputSynapse. getNextPattern method; this synchronized  method  method waits for the next output pattern from the net, returning an array of doubles containing the response of the neural network.   This call is made for each input pattern provided to the net. the  net.   The above code must be written in the embedding application, and to simulate this situation, we can call it from a main() method:  method:   public static void main(String[] args) {  {  if (args.length < 1) {  {  System.out.println("Usage: EmbeddedXOR XOR.snet");  XOR.snet");  } else { EmbeddedXOR xor = new EmbeddedXOR();  EmbeddedXOR();  xor.Go(args[0]);  xor.Go(args[0]);   } }

The complete source code of this example is contained in the EmbeddedXOR.java file in the org.joone.samples.xor org.joone.sample s.xor package. package.  

http://www.joone.org

14

 

 Developer Guide Guide 

Joone Core Engine  Engine 

 

Using only one input pattern We now will see how to interrogate the net using only an input pattern.  pattern.   We will show only the differences respect to the previous example:   private void Go(String fileName) {  {  // We load the serialized XOR neural ne ural net NeuralNet xor = restoreNeuralNet(fileName);  restoreNeuralNet(fileName);  if (xor != null) {  {  Vector layers = xor.getLayers();  xor.getLayers();   /* We get the first layer of the net (the input layer), then remove all the input synapses attached attached to it and attach a MemoryInputSynapse */  */   Layer input = (Layer)layers.elementAt(0);  (Layer)layers.elementAt(0);  input.removeAllInputs();   DirectSynapse();  DirectSynapse memInp = new DirectSynapse();   input.addInputSynapse(memInp);   ...

As you can read, we now use as input a DirectSynapse instead of the MemoryInputSynapse object.  object.  What are the differences?  differences?  1. The DirectSynapse object is not a I/O component, as it doesn’t inherit the StreamInputSynapse class  class  2. Consequently, it doesn’t doesn’t call the Monitor.nextStep method, so the neural neural network is not more controlled by the Monitor’s parameters (see the Technical Overview to better understand these concepts). Now the embedding application is responsible of the control of

the neural network (it must know when to start and stop it), while during the training phase the start and stop actions was determined by the parameters of the Monitor object, being that process not supervisioned (remember that a neural network can be trained on remote machines mach ines without a central control).  control).  3. For the same reasons, reasons, we don’t need to call the Monitor.Go meth method, od, nor to set its ‘TotCycles’ and ‘Patterns’ parameters.  parameters.  Thus, to interrogate the net we can just write, after having invoked the  NeuralNet.start  method:   method:  for (int i=0; i < 4; ++i) {  {  // Prepare the next input pattern  pattern  Pattern iPattern = new Pattern(inputArray[i]); iPattern.setCount(1);  iPattern.setCount(1);   // Interrogate the net  net   memInp.fwdPut(iPattern  memInp.fwd Put(iPattern); ); // Read the output pattern and print out it  it   double[] pattern = memOut.getNextPattern();  memOut.getNextPattern();  System.out.println("O System.out .println("Output utput Pattern #"+(i+1)+" = "+pattern[0 "+pattern[0]); ]); }

In the above code we give the net only one pattern for each query, using the DirectSynapse.fwdGet method (note that this method accepts a Pattern object). As in the previous example, to retrieve the output pattern we call the MemoryOutputSynapse.getNextPattern method.  method.   The complete source code of this example is contained in the IImmediateEmbedd mmediateEmbeddedXOR.java edXOR.java file in the org.joone.samples.xor org.joone.samples.xor package.  package. 

http://www.joone.org

 

15

This unregistered document was created with Win2PDF at or http://www.daneprairie.com. The version of Win2PDF is foravailable evaluation non-commercial use only.

 

   av a  a  O b je c t  t  O rie n te d  N    N eu r al  E   E n gi n  Ja  ne 

Joone Core Engine  Technical Overview  Overview  Paolo Marrone ( [email protected]  [email protected]  ceforge.net )

 

Joone Core Engine  Engine 

Technical Overview 

Summary  Revision............................................................................ Revision..................................................... .............................................. ............................................. ............................................. .......................3 3 Introduction ............................................ .................................................................. ............................................. .............................................. .............................................. ..........................5 ...5 Overview ............................................ ................................................................... .............................................. .............................................. ............................................. ............................. .......6 6 The Architecture............................... Architecture...................................................... ............................................... ............................................... ............................................. .............................. ........6 6 The Core Engine........................................ Engine................................................................ .............................................. ............................................. ............................................. ......................9 9 The Layer ............................................ .................................................................. ................................................ ................................................ ...........................................1 .....................10 0  The Recall Phase ........................................... .................................................................. .............................................. .............................................. ...............................10 ........10  

The Learning Phase.................... Phase ........................................... ............................................. .............................................. .............................................. ...........................11 .....11   Connecting Connecti ng a Synapse to a Layer ............................................. ................................................................... .............................................. ............................11 ....11   The Synapse ............................................ ................................................................. ............................................... ................................................ ........................................12 ..................12   The Pattern ............................................. ................................................................... ............................................. .............................................. ...........................................14 ....................14   The Matrix......................................................... Matrix................................................................................ .............................................. .............................................. ...............................14 ........14   The Monitor................................................... Monitor.......................................................................... .............................................. .............................................. ...................................14 ............14 The NN Parameters .......................................... ................................................................. .............................................. .............................................. ............................14 .....14   The NN control...................................... control.............................................................. .............................................. ............................................. .......................................15 ................15   Managing the events.......................................... events................................................................ .............................................. .............................................. ...........................17 .....17   I/O components components ............................................. .................................................................... .............................................. .............................................. ...................................20 ............20   The StreamInputSynapse............ StreamInputSynapse................................... ............................................... .............................................. ............................................. ...........................21 ....21   The StreamOutputSynapse.................................... StreamOutputSynapse............................................................ .............................................. ............................................ .......................23 .23   The Supervised S upervised Learning components ............................................ ................................................................... ............................................. ........................24 ..24   The TeacherSynapse............... TeacherSynapse...................................... .............................................. .............................................. .............................................. ...............................24 ........24   The TeachingSynapse............. TeachingSynapse.................................... .............................................. .............................................. ............................................. ...............................25 .........25   Using the Neural Network as a Whole ............................................ ................................................................... .............................................. ........................27 .27   The NeuralNet..................... NeuralNet ........................................... ............................................. .............................................. .............................................. ...................................27 ............27  

http://www.joone.org   http://www.joone.org

2

 

Technical Overview 

Joone Core Engine  Engine 

Revision  Revision  0.1.0

Date   Date March 6, 2002  2002 

Author  Author  Paolo Marrone  Marrone  Harry Glasgow  Glasgow 

Comments  Comments  • Pre Pre--release draft  draft 

0.2.0 0.3.0 0.3.5

0.3.6 0.3.7

March 7, 2002  2002 

Paolo Marrone  Marrone 

March 26, 2002  2002 

Paolo Marrone  Marrone 



• •

April 14, 2002  2002 

Paolo Marrone  Marrone 



April 16, 2002  2002 

Paolo Marrone  Marrone 

• •

May 8, 2002  2002 

Paolo Marrone  Marrone 

• •

http://www.joone.org   http://www.joone.org

Updated the I/O components chapter to reflect the new object model  model  Added the Supervised learning Compnents section  section   Added the NeuralNet description section  section   Broken the document separating it into two papers named ‘Technical Overview’ and ‘Developer Guide’  Guide’  Added the introduction  introduction  Added the description of the mechanism to manage the Monitor’s events  events  Added the schema to better understand the input model  model  Expanded the introduction  introduction  

3

 

Joone Core Engine  Engine 

Technical Overview 

I

would like to present the objectives that I had in mind when I started to write the first lines of code of Joone in the early 1996.  1996.   My dream was (and still is) to create a framework to implement a new approach the use of neural networks.   networks. I felt this necessity because the biggest (and unresolved until now) problem is to find the fittest network for a given problem, without falling into local minima, thus t hus finding the best architecture.  architecture. 

Okay - you'll say 'this is what we can do simply by training some randomly initialised neural network (NN) with a supervised or unsupervised algorithm'.  algorithm'.  Yes, it's true, but this is just scholastic theory, because training only one neural network, especially for hard problems of the real life, is not enough.  enough.  To find the best neural network is a really hard task because we need to determine many parameter parameterss of the net such as the number of the layers, how many neurons for each layer, the transfer function, the value of the learning rate, the momentum, etc... often causing frustrating failures. failures. The basic idea is to have an environment to easily train many neural networks in parallel, initialised initialised with different weights, parameters or different architectures, so the user can find the best NN simply by selecting the fittest neural network after the training process.  process.  Not only that but this process can continue retraining the selected NNs until some final parameter is reached (i.e. a low RMSE value) like a distillation process. The best architecture is discovered by Joone, not by the user!  user!  Many programs today exist that permit selection of the fittest neural network applying a genetic algorithm. I want to go beyond this, because my goal is to build a flexible environment programmable by the end user, so any existing or newly discovered global optimisation algorithm can be implemented.  implemented.  This is why Joone has its own distributed training environment and why it is based on a multithreaded engine.  engine.  My dreams aren't finished, because another one was to make easily usable and distributable a trained NN by the end user. For example, I'm imagining an assurance company that continuously trains many neural networks on risk evaluation, distributing the best ‘distilled’ resulting network to its sales force, that they can use it on their mobile devices.  devices.   This is why Joone is serializable and remotely transportable using any wired and wireless protocol, and it is easily runnable using a simple, small and generalized program.  program.  Moreover, my dream can become a more solid reality thanks to the advent of handheld devices like li ke mobile phones and PDA having inside a java virtual machine. Joone is ready to run on them, too.  too.  Hoping you’ll find Joone interesting and useful, I thank you for your interest to it.    Paolo Marrone Marrone 

http://www.joone.org   http://www.joone.org

4

 

Joone Core Engine  Engine 

Technical Overview 

Introduction  This paper describes the technical concepts underlying the core engine of Joone, explaining in detail the architectural design that is at its foundation.  foundation.  This guide is intended to provide the developer - or anyone interested to use Joone - with the knowledge of the basic mechanisms of the core engine, engine, so that anyone can understand how to use it and expand it to resolve one’s needs.  needs.   To see some examples about how to use the engine from within a java program to embed Joone on various application, please read the Developer Guide downloadable from the Joone’s sourceforge download area.  area. 

http://www.joone.org   http://www.joone.org

 

 

5

Technical Overview 

Joone Core Engine  Engine 

Overview  Each neural network (NN) is composed of a number of components (layers) connected together by connections (synapses). Depending on how these components are connected, several neural network architectures can be created (feed forward NN, recurrent NN, etc).  etc).   This document deals with feed forward neural networks (FFNN) for simplicity’s sake, but it is possible to build whatever neural network architecture is required with Joone.  Joone.   A FFNN is composed of a number of consecutive layers, each one connected to the next by a synapse. Recurrent connections from a layer to a previous one are not permitted. Consider the following figure: Synapse

Layers   Layers This is a sample FFNN with two layers and one synapse. Each layer is composed of a certain number of neurons, each of which have the same characteristics (transfer function, learning rate, etc). A neural net can be composed of several layers of different kinds of layer.  layer.   Each layer processes its input signal by applying a transfer function and sending the resulting pattern to the synapses that connect it to the next layer. So a neural network can process an input pattern, transferring it from its input layer to the output layer.  

The Architecture  To ensure that it is possible to build whatever neural network architecture is required with Joone, a method to transfer the patterns through the net is required without the need of a central point of control.   control. To accomplish this goal, each layer of Joone is implemented as a  Runnable object, so each layer runs independently from the other layers (getting the input pattern, applying the transfer function to it and putting the resulting pattern on the output synapses so that the next layers can receive it, processing it and so on) on) as depicted by the following basic scheme:  scheme:   f(X) 

I1  I1 

wN1

I2 I2   …

wN2

XN

YN

Ip  Ip  wNP

Where for each neuron N: http://www.joone.org   http://www.joone.org

6

 

Technical Overview 

Joone Core Engine  Engine  XN = (I1 * WN1) + … + (IP * WNP) f(X) = Transfer function (depending on the kind of layer’s property)  property)  YN = f(XN)

Complex neural architectures architectures cannet. be easily built, either linear or recursive, because there is no necessity for anetwork global controller of the Look at the following figure (the arrows represent the synapses):  synapses):  Input Layer 

Hidden Layers 

Layer 2  2 

Output Layer  Layer 

Layer 3  3 

Layer 5  5 

Layer 1  1 

Layer 4  4 

In this manner any form of modular neural networks can be built. Modular neural networks are a more generalized kind of multi-layered NN that process their input using several parallel sub-NNs. sub- NNs. They recombine the results from each module, tending to create structures within the topology, which can promote promote specialization of functions in each subsub-module. module.   From this point of view, a standard multi-layered neural network is just the simplest kind of modular neural networks.  networks. 

 

ó°‚®

Joone allows this kind of net to be built through its modular architecture, like a LE GO   bricks system!

To build a neural network, simply connect each layer to another as required using a synapse, and the net will run without problems. Each layer (running in its own thread) will read its input, apply the transfer function, and write the result in its output synapses, to which there are other layers connected running on separate threads, and so on.  on.  This transport mechanism is also used to bring the error from the output layers to the input layers during the training phases, allowing the weights and biases to be changed according to the chosen learning algorithm (for example the backprop algorithm).  algorithm). 

To accomplish this, each layer has two opposing transport mechanisms, one from the input to the output to transfer the input pattern during the recall phase, and another from the output to the input to transfer the learning error during the training phase, as depicted in the following figure:  

 

 

http://www.joone.org   http://www.joone.org

7

Technical Overview 

Joone Core Engine  Engine 

Input signal

Forward Transfer 

Backward Transfer 

Layer  Layer 

Forward Transfer  Weights adjustment

Synapse

Backward Transfer 

Error 

Layer   Layer

Each Joone component (both layers and synapses) has its own pre-built mechanisms to adjust the weights and biases according to the chosen learning algorithm.  algorithm.   By this means: •





The engine is flexible: you can build any architecture you want simply by connecting each layer to another with a synapse, without being concerned about the architecture. Each layer will run independently, processing the signal on its input and writing the results to its output, where the connected synapses will transfer the signal to the next layers, and so on. The engine is scalable: if you need more computation power, simply add more CPU to the system. Each layer, running on a separated thread, will be processed by a different CPU, enhancing the speed of the computation. The engine closely mirrors reality: conceptually, the net is not far from a real system (the brain), where each neuron works independently from each other without a global control system.   system.

Having seen how each component is implemented in Joone, the following sections look both at the object model and the implementation code.  code. 

http://www.joone.org http://www.jo one.org  

8

 

Joone Core Engine  Engine 

Technical Overview 

The Core Engine  The core engine of Joone is composed of a small number of interfaces and abstract classes forming a nucleus of objects that implement the basic behaviours of a neural network illustrated in the previous chapter.  chapter.  The following UML class diagram contains the main objects constituting the model of the core engine of Joone:  Joone: 

To simplify the model, only the relevant properties and methods are shown for each object.   As depicted, all the objects implement the  java.io.Serializab  java.io.Serializable le  interface, so each neural network netw ork built with Joone can be saved as a byte stream to be stored in a file system or data base, or be transported to other machines to be used remotely.  remotely.   The two main components are represented by two abstract classes (both contained in the org.joone.engine  package): the Layer and the Synapse objects.  objects.  

http://www.joone.org   http://www.joone.org

9

 

Joone Core Engine  Engine 

Technical Overview 

The Layer   The Layer object is the basic element that forms the neural net. It is composed of neurons, all having the same characteristics. This component transfers the input pattern to the output pattern by executing a transfer function. The output pattern is sent to a vector of Synapse objects attached to the layer's output. It is the active element of a neural net in Joone, in fact it runs in a separated thread (it implements the  java.lang.Runnable  interface) so that it can run independently from other layers in the neural net.  net.  Its heart is represented by the method run: public void run() {  {  while (running) {  {  int dimI = getRows();  getRows();  int dimO = getDimension();  getDimension();   // Recall phase inps = new double[dimI];  double[dimI];  this.fireFwdGet();   if (m_pattern != null) {  {    forward(inps);  forward(inps);   m_pattern.setArray(outs);  m_pattern.setArray(outs); fireFwdPut(m_pattern);   } if (step != -1) // Checks if the next step is a learning step  step   m_learning = monitor.isLearningCicle(step);  monitor.isLearningCicle(step);  else // Stops the net  net 

running = false;  false;  // Learning phase  phase  if ((m_learnin ((m_learning) g) && (running)) { gradientInps = new double[dimO];  double[dimO];  this.fireRevGet();   backward(gradientInps);  backward(gradientInps);   m_pattern = new Pattern(gradientOuts Pattern(gradientOuts); );      m_pattern.setCount(step);  m_pattern.setCount(step); fireRevPut(m_pattern);   } } // END END while (running = fa false) lse)    myThread = null;  null;  }

The end of the cycle is controlled by the running variable, so the code loops until some ending event occurs.  occurs.  The two main sections of the code have been highlighted with a border:  border:  

The Recall Phase 

 

 

The code in the first block reads all the input patterns from the input synapses ( fireFwdGet), where each input pattern is added to the others to produce the inps vector of doubles. It then calls the Forward method, which is an abstract method in the Layer object. In the forward method the inherited classes must implement the required formulas of the transfer function, reading the input values from the inps  vector and returning the result in the outs  vector of doubles. By using this mechanism based on the template pattern, new kind of layer can easily be built by extending the Layer object.  object.  After this, the code calls the fireFwdPut  method to write the calculated pattern to the output synapses,, from which subsequent layers can process the results in tthe synapses he same manner. http://www.joone.org   http://www.joone.org

Joone Core Engine  Engine 

10

Technical Overview 

In more simple terms the layer object’s behaviour acts like a pump that decants the liquid (the pattern) from one recipient (the synapse) to another.  another. 

The Learning Phase  After the recall phase, if the neural net is in a training cycle, the code calls the fireRevGet  method to read the error obtained on the last pattern from fr om the output synapses, then calls the abstract backward method where, like in the forward method, the inherited classes must implement the processing of the error to modify the biases of the neurons constituting the layer. The code does this task by reading the error pattern in the gradientInps vector and writing the result to the gradientOuts vector.   vector. After this, the code writes the error pattern contained in the gradientOuts vector to the input synapses (fireRevPut), from which other layers can subsequently process the back propagated error signal.  signal.  Once the overall vision of the task is established, the Layer object alternately ‘pumps’ the input signal from the input synapses to the output synapses, and the error pattern from the output synapses to the input synapses, as depicted in the following figure (the numbers indicate the sequence of the execution):  execution): 

1 Input signal

2

fwGet( )

forward()

fwPut( )

revPut( )

backward()

revGet( )

4

Input Synapse

Error 

3

Layer   Layer

Output  Output  Synapse

Connecting a Synapse to a Layer  To connect a synapse to a layer, the program must call the Layer.addInputSynapse method for an input synapse, or the Layer.addOutputSynapse method for an output synapse.  synapse.   These two methods, inherited from the the NeuralLayer  interface, are implemented in the Layer object as follows: /** Adds a new input synapse to the layer  layer   * @param newListener neural.engine.InputPatternListner  neural.engine.InputPatternListner  */ public synchronized void addInputSynapse(InputPatternListener newListener) { if (aInputPatternListener == null) {  {  aInputPatternListener = new java.util.Vector();  java.util.Vector();  }; aInputPatternListener.addElement(newListener);   if (newListener.getMonitor() == null)  null)  newListener.setMonitor(getMonitor()); newListener.se   tMonitor(getMonitor());  this.setInputDimension(newListener);     notifyAll();  notifyAll(); }

http://www.joone.org   http://www.joone.org

11

 

Joone Core Engine  Engine 

Technical Overview 

The Layer object has two vectors containing the list of the input synapses and the list of the output synapses connected to it.  it.  In the fireFwGet  and fireRevPut  methods the Layer scans the input vector and, for each input synapse found, it calls the fwGet and the revPut methods respectively (implemented by the input synapse from the InputPatternListener interface).  interface).   Look at the following code that implements the fireFwGet method:  method:  /** * Calls all the fwdGet methods on the input synapses to get the input patterns */ protected synchronized void fireFwdGet() {  {  double[] patt;  patt;  int currentSize = aInputPatternListener.size();  aInputPatternListener.size();  InputPatternListener Inpu tPatternListener tempListener = null;  null; 

for (int index = 0; index < currentSize; index++){  index++){   tempListener = (InputPatternListener)aInputPatternListener.elementAt(index);   if (tempListener != null) {  {  m_pattern = tempListene tempListener.fwdGet(); r.fwdGet();    if (m_pattern != null) {  {  patt = m_pattern.getArray();  m_pattern.getArray();  if (patt.length != inps.length)  inps.length)  inps = new double[patt.length];  double[patt.length];   sumInput(patt);  sumInput(patt);   step = m_pattern.getCount();  m_pattern.getCount();  }  }; }; }

In the bordered code there is a loop that scans the vector of input synapses.  synapses.   The same mechanism exists for the fireFwPut and fireRevGet methods applied to the vector of output synapses implementing the OutputPatternListener interface.  interface.   This mechanism is derived from the Observer Pattern, where the Layer is the Subject   and the Synapse is the Observer . Using these two vectors, it is possible to connect many synapses synapses (both input and output) to a Layer, permitting complex neural net architectures to be built.  built. 

The Synapse   The Synapse object represents the connection between two layers, permitting a pattern to be passed from one layer to another.  another.   The Synapse is also the ‘memory’ of a neural network. During the training process the weighs of the synapse (contained in the Matrix object) are modified according the implemented learning algorithm.   algorithm. As described above, a synapse is both the output synapse of a layer and the input synapse of the next connected layer in the NN. To do this, the synapse object implements the InputPatternListener and the OutputPatternListener interfaces.  interfaces.   These interfaces contain respectively the described methods fwGet, revPut, fwPut  and revGet. The following code describes how they are implemented in the Synapse object:  object:    public synchronized void fwdPut(Pattern pattern) { {  if (isEnabled()) {  { 

http://www.joone.org   http://www.joone.org

12

 

Joone Core Engine  Engine    count = pattern.getC pattern.getCount(); ount();  if ((count > ignoreBefore) || (count == -1)) { while (items > 0) {  {  try {  {  wait();  wait();   } catch (Interrupte (InterruptedException dException e) { {    return; }  }  }  m_pattern = pattern;  pattern;  inps = (double[])pattern.getArray();  (double[])pattern.getArray();    forward(inps);  forward(inps);

Technical Overview 

++items;  ++items;  }

notifyAll();  notifyAll();  

} } public synchronized Pattern fwdGet() {  {  if (!isEnabled())  (!isEnabled())  return null;  null;  while while (items == 0) {  {  try {  {  wait();  wait();   } catch (InterruptedException e) {  {  return null;  null;  } } --items; notifyAll();   notifyAll();  m_pattern.setArray(outs);   return m_pattern;  m_pattern;  }

The Synapse is a shared resource of two Layers that, as already mentioned, run on two separate threads. To avoid a layer trying to read the pattern from its input synapse before the other layer has written it, the shared synapse in synchronized.  synchronized.   Looking at the code, the variable called ‘ items’ represents the semaphore of this synchronization mechanism. After the first Layers calls the fwdPut method, the items variable is incremented to indicate that the synapse is ‘full’. Conversely, after the subsequent Layer calls the fwdGet method, this variable is decremented, indicating that the synapse is ‘empty’.  ‘empty’.  Both the above methods control the ‘items’ variable when they are invoked. If a layer tries to call the fwPut method when items is greater then zero, its thread falls in the wait state, because the synapse is already full. In the fwGet method, if a Layer tries to get a pattern when items is equal to zero (meaning that the synapse does not contain a pattern) then its corresponding thread falls in the wait the  wait state. The notifyAll call present at the end of the two methods permits the ‘awakening’ of the other waiting layer, signalling that the synapse is ready to be read or written. After the notifyAll, at the end of the method, the running thread releases the owned object permitting another waiting thread to take ownership. Note that although all waiting threads are notified by notifyAll, only one will acquire a lock and the other threads will return to a wait state.  state.   The synchronizing mechanism is the same in the t he corresponding revGet and revPut methods for the training phase of the neural network.  network.  The fwPut method calls the abstract forward method (at the same time as the revPut calls the abstract backward method) to permit to the inherited classes to implement respectively the recall and the learning formulas, as already described for the Layer object (according to the Template  pattern). Writing the appropriate code in these two methods, the engine can be extended with new synapses and layers implementing whatever whatever learning algorithm and architecture is required. http://www.joone.org   http://www.joone.org

13

 

Joone Core Engine  Engine 

The Pattern  

Technical Overview 

The Pattern object is the container of the data used to interrogate or train a neural network.   It is composed of two parameters: an array of doubles to contain the values of the transported transp orted pattern, and an integer to contain the sequence number of that pattern. The dimensions of the array are set according to the dimensions of the pattern transported.   The Pattern object is also used to ‘stop’ all the Layers in the neural network. When its ‘count’ parameter contains the value –1, unique all the safe layers that pattern their ‘running’ state and will stop (the way to will stop receive a threadthat in Java is towill exitexit fromfrom its ‘run’ method). Using this simple mechanism the threads threads within which the Layer objects run can easily be controlled.   controlled. The Pattern object is also cloneable, permitting a duplicate of a pattern to be passed to any layer during its transfer from first to last layer within a neural network.  network.  

The Matrix   The matrix object simply contains a matrix of doubles to store the values of the weights of the connections and the biases. An instance of a matrix object is contained within both the Synapse and Layer objects.  objects.  Each element of a matrix contains two values: the actual value of the represented weight, and the corresponding delta value. The delta value is the difference between the actual value and the value of the previous cycle.  cycle.  The delta value is useful during the learning phase, permitting the application of momentum to quickly find the best minimum of the error surface. The momentum algorithm adds the previous variation to the actual calculated weight’s value. See the literature for more information about the algorithm.   algorithm.

The Monitor   The Monitor object is the container of all the parameters to control the behaviour of the neural net. It controls the start/stop actions and permits net parameters to be set, e.g. learning rate, momentum, etc.   etc. Each component of the neural net (Layers and Synapses) is connected to a Monitor object so that it can read the parameters to control its work. The monitor can be different for all the components, though normally it is useful to create only one Monitor in a neural net, the reference set by a component’s setMonitor method.  method.  The Monitor can also notify a listener when some events occur. In fact, through the event handling based on a JavaBeans-like mechanism (the Observer Pattern), a listener object that implements the org.joone.engine.NeuralNetListener interface can register itself to receive all the events of the net. The following is a list of the Monitor object’s features.  features.  

The NN Parameters  The Monitor contains all the parameters needed during the training phases, e.g. the learning rate, the momentum, etc. Each parameter has its own getter and setter method, conforming to the JavaBeans specifications.   specifications. These parameters are used by an external application, for example, to display them in a user interface, or to calculate the formulas written in their neural network component’s  backward() 

http://www.joone.org   http://www.joone.org

14

 

Joone Core Engine  Engine 

Technical Overview 

methods, as shown in the following code extracted from the class (bold text):  text): 

org.joone.engine.SigmoidLayer  

public void backward(double[] pattern) {  {  super.backward(pattern);   double dw, absv;  absv;  int x; int n = getRows(); getRows(); for (x = 0; x < n; ++x) {  {  gradientOuts[x] = pattern[x] * outs[x] * (1 - outs[x]); // bias adjustment  adjustment  if (  monitor.getMomentum() <  monitor.getMomentum()  < 0) {  {  if (gradientOuts[x] < 0)  0)   absv = -gradientOuts[x]; else absv = gradientOuts[x];  gradientOuts[x];  dw = monitor.get = monitor.getLearningRat LearningRate() e() *  * gradientOuts[x] + absv * bias.delta[x][0];   } else  else  dw = monitor.get = monitor.getLearningRat LearningRate() e() *  * gradientOuts[x] +  monitor.getMomentum()  monitor.get Momentum() *  * bias.delta[x bias.delta[x][0]; ][0];    bias.value[x][0] += dw;  dw;  bias.delta[x][0] = dw;  dw;  } }

In this way each component has a standard mechanism for getting the parameters needed for its work.   work.

The NN control  The Monitor object object is also a central point for controlling the t he start/stop times of a neural network. network.   It has some parameters that are useful to control the behaviour of the NN, e.g. the total number of epochs, the total number of the input patterns, etc.  etc.  Before explaining how does this works, an explanation is required of how the input components of a neural network work.  work.  To provide an input pattern to a neural net, a component must be inherited from the org.joone.io.StreamInputSynapse   class. This abstract class extends the Synapse object, so it can be connected to the input of a Layer like any other Synapse.   When the Layer calls the fwGet method on the StreamInputSynapse (see the Layer object explained in a previous chapter), this object calls the Monitor.nextStep()  method to advise the Monitor that a new cycle must be processed.  processed.   Look at the implementation of the nextStep method:  method:  public synchronized boolean nextStep() {  {  while (run == 0) {  {  try {  {  if (!firstTime) {  {  if (currentCicle > 0) {  {  --currentCicle; --currentCicle; fireCicleTerminated();   run = patterns;  patterns;  }  if (currentCicle == 0) {  {  fireNetStopped(); if (saveRun == 0) {  {  saveRun = patterns;  patterns;  saveCurrentCicle saveCurrent Cicle = totCicles; totCicles;   

http://www.joone.org   http://www.joone.org

15

 

Technical Overview 

Joone Core Engine  Engine  }  firstTime = true;  true;  return false;  false;  //wait();  //wait();  

}  } else  else  /* If goes here, it means that this method * was called first to call Go() or runAgain() */ wait();  wait();   } catch (InterruptedException e) {  {  //e.printStackTrace();   return false;  false;  } } if (run > 0)  0)  --run; return true;  true;  } 

Looking at the bordered block of code, the variable run  contains the actual input pattern processed, while the currentCicle   contains the current epoch (both descending from the max initial value to zero during the work of the neural net).  net).   If the run  variable is equal to zero, then the Monitor calls the fireCicleTerminated   method to advise the registered observers that the actual epoch has finished, after that it decreases the currentCicle   by one. If the currentCicle   is zero, then it calls the fireNetStopped   method to indicate that the last epoch is terminated, and returns a FALSE value to the calling object.  object.   Otherwise, if the variable run  is greater than zero, the Monitor simply decrements it, returning the value TRUE to the calling object.  object.  The following notification mechanism is obtained by implementing the Observer Pattern; the observer objects register themselves with the Monitor by calling the Monitor.addNeuralNetListener   method, passing ‘this’ as a parameter. To receive these notifications, the observer objects must implement the org.Joone.engine.NeuralNetListener   interface.   interface. In this manner the following services are made available using the Monitor object:  object:  1. The StreamInputSynapse StreamInputSynapse knows if it can read read and process process the next next input pattern (otherwise it stops), being advised by the returned Boolean value.  value.  2. An external application can can start/stop a neural network network simply by setting the run  parameter to a value greater than zero (to start) or equal to zero (to stop). To simplify these actions, the methods Go (to start), Stop (to stop) and runAgain (to restore a previous stopped network to running) have been added to the Monitor.  Monitor.   3. The observer objects objects (e.g. the main application) application) connecte connected d to the Monitor can be be advised when a particular event raises, as when an epoch or the entire training process has finished (for example either to show to the user the actual epoch number or the actual training error). To see how to manage the events of the Monitor to read the parameters of the neural network, read the following paragraph.  paragraph. 

http://www.joone.org   http://www.joone.org

 

 

16

Technical Overview 

Joone Core Engine  Engine 

Managing the events  To explain how the events of the Monitor object can be used by an external application, the following explains in detail what happens when a neural network is trained and when the last epoch is reached.  reached.  Monitor 

xxxInput  Synapse 

Input  Input  Layer   Layer

Hidden Layer   Layer

Output Layer   Layer

Teacher Synapse

Training

Desired

Data 

Data 

Suppose to have a neural network composed, as depicted in the above figure, of three layers: a xxxInputSynapse to read the training data, a TeacherSynapse to calculate the error for the backprop algorithm, and a Monitor object that controls the overall training process. As already mentioned, all the components of a neural network built with Joone obtain a reference to the Monitor object, represented in the figure by the dotted lines.  lines.  Supposing the net is started in training mode, in the following figures all the phases involved in the process are shown when the end of the last epoch is reached. The numbers in the label boxes indicate the sequence of the processing:  processing:  2: the inputSynapse calls the nextStep method 

xxxInput  Synapse 

Monitor 

Input  Input  Layer   Layer

Hidden Layer   Layer

Output Layer   Layer

Teacher Synapse

1: the input layer calls Training

the fwdGet method 

Desired

Data 

Data 

When the input layer calls the xxxInputSynapse.fwdGet method (1), the called object calls the Monitor.nextStep method to see if the next pattern must be processed (2).  (2). 

http://www.joone.org   http://www.joone.org

 

 

17

Technical Overview 

Joone Core Engine  Engine 

4: the Monitor returns a false  Boolean value 

xxxInput  Synapse 

Training Data 

Monitor 

Input  Input  Layer   Layer

Hidden Layer   Layer

3: the Monitor raises the netStopped event 

Output Layer   Layer

5: the inputSynapse creates and injects in the net a ‘stop pattern’ 

Teacher Synapse

Desired Data 

Since the last epoch is finished, the Monitor object raises a netStopped event (3) and returns a false Boolean value to the xxxInputSynapse (4).  (4).   The xxxInputSynapse, because receives a false value, creates a ‘stop pattern’ composed of a Pattern object with the counter set to –1, and injects it in the neural network (5).

Monitor 

xxxInput 

Input   Input

Hidden

Output

Teacher

Synapse 

Layer   Layer

Layer   Layer

Layer   Layer

Synapse

Training Data 

6: all the layers stop their running threads when receive the ‘stop pattern’ 

Desired Data 

All the layers of the net stop their threads – simply exiting from the run() method – when they

receive a ‘stop pattern’ (6).  (6). 

 

 

http://www.joone.org   http://www.joone.org

18

Technical Overview 

Joone Core Engine  Engine 

7: the Teacher calculates and sets the global error contained in the Monitor

Monitor 

8: the Monitor raises the errorChanged event

xxxInput  Synapse 

Input  Input  Layer   Layer

Hidden Layer   Layer

Output Layer   Layer

Training Data 

Teacher Synapse

Desired Data 

The TeacherSynapse calculates the global error and communicates this value to the Monitor object (7), which raises an errorChanged event to its listeners (8).  (8).  

Warning: As explained explained in the above process, the netStopped event raised by the Monitor cannot be used to read the last error value of the net, nor to read the resulting output pattern from a recall phase, because this event could be raised when the last input pattern is still travelling across the layers, before it reaches the last output layer of the neural network.  network.  So, to be sure to read the t he right values from the net, the rules explained below must be followed: followed:   Reading the error: to read the error of the neural network, the errorChanged event must be waited for, so a listener that implements the NeuralNetListener interface must be built, and the code written

to manage the error in the inherited errorChanged method. Reading the outcome: to be sure to have received all the resulting patterns of a cycle from a recall phase, a ‘stop pattern’ must be waited for from the output layer of the net. To do this, an object that extends the OutputStreamSynapse must be built and the code to manage the output pattern written,

implemnting the fwdPut method of this class.  class.  Appropriate actions can be taken by checking the ‘count’ parameter of the received Pattern. Some pre--built output synapse classes are provided with Joone, and many others will be released in future pre versions.

http://www.joone.org http://www.jo one.org  

19

 

Joone Core Engine  Engine 

Technical Overview 

I/O  components   components   The I/O components of the core engine are stored in the org.joone.io package.  package.   They permit both the connection of a neural network to external sources of data and the storage of the results of the network to whatever output device is required.  required.   The object model is shown in the following figure:  figure:  

 

 

http://www.joone.org   http://www.joone.org

Joone Core Engine  Engine 

20

Technical Overview 

The abstract StreamInputSynapse and StreamOutputSynapse classes represent the core elements of the IO package.  package.  They extend the abstract Synapse class, so they can be ‘attached’ to the input or the output of a generic Layer object since they expose the same interface required by any i/o listener of a Layer.  Layer.  Using this simple mechanism the Layer is not affected by the category of synapses connected to it because as they all have the same interface, the Layer will continue to call the xxxGet and xxxPut methods without needing to know more about their specialization.  specialization. 

The StreamInputSynapse  The StreamInputSynapse object is designed to provide a neural network with input data by providing a simple method to manage data that is organized as rows and columns, for instance as semicolon--separated ASCII input data streams.  semicolon streams.  Each value in a row will be made available as an output of the input synapse, and the rows will be processed sequentially by successive calls to fwdGet method. As some files may contain information additional to the required data, the parameters firstRow , lastRow, firstCol and lastCol, derived from the InputSynapse  interface, may be used to define the range of usable data.  data.   The Boolean parameter stepCounter  indicates if the object is to call the Monitor.nextStep() method for each pattern read (see the NN control paragraph). By default it is set to TRUE but in some cases it must be set to FALSE. Read below to see why:

In a neural network that is to be trained, there needs to be at least two StreamInputSynapse objects: one to give the sample input patterns to the neural network and another to provide the net with the desired output patterns to implement some supervised learning algorithm.  algorithm.  Since the Monitor object is the same for all the components in a neural network built with Joone, there can be only one input component that calls the Monitor.nextStep() method, otherwise the counters of the Monitor object will be modified twice (or more) for each cycle.  cycle.   To avoid this side effect, the stepCounter parameter of the StreamInputSynapse that provides the desired output data to the neural network, is set to FALSE. A StreamInputSynapse can store its input data permanently by setting the buffered   parameter to TRUE (the default). So an input component can be saved or transported along with its input data, permitting a neural network to be used without the initial input file. This feature is very useful for remotely training a neural network in a distributed environment, as provided by the Joone framework.   framework. URLInputSynapse  objects are real implementations of the abstract   The FileInputSynapse StreamInputSynapse classand which read input patterns from files and http/ftp sockets respectively.  respectively.   To extract all the values from a semicolon-separated input stream, the above two classes use the StreamInputTokenizer object. These are able to parse each line of the input data stream to extract all the single values from it and return them by the getTokenAt  and getTokensArray  methods.  methods.   To add a new xxxInputSynapse that reads patterns from a different kind of input data to semicolon separated values, you must:  must: 

 

 

1. 2. 3. 4.

Create a new class implementing the PatternTokenizer interface (e.g. xxxInputTokeniz xxxInputTokenizer) er)   Write all the code code necessary necessary to implement implement all the public public methods of of the inherited interface. Create a new class inherited from StreamInputSynapse StreamInputSynapse (e.g. xxxInputSynapse).  xxxInputSynapse).  Override the abstract abstract method initInputStream, initInputStream, writing the code necessary to initialise the ‘token’ parameter of the inherited class. To do this, you must call the method super.setToken from within initInputStream, passing the newly created xxxInputTokenizer

http://www.joone.org   http://www.joone.org

21

Technical Overview 

Joone Core Engine  Engine 

after having initialised it. For more details see the implementation built into FileInputSynapse.   FileInputSynapse. To better understand the concepts underlying the I/O model of Joone, we must considerate that the I/O component package is based on two distinct tiers to logically separate the neural network from its input data.  data.  Since a neural network can natively process only floating point values, the I/O of Joone is based on this assumption, then if the nature of the input data is already numeric (integer or float/double), the user doesn’t need to make further format transformations on them. them.   The I/O object model is based on two separated levels of abstraction, like depicted in the following figure:   figure: It implements the PatternTokenizer  interface 

It extends the StreamInputSynapse class   class

Input data   data

xxxInputTokenizer Device’s specific interface (driver) 

Any data format, depending on the input device 

xxxInputSynapse   xxxInputSynapse

To the NN  NN  Input layer  layer 

Numeric (double) data format - ONLY  ONLY  

Numeric (double) data format - ONLY  ONLY  

The two colored The two colored blocks  blocks represent the objects that must be written to add a new input data format and/or device to the neural network.  network.  The first is the ‘driver’ that knows how to read the input data from the specific input device. It converts the specific input input data format to the neural network’s accepted numeric double format. The actual implemented StreamInputTokenizer is an object to transform semicolon separated ASCII values to numeric double values, and it was the first implementation made because the mo st common format of data is contained in text files; if the input data are already contained in this ASCII format, you can just use it, without wit hout implement any transformation. For data contained in array of doubles, doubles, (i.e. for input provided from anothe anotherr application), we have built the MemoryInputTokenizer  and the MemoryInputSynapse  classes that implement the above two layers to provide the neural network with data contained in a 2D array of doubles. To use them, simply create a new instance of the MemoryInputSynapse and set the input array calling its setInputArray method, then connect it to the input layer of the neural network.  network.  

http://www.joone.org   http://www.joone.org

22

 

Joone Core Engine  Engine 

Technical Overview 

The StreamOutputSynapse  The StreamOutputSynapse object allows a neural network to write output patterns. It writes all the values of the pattern passed by the call of fwdPut method to an output stream.  stream.   The values are written separated by the character contained in the separator  parameter (the default is the semicolon), and each row is separated by a carriage return.  return.   Extending this class allows output patterns from an output device to be written as, for example, ASCII files, FTP sites, spreadsheets, charting visual components, etc. etc.   Joone has three real implementations of the above abstract class: class:   the output output toona file an ASCII file in theMemoryOutputSynapse comma separated format; FileOutputSynapse, , to XLSOutputSynapse to write write the in Excel format; , to

Sponsor Documents

Or use your account on DocShare.tips

Hide

Forgot your password?

Or register your new account on DocShare.tips

Hide

Lost your password? Please enter your email address. You will receive a link to create a new password.

Back to log-in

Close