Analysis

1) In neural network tests, a custom simulator was created in C++. Three types of neural networks were simulated: 100%-synapse-connnected with standard Backpropagation learning, 20%-synapse-connected with standard Backpropagation learning, and a neural network using Backpropagation as well as the proposed concept for dynamically re-routing synapses to enhance learning. This neural network was initialized as a 100%-connected network, and insignificant synapses were randomly removed over time. This process gives a reasonable simulation of the proposed method.

All neural networks had an input layer of 10x10 nodes, connected to a hidden layer of 50 neurons, and an output layer of 10 neurons. They were trained for Optical Character Recognition, to recognize digits with 5% noise and small position offsets. Test data was generated using Visual Basic to create 5 sets of the 10 digits. All sets of digits were presented at each training cycle in a random order.

Mean Square Error of each neural network was measured over 1000 iterations, after each presentation of the training data. Each neural network test was repeated 10 times, to account for the significant and unpredictable impact of the random variation, particularly in the initialization synapse weights.

STATA was used after exporting the Mean Square Error vs. training cycle as a comma delimited file. The results from each network were averaged, and plotted to compare in terms of learning speed and accuracy. A least-squares regression analysis was done to compare the learning methods.

2) Results were obtained for the circuit simulations using PSPICE, with transistor-level simulations. Several graphs were generated to illustrate the circuit’s performance using specified transistor sizes. Parameters from a previous MOSIS fabrication run of the TSMC 0.35um process were used, for accurate simulations.

Copyright © Malcolm Stagg 2006. All Rights Reserved.
Website: http://www.virtualsciencefair.org/2006/stag6m2. Email: malcolmst@shaw.ca.