|A Dynamic Analog Concurrently-Processed Adaptive Chip|
|By Malcolm Stagg|
For Backpropagation, synapses consist of a stored weight which is multiplied by each input to generate outputs. The weight is adjusted by multiplying both forwards and error inputs, and scaling that value by a learning rate parameter.
In Hebbian learning, the forwards-propagating signal is again multiplied by the stored weight. The weight update is computed as the product of the backwards propagating signal on each side of the synapse, again scaled by the learning rate parameter.
The proposed synapse design supports both Backpropagation and Hebbian learning on the same cell. Differential voltages at the input are multiplied by the synaptic weight stored in an analog memory cell. An SRAM cell is used to store whether BP or Hebbian is used. If BP is used, a similar multiplier circuit is connected to the backwards-propagating output using a transmission gate. Weight update multiplies the backwards-propagating input by a value selected by transmission gates. With BP this value is the forwards- propagating input. With Hebbian, it is the backwards-propagating output.