Abstract: This article involves the circuit design of Convolutional Neural Network (CNN) activation functions, crucial components of deep learning architectures, alongside Spike-Timing-Dependent Plasticity (STDP) circuitry. These activation functions play a vital role in network performance, and the integration of STDP enhances learning dynamics. By mimicking biological synaptic behavior, STDP facilitates unsupervised learning and improves adaptability to input variations. Various activation functions, including sigmoid, tanh, ReLU (Rectified Linear Unit), and linear, were implemented, revealing varying power consumption levels. Sigmoid exhibited the highest power consumption at 65.33uW, while linear had the lowest at 2.491uW. Integrating STDP into the activation function architecture enhances the CNN's ability to efficiently learn from input patterns, following a biologically inspired approach. The overarching goal is to enhance learning dynamics, robustness, low latency, and reduce hardware requirements for both STDP and activation functions within the CNN. Simulations were conducted using CadenceĀ® Virtuoso ADEL environments with 90nm technology node library.
Keywords: Activation Function, ReLU Function, Hyperbolic Tangent (tanh), Feature Extraction, Backpropagation
DOI: 10.24874/PES08.01A.007
Recieved: 06.04.2025 Revised: 08.06.2025 Accepted: 13.07.2025
UDC:
Reads: 889 