Next Article in Journal
Experimental Study on Aerodynamic Characteristics of Downwind Bionic Tower Wind Turbine
Next Article in Special Issue
Deep Learning and Neural Architecture Search for Optimizing Binary Neural Network Image Super Resolution
Previous Article in Journal
An Enhanced Tree-Seed Algorithm for Function Optimization and Production Optimization
Previous Article in Special Issue
Motor Interaction Control Based on Muscle Force Model and Depth Reinforcement Strategy
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Neuron Circuit Based on a Split-gate Transistor with Nonvolatile Memory for Homeostatic Functions of Biological Neurons

1
School of Electronic and Electrical Engineering, Kyungpook National University, Daegu 41566, Republic of Korea
2
Division of Materials Science and Engineering, Hanyang University, Seoul 04763, Republic of Korea
*
Authors to whom correspondence should be addressed.
Biomimetics 2024, 9(6), 335; https://doi.org/10.3390/biomimetics9060335
Submission received: 23 April 2024 / Revised: 24 May 2024 / Accepted: 30 May 2024 / Published: 31 May 2024
(This article belongs to the Special Issue New Insights into Bio-Inspired Neural Networks)

Abstract

:
To mimic the homeostatic functionality of biological neurons, a split-gate field-effect transistor (S-G FET) with a charge trap layer is proposed within a neuron circuit. By adjusting the number of charges trapped in the Si3N4 layer, the threshold voltage (Vth) of the S-G FET changes. To prevent degradation of the gate dielectric due to program/erase pulses, the gates for read operation and Vth control were separated through the fin structure. A circuit that modulates the width and amplitude of the pulse was constructed to generate a Program/Erase pulse for the S-G FET as the output pulse of the neuron circuit. By adjusting the Vth of the neuron circuit, the firing rate can be lowered by increasing the Vth of the neuron circuit with a high firing rate. To verify the performance of the neural network based on S-G FET, a simulation of online unsupervised learning and classification in a 2-layer SNN is performed. The results show that the recognition rate was improved by 8% by increasing the threshold of the neuron circuit fired.

1. Introduction

Recently, artificial neural networks (ANNs), inspired by the brain, have been actively researched and developed [1]. Particular attention has been paid to spiking neural net-works (SNNs) that utilize biological spike timing–dependent plasticity (STDP) rules to function as event-driven systems for unsupervised learning [2,3,4,5,6,7,8,9,10]. Since such SNNs pursue a more bio-plausible direction than other ANNs do, it is necessary to explore the signal transmission mechanisms of biological neurons and synapses to gain a deeper understanding of these SNNs. Figure 1a illustrates the neurons and synapses, and we focus on explaining the membrane potential and neurotransmission of neurons. Neurons maintain the membrane potential by consuming adenosine triphosphate, and the mem-brane potential increases when cations are injected into the membrane due to stimuli pro-vided as dendrites. At this point, if the membrane potential exceeds a specific threshold value, the neuron will be depolarized. Then, the membrane potential reaches the action potential, following which the neuron is repolarized and the membrane potential restored to its initial value. The electrical spikes generated in this process are transmitted to the axon of the neuron in an electrical form and to the post-synaptic neuron in the form of a neurotransmitter through the synapse. However, according to rate-based learning rules such as SRDP and BCM [11,12,13,14], synapses cannot simply transmit signals between two neurons; rather, they can modulate the strength of the transmitted signal by increasing the synaptic efficiency when receiving high-frequency stimulation and reducing it when receiving low-frequency stimulation. These changes in synaptic strength can be explained primarily by activity-dependent plasticity, such as long-term potentiation and long-term depression, and are known to be closely related to the functions per-formed by the brain, such as learning, memory, and cognition. When synaptic strength changes only due to activity, the neural activity can be induced in the direction of runaway excitation or sequences. In response, neurons are known to perform homeostatic regulation to stabilize natural activity and maintain their average firing rate within a narrow range. Figure 1b shows the change of AMPA receptor’s number at the post-synaptic surface induced by neural activity. AMPA receptor’s numbers at the postsynaptic surface are accordingly scaled up- or downwardly in response to activity deprivation or overexcitation, respectively. In this process, the firing rate of neurons is maintained in a narrow range [15,16,17,18].
Recently, research has focused on integrating neuromorphic systems and hard-ware-based neural networks with various emerging memory devices to enable energy-efficient operation of artificial intelligence algorithms [9,10,19,20,21,22,23,24,25,26,27,28,29,30,31,32,33,34,35,36,37,38,39,40,41,42,43]. For instance, on-chip learning can reduce power consumption and compensate for the degradation caused by device variation [44,45,46,47,48,49]. Similar to biological neurons, STDP-based neural networks re-quire a homeostasis functionality that controls the firing rate of the output neurons to achieve more accurate pattern recognition. The homeostasis functionality in biological neurons involves reducing the fire rate of dominant (high firing rate) neurons and increasing that of weak (low firing rate) neurons. Various studies have attempted to implement the homeostasis functionality in hardware and software neural systems [50,51,52,53,54,55,56]. To mimic the homeostasis function, each neuron needs peripheral circuits composed of conventional complementary metal-oxide-semiconductors (CMOSs) and capacitors. However, if all neuron circuits have at least one large capacitor, the hardware footprint of the neural system can become excessive. In addition, when synaptic weights are updated in SNNs, the value controlling the firing frequency of each neuron circuit is also updated and stored in the peripheral circuits. Due to the volatile memory functionality in peripheral circuits, maintaining the values for the homeostasis functionality in the system is impossible without a supply voltage.
In this paper a split-gate field-effect transistor (S-G FET) with a charge-storage layer is proposed to mimic the homeostasis functionality. The S-G FET is used to compare the membrane potential (Vmem) with the threshold voltage (Vth) for the fire function in the neuron circuit. By using the fin structure in the S-G FET with independent double gates, it was possible to implement a stable operation of neuron circuit by distinguishing between the gate for implementing homeostatic function through P/E and the gate for reading synaptic signals. The Vth of each neuron circuit can be updated selectively using the S-G FET, which has two independent gates. The homeostasis functionality can be achieved by controlling the threshold of the neuron circuit using the program/erase operation in the charge-trap layer. Employing a nonvolatile memory functionality for the charge storage layer can maintain the optimized Vth of the neuron circuit. A two-layer SNN based on biological STDP learning rules was simulated with the S-G FET to demonstrate the improved pattern-recognition accuracy on MINST datasets.

2. Materials and Methods

The S-G FET was fabricated on a 6-inch Si wafer using conventional CMOS technology. Figure 2a–h shows the fabrication steps for integrating the S-G FET, n-type FET, p-type FET, and memory devices. First, a sacrificial SiO2 layer is deposited on the wafer and patterned to obtain the single-gate FETs, and a nitride layer is then deposited. Subsequently, a 250 nm poly-Si layer is deposited and patterned to enable the formation of the Si3N4 spacers. Next, a 35 nm (fin width = 35 nm) Si3N4 layer is deposited and etched again, causing the Si3N4 spacers to be formed on the sidewalls of the patterned poly-Si layer (Figure 2a). The poly-Si layer is selectively removed using a solution of HNO3 and HF (Figure 2b), and the Si substrate is etched to a depth of 80 nm (fin height = 80 nm) with the Si3N4 spacers as a hard mask (Figure 2c). Then, a thin SiO2 film is again deposited as a barrier layer for SF6 dry etching, followed by the deposition of a poly-Si layer (Figure 2d). The poly-Si spacers are selectively removed using a mask in the regions of bulk-fin FETs, followed by anisotropic etching of SiO2 (Figure 2e). Thereafter, isotropic Si etching using SF6 gas is performed by controlling the etching thickness (Figure 2f). Si fins without the SiO2/poly-Si sidewalls are separated from the Si substrate, as shown on the left in Figure 2g. Notably, both ends of the separated fins are connected to the substrate, as shown on the right in Figure 2g. By wet-etching SiO2 in a buffered hydrogen fluoride solution, the Si3N4 spacers on the sacrificial SiO2 are removed. The gap between the Si fins is filled with SiO2 via a high-density plasma chemical vapor deposition process (Figure 2h). After the SiO2 is etched up to a certain thickness (until the top of the Si fins), boron and phosphorus ions are implanted for the field and channel do** of the n-type and p-type FETs, respectively. Then, the SiO2/Si3N4/SiO2 (2/4.2/9 nm) gate dielectric layer, which is called the O/N/O layer, is deposited for a memory function, and a SiO2 film (10 nm) is deposited as the gate dielectric layer of the conventional MOSFET, respectively. (Figure 2i). Then, an n+-doped poly-Si layer is deposited as a gate material. To form the independent split gates using the Si3N4 spacers, the wafer is coated with a diluted photoresist (PR); a thin PR layer is then formed only on the top of the Si3N4 spacers (Figure 2j). By etching the PR up to a specific thickness, only the n+-doped poly-Si on the Si3N4 spacers is exposed (Figure 2k). The exposed n+-doped poly-Si is then etched to form the independent split gates. The remaining PR is removed, followed by the patterning of the n+-doped poly-Si for the gates. Subsequently, the Si3N4 spacer is striped (Figure 2l). After boron and arsenic ions are implanted for the source/drain regions of the n-type and p-type FETs, respectively, rapid thermal annealing is conducted at a temperature of 1050 °C (5 s) to activate the ions. Then, an interlayer dielectric is deposited, and the contact holes and metal are patterned.
Figure 3a,b show a 3D schematic view and a cross-sectional transmission electron microscopy (TEM) image, respectively, of the fabricated S-G FET with a floating fin body. The split gates (G1 and G2) are n+ doped poly-Si, and the thickness of the gate oxide (SiO2) is 9 nm for the conventional CMOSFET. The thicknesses of the tunneling layer, charge-trap layer, and blocking layer are 2 nm, 4.2 nm, and 9 nm, respectively, consisting of the SiO2/Si3N4/SiO2 stack to provide the nonvolatile memory for the homeostatic function. The width (Wfin) and height (Hfin) of the floating fin body are 35 nm and 80 nm, respectively. The split gates on both sides of the floating fin body can be used to modulate the threshold voltage (Vth) of the S-G FET, thereby enabling neuron circuits to be controlled using the S-G FET. The do** concentrations of the n-type channel, p-type channel, source region, and drain region are 1 × 1018 cm−3, 1 × 1018 cm−3, 2 × 1020 cm−3, and 2 × 1020 cm−3, respectively.

3. Results and Discussion

3.1. Basic Device Characteristics

Figure 4 shows the ID-VG characteristics of CMOSs and S-G FET, respectively. The entire measurement process was conducted using the Keysight B1500A parameter analyzer with 10ms intervals. Figure 4a shows the measured ID–VG curves of the n-type and p-type FETs fabricated on the same wafer. The gate width (W) and length (L) of the conventional CMOSs are 35 nm and 1 μm, respectively. The subthreshold swings (SS) of of conventional n-type MOSFET and p-type MOSFET are 70 mV/decade and 160 mV/decade at VD = 0.1 V, respectively. The conventional CMOSs are used for the neuron circuits, self-controller (including a pulse generator for the program/erase operation of synaptic devices and the S-G FET), and connecting parts (current mirror and switch circuits) between the synaptic array and neuron circuit. Figure 4b shows the measured ID–VG1 curves of the S-G FET for VG2 values between −2 V and 2 V. The solid and hollow symbols represent the transfer curves at VD values of 0.1 V and 1 V, respectively. For both sets of curves, the ID values are almost identical at the same VG1 due to the low drain-induced barrier lowering of the fabricated S-G FET. As the gate voltage applied to G2 decreases from positive to negative, the potential in the channel region increases, and the electron energy barrier from the source region to the channel region increases as well. This results in an increase in the Vth of the S-G FET. Since the channel region between the split gates is fully depleted as 35 nm, the △Vth of the S-G FET is almost linearly related to △VG2 [20]. In previous studies, neuron circuits were designed as homeostatic circuits consisting of current mirrors and a single capacitor, to mimic the biological homeostatic function. However, since this involves storing the homeostatic data of neurons in capacitors to control the Vth of each neuron, a risk of data loss exists due to the volatile memory characteristics of the capacitors. Additionally, the large size of such capacitors (with a 1 pF capacitor occupying 100 μm² at a SiO2 thickness of 10 nm) causes the neuron circuit array to occupy a significant area. As described below, however, utilizing the nonvolatile memory characteristics of the charge-trap layer allows for implementing the homeostatic function of biological neurons at a high density as well as permanent storage of homeostatic data.
When the S-G FET with the two independent gates (G1 and G2) is used in a neuron circuit, one gate (G1) reads input signals from a synaptic array, while the other gate (G2) modulates the Vth of the S-G FET through the amount of charge in the charge-trap layer via program/erase operations. The program/erase operations are executed in the tunneling layer through the Fowler–Nordheim (FN) tunneling mechanism when high positive/negative voltages are biased across the gate dielectric layers. Figure 5a shows the measured ID–VG1 curves of the S-G FET with the number of program pulses applied to G2. When a positive pulse (VPGM of 7 V, tPGM of 100 μs) is applied to G2 for the program operation, electrons accumulate in the charge-trap layer adjacent to G2; this has the same effect as applying a negative voltage to G2, resulting in an increase in the Vth of the S-G FET. By contrast, when a negative pulse (VERS of −7.5 V, tERS of 100 μs) is applied to G2 for the erase operation, the accumulated electrons are de-trapped, resulting in a decrease in the Vth of the S-G FET. Figure 5b shows the measured Vth changes in ID–VG1 curves of the S-G FET with the number of program/erase pulses applied to G2; Vth is measured at an ID of 10 nA. As the number of program pulses applied to G2 increases, Vth increases from 0.6 V to 0.83 V at 50 pulses. Conversely, when the erase pulses are applied to G2, Vth gradually decreases and returns to its initial value of 0.6 V when more than 50 pulses are applied. Vths in program and erase states were measured at room temperature with the terminals open. As a result, The Vths of the S-G FET in both states are maintained well over 104 s (< Vth changes of 10%), as shown in Figure 5c. The Vth values of the S-G FET in both the program and erase states are maintained for well over 104 s (Vth variation < 10%), as shown in Figure 5c; the nonvolatile memory function enables the optimized Vth of the S-G FET to be maintained. If the S-G FET is used to determine the threshold of one neuron circuit during the learning process of an SNN, the threshold of all other neuron circuits can be optimized through the PGM/ERS operation. Moreover, G1 and G2 being independent gates allows for the thresholds of all neuron circuits to be updated selectively. No change in the SS is observed until 104 program/erase cycles have been executed. However, beyond 105 cycles, the tunneling layer on G2 becomes degraded, resulting in an increase in the SS corresponding to the VG2 of the S-G FET. Notably, since the program/erase pulses are applied to G2, the gate dielectric of G1 does not degrade, even after more than 105 program/erase cycles, as seen in Figure 5d. In addition, |7 V| program/erase pulses are applied to G2 of the S-G FET to modulate the threshold value of the neuron circuit only during the learning process. After learning, only small signal (synaptic signals, membrane potential) are applied to G1 of the S-G FET for the classification and pattern recognition. Therefore, when implementing biological homeostasis functions using nonvolatile memory characteristics, signals transmitted from the synaptic array can be reliably read.

3.2. A Neuron Circuit Using the S-G FET

Figure 6a shows a schematic of an integrate-and-fire (IF) neuron circuit integrated with the S-G FET. The S-G FET is designed using a Sentaurus TCAD simulation, and the modulation of the Vth in the charge trap layer (Si3N4) of the S-G FET was performed through the program and erased operations via the Fowler-Nordheim (FN) tunneling model. When determining carrier mobility, the velocity saturation model, do** dependence model, and Lombardi surface mobility model were used. The old Slotboom model was used to calculate bandgap narrowing due to do**. The simulation of the neuron circuit with the S-G FET was conducted using the mixed mode provided in the Sentaurus TCAD simulation. The parameters of the CMOS and capacitors are as follows: L = 0.5 μm, W = 0.1 μm, Cmem = 0.5 pF, Creset = 0.05 pF. The supply voltage (VDD) is 1.0 V. Cmem is used for the integrate function, while the S-G FET compares the membrane potential (Vmem) with Vth. When Vmem exceeds the Vth of the S-G FET due to the charge stored in Cmem, the S-G FET triggers the fire function, and the state of Node 1 (N1) changes from high to low. Then, the output node (Vout) of INV1 changes from a low to a high state, and the charges accumulated on Cmem are drained through Mreset. Finally, the neuron circuit resets to the initial state via a feedback signal. At this point, M1 and M2 temporarily enhance the pull-down operation, whereby the S-G FET is activated and lowers the voltage at N1. M4 is intended to prevent the drain current from flowing through the S-G FET when VPGM (positive voltage) is applied to G2 for the program operation. Figure 6b demonstrates the IF operation of the neuron circuit using the S-G FET during the learning process. Through this operation, the output signal is transmitted to the expended pulse generator and voltage shifter, where it can be modulated to the desired pulses for the program operation. When a program pulse (VPGM of 7 V, tPGM of 100 μs) is applied to G2, 0 V is simultaneously applied to the M4 gate to perform the program operation through FN tunneling. The neuron circuit performs the IF operation at a higher Vmem from 0.56 V, as shown in Figure 6b. Through the program operation of the S-G FET, Vths of the neuron circuit are increased. Each of the Vths of the neuron circuit after the program operation was 0.558 V, 0.60 V, 0.618 V, 0.637 V, 0.66 V, 0.675 V, and 0.71 V. Consequently, the firing rate of the neuron circuit also linearly decreases as 1902 Hz, 1754 Hz, 1694 Hz, 1639 Hz, 1587 Hz, 1538 Hz, and 1449 Hz. Raising the threshold of frequently firing neuron circuits allows other neurons a greater chance of firing.
An output spike of 1 V and 2 μs generated by the neuron circuit is insufficient for the program/erase operations required to inject charge into the charge-trap layer for implementing homeostatic functions. Hence, to generate the required program/erase pulses, an extended pulse generator and a voltage level shifter are designed (Figure 7a) [57]. The extended pulse generator consists of a NOR gate and buffer invertors to extend the width of the pulse, while the voltage level shifter comprises an inverter and differential amplifier to raise the output signal of the neuron to the VPGM level. In Figure 7a, the parameters of the CMOS, excluding M5, M6, and M7, are as follows: L = 0.5 μm, W = 0.1 μm, C1 = 1 pF, VDD1 = 1 V, and VDD2 = 7 V. M5 is designed with L = 2 μm and W = 0.1 μm to slowly pull up the V2 node; the width of the extended pulse is determined by M5 and the capacitor C1. Standard CMOS devices are unsuitable for handling high supply voltage about 7 V, in the case of M6 to M9 for high voltage, the simulation was conducted by setting it as a device with lightly doped drain (LDD) which is do** concentration of 1 × 1019 cm− 3 [58,59]. M6 and M7 are designed with L = 5 um and W = 0.1 um to ensure an output voltage of 0 V in the off state, even when the applied VDD2 has a high value of 7 V or more. A high supply voltage is applied to the gates of M6 and M9, generating 7V, which can cause damage or degradation to the gate oxide. To ensure the stable operation of the circuit, it is necessary to improve the quality and increase the thickness of the CMOS gate oxide through fabrication.
When the output spike is transmitted to the Vn.out node, the NOR gate lowers V1 from 1 V to 0 V. Since V2 is coupled to V1 by C1, V2 is also set to the low state. Consequently, Vextended is raised to the high state by the inverter. Even though Vn.out changes from 1 V to 0 V after 2 μs (tspike = 2 μs), V1 remains 0 V because Vextended, another input voltage to the NOR gate, is 1 V. However, when V2 is 0 V, M5 is activated, which causes V2 to increase gradually from 0 V to 1 V. As shown in Figure 7b, when V2 is close to 1 V after 50 μs, Vextended decreases from 1 V to 0 V. Since all the input voltages and the output voltage of the NOR gate become 0 V, both V1 and V2 return to the initial value of 1 V. Thus, the extended pulse generator allows for extending the output spike of the neuron circuit to the time required for the program operation, and the extension duration is determined by the capacitance of C1 and the current at M5. Next, the process of modulating the amplitude of the extended pulse from 1 V to 7 V is described. The output voltage of the extended pulse generator is transmitted to the input node of the voltage level shifter and the gate of M8. When Vextended is raised, M8 is activated, and the inverter deactivates M9. Thus, M7 is activated, and a VDD2 of 7 V is applied as the output voltage of the voltage level shifter, which is used as the VPGM for the program operation of the synaptic devices and the S-G FET. Conversely, lowering Vextended deactivates M8, causes the inverter to activate M9, and returns all nodes to their initial condition, as shown in Figure 7b.

3.3. Pattern Recognition in SNN with Homeostasis Function

To verify the performance of a neural network based on the S-G FET, online unsupervised learning and classification were performed on in SNNs using MNIST data set with a Python simulator as shown in Figure 8a. The MNIST dataset is represented by 28 × 28 pixels, and 60,000 training and 10,000 test images of MNIST dataset are used for training and verification. The simulated SNN consists of a 784-input neuron layer and a 50–500 output neuron layer. To analyze the change in recognition rate according to the number of output neurons, the output neurons were set from 50 to 500. The synapse array for training the SNN used TFT (Thin Film Transistor) type NOR flash memory, and the memory characteristics for training used a simplified STDP, as shown in Figure 8b [60]. When Vpre and Vpost are applied to the gate and source of each TFT synaptic device, respectively, program (LTD) and erase operations (LTP) in the charge storage layer are performed due to the potential difference (Vpre–Vpost) between the gate and source [60]. Figure 8c shows a block diagram of the proposed system, consisting of a synapse array, neuron circuits, and a common controller (including switch circuits, the extended pulse generator, and the voltage level shifter). To ensure that other neurons have the opportunity to fire, the thresholds of frequently fired neurons were increased by applying VPGM_SGFET whenever a neuron fired and transmitted an output signal. To accelerate the learning, the thresholds of all neuron circuits were reduced by periodically applying VERS_SGFET to all the neuron circuits. As described earlier, this operation can imitate the homeostasis functionality for controlling the firing rate of neuron circuits. The initial voltage threshold of each neuron in the proposed neural network is 0.7 V, and Figure 8d,e show the optimized Vth of the output neurons and the recognition rate of the proposed system, respectively, for a given MNIST dataset. The recognition rate is an average of 10 runs. The type of MNIST digits can be distinguished better by increasing the number of output neurons in a neural computing system, which leads to highly accurate pattern recognition [20]. The maximum accuracy of the SNN based on the proposed homeostasis functionality reached 91.84% with 200 output neurons. The proposed SNN achieved about an 8% higher recognition rate than an SNN without the homeostasis function. In addition, although the conductance fluctuation of synaptic devices is large (σ/μ > 0.5), as shown in Figure 8f, the degradation of the recognition rate was observed to be low (~3%) in the proposed system due to the homeostasis function. However, the recognition rate without the homeostasis functionality was severely compromised as the fluctuation of the synaptic devices increased.

4. Conclusions

A S-G FET was designed and fabricated in this study to control the threshold of neuron circuits for spiking neural networks. The Vth variation of the S-G FET was verified experimentally by applying a VPGM of 7 V and VERS of −7.5 V to the G2 gate. Even after more than 105 program/erase cycles, the gate dielectrics of G1 did not degrade; thus, no change in the SS of G1 was observed. In neuron circuits with the homeostatic function, the recognition rate was improved by 8% by increasing the firing threshold of the circuits. Furthermore, the nonvolatile memory functionality of the charge storage layer in the S-G FET could maintain the thresholds (for over 104 s) of the neuron circuits optimized during the SNN’s learning process. The proposed homeostatic neuron functionality yielded a high classification accuracy for the MNIST dataset, despite large fluctuations in the synaptic devices in the two-layer SNN.

Author Contributions

Conceptualization, H.K. (Hyung** Kim); methodology, H.K. (Hansol Kim); validation, H.K. (Hansol Kim) and H.K. (Hyung** Kim); formal analysis, H.K. (Hansol Kim) and S.Y.W.; investigation, H.K. (Hansol Kim) and S.Y.W.; data curation, H.K. (Hyung** Kim); writing—original draft preparation, H.K. (Hansol Kim) and S.Y.W.; writing—review and editing, S.Y.W. and H.K. (Hyung** Kim); supervision, H.K. (Hyung** Kim). All authors have read and agreed to the published version of the manuscript.

Funding

This work was supported in part by NRF funded by the Korean government (RS-2024-00405200, 50%, RS-2024-00406790, 20%, RS-2023-00270126, 20%), in part by the MSIT (Ministry of Science and ICT), Korea, under the ITRC (Information Technology Research Center) support program (IITP-2021-0-02052, 10%) supervised by the IITP (Institute for Information & Communications Technology Planning & Evaluation), and in part by the Brain Korea 21 Four Program. The EDA tool was provided by the IC Design Education Center (IDEC), Korea.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

The original contributions presented in the study are included in the article, further inquiries can be directed to the corresponding author/s.

Conflicts of Interest

The authors declare no conflicts of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript; or in the decision to publish the results.

References

  1. Yu, S. Neuro-inspired computing with emerging nonvolatile memories. Proc. IEEE 2018, 106, 260–285. [Google Scholar] [CrossRef]
  2. Masquelier, T.; Thorpe, S.J. Unsupervised learning of visual features through spike timing dependent plasticity. PLoS Comput. Biol. 2007, 3, 247–257. [Google Scholar] [CrossRef] [PubMed]
  3. Wu, X.; Saxena, V.; Zhu, K. Homogeneous Spiking Neuromorphic System for Real-World Pattern Recognition. IEEE J. Emerg. Sel. Top. Circuits Syst. 2015, 5, 254–266. [Google Scholar] [CrossRef]
  4. Bichler, O.; Querlioz, D.; Thorpe, S.J.; Bourgoin, J.-P.; Gamrat, C. Unsupervised features extraction from asynchronous silicon retina through Spike-Timing-Dependent Plasticity. In Proceedings of the 2011 International Joint Conference on Neural Networks, San Jose, CA, USA, 31 July–5 August 2011; pp. 859–866. [Google Scholar]
  5. Qiao, N.; Mostafa, H.; Corradi, F.; Osswald, M.; Stefanini, F.; Sumislawska, D.; Indiveri, G. A reconfigurable on-line learning spiking neuromorphic processor comprising 256 neurons and 128K synapses. Front. Neurosci. 2018, 9, 141. [Google Scholar] [CrossRef] [PubMed]
  6. Ferre, P.; Mamalet, F.; Thorpe, S.J. Unsupervised Feature Learning with Winner-Takes-All Based STDP. Front. Comput. Neurosci. 2018, 12, 24. [Google Scholar] [CrossRef] [PubMed]
  7. Diehl, P.U.; Cook, M. Unsupervised learning of digit recognition using spike-timing-dependent plasticity. Front. Comput. Neurosci. 2015, 9, 99. [Google Scholar] [CrossRef] [PubMed]
  8. Querlioz, D.; Bichler, O.; Dollfus, P.; Gamrat, C. Immunity to device variations in a spiking neural network with memristive nanodevices. IEEE Trans. Nanotechnol. 2013, 12, 288–295. [Google Scholar] [CrossRef]
  9. Park, J.; Kim, S.; Song, M.S.; Youn, S.; Kim, K.; Kim, H. Implementation of convolutional neural networks in memristor crossbar arrays with binary activation and weight quantization. ACS Appl. Mater. Interfaces 2024, 16, 1054–1065. [Google Scholar] [CrossRef] [PubMed]
  10. Wang, Y.; Yin, L.; Huang, W.; Li, Y.; Huang, S.; Zhu, Y.; Yang, D.; Pi, X. Optoelectronic synaptic devices for neuromorphic computing. Adv. Intell. Syst. 2021, 3, 2000099. [Google Scholar] [CrossRef]
  11. Florini, D.; Gandolfi, D.; Mapelli, J.; Benatti, L.; Pavan, P.; Puglisi, F.M. A Hybrid CMOS-Memristor Spiking Neural Network Supporting Multiple Learning Rules. IEEE Trans. Neural Netw. Learn. Syst. 2024, 35, 5117–5129. [Google Scholar] [CrossRef]
  12. Milo, V.; Pedretti, G.; Carboni, R.; Calderoni, A.; Ramaswamy, N.; Ambrogio, S.; Ielmini, D. Demonstration of hybrid CMOS/RRAM neural networks with spike time/rate-dependent plasticity. In Proceedings of the 2016 IEEE International Electron Devices Meeting (IEDM), San Francisco, CA, USA, 3–7 December 2016. [Google Scholar]
  13. Cooper, L.; Bear, M. The BCM theory of synapse modification at 30: Interaction of theory with experiment. Nat. Rev. Neurosci. 2012, 13, 798–810. [Google Scholar] [CrossRef]
  14. Ahmed, T.; Walia, S.; Mayes, E.L.H.; Ramanathan, R.; Bansal, V.; Bhaskaran, M.; Sriram, S.; Kavehei, O. Time and rate dependent synaptic learning in neuro-mimicking resistive memories. Sci. Rep. 2019, 9, 15404. [Google Scholar] [CrossRef]
  15. Fernandes, D.; Carvalho, A.L. Mechanisms of homeostatic plasticity in the excitatory synapse. J. Neurochem. 2016, 139, 973–996. [Google Scholar] [CrossRef]
  16. Wang, G.; Gilbert, J.; Man, H.-Y. AMPA Receptor Trafficking in Homeostatic Synaptic Plasticity: Functional Molecules and Signaling Cascades. Neural Plast. 2012, 2012, 825364. [Google Scholar] [CrossRef]
  17. Tien, N.W.; Kerschensteiner, D. Homeostatic plasticity in neural development. Neural Dev. 2018, 13, 9. [Google Scholar] [CrossRef]
  18. Chowdhury, D.; Hell, J.W. Homeostatic synaptic scaling: Molecular regulators of synaptic AMPA-type glutamate receptors. F1000Research 2018, 7, 234. [Google Scholar] [CrossRef] [PubMed]
  19. Kim, K.; Song, M.S.; Hwang, H.; Hwang, S.; Kim, H. A comprehensive review of advanced trends: From artificial synapses to neuromorphic systems with consideration of non-ideal effects. Front. Neurosci. 2024, 18, 1279708. [Google Scholar] [CrossRef] [PubMed]
  20. Cho, S. Volatile and nonvolatile memory devices for neuromorphic and processing-in-memory applications. J. Semicond. Technol. Sci. 2022, 22, 30–46. [Google Scholar] [CrossRef]
  21. Lee, G.H.; Song, M.S.; Kim, S.; Yim, J.; Hwang, S.; Yu, J.; Kwon, D.; Kim, H. Ferroelectric field-effect transistors for binary neural network with 3-D NAND architecture. IEEE Trans. Electron Devices 2022, 69, 6438–6445. [Google Scholar] [CrossRef]
  22. Li, Z.; Meng, J.; Yu, J.; Liu, Y.; Wang, T.; Liu, P.; Chen, S.; Zhu, H.; Sun, Q.; Zhang, D.W.; et al. CMOS compatible low power consumption ferroelectric synapse for neuromorphic computing. IEEE Electron Device Lett. 2023, 44, 532–535. [Google Scholar] [CrossRef]
  23. Kumar, A.; Krishnaiah, M.; Park, J.; Mishra, D.; Dash, B.; Jo, H.-B.; Lee, G.; Youn, S.; Kim, H.; **, S.H. Multibit, lead-free Cs2SnI6 resistive random access memory with self-compliance for improved accuracy in binary neural network application. Adv. Funct. Mater. 2024, 34, 2310780. [Google Scholar] [CrossRef]
  24. Kim, J.P.; Kim, S.K.; Park, S.; Kuk, S.-H.; Kim, T.; Kim, B.H.; Ahn, S.-H.; Cho, Y.-H.; Jeong, Y.; Choi, S.-Y.; et al. Dielectric-engineered high-speed, low-power, highly reliable charge trap flash-based synaptic device for neuromorphic computing beyond inference. Nano Lett. 2023, 23, 451–461. [Google Scholar] [CrossRef] [PubMed]
  25. Youn, S.; Lee, J.; Kim, S.; Park, J.; Kim, K.; Kim, H. Programmable threshold logic implementations in a memristor crossbar array. Nano Lett. 2024, 24, 3581–3589. [Google Scholar] [CrossRef] [PubMed]
  26. Han, X.; Zhang, J.; Zeng, T.; Zhao, X.; Li, J.; Sun, H.; Cao, Y.; Tong, Y.; Tang, Q.; Liu, Y. Super-flexible, transparent synaptic transistors based on pullulan for neuromorphic electronics. IEEE Electron Device Lett. 2023, 44, 606–609. [Google Scholar] [CrossRef]
  27. Kim, S.; Park, J.; Kim, T.-H.; Hong, K.; Hwang, Y.; Park, B.G.; Kim, H. 4-bit Multilevel Operation in Overshoot Suppressed Al2O3/TiOx Resistive Random-Access Memory Crossbar Array. Adv. Intell. Syst. 2022, 4, 2100273. [Google Scholar] [CrossRef]
  28. Hwang, S.; Yu, J.; Song, M.S.; Hwang, H.; Kim, H. Memcapacitor crossbar array with charge trap NAND flash structure for neuromorphic computing. Adv. Sci. 2023, 10, 2303817. [Google Scholar] [CrossRef] [PubMed]
  29. Kim, T.-H.; Kim, S.; Hong, K.; Park, J.; Youn, S.; Lee, J.-H.; Park, B.-G.; Kim, H. Effect of program error in memristive neural network with weight quantization. IEEE Trans. Electron Devices 2022, 69, 3151–3157. [Google Scholar] [CrossRef]
  30. Werner, T.; Vianello, E.; Bichler, O.; Grossi, A.; Nowak, E.; Nodin, J.-F.; Yvert, B.; DeSalvo, B.; Perniola, L. Experimental demonstration of short and long term synaptic plasticity using OxRAM multi k-bit arrays for reliable detection in highly noisy input data. In Proceedings of the IEEE International Electron Devices Meeting (IEDM), San Francisco, CA, USA, 3–7 December 2016. [Google Scholar]
  31. Wang, Z.Q.; Xu, H.Y.; Li, X.H.; Yu, H.; Liu, Y.C.; Zhu, X.J. Synaptic learning and memory functions achieved using oxygen ion migration/diffusion in an amorphous InGaZnO memristor. Adv. Funct. Mater. 2012, 22, 2759–2765. [Google Scholar] [CrossRef]
  32. Oh, S.; Huang, Z.; Shi, Y.; Kuzum, D. The impact of resistance drift of phase change memory (PCM) synaptic devices on artificial neural network performance. IEEE Electron Device Lett. 2019, 40, 1325–1328. [Google Scholar] [CrossRef]
  33. Bianchi, S.; Munoz-Martin, I.; Hashemkhani, S.; Pedretti, G.; Ielmini, D. A bio-inspired recurrent neural network with self-adaptive neurons and PCM synapses for solving reinforcement learning tasks. In Proceedings of the IEEE International Symposium on Circuits and Systems (ISCAS), Seville, Spain, 10–21 October 2020; pp. 1–5. [Google Scholar]
  34. Kang, M.; Park, J. Peripheral circuit optimization with precharge technique of spin transfer torque MRAM synapse array. In Proceedings of the International Technical Conference on Circuits/Systems, Computers and Communications (ITC-CSCC), Jeju, Republic of Korea, 25–28 June 2021; pp. 1–3. [Google Scholar]
  35. Feng, Y.; Huang, P.; Zhao, Y.; Shan, Y.; Zhang, Y.; Zhou, Z.; Liu, L.; Liu, X.; Kang, J. Improvement of state stability in multi-level resistive random-access memory (RRAM) array for neuromorphic computing. IEEE Electron Device Lett. 2021, 42, 1168–1171. [Google Scholar] [CrossRef]
  36. Li, Y.; Ang, K.-W. Hardware implementation of neuromorphic computing using large-scale memristor crossbar arrays. Adv. Intell. Syst. 2021, 3, 2000137. [Google Scholar] [CrossRef]
  37. Jeon, K.; Kim, J.; Ryu, J.J.; Yoo, S.-J.; Song, C.; Yang, M.K.; Jeong, D.S.; Kim, G.H. Self-rectifying resistive memory in passive crossbar arrays. Nat. Commun. 2021, 12, 2968. [Google Scholar] [CrossRef] [PubMed]
  38. Hsieh, E.; Zheng, X.; Le, B.; Shih, Y.; Radway, R.; Nelson, M.; Mitra, S.; Wong, S. Four-bits-per-memory one-transistor-and-eight-resistive-random-access-memory (1T8R) array. IEEE Electron Device Lett. 2021, 42, 335–338. [Google Scholar] [CrossRef]
  39. Rajendran, B.; Alibart, F. Neuromorphic Computing Based on Emerging Memory Technologies. IEEE J. Emerg. Sel. Top. Circuits Syst. 2016, 6, 198–211. [Google Scholar] [CrossRef]
  40. Yang, J.Q.; Zhou, Y.; Han, S.T. Functional Applications of Future Data Storage Devices. Adv. Electron Mater. 2021, 7, 2001181. [Google Scholar] [CrossRef]
  41. Li, S.; Lyu, H.; Li, J.; He, Y.; Gao, X.; Wan, Q.; Shi, Y.; Pan, L. Multiterminal ionic synaptic transistor with artificial blink reflex function. IEEE Electron Device Lett. 2021, 42, 351–354. [Google Scholar] [CrossRef]
  42. Lee, S.-T.; Lim, S.; Choi, N.Y.; Bae, J.-H.; Kwon, D.; Park, B.-G.; Lee, J.-H. Operation scheme of multi-layer neural networks using nand flash memory as high-density synaptic devices. IEEE J. Electron Devices Soc. 2019, 7, 1085–1093. [Google Scholar] [CrossRef]
  43. Park, Y.J.; Kwon, H.T.; Kim, B.; Lee, W.J.; Wee, D.H.; Choi, H.-S.; Park, B.-G.; Lee, J.-H.; Kim, Y. 3-D stacked synapse array based on charge-trap flash memory for implementation of deep neural networks. IEEE Trans. Electron Devices 2018, 66, 420–427. [Google Scholar] [CrossRef]
  44. Woo, S.Y.; Choi, K.-B.; Kim, J.; Kang, W.-M.; Kim, C.-H.; Seo, Y.-T.; Bae, J.-H.; Park, B.-G.; Lee, J.-H. Implementation of homeostasis functionality in neuron circuit using double-gate device for spiking neural network. Solid-State Electron. 2020, 165, 107741. [Google Scholar] [CrossRef]
  45. Wu, H.; Yao, P.; Gao, B.; Wu, W.; Zhang, Q.; Zhang, W.; Deng, N.; Wu, D.; Wong, H.-S.P.; Yu, S.; et al. Device and circuit optimization of RRAM for neuromorphic computing. In Proceedings of the 2017 IEEE International Electron Devices Meeting (IEDM), San Francisco, CA, USA, 2–6 December 2017. [Google Scholar]
  46. Lim, S.; Bae, J.-H.; Eum, J.-H.; Lee, S.; Kim, C.-H.; Kwon, D.; Park, B.-G.; Lee, J.-H. Adaptive learning rule for hardware-based deep neural networks using electronic synapse devices. Neural Comput. Appl. 2018, 31, 8101–8116. [Google Scholar] [CrossRef]
  47. Kwon, D.; Lim, S.; Bae, J.-H.; Lee, S.-T.; Kim, H.; Kim, C.-H.; Park, B.-G.; Lee, J.-H. Adaptive Weight Quantization Method for Nonlinear Synaptic Devices. IEEE Trans. Electron Devices 2019, 66, 395–401. [Google Scholar] [CrossRef]
  48. Woo, S.Y.; Choi, K.-B.; Lim, S.; Lee, S.-T.; Kim, C.-H.; Kang, W.-M.; Kwon, D.; Bae, J.-H.; Park, B.-G.; Lee, J.-H. Synaptic device using a floating fin-body MOSFET with memory functionality for neural network. Solid-State Electron. 2019, 156, 23–27. [Google Scholar] [CrossRef]
  49. Bartolozzi, C.; Nikolayeva, O.; Indiveri, G. Implementing homeostatic plasticity in VLSI networks of spiking neurons. In Proceedings of the 15th IEEE International Conference on Electronics, Circuits and Systems, Saint Julian’s, Malta, 31 August–3 September 2008; pp. 682–685. [Google Scholar]
  50. Bartolozzi, C.; Indiveri, G. Global scaling of synaptic efficacy: Homeostasis in silicon synapses. Neurocomputing 2009, 72, 726–731. [Google Scholar] [CrossRef]
  51. Rovere, G.; Ning, Q.; Bartolozzi, C.; Indiveri, G. Ultra Low Leakage Synaptic Scaling Circuits for Implementing Homeostatic Plasticity in Neuromorphic Architectures. In Proceedings of the 2014 IEEE International Symposium on Circuits and Systems (ISCAS), Melbourne, VIC, Australia, 1–5 June 2014; pp. 2073–2076. [Google Scholar]
  52. Qiao, N.; Indiveri, G.; Bartolozzi, C. Automatic gain control of ultra-low leakage synaptic scaling homeostatic plasticity circuits. In Proceedings of the 2016 IEEE Biomedical Circuits and Systems Conference (BioCAS); Shanghai, China: 17–19 October 2016; pp. 156–159.
  53. Shi, X.; Zeng, Z.; Yang, L.; Huang, Y. Memristor-Based Circuit Design for Neuron with Homeostatic Plasticity. IEEE Trans. Emerg. Top. Comput. Intell. 2018, 2, 359–370. [Google Scholar] [CrossRef]
  54. Zjajo, A. Dynamic Homeostatic Regulation in Energy-Efficient Time-Locked Neuromorphic Systems. In Proceedings of the 2020 IEEE 20th International Conference on Bioinformatics and Bioengineering (BIBE), Cincinnati, OH, USA, 26–28 October 2020; pp. 719–722. [Google Scholar]
  55. Zhao, Z.; Qu, L.; Wang, L.; Deng, Q.; Li, N.; Kang, Z.; Guo, S.; Xu, W. A Memristor-Based Spiking Neural Network with High Scalability and Learning Efficiency. IEEE Trans. Circuits Syst. II Express Briefs 2020, 67, 931–935. [Google Scholar] [CrossRef]
  56. Johnson, A.P.; Liu, J.; Millard, A.G.; Karim, S.; Tyrrell, A.M.; Harkin, J.; Timmis, J.; McDaid, L.J.; Halliday, D.M. Homeostatic Fault Tolerance in Spiking Neural Networks: A Dynamic Hardware Perspective. IEEE Trans. Circuits Syst. I Regul. Pap. 2018, 65, 687–699. [Google Scholar] [CrossRef]
  57. Kang, W.-M.; Kim, C.-H.; Lee, S.; Woo, S.Y.; Bae, J.-H.; Park, B.-G.; Lee, J.-H. A spiking neural network with a global self-controller for unsupervised learning based on spike-timing-dependent plasticity using flash memory synaptic devices. In Proceedings of the 2019 International Joint Conference on Neural Networks (IJCNN), Budapest, Hungary, 14–19 July 2019; pp. 1–7. [Google Scholar]
  58. Goda, A. Recent Progress on 3D NAND Flash Technologies. Electronics 2021, 10, 3156. [Google Scholar] [CrossRef]
  59. Woo, S.Y.; Kang, W.M.; Seo, Y.T.; Lee, S.; Kwon, D.; Oh, S.; Bae, J.-H.; Lee, J.-H. Demonstration of integrate-and-fire neuron circuit for spiking neural networks. Solid-State Electron. 2022, 198, 108481. [Google Scholar] [CrossRef]
  60. Kim, C.-H.; Lee, S.; Woo, S.Y.; Kang, W.-M.; Lim, S.; Bae, J.-H.; Kim, J.; Lee, J.-H. Demonstration of Unsupervised Learning with Spike-Timing-Dependent Plasticity Using a TFT-Type NOR Flash Memory Array. IEEE Trans. Electron Devices 2018, 65, 1774–1780. [Google Scholar] [CrossRef]
Figure 1. (a) Simple diagram of a biological neuron and synapse. The neuron is composed of a nucleus, dendrite, axon, and axon terminals. Synapse is a connection between pre-synaptic neurons and post-synaptic neurons. (b) A model of homeostatic regulation which stabilizes firing rate of neurons by synapse scaling.
Figure 1. (a) Simple diagram of a biological neuron and synapse. The neuron is composed of a nucleus, dendrite, axon, and axon terminals. Synapse is a connection between pre-synaptic neurons and post-synaptic neurons. (b) A model of homeostatic regulation which stabilizes firing rate of neurons by synapse scaling.
Biomimetics 09 00335 g001
Figure 2. (al) Steps for integration of the fabricated S-G FET, n-type FET, p-type FET, and memory devices on the same wafer.
Figure 2. (al) Steps for integration of the fabricated S-G FET, n-type FET, p-type FET, and memory devices on the same wafer.
Biomimetics 09 00335 g002
Figure 3. (a) 3D schematic view and (b) cross-sectional TEM image of the fabricated S-G FET.
Figure 3. (a) 3D schematic view and (b) cross-sectional TEM image of the fabricated S-G FET.
Biomimetics 09 00335 g003
Figure 4. (a) The measured ID-VG curves of the n-type FET and p-type FET, fabricated on the same wafer with the S-G FET. (b) The measured ID-VG1 curves of the S-G FET as a parameter of different VG2 from −2 V to 2 V, respectively. The solid and open symbols represent transfer curves at VD of 0.1 V and 1 V, respectively.
Figure 4. (a) The measured ID-VG curves of the n-type FET and p-type FET, fabricated on the same wafer with the S-G FET. (b) The measured ID-VG1 curves of the S-G FET as a parameter of different VG2 from −2 V to 2 V, respectively. The solid and open symbols represent transfer curves at VD of 0.1 V and 1 V, respectively.
Biomimetics 09 00335 g004
Figure 5. (a) ID-VG1 curves of the S-G FET as a function of the number of pulses applied to G2. (b) Vth variation in ID-VG1 curves of the S-G FET with the number of program/erase pulses applied to G2 at an ID of 10 nA. (c) Retention characteristics (Vth change) of the S-G FET over time. (d) SS variation in the G1 and G2 DC sweep of the S-G FET with the number of program/erase cycles applied to G2.
Figure 5. (a) ID-VG1 curves of the S-G FET as a function of the number of pulses applied to G2. (b) Vth variation in ID-VG1 curves of the S-G FET with the number of program/erase pulses applied to G2 at an ID of 10 nA. (c) Retention characteristics (Vth change) of the S-G FET over time. (d) SS variation in the G1 and G2 DC sweep of the S-G FET with the number of program/erase cycles applied to G2.
Biomimetics 09 00335 g005
Figure 6. (a) Schematic of IF neuron circuit integrated with S-G FET. (b) Simulated operational characteristics demonstrating the increase in the neuron circuit threshold during the learning process.
Figure 6. (a) Schematic of IF neuron circuit integrated with S-G FET. (b) Simulated operational characteristics demonstrating the increase in the neuron circuit threshold during the learning process.
Biomimetics 09 00335 g006
Figure 7. (a) Basic configuration circuit including extended pulse generation and voltage level shifter for program/erase operations. (b) Simulated Vt plots for modulation of VPGM pulse from the output pulse of the neuron circuit using the basic configuration circuit.
Figure 7. (a) Basic configuration circuit including extended pulse generation and voltage level shifter for program/erase operations. (b) Simulated Vt plots for modulation of VPGM pulse from the output pulse of the neuron circuit using the basic configuration circuit.
Biomimetics 09 00335 g007
Figure 8. (a) A diagram of a fully connected neural network with a synapse array and neurons. (b) Synaptic weight updates (LTP/LTD) and operation scheme of a TFT memory device [60] (c) A block diagram of the neuromorphic system consisting of a synapse array, neurons, and a common controller. (d) The optimized threshold voltages of output neurons with the input patterns. (e) Comparison of the pattern recognition accuracy in STDP based SNN system with/without homeostasis functionality. (f) The degradation of the recognition rate with increasing conductance variation (σ/μ) of the synaptic devices.
Figure 8. (a) A diagram of a fully connected neural network with a synapse array and neurons. (b) Synaptic weight updates (LTP/LTD) and operation scheme of a TFT memory device [60] (c) A block diagram of the neuromorphic system consisting of a synapse array, neurons, and a common controller. (d) The optimized threshold voltages of output neurons with the input patterns. (e) Comparison of the pattern recognition accuracy in STDP based SNN system with/without homeostasis functionality. (f) The degradation of the recognition rate with increasing conductance variation (σ/μ) of the synaptic devices.
Biomimetics 09 00335 g008
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Kim, H.; Woo, S.Y.; Kim, H. Neuron Circuit Based on a Split-gate Transistor with Nonvolatile Memory for Homeostatic Functions of Biological Neurons. Biomimetics 2024, 9, 335. https://doi.org/10.3390/biomimetics9060335

AMA Style

Kim H, Woo SY, Kim H. Neuron Circuit Based on a Split-gate Transistor with Nonvolatile Memory for Homeostatic Functions of Biological Neurons. Biomimetics. 2024; 9(6):335. https://doi.org/10.3390/biomimetics9060335

Chicago/Turabian Style

Kim, Hansol, Sung Yun Woo, and Hyung** Kim. 2024. "Neuron Circuit Based on a Split-gate Transistor with Nonvolatile Memory for Homeostatic Functions of Biological Neurons" Biomimetics 9, no. 6: 335. https://doi.org/10.3390/biomimetics9060335

Article Metrics

Back to TopTop