Resistive Switching Devices and Their Applications for Computing Beyond von Neumann Architecture
Add to Google Calendar
As the demand for processing artificial intelligence (AI) and cognitive tasks increases, new devices and computing architectures that can reduce the cost of the memory bottleneck have gained significant interest. One of emerging device that can enable non-von Neumann architectures such as neuromorphic computing and in-memory computing, resistive random-access memory (RRAM), has been extensively studied. In this thesis, I will discuss optimization of RRAM devices as well as the application of RRAM devices for machine learning tasks and combinatorial optimization problems.
Experimental demonstration of feature extraction and dimensionality reduction by using tantalum oxide (TaOx)-based analog RRAM devices will be first introduced. In the second project, an RRAM structure that offers very low power and large on/off ratio is developed using copper active electrode and atomic layer deposited Al2O3 layers for low-power in-memory computation and digital version of neuromorphic computing applications. Beyond device optimizations, I will present two projects that aim at demonstrating the applications of RRAM devices, implementing RRAM-based hardware acceleration of simulated annealing of the two-dimensional spin glass problem, and stochastic learning of deep neural networks, respectively. At the end of this thesis, general application of RRAM array for combinatorial optimization problem is proposed as a future work.