patents.google.com

Kim et al., 2022 - Google Patents

  • ️Sat Jan 01 2022
An overview of processing-in-memory circuits for artificial intelligence and machine learning

Kim et al., 2022

View PDF
Document ID
4969503506706290698
Author
Yu C
Xie S
Chen Y
Kim J
Kim B
Kulkarni J
Kim T
Publication year
2022
Publication venue
IEEE Journal on Emerging and Selected Topics in Circuits and Systems

External Links

Snippet

Artificial intelligence (AI) and machine learning (ML) are revolutionizing many fields of study, such as visual recognition, natural language processing, autonomous vehicles, and prediction. Traditional von-Neumann computing architecture with separated processing …

Continue reading at ieeexplore.ieee.org (PDF) (other versions)
  • 238000010801 machine learning 0 title abstract description 21

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F7/00Methods or arrangements for processing data by operating upon the order or content of the data handled
    • G06F7/38Methods or arrangements for performing computations using exclusively denominational number representation, e.g. using binary, ternary, decimal representation
    • G06F7/48Methods or arrangements for performing computations using exclusively denominational number representation, e.g. using binary, ternary, decimal representation using non-contact-making devices, e.g. tube, solid state device; using unspecified devices
    • G06F7/52Multiplying; Dividing
    • G06F7/523Multiplying only
    • G06F7/53Multiplying only in parallel-parallel fashion, i.e. both operands being entered in parallel
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computer systems based on biological models
    • G06N3/02Computer systems based on biological models using neural network models
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F7/00Methods or arrangements for processing data by operating upon the order or content of the data handled
    • G06F7/38Methods or arrangements for performing computations using exclusively denominational number representation, e.g. using binary, ternary, decimal representation
    • G06F7/48Methods or arrangements for performing computations using exclusively denominational number representation, e.g. using binary, ternary, decimal representation using non-contact-making devices, e.g. tube, solid state device; using unspecified devices
    • G06F7/50Adding; Subtracting
    • G06F7/505Adding; Subtracting in bit-parallel fashion, i.e. having a different digit-handling circuit for each denomination
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11CSTATIC STORES
    • G11C11/00Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor
    • G11C11/21Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using electric elements
    • G11C11/34Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using electric elements using semiconductor devices
    • G11C11/40Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using electric elements using semiconductor devices using transistors
    • G11C11/41Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using electric elements using semiconductor devices using transistors forming static cells with positive feedback, i.e. cells not needing refreshing or charge regeneration, e.g. bistable multivibrator or Schmitt trigger
    • G11C11/412Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using electric elements using semiconductor devices using transistors forming static cells with positive feedback, i.e. cells not needing refreshing or charge regeneration, e.g. bistable multivibrator or Schmitt trigger using field-effect transistors only
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F15/00Digital computers in general; Data processing equipment in general
    • G06F15/76Architectures of general purpose stored programme computers
    • G06F15/78Architectures of general purpose stored programme computers comprising a single central processing unit
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F17/00Digital computing or data processing equipment or methods, specially adapted for specific functions
    • G06F17/50Computer-aided design
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F9/00Arrangements for programme control, e.g. control unit
    • G06F9/06Arrangements for programme control, e.g. control unit using stored programme, i.e. using internal store of processing equipment to receive and retain programme
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computer systems based on biological models
    • G06N3/12Computer systems based on biological models using genetic models
    • G06N3/126Genetic algorithms, i.e. information processing using digital simulations of the genetic system
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06NCOMPUTER SYSTEMS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N99/00Subject matter not provided for in other groups of this subclass
    • G06N99/005Learning machines, i.e. computer in which a programme is changed according to experience gained by the machine itself during a complete run
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06FELECTRICAL DIGITAL DATA PROCESSING
    • G06F12/00Accessing, addressing or allocating within memory systems or architectures
    • G06F12/02Addressing or allocation; Relocation
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11CSTATIC STORES
    • G11C11/00Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor
    • G11C11/56Digital stores characterised by the use of particular electric or magnetic storage elements; Storage elements therefor using storage elements with more than two stable states represented by steps, e.g. of voltage, current, phase, frequency
    • GPHYSICS
    • G11INFORMATION STORAGE
    • G11CSTATIC STORES
    • G11C15/00Digital stores in which information comprising one or more characteristic parts is written into the store and in which information is read-out by searching for one or more of these characteristic parts, i.e. associative or content-addressed stores
    • G11C15/04Digital stores in which information comprising one or more characteristic parts is written into the store and in which information is read-out by searching for one or more of these characteristic parts, i.e. associative or content-addressed stores using semiconductor elements

Similar Documents

Publication Publication Date Title
Kim et al. 2022 An overview of processing-in-memory circuits for artificial intelligence and machine learning
Agrawal et al. 2019 Xcel-RAM: Accelerating binary neural networks in high-throughput SRAM compute arrays
Kim et al. 2021 Colonnade: A reconfigurable SRAM-based digital bit-serial compute-in-memory macro for processing neural networks
Jhang et al. 2021 Challenges and trends of SRAM-based computing-in-memory for AI edge devices
Wang et al. 2019 A 28-nm compute SRAM with bit-serial logic/arithmetic operations for programmable in-memory vector computing
US10346347B2 (en) 2019-07-09 Field-programmable crossbar array for reconfigurable computing
Bavikadi et al. 2020 A review of in-memory computing architectures for machine learning applications
Bojnordi et al. 2016 Memristive boltzmann machine: A hardware accelerator for combinatorial optimization and deep learning
Yin et al. 2019 Vesti: Energy-efficient in-memory computing accelerator for deep neural networks
Knag et al. 2020 A 617-TOPS/W all-digital binary neural network accelerator in 10-nm FinFET CMOS
Lee et al. 2021 A charge-domain scalable-weight in-memory computing macro with dual-SRAM architecture for precision-scalable DNN accelerators
Chang et al. 2019 PXNOR-BNN: In/with spin-orbit torque MRAM preset-XNOR operation-based binary neural networks
Yue et al. 2022 STICKER-IM: A 65 nm computing-in-memory NN processor using block-wise sparsity optimization and inter/intra-macro data reuse
Seo et al. 2022 Digital versus analog artificial intelligence accelerators: Advances, trends, and emerging designs
Ankit et al. 2020 Circuits and architectures for in-memory computing-based machine learning accelerators
Zhang et al. 2022 PIMCA: A programmable in-memory computing accelerator for energy-efficient DNN inference
Kang et al. 2020 Deep in-memory architectures in SRAM: An analog approach to approximate computing
Yavits et al. 2021 GIRAF: General purpose in-storage resistive associative framework
Jaiswal et al. 2020 I-SRAM: Interleaved wordlines for vector Boolean operations using SRAMs
Ali et al. 2022 Compute-in-memory technologies and architectures for deep learning workloads
Heo et al. 2022 T-PIM: An energy-efficient processing-in-memory accelerator for end-to-end on-device training
Shanbhag et al. 2022 Benchmarking in-memory computing architectures
Agrawal et al. 2020 CASH-RAM: Enabling in-memory computations for edge inference using charge accumulation and sharing in standard 8T-SRAM arrays
Angizi et al. 2022 Pisa: A binary-weight processing-in-sensor accelerator for edge image processing
Kang et al. 2021 S-FLASH: A NAND flash-based deep neural network accelerator exploiting bit-level sparsity