View : 287 Download: 0

Full metadata record

DC Field Value Language
dc.contributor.author반효경*
dc.date.accessioned2023-10-26T16:30:03Z-
dc.date.available2023-10-26T16:30:03Z-
dc.date.issued2023*
dc.identifier.issn1546-2218*
dc.identifier.issn1546-2226*
dc.identifier.otherOAK-33831*
dc.identifier.urihttps://dspace.ewha.ac.kr/handle/2015.oak/266467-
dc.description.abstractDue to the recent trend of software intelligence in the Fourth Industrial Revolution, deep learning has become a mainstream workload for modern computer systems. Since the data size of deep learning increasingly grows, managing the limited memory capacity efficiently for deep learning workloads becomes important. In this paper, we analyze memory accesses in deep learning workloads and find out some unique characteristics differentiated from traditional workloads. First, when comparing instruction and data accesses, data access accounts for 96%-99% of total memory accesses in deep learning workloads, which is quite different from traditional workloads. Second, when comparing read and write accesses, write access dominates, accounting for 64%-80% of total memory accesses. Third, although write access makes up the majority of memory accesses, it shows a low access bias of 0.3 in the Zipf parameter. Fourth, in predicting re-access, recency is important in read access, but frequency provides more accurate information in write access. Based on these observations, we introduce a Non-Volatile Random Access Memory (NVRAM)-accelerated memory architecture for deep learning workloads, and present a new memory management policy for this architecture. By considering the memory access characteristics of deep learning workloads, the proposed policy improves memory performance by 64.3% on average compared to the CLOCK policy.*
dc.languageEnglish*
dc.publisherTECH SCIENCE PRESS*
dc.subjectMemory access*
dc.subjectdeep learning*
dc.subjectmachine learning*
dc.subjectmemory management*
dc.subjectCLOCK*
dc.titleCharacterization of Memory Access in Deep Learning and Its Implications in Memory Management*
dc.typeArticle*
dc.relation.issue1*
dc.relation.volume76*
dc.relation.indexSCIE*
dc.relation.indexSCOPUS*
dc.relation.startpage607*
dc.relation.lastpage629*
dc.relation.journaltitleCMC-COMPUTERS MATERIALS & CONTINUA*
dc.identifier.doi10.32604/cmc.2023.039236*
dc.identifier.wosidWOS:001018251400003*
dc.author.googleLee, Jeongha*
dc.author.googleBahn, Hyokyung*
dc.contributor.scopusid반효경(7003994561)*
dc.date.modifydate20240315133816*
Appears in Collections:
인공지능대학 > 컴퓨터공학과 > Journal papers
Files in This Item:
There are no files associated with this item.
Export
RIS (EndNote)
XLS (Excel)
XML


qrcode

BROWSE