Comparing Traditional Caching Algorithms and AI-assisted Techniques in Modern CPU Environments

Abstract

Modern central processing units (CPUs) rely on caching mechanisms to mitigate the performance gap between fast processors and comparatively slower main memory. Traditional caching algorithms such as Least Recently Used (LRU), Least Frequently Used (LFU), and First-In First-Out (FIFO) have long served as standard approaches due to their simplicity and predictable behavior. However, increasingly dynamic workloads driven by artificial intelligence, cloud computing, and multi-core architectures challenge the effectiveness of these static methods. Recent research suggests that artificial intelligence–based caching techniques may improve cache efficiency by adapting to evolving access patterns, yet few studies have directly compared these approaches to traditional algorithms within CPU environments. This study proposes a quantitative experimental investigation comparing classical and AI-driven caching methods using a controlled CPU simulation environment. Performance will be evaluated based on cache hit ratio, memory access latency, and computational overhead, with statistical analysis conducted using independent-samples t-tests. By examining both efficiency gains and associated costs, the study aims to determine whether AI-assisted caching offers practical advantages over traditional techniques under realistic processor constraints. The findings are expected to contribute to computer architecture research by providing empirical evidence regarding the feasibility and trade-offs of integrating adaptive learning mechanisms into CPU cache management.

Start Time

15-4-2026 3:30 PM

End Time

15-4-2026 4:30 PM

Room Number

219

Presentation Type

Oral Presentation

Presentation Subtype

UG Orals

Presentation Category

Science, Technology, and Engineering

Student Type

Undergraduate Student

Faculty Mentor

Chandler Scott

This document is currently not available here.

Share

COinS
 
Apr 15th, 3:30 PM Apr 15th, 4:30 PM

Comparing Traditional Caching Algorithms and AI-assisted Techniques in Modern CPU Environments

219

Modern central processing units (CPUs) rely on caching mechanisms to mitigate the performance gap between fast processors and comparatively slower main memory. Traditional caching algorithms such as Least Recently Used (LRU), Least Frequently Used (LFU), and First-In First-Out (FIFO) have long served as standard approaches due to their simplicity and predictable behavior. However, increasingly dynamic workloads driven by artificial intelligence, cloud computing, and multi-core architectures challenge the effectiveness of these static methods. Recent research suggests that artificial intelligence–based caching techniques may improve cache efficiency by adapting to evolving access patterns, yet few studies have directly compared these approaches to traditional algorithms within CPU environments. This study proposes a quantitative experimental investigation comparing classical and AI-driven caching methods using a controlled CPU simulation environment. Performance will be evaluated based on cache hit ratio, memory access latency, and computational overhead, with statistical analysis conducted using independent-samples t-tests. By examining both efficiency gains and associated costs, the study aims to determine whether AI-assisted caching offers practical advantages over traditional techniques under realistic processor constraints. The findings are expected to contribute to computer architecture research by providing empirical evidence regarding the feasibility and trade-offs of integrating adaptive learning mechanisms into CPU cache management.