Degree Name

MS (Master of Science)

Program

Mathematical Sciences

Date of Award

12-2025

Committee Chair or Co-Chairs

Jeff Knisley

Committee Members

Rodney Keaton, Robert A. Beeler

Abstract

This thesis investigates grokking, the delayed transition from memorization to generalization in neural networks trained on deterministic chaotic data. Using an integer–arithmetic discretization of the logistic map, yn+1 =( a yn(p − yn))/ p 2 , bounded aperiodic sequences were generated across control parameters α ranging from 3.0 to 4.0. Transformer-based models displayed characteristic grokking curves. In periodic and chaotic regimes, validation accuracy rose suddenly after long plateaus, while at the Feigenbaum boundary (α ≈ 3.57) generalization failed completely. Increasing data diversity restored learning in chaotic domains, and explicit α–conditioning enabled a single network to generalize across all regimes. A bifurcation diagram of model-predicted data reproduced the main features of the true logistic map, confirming that the network captured the underlying dynamics. These results link delayed generalization in deep learning to structural transitions in deterministic chaos.

Document Type

Thesis - unrestricted

Copyright

Copyright by the authors.

Share

COinS