Degree Name

MS (Master of Science)

Program

Mathematical Sciences

Date of Award

12-2023

Committee Chair or Co-Chairs

Jeff R. Knisley

Committee Members

Michele L. Joyner, Robert B. Gardner

Abstract

Graph Neural Networks and Transformers are very powerful frameworks for learning machine learning tasks. While they were evolved separately in diverse fields, current research has revealed some similarities and links between them. This work focuses on bridging the gap between GNNs and Transformers by offering a uniform framework that highlights their similarities and distinctions. We perform positional encodings and identify key properties that make the positional encodings node embeddings. We found that the properties of expressiveness, efficiency and interpretability were achieved in the process. We saw that it is possible to use positional encodings as node embeddings, which can be used for machine learning tasks such as node classification, graph classification, and link prediction. We discuss some challenges and provide future directions.

Document Type

Thesis - unrestricted

Copyright

Copyright by the authors.

Share

COinS