Degree Name
MS (Master of Science)
Program
Mathematical Sciences
Date of Award
12-2023
Committee Chair or Co-Chairs
Jeff R. Knisley
Committee Members
Michele L. Joyner, Robert B. Gardner
Abstract
Graph Neural Networks and Transformers are very powerful frameworks for learning machine learning tasks. While they were evolved separately in diverse fields, current research has revealed some similarities and links between them. This work focuses on bridging the gap between GNNs and Transformers by offering a uniform framework that highlights their similarities and distinctions. We perform positional encodings and identify key properties that make the positional encodings node embeddings. We found that the properties of expressiveness, efficiency and interpretability were achieved in the process. We saw that it is possible to use positional encodings as node embeddings, which can be used for machine learning tasks such as node classification, graph classification, and link prediction. We discuss some challenges and provide future directions.
Document Type
Thesis - unrestricted
Recommended Citation
Manu, Bright Kwaku, "A Bridge between Graph Neural Networks and Transformers: Positional Encodings as Node Embeddings" (2023). Electronic Theses and Dissertations. Paper 4308. https://dc.etsu.edu/etd/4308
Copyright
Copyright by the authors.
Included in
Artificial Intelligence and Robotics Commons, Data Science Commons, Discrete Mathematics and Combinatorics Commons, Theory and Algorithms Commons