The recent PSU-OSPO meeting encompassed a range of topics from the intricacies of working with large language models (LLMs) to plans for upcoming events like the Penn State hackathon. Notable presentations included Dr. Carl Cotner’s insights into the role of vectors in AI.

Attendees

Key Discussion Points

Working Memory, Short Term Memory, Long Term Memory

The meeting opened with a discussion about the different types of memory models, focusing on their applications in computing and artificial intelligence.

Presentation by Matt Viana: Fine-tuning Large Language Models (LLMs)

  • Context: Emphasis was placed on the importance of context in the functioning of LLMs.
  • Retrieval Augmentation: The use of external data sources to enhance the capabilities of LLMs was discussed.
  • The Model Itself: An overview of the underlying structure and functioning of LLMs was provided.
  • Working Memory, Short Term Memory, and Long-Term Memory

Presentation by Dr. Carl Cotner: The Role of Vectors in AI

  • Vector Space Generalization:
    • Explained concepts such as space, direction, magnitude, and clock arithmetic.
    • Clarified that vector spaces are not confined to three-dimensional space.
    • Discussed the relevance of real numbers and integers when working in vector space
  • Applications and Examples:
    • A Torus is analogous to a space invaders board
    • Tic Tac Toe boards are very useful in imagining discrete vector space
  • The Curse of Dimensionality:
    • Challenges in working with 3 and 4-dimensional spaces.
    • Difficulties in conceptualizing spaces with 5 or more dimensions.
    • Example of a tesseract having 16 corners.
  • Deep Learning in High-Dimensional Spaces:
    • Discussed how deep learning thrives in high-dimensional spaces through cosine similarity.
    • Noted the existence of an optimal size for vector spaces in relation to LLMs.
    • Word2Vec is an example of converting objects into vectors.

Upcoming Events

  • Penn State Spring Hackathon (HackPSU): Scheduled for March 16th.