We introduce Graph of Thoughts (GoT): a framework that
advances prompting capabilities in large language models
(LLMs) beyond those offered by paradigms such as
Chain-of-Thought or Tree of Thoughts (ToT). The key idea and
primary advantage of GoT is the ability to model the information
generated by an LLM as an arbitrary graph, where units of
information ("LLM thoughts") are vertices, and edges correspond
to dependencies between these vertices. This approach enables
combining arbitrary LLM thoughts into synergistic outcomes,
distilling the essence of whole networks of thoughts,
or enhancing thoughts using feedback loops. We illustrate
that GoT offers advantages over state of the art on different
tasks, for example increasing the quality of sorting by 62%
over ToT, while simultaneously reducing costs by >31%.
We ensure that GoT is extensible with new thought
transformations and thus can be used to spearhead new prompting
schemes. This work brings the LLM reasoning closer to human
thinking or brain mechanisms such as recurrence, both
of which form complex networks
Academic Publication Graph of Thoughts: Solving Elaborate Problems with Large Language Models
Research Abstract & Technology Focus
Commercial Realization
Startups and Open Source tools heavily associated with the concepts explored in this paper.
-
GitHubWorld-Open-Graph/br-acc
-
GitHubLum1104/Understand-Anything
-
Product HuntHelixDB
-
Product HuntNotebookLM Custom Infographic Styles
Market Trends