Content area

Abstract

In order to address the chain of thought in the large language model inference cost surge, this research proposes to use a sparse attention mechanism that only focuses on a few relevant tokens. The researcher constructed a new attention mechanism and used GiantRabbit trained with custom GPTs as an experimental tool. The experiment tested and compared the reasoning time, correctness score and chain of thought length of this model and o1 Preview in solving the linear algebra test questions of MIT OpenCourseWare. The results show that GiantRabbit's reasoning time and chain of thought length are significantly lower than o1 Preview. It verifies the feasibility of sparse attention mechanism for optimizing chain of thought reasoning. Detailed architectural details and experimental process have been uploaded to Github, the link is:https://github.com/brucewang123456789/GeniusTrail.git.

Details

1009240
Identifier / keyword
Title
Reducing Reasoning Costs -- The Path of Optimization for Chain of Thought via Sparse Attention Mechanism
Author
Publication title
arXiv.org; Ithaca
Publication year
2024
Publication date
Dec 11, 2024
Section
Computer Science
Publisher
Cornell University Library, arXiv.org
Source
arXiv.org
Place of publication
Ithaca
Country of publication
United States
University/institution
Cornell University Library arXiv.org
e-ISSN
2331-8422
Source type
Working Paper
Language of publication
English
Document type
Working Paper
Publication history
 
 
Online publication date
2024-12-12
Milestone dates
2024-11-14 (Submission v1); 2024-11-15 (Submission v2); 2024-12-01 (Submission v3); 2024-12-11 (Submission v4)
Publication history
 
 
   First posting date
12 Dec 2024
ProQuest document ID
3128887362
Document URL
https://www.proquest.com/working-papers/reducing-reasoning-costs-path-optimization-chain/docview/3128887362/se-2?accountid=208611
Full text outside of ProQuest
Copyright
© 2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Last updated
2024-12-13
Database
ProQuest One Academic