Content area

Abstract

We introduce MLE-bench, a benchmark for measuring how well AI agents perform at machine learning engineering. To this end, we curate 75 ML engineering-related competitions from Kaggle, creating a diverse set of challenging tasks that test real-world ML engineering skills such as training models, preparing datasets, and running experiments. We establish human baselines for each competition using Kaggle's publicly available leaderboards. We use open-source agent scaffolds to evaluate several frontier language models on our benchmark, finding that the best-performing setup--OpenAI's o1-preview with AIDE scaffolding--achieves at least the level of a Kaggle bronze medal in 16.9% of competitions. In addition to our main results, we investigate various forms of resource scaling for AI agents and the impact of contamination from pre-training. We open-source our benchmark code (github.com/openai/mle-bench/) to facilitate future research in understanding the ML engineering capabilities of AI agents.

Details

1009240
Identifier / keyword
Title
MLE-bench: Evaluating Machine Learning Agents on Machine Learning Engineering
Publication title
arXiv.org; Ithaca
Publication year
2024
Publication date
Dec 20, 2024
Section
Computer Science
Publisher
Cornell University Library, arXiv.org
Source
arXiv.org
Place of publication
Ithaca
Country of publication
United States
University/institution
Cornell University Library arXiv.org
e-ISSN
2331-8422
Source type
Working Paper
Language of publication
English
Document type
Working Paper
Publication history
 
 
Online publication date
2024-12-23
Milestone dates
2024-10-09 (Submission v1); 2024-10-24 (Submission v2); 2024-12-11 (Submission v3); 2024-12-16 (Submission v4); 2024-12-20 (Submission v5)
Publication history
 
 
   First posting date
23 Dec 2024
ProQuest document ID
3115222993
Document URL
https://www.proquest.com/working-papers/mle-bench-evaluating-machine-learning-agents-on/docview/3115222993/se-2?accountid=208611
Full text outside of ProQuest
Copyright
© 2024. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Last updated
2024-12-24
Database
2 databases
  • ProQuest One Academic
  • ProQuest One Academic