Full text

Turn on search term navigation

© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.

Abstract

To enhance the software implementation process, developers frequently leverage preexisting code snippets by exploring an extensive codebase. Existing code search tools often rely on keyword- or syntactic-based methods and struggle to fully grasp the semantics and intent behind code snippets. In this paper, we propose a novel hybrid C2B model that combines CodeT5 and bidirectional long short-term memory (Bi-LSTM) for source code search and recommendation. Our proposed C2B hybrid model leverages CodeT5’s domain-specific pretraining and Bi-LSTM’s contextual understanding to improve code representation and capture sequential dependencies. As a proof-of-concept application, we implemented the proposed C2B hybrid model as a deep neural code search tool and empirically evaluated the model on the large-scale dataset of CodeSearchNet. The experimental findings showcase that our methodology proficiently retrieves pertinent code snippets and surpasses the performance of prior state-of-the-art techniques.

Details

Title
C2B: A Semantic Source Code Retrieval Model Using CodeT5 and Bi-LSTM
Author
Bibi, Nazia  VIAFID ORCID Logo  ; Maqbool, Ayesha  VIAFID ORCID Logo  ; Rana, Tauseef; Afzal, Farkhanda; Adnan Ahmed Khan
First page
5795
Publication year
2024
Publication date
2024
Publisher
MDPI AG
e-ISSN
20763417
Source type
Scholarly Journal
Language of publication
English
ProQuest document ID
3078995757
Copyright
© 2024 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.