Content area
This study proposes and implements a German learning system based on a hybrid fuzzy-neural model, aiming to enhance the language acquisition efficiency of German learners by integrating the strengths of fuzzy logic in handling uncertainty with those of deep neural networks for complex pattern recognition. Through detailed computational experiments, the hybrid model achieved significant improvements over traditional and baseline methods, with key results including vocabulary acquisition accuracy of 90.5% ± 1.2%, syntactic analysis accuracy of 88.7% ± 1.6%, sentiment analysis accuracy of 92.1% ± 1.3%, and a reading comprehension BLEU score of 42.3 ± 1.5%. Students in the experimental group showed substantial gains from pre-test (75.8 ± 5.2) to post-test (88.3 ± 4.1), achieving an average improvement of 12.5 points compared to the control group's 5.9-point increase. Additionally, the experimental group rated the teaching content as rich and diverse (4.7/5), found the teaching methods interesting and effective (4.5/5), felt it helped improve their language skills (4.8/5), and considered it easy to learn independently (4.6/5), with overall satisfaction at 4.7/5. These findings highlight the hybrid fuzzy-neural model's effectiveness in enhancing both learning outcomes and student engagement in German language education.
Keywords: Hybrid fuzzy neural model, German learning, natural language processing, personalized learning, social interaction
This study proposes and implements a German learning system based on a hybrid fuzzy-neural model, aiming to enhance the language acquisition efficiency of German learners by integrating the strengths of fuzzy logic in handling uncertainty with those of deep neural networks for complex pattern recognition. Through detailed computational experiments, the hybrid model achieved significant improvements over traditional and baseline methods, with key results including vocabulary acquisition accuracy of 90.5% ± 1.2%, syntactic analysis accuracy of 88.7% ± 1.6%, sentiment analysis accuracy of 92.1% ± 1.3%, and a reading comprehension BLEU score of 42.3 ± 1.5%. Students in the experimental group showed substantial gains from pre-test (75.8 ± 5.2) to post-test (88.3 ± 4.1), achieving an average improvement of 12.5 points compared to the control group's 5.9-point increase. Additionally, the experimental group rated the teaching content as rich and diverse (4.7/5), found the teaching methods interesting and effective (4.5/5), felt it helped improve their language skills (4.8/5), and considered it easy to learn independently (4.6/5), with overall satisfaction at 4.7/5. These findings highlight the hybrid fuzzy-neural model's effectiveness in enhancing both learning outcomes and student engagement in German language education.
Povzetek: Izvirni hibridni model zdruzuje mehko logiko in globoko ucenje za izboljsanje ucinkovitosti ucenja nemscine, z izboljsanjem pridobivanja besedisca, sintaktiène analize in analize sentimenta ter povecano angaziranostjo ucencev.
(ProQuest: ... denotes formulae omited.)
1 Introduction
In today's digital age, the rapid development of natural language processing (NLP) technology is changing the way people learn foreign languages at an unprecedented speed [1]. In the wave of globalization, the demand for second language learning is becoming increasingly strong. As one of the main international communication languages, the importance of German is self-evident. German is not only one of the most important business languages in Europe, but also plays a key role in many fields such as science, technology and art. Therefore, developing a set of efficient and personalized German learning tools has become an urgent task in the field of educational technology [2].
Traditional German teaching methods often rely on fixed textbooks and teachers' personal experience. This model largely ignores individual differences among learners, resulting in unsatisfactory learning results. However, with the continuous breakthroughs in big data analysis, machine learning algorithms, and especially deep learning technology, computer-assisted language learning (CALL) systems have brought new possibilities to German learning [3]. These systems break the shackles of traditional teaching and provide learners with a more flexible and personalized learning experience. Although there are many CALL systems on the market, most of them use static rules or simple statistical models to process language data. This processing method makes it difficult to capture the rich connotations and subtle differences in natural language, especially in terms of grammatical structure, vocabulary usage, and cultural background. In addition, these traditional methods often ignore the differences between learners, resulting in low learning efficiency and uneven learning outcomes [4].
In response to these problems, this study proposed an innovative solution - a hybrid model that combines fuzzy logic and neural network technology. This model aims to provide more accurate and efficient learning support for German learners by deeply analyzing a large-scale real German text corpus. The application of fuzzy logic enables the model to better handle uncertainty and ambiguity in natural language, thereby achieving a more delicate grasp of grammar, vocabulary and culture [5]. The integration of neural network technology further enhances the model's autonomous learning ability and adaptability. By continuously learning a large amount of real corpus, the neural network can automatically extract the language features of German and adjust the learning strategy according to learners' feedback to achieve personalized teaching. In this process, learners can break free from the constraints of traditional teaching and learn according to their own needs and progress, greatly improving learning efficiency. In addition, this study also focuses on the interaction and cooperation between learners. By introducing social elements, the hybrid model encourages learners to communicate with each other and share their learning experiences, thereby forming an active German learning community. In this community, learners can encourage each other and make progress together, further improving learning effects [6].
This research has important theoretical significance and practical application value. In theory, it expands the knowledge boundary of how to effectively use imprecise information for language modeling in the field of NLP; in practice, it helps to create smarter and more adaptive German learning software, thereby improving learners' interest, participation and ultimate language mastery [7]. In particular, considering that fuzzy logic is good at processing uncertain information, combining it with neural networks with powerful pattern recognition capabilities can better simulate the process of human understanding of language and promote a more natural and smooth learning experience.
The main goal of this paper is to explore and implement a novel hybrid fuzzy neural model to support corpus-based German language learning activities. Specifically, we will build a comprehensive German text database, develop an effective preprocessing process including text cleaning and word segmentation, and design and train a learning system that combines fuzzy logic and deep neural network architecture. In addition, we will verify the effectiveness of the proposed model through a series of experiments and compare its performance with existing solutions. Finally, we will analyze the results and discuss the practical contribution of the model to improving German learning results [8].
Although our work focuses on German as the target language, the proposed framework is also applicable to learning scenarios of other languages. However, in order to ensure the feasibility and operability of the research, we only selected a limited number of specific application scenarios for testing, such as vocabulary acquisition, syntactic analysis, etc. At the same time, due to the limitation of computing resources, we can only use a relatively small-scale dataset for preliminary experiments. Future work will further expand the amount of data and explore more diverse functional modules.
2 Literature review
2.1 Relevant theories of German language learning
As a language with complex grammatical structure and rich vocabulary, German has attracted a large number of learners around the world. Traditional German teaching methods focus on grammar-translation method, direct method and communicative method. In recent years, with the development of cognitive science, more and more research has begun to focus on how to promote students' language acquisition process by building an effective learning environment. For example, sociocultural theory emphasizes the importance of social interaction for language learning [8]; the input hypothesis points out that comprehensible input is one of the key factors in language acquisition [9]. These theories provide a solid theoretical foundation for this study and guide us on how to design more effective German learning tools.
2.2 Application of corpus in language learning
A corpus is a collection of real texts collected according to certain principles. It can be used for linguistic research or to support language learning. In the field of German teaching, the use of corpora can help students get in touch with real language usage, thereby improving their understanding and application ability [10]. For example, by analyzing large-scale written or spoken materials, teachers can discover valuable information such as specific vocabulary collocation patterns and common error types, and then develop targeted teaching strategies. In addition, some advanced CALL systems have begun to try to integrate corpus resources to provide a richer learning experience. However, how to efficiently process and utilize this data remains a challenge [11, 12].
2.3 Basic concepts of fuzzy logic and its role in language processing
In the field of NLP, fuzzy logic is widely used in sentiment analysis, machine translation and many other aspects. Especially for those problems that are difficult to define with clear rules, fuzzy logic provides a flexible and powerful solution. Combined with the German learning scenario, it can be used to better simulate the way humans understand language, especially when faced with polysemous words or complex syntactic structures [13, 14].
2.4 Development of neural networks and their application in natural language processing
In the context of decision-making and assessment methods, Albadán et al. (2018) proposed fuzzy logic models for non-programmed decision-making in personnel selection processes, incorporating gamification to enhance the decision-making process in complex environmentsy [15], Gül and Aydoğdu (2021) introduced novel entropy measure definitions and applied them to the combinative distance-based assessment (CODAS) method under a picture fuzzy environment, improving the precision and applicability of the method in various decision-making scenarios . These aemonstrate the increasing importance and versatility of fuzzy logic and entropy measures in optimizing decision-making processes [16].
Artificial neural network technology has undergone a transformation from shallow structures to deep architectures. Especially after the rise of deep learning, various new network models have emerged, greatly promoting the development of the NLP field. Convolutional neural networks (CNNs), recurrent neural networks (RNNs), and variants such as long short-term memory networks (LSTMs) and gated recurrent units (GRUs) have all demonstrated excellent performance on different tasks. For morphologically rich languages such as German, sequence-based RNNs and their variants are particularly useful because they can capture dependencies within sentences. At the same time, the introduction of the attention mechanism enables the model to focus on the most relevant parts of the input sequence when generating outputs, further improving performance [17, 18].
2.5 Comparison and evaluation of related work
In recent years, many scholars have tried to combine fuzzy logic with neural networks and apply them to NLP tasks. For example, a study proposed a text classification algorithm based on a fuzzy reasoning system. This method achieved a high classification accuracy by fusing information from multiple feature subspaces. Another work explored the use of fuzzy clustering technology to preprocess training samples in sentiment analysis tasks. The results showed that this method helps alleviate the problem of class imbalance [19, 20]. Despite this, there are not many application cases specifically for German learning. Most of the existing research focuses on English or other mainstream languages, which shows that our work is of great significance in filling this gap [21].
2.6 Positioning and innovation of this study
This study aims to develop a novel hybrid fuzzy neural model to improve the corpus-based German language learning effect. Compared with the traditional single technology route, our proposed method combines the advantages of fuzzy logic in dealing with uncertainty and the powerful representation learning ability of neural networks. Specifically, we will first establish a German text database containing various text types, and then ensure the data quality through a series of preprocessing steps. Next, we will design and implement a learning platform that combines fuzzy systems and deep learning frameworks, in which the fuzzy system is responsible for identifying and processing fuzzy information in language, while the neural network focuses on learning contextrelated features. Finally, through comparative experiments, we verify the advantages of the proposed model over existing solutions and explore its potential application value in depth. We believe that this integrated approach can not only improve the efficiency of German learning, but also provide new ideas for other language education projects.
Table 1 summarizes several state-of-the-art methods in the field of corpus-based German language learning, highlighting key results such as accuracy, F1 scores, and BLEU scores. The listed methods encompass traditional teaching approaches like the Grammar-Translation method, as well as more contemporary machine learning techniques including fuzzy reasoning systems, fuzzy clustering, and neural network models such as RNNs, LSTMs, and attention-based models.
Traditional methods, although rooted in established practice, have limitations in terms of interactivity and scope, which modern machine learning techniques aim to address. For instance, fuzzy reasoning systems and fuzzy clustering methods have shown high accuracy in tasks like text classification and sentiment analysis but encounter challenges related to generalization and the requirement for large-scale labeled data. On the other hand, neural network approaches like RNNs and LSTMs have demonstrated strong performance in modeling sequential dependencies and complex structures within the German language, though they are computationally intensive and can struggle with long-term dependencies.
3 Hybrid fuzzy neural model design
3.1 Model idea
When designing the hybrid fuzzy neural model, our core idea is to combine the advantages of fuzzy logic and deep neural networks to more effectively handle the complexity and uncertainty in German language learning. Fuzzy logic, as a powerful tool, can handle common ambiguity and uncertainty problems in natural language, such as polysemy, emotional expressions, and subtle differences in grammatical structure. By introducing fuzzy logic, the model can better simulate the way humans understand language, thereby providing more delicate and accurate language processing capabilities. At the same time, deep neural networks have performed well in natural language processing tasks with their powerful representation learning and pattern recognition capabilities. In particular, for sequence data such as text, recurrent neural networks (RNNs) and their variants (such as LSTM and GRU) can capture long-term dependencies within sentences, which is crucial for understanding complex syntactic structures and contextual information. In addition, the introduction of the attention mechanism enables the model to focus on the most relevant parts of the input sequence when generating outputs, further improving performance.
In this study, we combined fuzzy logic with deep neural networks to build an integrated hybrid model. Specifically, the fuzzy system will be used to process uncertainty and ambiguity in language, such as the polysemy of words and subtle changes in emotions. By defining appropriate fuzzy rules and membership functions, the fuzzy system can provide a flexible and powerful processing method for the model. On the other hand, the deep neural network focuses on learning the context-related features of the text and automatically extracts key information that helps to understand and generate German texts.
The design idea of this hybrid model aims to take full advantage of the advantages of two technologies: fuzzy logic handles uncertainty, and neural networks handle contextual dependencies. In this way, we hope to develop a learning system that can not only accurately capture the nuances of language, but also efficiently learn and adapt to a large number of real corpora. Such a system can not only improve the language acquisition efficiency of German learners, but also provide a more personalized and interactive learning experience to meet the needs of different learners. In addition, in order to further enhance the practicality of the model, we will also consider introducing social elements to encourage interaction and cooperation among learners. By establishing an active learning community, learners can exchange experiences and share resources with each other, forming a positive learning environment and promoting common progress. This design idea not only focuses on the improvement of individual learning effects, but also pays attention to the play of group effects, and strives to improve the overall effect of German learning at multiple levels.
3.2 Specific implementation
In this section, we will introduce the implementation of the hybrid fuzzy neural model in detail. To build this model, we need to combine fuzzy logic and deep learning techniques to process German language data. We will discuss the following aspects: the design of the fuzzy system, the selection and optimization of the neural network architecture, and how to organically combine the two to form a unified learning framework.
This study integrates fuzzy logic and neural networks through modular design. First, the fuzzy system receives input data and converts it into fuzzy membership values through fuzzification to capture the ambiguity in the language. Then, the membership values are inferred with the fuzzy rule base to generate fuzzy outputs. Finally, these outputs are converted into numerical representations through defuzzification as input features of the neural network. The neural network further performs deep learning on these features to extract the contextual information of the language, thereby completing complex task modeling.
As shown in Fig. 1, the overall design idea is to build a hybrid model that combines fuzzy logic and deep learning technology to improve the performance of natural language processing tasks. The core of this model is to combine the flexibility of fuzzy systems with the powerful representation ability of neural networks.
First, in the fuzzy system design stage, fuzzy sets and membership functions are defined to capture the uncertainty in language. For the polysemy problem of words, a membership function based on contextual features is used to quantify the applicability of different meanings in a specific sentence. In addition, a series of fuzzy rules are formulated to handle complex language phenomena, such as emotional expression. These rules can infer possible emotional tendencies based on the emotional intensity of the input text and the appearance of evaluation words. Next, a neural network architecture based on LSTM units and combined with an attention mechanism is designed. The LSTM structure is good at capturing long-distance dependencies, while the attention mechanism enables the model to better focus on the key parts of the input sequence. This combination helps to enhance the model's ability to understand the deep meaning of the text.
The above two parts are integrated into a unified framework. Specifically, the output of the fuzzy system can be incorporated into the neural network training process as an additional feature, or the behavior of certain layers (such as weight allocation) can be directly adjusted. At the same time, an end-to-end learning strategy is adopted, that is, the fuzzy parameters and the neural network parameters are optimized simultaneously to ensure that the entire system works together. To this end, a composite loss function needs to be carefully designed, which not only considers the goals of the task itself (such as classification accuracy), but also takes into account the consistency and effectiveness of the fuzzy rules. In this way, we hope to develop a more intelligent and adaptable language processing solution.
Model optimization uses the cross entropy loss function (classification task) or mean square error (regression task) as the main indicator to measure the deviation between the predicted value and the target value. Hyperparameter adjustment is completed by combining grid search and random search to optimize key parameters such as learning rate, batch size, and number of fuzzy rules. In the evaluation stage, performance is comprehensively evaluated using indicators such as accuracy, F1 value, and BLEU score to verify the generalization ability and actual effect of the model from multiple dimensions.
3.2.1 Fuzzy system design
We define a set of fuzzy sets and membership functions to represent uncertainty and ambiguity in language. Taking the polysemy of words as an example, we can define multiple meanings for each word and quantify the applicability of these meanings in a specific context through membership functions. Suppose there is a word W with n different meanings. For a given sentence X, the membership of W to the meaning S_i in sentence X can be expressed as formula (1).
...(1)
Among them, f is a feature-based function that can calculate the possibility of W belonging to each meaning based on the context information of sentence X. We use fuzzy rules to deal with language phenomena in different situations. For example, when it comes to emotional expression, we can define the following fuzzy rules:
If a sentence has a high sentiment intensity and contains positive evaluation words (P_positive), then the sentence is likely to be positive. If a sentence has a low sentiment intensity and contains negative evaluation words, then the sentence is likely to be negative. These rules can be encoded through a fuzzy inference system. Ultimately, the results obtained through fuzzy inference can be used to adjust the input weights of the neural network, or directly participate in the training process as part of the network.
The definition of fuzzy membership functions is mainly based on a combination of expert experience and data-driven methods. First, by observing the distribution patterns in actual language data, a triangular or trapezoidal function is preliminarily designed; then, optimization algorithms (such as particle swarm optimization) are used to further adjust the parameters to improve accuracy. In sentiment analysis, rule examples include: "If the sentiment intensity is medium and the sentence contains negative words, the sentiment tendency is negative." This rule flexibly handles sentiment ambiguity and supports fine-grained analysis.
3.2.2 Neural network architecture
The basic LSTM unit structure includes three gating mechanisms: input gate, forget gate, and output gate, as well as a cell state. These components work together to enable LSTM to effectively store and access long-distance information. The state update formula of the LSTM unit is shown in Formula (2)-(6).
...(2)
...(3)
...(4)
...(5)
...(6)
The network training configuration uses a learning rate of 0.001, 50 training cycles, the Adam optimizer, and a compound loss function (see Formula (9)). LSTM was chosen based on its excellent performance in capturing sequence dependencies, while transformers are relatively inefficient for small datasets. The attention mechanism further improves the model's ability to focus on important features of long sentences. The design that combines these technologies can achieve a balance between performance and efficiency.
In addition, it is necessary to introduce the attention mechanism, which can help the model focus more on the important parts. The attention mechanism is usually applied to the encoder-decoder architecture, where the encoder is responsible for reading the entire input sequence and generating a fixed-length vector, and the decoder uses the vector to generate the output sequence. The attention mechanism allows the decoder to focus on different parts of the input sequence at each step, thereby better understanding the context. The calculation formula of the attention weight is Formula (7) and Formula (8).
...(7)
...(8)
Where, aij a is the attention of time step i to the jth position of the input sequence, i si-1 is the hidden state of the decoder at the previous step, j h is the hidden state of the encoder at the corresponding position, v, Wh, Ws, and attn b the learnable parameters.
3.2.3 Model integration
We combine fuzzy systems with neural networks to build a complete hybrid fuzzy-neural model. Specifically, the output of the fuzzy system can be fed into the neural network as an additional feature, or it can be used to adjust the weights or activation values of certain layers. For example, when dealing with polysemous words, we can adjust the word embedding vector according to the fuzzy membership; This requires us to define a suitable loss function, usually a composite loss function that combines the task objective and the fuzzy rule constraints. For example, if our goal is to minimize the classification error and we hope that the prediction results of the fuzzy rule are consistent with the actual label, then the total loss function may be as shown in Equation (9).
...(9)
Among them, Ltask is the loss term for a specific task (such as cross entropy loss), Lfuzzy is the penalty term for fuzzy rule deviation, and λ is a hyperparameter that balances the contributions of the two.
The interaction process between the fuzzy logic module and the neural network consists of three parts: first, the input language data is fuzzified to generate fuzzy features; then, the neural network extracts the contextual semantic representation; finally, the fuzzy output and the neural network prediction are combined to complete the reasoning. The loss function contains the data error term and the regularization penalty term. The penalty weight selects the best value through grid search to achieve a balance between performance and generalization ability. Pseudocode and flowchart are shown in the appendix.
The following pseudo code shows the interaction process between fuzzy rules and neural networks. The data is first preprocessed and fuzzified, and converted into membership values to capture fuzzy information. Subsequently, reasoning is performed through the fuzzy rule base to generate fuzzy outputs, which are then defuzzified and converted into numerical features. The neural network uses these features to complete deep learning tasks, including forward propagation, loss calculation, and backpropagation optimization. This process effectively combines the flexibility of fuzzy logic with the powerful representation ability of neural networks, providing an efficient solution for dealing with uncertainty in language.
4 Experimental evaluation
4.1 Experimental setup
In order to verify the effectiveness and superiority of the proposed hybrid fuzzy neural model in German language learning, we designed a series of experiments to evaluate the performance of the model on different tasks and compare it with existing methods.
4.1.1 Dataset
The foundation of the experiment is to build a comprehensive and diverse database of German texts. We collected a variety of text types, including news articles, literary works, textbooks, and social media posts to ensure coverage of different topics and styles. In the data preprocessing stage, we performed text cleaning to remove irrelevant characters, punctuation, and other noise to improve data quality. The word segmentation tool was used to split the text into words or phrases for further processing. Stemming restores words to their basic form, reduces vocabulary diversity, and improves the generalization ability of the model.
The data used for the experiment was divided into training, validation, and test sets in a ratio of 7:2:1 to ensure that the performance of the model can be fully evaluated. The training set is used for model learning, the validation set is used for parameter adjustment and to prevent overfitting, and the test set is used for final evaluation. The experiment used 5-fold cross-validation, and different data subsets were randomly selected for training and validation in each division to ensure the stability and reliability of the results. Cross-validation not only improves the comprehensiveness of model evaluation, but also helps determine the optimal hyperparameter combination, thereby improving the generalization ability of the model.
The impact of cultural and contextual biases on language processing is explicitly considered in model training, such as reducing gender or regional biases by balancing the distribution of corpus. In addition, the actual application of the model must be cautious to avoid misunderstandings or discrimination caused by semantic misjudgments in sensitive scenarios. These considerations ensure that the model has a higher sense of social responsibility and practical value.
4.1.2 Task definition
When evaluating the hybrid fuzzy neural model, we designed four main tasks to comprehensively test its performance. The vocabulary acquisition task aims to evaluate the model's ability to help learner's master new vocabulary. The model needs to accurately identify the meaning of vocabulary and its usage in sentences based on contextual information. The syntactic analysis task involves understanding sentence structure and grammatical rules. The model needs to correctly parse sentence components and identify key parts such as subject, predicate and object.
4.1.3 Baseline model
To compare the performance of the proposed hybrid fuzzy neural model, we selected several common baseline models as references. Traditional rule-based methods rely on hand-written rules for vocabulary and grammar analysis, which are specific and precise, but are incapable of dealing with complex language phenomena. Statistical machine learning methods such as support vector machines (SVM) and random forests make predictions by learning patterns in data, but have limitations in dealing with uncertainty and ambiguity in natural language. Pure deep learning methods such as LSTM-based sequence-tosequence models (Seq2Seq) and BERT pre-trained models perform well in processing large-scale text data and learning context-related features, but there is still room for improvement in dealing with subtle language differences and uncertainty. By comparing with these baseline models, we can more clearly understand the advantages and improvements of the hybrid fuzzy neural model.
4.1.4 Evaluation metrics
To comprehensively evaluate the performance of the model, we selected several commonly used evaluation indicators. Accuracy measures the accuracy of the model on the classification task. It is an intuitive and easy-tounderstand indicator that is suitable for most classification tasks. The F1 score takes into account both precision and recall, and is particularly suitable for unbalanced data sets. It can balance the performance of the model on different types of errors to a certain extent. For tasks that generate text, such as reading comprehension and question answering, the BLEU score is used to evaluate the quality of the generated text. The BLEU score measures the fluency and accuracy of the generated results by calculating the n-gram overlap between the generated text and the reference text.
In addition to accuracy and F1 score, logarithmic loss and precision-recall curves are introduced to evaluate model performance. Logarithmic loss reflects the degree of calibration of the model's predicted probability, and the PR curve shows the model performance under different thresholds, which is especially valuable on unbalanced datasets. The introduction of these indicators makes the model evaluation more comprehensive and reliable.
4.2 Experimental results
The experimental results in Table 2 were subjected to analysis of variance (ANOVA) and significance tests (such as Tukey HSD or t-test), verifying the significant performance improvement of the hybrid fuzzy neural model in multiple tasks. In particular, in terms of vocabulary acquisition accuracy, grammatical analysis F1 score, and sentiment analysis accuracy, the results of the model were significantly better than those of rule-based and statistical learning methods (p < 0.01), and showed similar or slightly better results than pure deep learning (LSTM Seq2Seq) and BERT pre-trained models (no statistically significant difference, p > 0.05). In addition, in terms of the BLEU score and confusion rate indicators of reading comprehension, the hybrid fuzzy neural model showed a smaller fluctuation range, further illustrating its advantages in stability and robustness.
Table 2 shows the evaluation performance of different models on tasks related to German learning, including vocabulary acquisition, syntactic analysis, sentiment analysis, and reading comprehension. From the data, it can be seen that the hybrid fuzzy neural model performs well on all these tasks, especially in vocabulary acquisition accuracy (90.5%) and sentiment analysis accuracy (92.1%), which is ahead of other methods. Compared with traditional techniques such as rule-based methods and statistical machine learning (SVM), it not only provides higher accuracy, but also the F1 score shows its good balance. Although pure deep learning models such as LSTM Seq2Seq and BERT pre-trained models also show strong competitiveness, they are slightly inferior to the hybrid fuzzy neural model in most indicators.
4.3 Case analysis
4.3.1 Data collection and processing
In order to comprehensively evaluate the effect of the hybrid fuzzy neural model in German learning, this study adopted a variety of data collection methods. First, all 120 students were given German proficiency tests before and after the experiment to assess their initial level and final progress. These tests covered multiple aspects such as vocabulary, grammar, reading comprehension and writing to ensure that the students' language ability was fully reflected. By comparing the scores of the pre-test and posttest, the changes in the German proficiency of the students in the experimental group and the control group can be intuitively seen.
During the experiment, we also recorded the students' learning behavior data in detail. Specifically, it includes the number of logins for each student, the learning time for each login, and the number of tasks completed. These data are automatically collected through the online platform and statistically analyzed. By analyzing these quantitative data, we can understand the students' learning frequency, duration, and task completion, and thus evaluate their use and participation in the learning system. For example, the average number of logins and learning time can help us determine whether students actively use the system, while the number of completed tasks reflects their level of involvement in each learning module. In addition, in order to gain a deeper understanding of students' satisfaction and experience with the learning system, we conducted a questionnaire survey at the end of the experiment. The questionnaire design includes multiple dimensions, such as the richness of teaching content, the effectiveness and fun of teaching methods, the effect of language skill improvement, and the convenience of self-learning. Students were asked to rate each dimension (out of 5 points) based on their actual experience and provide specific feedback and suggestions. The results of the questionnaire survey not only provide quantitative satisfaction scores, but also contain rich qualitative data. These qualitative data were coded and thematically analyzed to extract key opinions and suggestions, helping us better understand students' needs and improvement directions.
As shown in Table 3, there are significant differences in the performance of the experimental group and the control group in the pre-test and post-test scores. The average score of the students in the experimental group was 75.8 points (standard deviation 5.2) at the beginning of the experiment, and the average score increased to 88.3 points (standard deviation 4.1) after the experiment, an average increase of 12.5 points. In contrast, the average score of the control group students in the pre-test was 76.2 points (standard deviation 4.9), and the average score of the post-test was 82.1 points (standard deviation 5.3), an average increase of 5.9 points. From the data, it can be seen that the German level of the students in the experimental group has improved significantly more than that of the control group. This shows that the learning system based on the hybrid fuzzy neural model has a significant effect on improving students' German level. Through personalized learning experience, highly interactive teaching methods and the system's selfadjustment function, the students in the experimental group can more effectively master German knowledge, thereby making great progress in a short period of time.
Fig. 2 compares the performance of the two groups on grammatical analysis and sentiment analysis. The experimental group scored higher on grammatical analysis, 92 points, and scored 88 points on sentiment analysis. The control group scored lower on grammatical analysis, 86 points, and scored 84 points on sentiment analysis.
Fig. 3 records the changes in students' average scores every two months from October 2019 to August 2020. As can be seen from the Figure, the average scores of students showed an upward trend during this period, with the fastest growth from December 2019 to February 2020.
As shown in Table 4, in the performance comparison of different tasks, the experimental group students performed better than the control group in the four tasks of vocabulary acquisition, syntactic analysis, sentiment analysis and reading comprehension. Specifically, in the vocabulary acquisition task, the experimental group had an accuracy of 90.5% and an F1 score of 0.90, while the control group had 82.3% and 0.82 respectively; in the syntactic analysis task, the experimental group had an accuracy of 88.7% and an F1 score of 0.89, while the control group had 80.2% and 0.80; in the sentiment analysis task, the experimental group had an accuracy of 92.1% and an F1 score of 0.92, while the control group had 84.5% and 0.85; in the reading comprehension task, the experimental group had a BLEU score of 42.3 and a perplexity of 59.8, while the control group had a BLEU score of 35.2 and a perplexity of 65.4. These results show that the hybrid fuzzy neural model can better capture contextual information and provide more accurate language understanding and generation capabilities when dealing with complex language tasks, thereby helping students achieve better results in various language skills.
Fig. 4 is a box plot showing the distribution of four indicators. From leftto right, they are Accuracy, F1, BLUE, and Perplexity. The values of Accuracy and F1 are concentrated between 75 and 85, and there are no outliers; the values of BLUE are distributed between 40 and 60, and there is one outlier below the lower quartile; the values of Perplexity are relatively scattered, most of which are between 20 and 40, and there are two outliers above the upper limit.
As shown in Table 5, the difference in learning behavior records between the experimental group and the control group is also very obvious. The average weekly login times of students in the experimental group are 4.5 times (standard deviation is 1.2), the average weekly study time is 5.3 hours (standard deviation is 1.1), and the average number of tasks completed per week is 12.8 (standard deviation is 2.3). In contrast, the average weekly login times of students in the control group are only 2.8 times (standard deviation is 0.9), the average weekly study time is 3.5 hours (standard deviation is 0.8), and the average number of tasks completed per week is 8.5 (standard deviation is 1.5). Students in the experimental group showed higher learning enthusiasm and participation. They used the learning system more frequently, spent more time on learning, and completed more tasks. This shows that the personalized learning experience and more interactive teaching methods provided by the hybrid fuzzy neural model have stimulated students' interest in learning and made them more engaged and proactive in the learning process.
As shown in Table 6, the survey on students' satisfaction with teaching content, teaching methods, language skills improvement and self-learning convenience shows that the students in the experimental group highly evaluated the learning system based on the hybrid fuzzy neural model. In terms of whether the teaching content is rich and diverse, the average score of the experimental group is 4.7 points (out of 5 points), while the control group is 3.9 points; in terms of whether the teaching method is interesting and effective, the experimental group is 4.5 points, and the control group is 3.5 points; in terms of whether it helps to improve language skills, the experimental group is 4.8 points, and the control group is 3.7 points; in terms of whether it is convenient for self-learning, the experimental group is 4.6 points, and the control group is 3.4 points; in terms of overall satisfaction, the experimental group is 4.7 points, and the control group is 3.6 points. These data show that the students in the experimental group generally believe that the hybrid fuzzy neural model provides rich and diverse teaching content, adopts interesting and effective teaching methods, helps to significantly improve their language skills, and facilitates self-learning. In contrast, the students in the control group are less satisfied with the traditional teaching methods, especially in terms of the fun and effectiveness of the teaching methods.
4.4 Discussion
In this section, we compare the performance of our hybrid fuzzy-neural model with existing state-of-the-art (SOTA) methods and analyze why our model outperforms others in certain tasks, as well as address potential limitations.
(1) Comparison with SOTA Methods
Our model combines fuzzy logic and neural networks, offering flexibility and accuracy compared to traditional methods like grammar-translation or communicative methods, which are more rigid. While fuzzy systems excel at handling linguistic uncertainty, they struggle with sequential dependencies, a challenge addressed by integrating Long Short-Term Memory (LSTM) networks. LSTMs capture long-term dependencies and enhance the model's ability to process complex German sentence structures. Compared to RNN-based models and models like the Transformer, our hybrid approach balances ambiguity handling with sequence learning, achieving better results on corpus-based language tasks.
(2)Why Our Hybrid Model Performs Better
The success of our model lies in combining fuzzy logic's ability to manage linguistic ambiguity with the LSTM's strength in processing sequential data. This combination allows our model to handle complex German grammar and vocabulary more effectively than traditional approaches. The fuzzy system manages uncertainty, while the neural network extracts relevant features from large corpora, improving accuracy and adaptability in learning.
(3) Limitations and Potential Reasons for Underperformance
Despite its advantages, our model faces challenges in computational complexity due to the combined use of fuzzy logic and LSTM networks, leading to longer training times and higher resource consumption. Additionally, the quality of the training corpus significantly affects performance, and insufficiently diverse data may hinder generalization. The model also may not perform as well on tasks with highly deterministic relationships, such as spelling correction, where simpler models might be more efficient.
(4)Future Directions
Future work could focus on incorporating the attention mechanism to enhance the model's ability to focus on relevant input sequences, further improving performance. Expanding the corpus with more diverse text types could help the model generalize better, while optimizing computational efficiency through techniques like pruning could make the model more scalable and suitable for real-time applications.
5 Conclusion
This study designed and implemented a German learning system based on a hybrid fuzzy neural model, and verified its effectiveness and superiority in improving the language acquisition efficiency of German learners. The experimental results show that the experimental group students performed significantly better than the control group and other baseline models on multiple tasks. Specifically, in the vocabulary acquisition task, the accuracy and F1 score of the experimental group were 90.5% and 0.90, respectively, which were significantly higher than the 82.3% and 0.82 of the control group. In the syntactic analysis task, the accuracy and F1 score of the experimental group were 88.7% and 0.89, respectively, while the control group was 80.2% and 0.80. In the sentiment analysis task, the accuracy and F1 score of the experimental group were 92.1% and 0.92, respectively, while the control group was 84.5% and 0.85. In the reading comprehension task, the BLEU score of the experimental group was 42.3 and the perplexity was 59.8, while the BLEU score of the control group was 35.2 and the perplexity was 65.4. These results show that the hybrid fuzzy neural model can better capture contextual information and provide more accurate language understanding and generation capabilities when dealing with complex language tasks. In addition, students in the experimental group showed higher learning enthusiasm and participation, with an average of 4.5 logins per week, an average weekly learning time of 5.3 hours, and 12.8 tasks completed, which were significantly higher than those in the control group. The student satisfaction survey showed that students in the experimental group gave high praise to the teaching content, teaching methods, language skills improvement, and self-learning convenience, with an overall satisfaction score of 4.7 (out of 5).
References
[1] Tacke T, Nohl-Deryk P, Lingwal N, Reimer LM, Starnecker F, Güthlin C, et al. The German version of the mHealth App Usability Questionnaire (GERMAUQ): Translation and validation study in patients with cardiovascular disease. Digital Health. 2024; 10: 20552076231225168. https://doi.org/10.1177/20552076231225168
[2] Müller F, Hummers E, Noack EM. Medical characteristics of foreign language patients in paramedic care. International Journal of Environmental Research and Public Health. 2020; 17(17):6306. https://doi.org/10.3390/ijerph17176306
[3] 59. Mustonen J, Henttonen H, Vaheri A, Zöller L, Krüger DH. Infection outbreak among German and finish troups in Eastern Lapland during World War II First description of hantavirus disease in the German language area. Deutsche Medizinische Wochenschrift. 2022; 147(24/25):1629-1634. https://doi.org/10.1055/a-1817-5129
[4] 60. Neumann K, Arnold B, Baumann A, Bohr C, Euler HA, Fischbach T, et al. New terminology for developmental language disorders. MonatsschriftKinderheilkunde. 2021; 169(9):837-842. https://doi.org/10.1007/s00112-021-01148-2
[5] Kuschmann A, Schölderle T, Haas E. Clinical practice in childhood dysarthria: an online survey of German-speaking speech-language pathologists. American Journal of Speech-Language Pathology. 2023; 32(6):2802-2826. https://doi.org/10.1044/2023_ajslp-23-00076
[6] Liu SQ, Chen Y, Shen Q, Gao XS. Sustainable professional development of german language teachers in china: research assessment and external research funding. Sustainability. 2022; 14(16):9910. https://doi.org/10.3390/su14169910
[7] Lo JJH. Between ah(m) and Euh(m): The distribution and realization of filled pauses in the speech of german-french simultaneous bilinguals. Language and Speech. 2020; 63(4):746-768. https://doi.org/10.1177/0023830919890068
[8] Ludusan B, Schuppler B. An analysis of prosodic boundaries across speaking styles in two varieties of German. Speech Communication. 2022; 141:93-106. https://doi.org/10.1016/j.specom.2022.05.002
[9] Lüke C, Kauschke C, Dohmen A, Haid A, Leitinger C, Männel C, et al. Definition and terminology of developmental language disorders-Interdisciplinary consensus across German-speaking countries. Plos One. 2023; 18(11): e0293736. https://doi.org/10.1371/journal.pone.0293736
[10] Marchenko TV. The Russian language in Germany: its life, adventures, and learning specifics. Herald of the Russian Academy of Sciences. 2020; 90(1):116- 121. https://doi.org/10.1134/s1019331620010104
[11] Stasche N, Bärmann M. History of the Germanlanguage ENT journals. German version. Hno. 2021; 69(5):385-415. https://doi.org/10.1007/s00106-021- 01035-y
[12] Stockert A, Wawrzyniak M, Klingbeil J, Wrede K, Kümmerer D, Hartwigsen G, et al. Dynamics of language reorganization after lefttemporo-parietal and frontal stroke. Brain. 2020; 143:844-861. https://doi.org/10.1093/brain/awaa023
[13] Stroh AL, Rösler F, Dormal G, Salden U, Skotara N, Hänel-Faulhaber B, et al. Neural correlates of semantic and syntactic processing in German Sign Language. Neuroimage. 2019; 200:231-241. https://doi.org/10.1016/j.neuroimage.2019.06.025
[14] Tayir T, Li L. Unsupervised multimodal machine translation for low-resource distant language Pairs. ACM Transactions on Asian and Low-Resource Language Information Processing. 2024; 23(4):1-22. https://doi.org/10.1145/3652161
[15] Albadán J, Gaona P, Montenegro C, González- Crespo, R., & Herrera-Viedma, E. Fuzzy logic models for non-programmed decision-making in personnel selection processes based on gamification. Informatica, 2018, 29(1): 1-20. https://doi.org/10.15388/Informatica.2018.155
[16] Gül S, Aydoğdu A. Novel entropy measure definitions and their uses in a modified combinative distance-based assessment (CODAS) method under picture fuzzy environment. Informatica, 2021, 32(4): 759-794. https://doi.org/10.15388/21-INFOR458
[17] Titze L, Lutz M, Franke I, Streb J, Dudeck M. German language skills of forensic psychiatric inpatients with a history of migration: a baseline study. Recht & Psychiatrie. 2021; 39(3):140-146. https://doi.org/10.1486/rp-2021-03_140
[18] Tuerk S, Domahs U. Orthographic influences on spoken word recognition in bilinguals are dependent on the orthographic depth of the target language not the native language. Brain and Language. 2022; 235:105186. https://doi.org/10.1016/j.bl.2022.105186
[19] Ulrich L, Thies P, Schwarz A. Availability, quality, and evidence-based content of mHealth apps for the treatment of nonspecific low back pain in the German language: systematic assessment. Jmir Mhealth and Uhealth. 2023; 11: e47502. https://doi.org/10.2196/47502
[20] Lindemann J, Goldberg-Bockhorn E, Stupp F, Scheithauer M, Sieron HL, Hoffmann TK, et al. Adaption of the "empty nose 6 item questionnaire" (ENS6Q) into German language. Laryngo-Rhino- Otologie. 2022; 101(12):979-986. https://doi.org/10.1055/a-1841-6542
[21] Müller F, Holman H, Hummers E, Schröder D, Noack EM. Multilingual competencies among ambulatory care providers in three German Federal States. BMC Primary Care. 2022; 23(1):315. https://doi.org/10.1186/s12875-022-01926-158.
© 2025. This work is published under https://creativecommons.org/licenses/by/3.0/ (the "License"). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.