1. Introduction
Molodtsov [1,2] proposed soft set theory as a novel concept in 1999 to handle vagueness and uncertainty. Mathematical tools using a combination of the soft set model and other mathematical models have since been developing rapidly, such as the fuzzy soft set [3], the intuitionistic fuzzy soft set [4], the interval-valued fuzzy soft set [5], the interval-valued intuitionistic fuzzy soft set [6,7], the belief interval-valued soft set [8], the confidence soft sets [9], the linguistic value soft set [10], separable fuzzy soft sets [11], dual hesitant fuzzy soft sets [12], the Z-soft fuzzy rough set [13], the fuzzy parameterized fuzzy soft set [14,15,16,17], interval-valued q-rung orthopair fuzzy soft sets [18], the interval-valued multi-polar fuzzy soft set [19], the soft rough set [20], etc. The fuzzy soft set is one of the most important branches of soft sets. Maji et al. [21] was the first to combine a fuzzy set with a soft set and put forward the idea of a fuzzy soft set. This concept has been further developed in [22]. There are many practical and valuable applications based on fuzzy soft sets. Sadiq et al. [23] proposed an approach for ranking the functional requirements of software using fuzzy soft set theory. A novel time-varying weight determination method based on a fuzzy soft set was given in [24]. A combination of the association rule method and the fuzzy soft set model was proposed in [25]. It is worth mentioning that the concept of a fuzzy soft set has been widely used in the field of decision making. Maji and Roy [26] proposed a target-recognition method based on a fuzzy soft set for imprecise multi-observer data, which has been improved in [27]. Another author [28] discussed the fuzzy soft aggregation operator, which supports the creation of more effective decision making approaches. Using the horizontal soft set in [29], an adjustable decision method based on the fuzzy soft set was proposed. Authors of [30] described the concept of fuzzy soft matrices and their related operations, which allowed them to propose a new decision-making method. The authors of [31] showed a process of information fusion that provides a more reliable resultant fuzzy soft set from an input dataset. Tang et al. [32] proposed the gray relational analysis method based on a fuzzy soft set in decision making. Deng et al. [33] proposed an object-parameter method to predict missing data in incomplete fuzzy soft sets. Uncertainty handling is one of the most important and difficult tasks in medical decision-making. The authors of [34] improved the decision algorithm based on a fuzzy soft set using a fuzzy measure and D-S evidence theory, which is often applied to medical diagnoses. The authors of [35] proposed a chest X-ray-enhanced diagnosis method for pneumonia malformations based on a fuzzy soft set and D-S evidence theory. Chen et al. [36] proposed a group decision-making algorithm based on an extended fuzzy soft set in order to identify cognitive differences among different decision makers.
Nevertheless, there are some redundant parameters in the actual decision-making process. A parameter-reduction set is the smallest subset of parameters that exhibit the same reduction results or descriptions as the original parameter set. Parameter reduction is one of important research issues involving applications of these tools that deal with uncertainty [37,38]. Kong et al. [39] first proposed the normal parameter reduction of fuzzy soft set theory. Ma et al. [40] proposed an efficient distance-based parameter-reduction algorithm for this model. In [41], a reduction in the parameters of fuzzy soft sets was studied from a new perspective on scoring criteria and was improved in [42]. However, the two existing algorithms do not consider newly added parameters and have higher computation, which lead to low extendibility.
To address these issues, we propose a S-Score table-based parameter-reduction method for fuzzy soft sets. Our contributions are as follows:
(1). A new S-Score table-based parameter-reduction method for fuzzy soft sets is presented.
(2). The proposed approach has relatively lower computation in comparison with the two existing algorithms in [41,42].
(3). The proposed approach considers newly added parameters. Due to this consideration of the added parameters, our proposed approach has much better flexibility and is beneficial to the extension of fuzzy soft sets and a combination of multiple fuzzy soft sets.
(4). The experimental results on two real-life applications show the availability and feasibility of our approach.
The rest of this paper is organized as follows. Section 2 reviews the basic concepts of soft set theory and fuzzy soft set theory and discusses the two parameter-reduction methods for fuzzy soft set proposed in [41,42]. In Section 3, our parameter-reduction algorithm for fuzzy soft sets based on an S-Score table is proposed. In Section 4, this newly proposed algorithm is compared with the two existing methods in two real-life applications. Finally, Section 5 concludes the paper.
2. Related Work
In the current section, we briefly recall the basic ideas and notion of soft sets, fuzzy soft sets, and related parameter-reduction methods for fuzzy soft sets.
2.1. Basic Notions
First, we recall the basic definition of fuzzy sets initially developed by Zadeh [43] in 1965.
([43]). A fuzzy set F in the universe is defined as . is called the membership function of F, and indicates the membership degree from to . The family of all fuzzy sets on is denote by .
Molodtsov [1] defined soft sets in the following way. Let be an initial universe of objects and be the set of parameters in relation to objects in .
Parameters are often regarded as attributes, characteristics, or properties of objects. Let denote the power set of and .
A pair is called a soft set over , where is a mapping given by .
Maji et al. [21] initiated the study on hybrid structures involving both fuzzy sets and soft sets. They introduced the notion of fuzzy soft sets, which can be seen as the fuzzy generalization of a classical soft set. Maji et al. [21] proposed the concept of a fuzzy soft set as follows.
(See [21]). Let U be an initial universe of objects, be a set of parameters in relation to objects in , and be the set of all fuzzy subsets of . A pair is called a fuzzy soft set over , where is a mapping given by .
2.2. Existing Parameter-Reduction Methods for Fuzzy Soft Sets
Parameter reduction is an important process of decision-making applications for fuzzy soft sets. Here, we mainly recall two existing methods of parameter reduction.
Kong et al. [41] combined the score decision criterion with the standard parameter reduction method of a soft set and developed a score decision criterion parameter-reduction algorithm based on a fuzzy soft set, which is abbreviated as the S-normal reduction algorithm (SNR).
However, this method has high computation. To simplify the calculation, an improved parameter-reduction method for score decision criteria of a fuzzy soft set (ISNR) was presented in [42].
However, the two existing Algorithm 1 and Algorithm 2 presented above do not consider newly added parameters and has higher computation, which lead to low extendibility when multiple datasets are combined. As a result, we propose a new parameter-reduction method that takes the added parameters into account.
Algorithm 1: S-normal reduction algorithm [41] (SNR). |
Step 1: Input a fuzzy soft set ; |
Algorithm 2: ISNR [42]. |
Step 1: Input a fuzzy soft set |
3. Our Proposed Method
By introducing a new concept called an S-Score table in this section, we provide a new approach that successfully overcomes the limitations of the S-normal and I-S-normal methods.
Let be the universe and be the attribute set. is the membership value of object for parameter . is the number of objects for which the membership value of is equal to or greater than the membership value of and is the number of objects for which the membership value of is equal to or less than the membership value of .
The S-Score value of object on is denoted by and defined by
(1)
The overall S-Score of object is denoted by and defined as
(2)
The S-Score table is a table in which the rows are labeled by the attribute and the columns are labeled by the objects . An entry corresponding to attribute and object is denoted by .
To illustrate our method in Algorithm 3, we give the following example.
Algorithm 3: The proposed reduction algorithm based on S-Score table. |
Step 1: Input a fuzzy soft set ; |
There are six objects , and is a collection of parameters. The fuzzy soft set is shown in Table 1.
According to our Algorithm 3, the following steps are given:
Step 1: Input a fuzzy soft set as shown in Table 1;
Step 2: Compute the S-Score valve for each object to the attribute using Equation (1);
Hence, we can obtain the following:
Similarly, we calculate the remaining S-Score valves as shown in Table 2.
Step 3: Compute for any subset of ; if for all objects, then T is called a non-essential set in E.
In this process, we find that for all of objects = 0, which is illustrated in Table 3.
Step 4: We find that the reduced subset is the largest non-required subset. If so, the remaining parameter subset E-T = is the final reduction result.
We also apply SNR and ISNR to Example 4.1. As a result, the parameter-reduction results are That is, three methods provide equivalent reduction results. In order to verify this point, we give the following Theorem.
Suppose thatis a fuzzy soft set on U;; and. For any, calculating its scoreand priority rankingusing the SNR algorithm, its scoreand priority rankusing the ISNR algorithm, and its overall S-Scoreand priority rankusing our algorithm, we haveand.
Since and , we can obtain the following:
In the same way, we can obtain . To sum up, we can have .
We use Equations (1) and (2) to create the S-Score table of the fuzzy soft set , which is shown in Table 4. We can obtain the rank of objects according to . □
According to our method, we should find the subset ST1 = ST2 = … = STn = 0. Because ST1 = ST2 = … = STn = 0, it is clear that the rank of objects based on sEi is the same as the rank of objects according to sE−Ti. That is, the object priority remains unchanged after the redundant parameter set is reduced, so . This completes the proof.
From the above theorem, we can conclude that the three reduction algorithms provide equivalent reduction results.
4. Comparison Results among Three Methods
In this section, first, we compare the proposed algorithm with the two existing methods—SNR (Algorithm 1) and ISNR (Algorithm 1)—on two real-life applications. As a result, we summarize the comparison among three methods from aspects such as consideration of the added parameters, flexibility, and computational complexity.
4.1. Case 1: Personal Postgraduate Enrollment for the Supervisors
After the postgraduate entrance examination, one supervisor who works at Northwest Normal University plans to recruit one graduate student majoring in computer science. He receives five emails with resumes. That is, there are five candidates. Furthermore, this supervisor examines the five resumes and summarizes seven appraisal items as diverse as “reputation of the university at which the candidate studied their bachelor’s degree”, “international ranking of their computer science major”, “GPA”, “English reading ability”, “English writing ability”, “academic performance”, and “internship experience” to evaluate the five candidates. We apply a fuzzy soft set to display the performances of the five candidates about the seven aspects. Suppose that is the set of five candidates, and is the set of seven appraisal items. Table 5 presents the data records for the personal postgraduate enrollment system used by the supervisor as a fuzzy soft set .
4.1.1. Three Methods on the Original Dataset
SNR
According to the algorithm of SNR, first, we calculate the comparison and score table of a fuzzy soft set (F, E), which is shown in Table 6. We can see that the scores of each object are −2, −6, 4, 4, and 0, respectively. The priority ranking is .
We check the matrix WE-WE-T for subset and find that this matrix is symmetric. As a result, the final S-normal reduction result is .
In this algorithm, we first calculate the comparison table of the fuzzy soft set, the number of elements accessed is 25 × 7 = 175. Next, we check the matrix WE−WE−T for subset, In this step, the number of elements accessed is 2 × 25 × 5 + 6 × 25 = 400. From the above steps, we can conclude that the total number of elements accessed for S-normal is 575.
ISNR
First, the special subset is found; then, we calculate the fuzzy soft set from the comparison table and the score table, which are shown in Table 7. The number of elements accessed for this step is 100. The difference is calculated from the score table, and the number of elements accessed is 50. From the above steps, we can obtain that the total number of elements accessed by ISNR is 150.
The New Proposed Algorithm
According to our proposed algorithm, the following steps are given: a special subset is found, the number of elements accessed is 80 when comparing the difference between the membership degrees of different objects using Equation (1) in this step. Additionally, the number of elements accessed, shown in Table 8 using Equation (2) is 20. From the above steps, the total number of elements assessed is up to 100.
Compared with the SNR algorithm, it is clear that the improvement in the total number of elements accessed is up to 42% in this process, while the improvement in the total number of elements accessed is 21% compared with ISNR.
4.1.2. Three Methods on the Extended Dataset
Suppose that is an original fuzzy soft set and that the new attribute set should be added to . If parameter reduction is performed using SNR, you have to assemble two parameter sets into a parameter set and to compute a new comparison table for the new fuzzy soft set . Here, after face-to-face interviews for the five candidates, the supervisor considers adding two new attributes such as “expression ability” and “interest in research” to evaluate the applicants shown in Table 9. However, for the newly added parameters , three methods have different reduction processes and number of elements accessed.
SNR
According to SNR, the following steps are given:
Step 1: Combine Table 5 and Table 9 into a new fuzzy soft set, as shown in Table 10.
Step 2: Table 11 presents a comparison table of the fuzzy soft set , and the number of elements accessed for this step is 250.
Step 3: Calculate the score table according to the comparison table in Step 2, the number of elements accessed for this step is 2 × 25 = 50.
As you can see from Table 11, after adding a new attribute, the object score list and prioritization are consistent with the results of the original dataset, so the newly added attribute set is not a necessary set and can be reduced. In the extended dataset, the total elements accessed for S-normal is 300.
ISNR
According to ISNR, the following steps are given:
Step 1: Calculate the comparison and score tables of the new attribute sets, as shown in Table 12, and the number of elements accessed in this step is 50;
Step 2: Calculate the difference based on the score table in Step 1, and the number of elements accessed is 50;
Step 3: As you can see from Table 12, the newly added attribute set can be reduced because all of the object scores are 0. Therefore, the total number of elements accessed is 100.
Our Proposed Algorithm
According to our proposed algorithm, the following steps are given:
Step 1: Compute the S-Score table for the two newly added attributes, as shown in Table 13;
Step 2: For the newly added parameters , obtain for all of objects, so the newly added attributes can be reduced;
Using this method, we create an S-Score table for the two newly added attributes, and the number of elements accessed in this step is 40. Additionally, then, we obtain for the two added parameters, and the number of elements accessed is 10. Finally, the total number of elements accessed from the extended dataset is 40 + 10 = 50.
Table 14 shows the comparative results of the three reduction algorithms. Both the original dataset and the extended dataset, the three methods have the same reduction results, so the three reduction algorithms are equivalent.
Compared with SNR, the improvement of the total number of element access is up to 83% in this decision process after adding new parameters; while the improvement of the total number of element access is up to 50% compared with ISNR. The proposed approach considers the newly added parameters. Due to considering the added parameters, our proposed approach has the much higher flexibility and is beneficial to the extension of fuzzy soft set and combination of multiple fuzzy soft sets.
4.2. Case 2: Evaluation for Academic Papers
Researchers usually use many measurement indicators to evaluate the academic papers. A researcher as a beginner wants to read an excellent scientific journal paper on the research topic of “data mining”, so he collects five academic papers from Baidu scholar. He cares about the performances of these academic papers from the seven aspects including “downloads”, “cited frequency”, “number of results”, “H-index”, “number of cited”, “reading volume” and “impact factor”. Here, we employ a fuzzy soft-set model to describe five academic papers. U is a collection of five academic papers, and = {“Design and Application of Teaching Quality Monitoring and Evaluation System Based on Data Mining”, “Study on Data Mining for Combat Simulation”, “Research on Intrusion Detection of Data Mining Model Based on Improved Apriori Algorithm”, “Construction of Cloud Service Platform for Network Big Data Mining”, “Overview of Data Mining”} [42]. as a set of parameters. All of datasets are normalized and transformed into a unit entity between 0 and 1. The specific data are shown in Table 15 below. However, because some measurements tend to be similar, there are some redundant data in the evaluation. We use the three methods to get the parameter reduction results.
By comparing the above three algorithms, it can be found:
(1). The three algorithms can obtain the same reduction results as ;
(2). In this case, the number of elements accessed by SNR, ISNR and our method are 350, 100 and 60, respectively;
(3). For the number of element access, the newly proposed reduction algorithm are improved by 83% and 40% over SNR and ISNR, respectively.
The comparison results of the three reduction algorithms on the evaluation of academic papers are shown in Table 16 below.
4.3. Computational Complexity
Assuming a fuzzy soft set with an initial theoretical field is U, in fuzzy soft set (F, E), the objects and parameters are n rows and m columns, respectively. There are non-essential parameter sets in the original dataset E. The next three reduction algorithms accessed the elements only for a special subset of the original dataset, and the number of columns is recorded as . The number of elements accessed by the three reduction algorithms from the original dataset is analyzed as follows:
4.3.1. SNR
If a special subset exists in the original parameters set E, represents the number of columns of a special subset ,using big O notation, the computational complexity of SNR is O().
4.3.2. ISNR
For ISNR, if special subset has columns, using big O notation, the computational complexity of ISNR is O().
4.3.3. The Newly Proposed Algorithm
For the reduction algorithm of this study, only the comparison table of special subset is calculated, special subset has columns, the number of elements accessed for this step is , we need to sum up the score table, and the calculation number of this step is . In summary, the total number of elements accessed for the reduction algorithm in this study is. Using big O notation, the computational complexity of the proposed algorithm is O().
Finally, we summarize the comparison results among the three methods as shown in Table 17.
5. Conclusions
In this paper, we proposed a new parameter-reduction method for fuzzy soft sets. As can be learned from the above two datasets, our proposed method has the same reduction results as two existing parameter reduction approaches for fuzzy soft sets; SNR and ISNR. However, it is clear that our method outperforms SNR and ISNR in terms of the number of elements accessed. When we have to add new parameters, our method takes the added parameters into account. Hence, our method has better flexibility and extendibility compared with SNR and ISNR. Our method can be applied to the extension of fuzzy soft sets and a combination of multiple evaluation systems. However, the proposed approach has limitations regarding computation when the number of attributes is very large. In future work, we will extend this parameter-reduction method to other mathematical models such as the interval-valued fuzzy soft set, the soft rough set, etc.
Conceptualization, H.Q. and X.M..; methodology, W.W.; software, C.G.; validation, X.M. and Y.W.; investigation, C.G.; data curation, C.G.; writing—original draft preparation, C.G. and H.Q.; writing—review and editing, X.M. and Y.W.; supervision, H.Q.; project administration, H.Q.; funding acquisition, H.Q. All authors have read and agreed to the published version of the manuscript.
Not applicable.
Not applicable.
Not applicable.
The authors declare no conflict of interest.
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Fuzzy soft set
U | e 1 | e 2 | e 3 | e 4 | e 5 | e 6 | e 7 |
---|---|---|---|---|---|---|---|
P 1 | 0.3 | 0.1 | 0.8 | 0.3 | 0.2 | 0.4 | 0.7 |
P 2 | 0.3 | 0.3 | 0.6 | 0.2 | 0.5 | 0.2 | 0.3 |
P 3 | 0.4 | 0.2 | 0.7 | 0.8 | 0.2 | 0.9 | 0.9 |
P 4 | 0.7 | 0.4 | 0.5 | 0.5 | 0.4 | 0.3 | 0.8 |
P 5 | 0.2 | 0.5 | 0.4 | 0.4 | 0.7 | 0.1 | 0.2 |
P 6 | 0.6 | 0.7 | 0.3 | 0.1 | 0.6 | 0.5 | 0.4 |
The S-Score table of (S, P).
U | e 1 | e 2 | e 3 | e 4 | e 5 | e 6 | e 7 |
---|---|---|---|---|---|---|---|
P 1 | −2 | −5 | 5 | −1 | −4 | 1 | 1 |
P 2 | −2 | −1 | 1 | −3 | 1 | −3 | −3 |
P 3 | 1 | −3 | 3 | 5 | −2 | 5 | 5 |
P 4 | 5 | 1 | −1 | 3 | −1 | −1 | 3 |
P 5 | −5 | 3 | −3 | 1 | 5 | −5 | −5 |
P 6 | 3 | 5 | −5 | −5 | 3 | 3 | −1 |
The reduced parameters.
U | e 2 | e 3 | ST |
---|---|---|---|
P 1 | −5 | 5 | 0 |
P 2 | −1 | 1 | 0 |
P 3 | −3 | 3 | 0 |
P 4 | 1 | −1 | 0 |
P 5 | 3 | −3 | 0 |
P 6 | 5 | −5 | 0 |
S-Score table of fuzzy soft set
U | E 1 | E 2 | … |
|
|
|
---|---|---|---|---|---|---|
P 1 |
|
|
… |
|
|
|
P 2 |
|
|
… |
|
|
|
… | … | … | … | … | … | … |
|
|
|
… |
|
|
|
|
|
|
… |
|
|
|
Fuzzy soft set
U | e 1 | e 2 | e 3 | e 4 | e 5 | e 6 | e 7 |
---|---|---|---|---|---|---|---|
P 1 | 0.18 | 0.82 | 0.45 | 0.45 | 0.15 | 0.55 | 0.85 |
P 2 | 0.54 | 0.75 | 0.70 | 0.25 | 0.45 | 0.35 | 0.55 |
P 3 | 0.62 | 0.50 | 0.30 | 0.75 | 0.15 | 0.85 | 0.85 |
P 4 | 0.85 | 0.45 | 0.78 | 0.60 | 0.35 | 0.45 | 0.80 |
P 5 | 0.89 | 0.32 | 0.89 | 0.50 | 0.80 | 0.25 | 0.25 |
P 6 | 0.18 | 0.82 | 0.45 | 0.45 | 0.15 | 0.55 | 0.85 |
The comparison and score table of
U | P 1 | P 2 | P 3 | P 4 | P 5 | Row-Sum
|
Column-Sum
|
Score
|
---|---|---|---|---|---|---|---|---|
P 1 | 7 | 4 | 4 | 3 | 3 | 21 | 23 | −2 |
P 2 | 3 | 7 | 3 | 2 | 3 | 18 | 24 | −6 |
P 3 | 5 | 4 | 7 | 4 | 4 | 24 | 20 | 4 |
P 4 | 4 | 5 | 3 | 7 | 4 | 23 | 19 | 4 |
P 5 | 4 | 4 | 3 | 3 | 7 | 21 | 21 | 0 |
The comparison and score table of
U | P 1 | P 2 | P 3 | P 4 | P 5 | Row-Sum
|
Column-Sum
|
Score
|
---|---|---|---|---|---|---|---|---|
P 1 | 4 | 2 | 1 | 2 | 2 | 11 | 11 | 0 |
P 2 | 2 | 4 | 2 | 2 | 2 | 12 | 12 | 0 |
P 3 | 1 | 2 | 4 | 2 | 2 | 11 | 11 | 0 |
P 4 | 2 | 2 | 2 | 4 | 2 | 12 | 12 | 0 |
P 5 | 2 | 2 | 2 | 2 | 4 | 12 | 12 | 0 |
The S-Score table of
U | e 1 | e 2 | e 5 | e 7 | S k |
---|---|---|---|---|---|
P 1 | −4 | 4 | −3 | 3 | 0 |
P 2 | −2 | 2 | 2 | −2 | 0 |
P 3 | 0 | 0 | −3 | 3 | 0 |
P 4 | 2 | −2 | 0 | 0 | 0 |
P 5 | 4 | −4 | 4 | −4 | 0 |
Fuzzy soft set
U |
|
|
---|---|---|
P 1 | −4 | 4 |
P 2 | −2 | 2 |
P 3 | 0 | 0 |
P 4 | 2 | −2 |
P 5 | 4 | −4 |
Fuzzy soft set
U | e 1 | e 2 | e 3 | e 4 | e 5 | e 6 | e 7 |
|
|
---|---|---|---|---|---|---|---|---|---|
P 1 | 0.18 | 0.82 | 0.45 | 0.45 | 0.15 | 0.55 | 0.85 | 0.70 | 0.20 |
P 2 | 0.54 | 0.75 | 0.70 | 0.25 | 0.45 | 0.35 | 0.55 | 0.35 | 0.80 |
P 3 | 0.62 | 0.50 | 0.30 | 0.75 | 0.15 | 0.85 | 0.85 | 0.80 | 0.15 |
P 4 | 0.85 | 0.45 | 0.78 | 0.60 | 0.35 | 0.45 | 0.80 | 0.60 | 0.50 |
P 5 | 0.89 | 0.32 | 0.89 | 0.50 | 0.80 | 0.25 | 0.25 | 0.40 | 0.60 |
The comparison and score table of
U | P 1 | P 2 | P 3 | P 4 | P 5 | Row-Sum
|
Column-Sum
|
Score
|
---|---|---|---|---|---|---|---|---|
P 1 | 9 | 5 | 5 | 4 | 4 | 32 | 34 | −2 |
P 2 | 5 | 9 | 5 | 4 | 5 | 29 | 35 | −6 |
P 3 | 7 | 6 | 9 | 6 | 6 | 35 | 31 | 4 |
P 4 | 6 | 7 | 5 | 9 | 6 | 34 | 30 | 4 |
P 5 | 6 | 6 | 5 | 5 | 9 | 32 | 32 | 0 |
The comparison and score table of
U | P 1 | P 2 | P 3 | P 4 | P 5 | Row-Sum
|
Column-Sum
|
Score
|
---|---|---|---|---|---|---|---|---|
P 1 | 2 | 1 | 1 | 1 | 1 | 6 | 6 | 0 |
P 2 | 1 | 2 | 1 | 1 | 1 | 6 | 6 | 0 |
P 3 | 1 | 1 | 2 | 1 | 1 | 6 | 6 | 0 |
P 4 | 1 | 1 | 1 | 2 | 1 | 6 | 6 | 0 |
P 5 | 1 | 1 | 1 | 1 | 2 | 6 | 6 | 0 |
The S-Score table of
U |
|
|
|
---|---|---|---|
P 1 | 2 | −2 | 0 |
P 2 | −4 | 4 | 0 |
P 3 | 4 | −4 | 0 |
P 4 | 0 | 0 | 0 |
P 5 | −2 | 2 | 0 |
Comparison results among the three algorithms for case 1.
Comparison | SNR | ISNR | Our Algorithm | Improvement vs. SNR/ISNR |
---|---|---|---|---|
Reduction result |
|
|
|
The same |
Flexibility and extendibility | Weak | Weak | Strong | Stronger |
Considering the added parameters | No | No | Yes | - |
The total number of element access on the original dataset | 575 | 150 | 100 | 82.6%/33.3% |
The total number of element access on the extended dataset | 300 | 100 | 50 | 83.3%/50% |
Fuzzy soft set of Evaluation System for Academic papers [
U | e 1 | e 2 | e 3 | e 4 | e 5 | e 6 | e 7 |
---|---|---|---|---|---|---|---|
P 1 | 0.05 | 0.09 | 0.12 | 0.16 | 0.05 | 0.95 | 0.44 |
P 2 | 0.17 | 0.95 | 0.95 | 0.95 | 0.23 | 0.75 | 0.95 |
P 3 | 0.22 | 0.40 | 0.11 | 0.61 | 0.23 | 0.55 | 0.05 |
P 4 | 0.72 | 0.43 | 0.08 | 0.50 | 0.05 | 0.15 | 0.18 |
P 5 | 0.95 | 0.05 | 0.05 | 0.05 | 0.95 | 0.05 | 0.63 |
Comparison result for case 2.
Algorithm Comparison | SNR | ISNR | Our Algorithm |
---|---|---|---|
Parameter reduction results |
|
|
|
Number of element access | 350 | 100 | 60 |
Summary of comparison results.
Comparison | SNR | ISNR | Our Algorithm |
---|---|---|---|
Reduction result |
|
|
|
Flexibility and extendibility | Weak | Weak | Strong |
Considering the added parameters | No | No | Yes |
Computational complexity |
References
1. Molodtsov, D. Soft set theory-first results. Comput. Math. Appl.; 1999; 37, pp. 19-31. [DOI: https://dx.doi.org/10.1016/S0898-1221(99)00056-5]
2. Han, B.; Li, Y.; Geng, S. 0–1 Linear programming methods for optimal normal and pseudo parameter reductions of soft sets. Appl. Soft Comput.; 2017; 54, pp. 467-484. [DOI: https://dx.doi.org/10.1016/j.asoc.2016.08.052]
3. Yang, Y.; Tan, X.; Meng, C. The multi-fuzzy soft set and its application in decision making. Appl. Math. Model.; 2013; 37, pp. 4915-4923. [DOI: https://dx.doi.org/10.1016/j.apm.2012.10.015]
4. Agarwal, M.; Biswas, K.K.; Hanmandlu, M. Generalized intuitionistic fuzzy soft sets with applications in decision-making. Appl. Soft Comput.; 2013; 13, pp. 3552-3566. [DOI: https://dx.doi.org/10.1016/j.asoc.2013.03.015]
5. Ma, X.; Qin, H.; Sulaiman, N.; Herawan, T.; Abawajy, J. The parameter reduction of the interval-valued fuzzy soft sets and its related algorithms. IEEE Trans. Fuzzy Syst.; 2014; 22, pp. 57-71. [DOI: https://dx.doi.org/10.1109/TFUZZ.2013.2246571]
6. Jiang, Y.; Tang, Y.; Chen, Q.; Liu, H.; Tang, J. Interval-valued intuitionistic fuzzy soft sets and their properties. Comput. Math. Appl.; 2010; 60, pp. 906-918. [DOI: https://dx.doi.org/10.1016/j.camwa.2010.05.036]
7. Ma, X.; Qin, H.J.; Abawajy, J. Interval-valued intuitionistic fuzzy soft sets based decision making and parameter reduction. IEEE Trans. Fuzzy Syst.; 2022; 30, pp. 357-369. [DOI: https://dx.doi.org/10.1109/TFUZZ.2020.3039335]
8. Vijayabalaji, S.; Ramesh, A. Belief interval-valued soft set. Expert Syst. Appl.; 2019; 119, pp. 262-271. [DOI: https://dx.doi.org/10.1016/j.eswa.2018.10.054]
9. Aggarwal, M. Confidence soft sets and applications in supplier selection. Comput. Ind. Eng.; 2019; 127, pp. 614-624. [DOI: https://dx.doi.org/10.1016/j.cie.2018.11.005]
10. Sun, B.; Ma, W.; Li, X. Linguistic value soft set-based approach to multiple criteria group decision-making. Appl. Soft Comput.; 2017; 58, pp. 285-296. [DOI: https://dx.doi.org/10.1016/j.asoc.2017.03.033]
11. Alcantud, J.C.R.; Mathew, T.J. Separable fuzzy soft sets and decision making with positive and negative attributes. Appl. Soft Comput.; 2017; 59, pp. 586-595. [DOI: https://dx.doi.org/10.1016/j.asoc.2017.06.010]
12. Arora, R.; Garg, H. A robust correlation coefficient measure of dual hesitant fuzzy soft sets and their application in decision making. Eng. Appl. Artif. Intell.; 2018; 72, pp. 80-92. [DOI: https://dx.doi.org/10.1016/j.engappai.2018.03.019]
13. Zhan, J.; Irfan Ali, M.; Mehmood, N. On a novel uncertain soft set model: Z-soft fuzzy rough set model and corresponding decision making methods. Appl. Soft Comput.; 2017; 56, pp. 446-457. [DOI: https://dx.doi.org/10.1016/j.asoc.2017.03.038]
14. Memis, S.; Enginoglu, S.; Erkan, U. Numerical Data Classification via Distance-Based Similarity Measures of Fuzzy Parameterized Fuzzy Soft Matrices. IEEE Access; 2021; 9, pp. 88583-88601. [DOI: https://dx.doi.org/10.1109/ACCESS.2021.3089849]
15. Memi, S.; Enginolu, S.; Erkan, U. A Classification Method in Machine Learning Based on Soft Decision-Making via Fuzzy Parameterized Fuzzy Soft Matrices. Soft Comput.; 2021; 26, pp. 1165-1180. [DOI: https://dx.doi.org/10.1007/s00500-021-06553-z]
16. Memiş, S.; Enginoğlu, S.; Erkan, U. Fuzzy Parameterized Fuzzy Soft k-Nearest Neighbor Classifier. Neurocomputing; 2022; 500, pp. 351-378. [DOI: https://dx.doi.org/10.1016/j.neucom.2022.05.041]
17. Memiş, S.; Enginoğlu, S.; Erkan, U. A New Classification Method Using Soft Decision-Making Based on an Aggregation Operator of Fuzzy Parameterized Fuzzy Soft Matrices. Turk. J. Electr. Eng. Comput. Sci.; 2022; 3, pp. 871-890. [DOI: https://dx.doi.org/10.55730/1300-0632.3816]
18. Ghous, A.; Muhammad, A.; Muhammad, A.; Adeel, S. Attribute reduction approaches under interval-valued q-rung orthopair fuzzy soft framework. Appl. Intell.; 2022; 52, pp. 8975-9000.
19. Akram, M.; Ali, G.; Alcantud, J.C.R. Parameter reduction analysis under interval-valued m-polar fuzzy soft information. Artif. Intell. Rev.; 2021; 54, pp. 5541-5582. [DOI: https://dx.doi.org/10.1007/s10462-021-10027-x]
20. Zhan, J.; Liu, Q.; Herawan, T. A novel soft rough set: Soft rough hemirings and corresponding multicriteria group decision making. Appl. Soft Comput.; 2017; 54, pp. 392-402. [DOI: https://dx.doi.org/10.1016/j.asoc.2016.09.012]
21. Maji, P.; Biswas, K.R.; Roy, A.R. Fuzzy soft sets. J. Fuzzy Math.; 2001; 9, pp. 589-602.
22. Majumdar, P.; Samanta, S.K. Generalized fuzzy soft sets. Comput. Math. Appl.; 2010; 59, pp. 1425-1432. [DOI: https://dx.doi.org/10.1016/j.camwa.2009.12.006]
23. Sadiq, M.; Devi, V. Fuzzy-soft set approach for ranking the functional requirements of software. Expert Syst. Appl.; 2022; 193, 116452. [DOI: https://dx.doi.org/10.1016/j.eswa.2021.116452]
24. Li, H.; Xiong, S. Time-varying weight coefficients determination based on fuzzy soft set in combined prediction model for travel time. Expert Syst. Appl.; 2021; 189, 115198. [DOI: https://dx.doi.org/10.1016/j.eswa.2021.115998]
25. Dede, R.; Noor, A.S.; Mustafa, M.D. Association rules of fuzzy soft set based classification for text classification problem. J. King Saud Univ.-Comput. Inf. Sci.; 2020; 34, pp. 801-812.
26. Maji, P.; Roy, A.R. A fuzzy soft set theoretic approach to decision making problems. J. Comput. Appl. Math.; 2007; 203, pp. 412-418.
27. Kong, Z.; Gao, L.; Wang, L. Comment on ‘a fuzzy soft set theoretic approach to decision making problems’. J. Comput. Appl. Math.; 2009; 223, pp. 540-542. [DOI: https://dx.doi.org/10.1016/j.cam.2008.01.011]
28. Çağman, N.; Enginoğlu, S.; Çıtak, F. Fuzzy Soft Set Theory and Its Applications. Iran. J. Fuzzy Syst.; 2011; 8, pp. 137-147.
29. Feng, F.; Jun, Y.; Liu, X.; Li, L. An adjustable approach to fuzzy soft set based decision making. J. Comput. Appl. Math.; 2010; 234, pp. 10-20. [DOI: https://dx.doi.org/10.1016/j.cam.2009.11.055]
30. Çağman, N.; Enginoğlu, S. Fuzzy Soft Matrix Theory and Its Application in Decision Making. Iran. J. Fuzzy Syst.; 2012; 9, pp. 109-119.
31. Carlos, J.; Alcantud, R. A novel algorithm for fuzzy soft set based decision making from multiobserver input parameter data set. Inf. Fusion; 2016; 29, pp. 142-148.
32. Tang, H. A novel fuzzy soft set approach in decision making based on grey relational analysis and Dempster–Shafer theory of evidence. Appl. Soft Comput.; 2015; 31, pp. 317-325. [DOI: https://dx.doi.org/10.1016/j.asoc.2015.03.015]
33. Deng, T.; Wang, X. An object-parameter approach to predicting unknown data in incomplete fuzzy soft sets. Appl. Math. Model.; 2013; 37, pp. 4139-4146. [DOI: https://dx.doi.org/10.1016/j.apm.2012.09.010]
34. Wang, J.; Hu, Y.; Xiao, F.; Deng, X.; Deng, Y. A novel method to use fuzzy soft sets in decision making based on ambiguity measure and Dempster–Shafer theory of evidence: An application in medical diagnosis. Artif. Intell. Med.; 2016; 69, pp. 1-11. [DOI: https://dx.doi.org/10.1016/j.artmed.2016.04.004] [PubMed: https://www.ncbi.nlm.nih.gov/pubmed/27235800]
35. Biswajit, B.; Swarup, K.G.; Siddhartha, B.; Jan, P.; Vaclav, S.; Amlan, C. Chest X-ray enhancement to interpret pneumonia malformation based on fuzzy soft set and Dempster–Shafer theory of evidence. Appl. Soft Comput.; 2020; 86, 105889.
36. Chen, W.; Zou, Y. Group decision making under generalized fuzzy soft sets and limited cognition of decision makers. Eng. Appl. Artif. Intell.; 2020; 87, 103344. [DOI: https://dx.doi.org/10.1016/j.engappai.2019.103344]
37. Akram, M.; Ali, G.; Alcantud, J.C.R. Attributes reduction algorithms for m-polar fuzzy relation decision systems. Int. J. Approx. Reason.; 2022; 140, pp. 232-254. [DOI: https://dx.doi.org/10.1016/j.ijar.2021.10.005]
38. Akram, M.; Ali, G.; Alcantud, J.C.R.; Fatimah, F. Parameter reductions in N-soft sets and their applications in decision-making. Expert Syst.; 2021; 38, e12601. [DOI: https://dx.doi.org/10.1111/exsy.12601]
39. Kong, Z.; Gao, L.; Wang, L.; Li, S. The normal parameter reduction of soft sets and its algorithm. Comput. Math. Appl.; 2008; 56, pp. 3029-3037. [DOI: https://dx.doi.org/10.1016/j.camwa.2008.07.013]
40. Ma, X.; Qin, H. A distance-based parameter reduction algorithm of fuzzy soft sets. IEEE Access; 2018; 6, pp. 10530-10539. [DOI: https://dx.doi.org/10.1109/ACCESS.2018.2800017]
41. Kong, Z.; Ai, J.; Wang, L.; Li, P.; Ma, L.; Lu, F. New normal parameter reduction method in fuzzy soft set theory. IEEE Access; 2018; 7, pp. 2986-2998. [DOI: https://dx.doi.org/10.1109/ACCESS.2018.2888878]
42. Ma, X.; Fei, Q.; Qin, H.; Zhou, X.; Li, H. New Improved Normal Parameter Reduction Method for Fuzzy Soft Set. IEEE Access; 2019; 7, pp. 154912-154921. [DOI: https://dx.doi.org/10.1109/ACCESS.2019.2949142]
43. Zadeh, L.A. Fuzzy sets. Inf. Control.; 1965; 8, pp. 338-353. [DOI: https://dx.doi.org/10.1016/S0019-9958(65)90241-X]
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.
Abstract
A fuzzy soft set is a mathematical tool used to deal with vagueness and uncertainty. Parameter reduction is an important issue when applying a fuzzy soft set to handle decision making. However, existing methods neglect newly added parameters and have higher computational complexities. In this paper, we propose a new S-Score table-based parameter-reduction approach for fuzzy soft sets. Compared with two existing methods of parameter reduction for a fuzzy soft set, our method takes newly added parameters into account, which brings about greater flexibility and is beneficial to the extension of fuzzy soft sets and a combination of multiple fuzzy soft sets. Additionally, our method accesses fewer elements from the dataset, which results in lower computation compared with the two existing approaches. The experimental results from two applications show the availability and feasibility of our approach.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer