2023年全國碩士研究生考試考研英語一試題真題(含答案詳解+作文范文)_第1頁
已閱讀1頁,還剩23頁未讀 繼續(xù)免費閱讀

下載本文檔

版權(quán)說明:本文檔由用戶提供并上傳,收益歸屬內(nèi)容提供方,若內(nèi)容存在侵權(quán),請進行舉報或認領(lǐng)

文檔簡介

1、<p><b>  附錄</b></p><p><b>  附錄1</b></p><p>  An Improved Rough Set Approach to Design of Gating Scheme for Injection Moulding</p><p>  F. Shi,1 Z. L. Lo

2、u,1 J. G. Lu2 and Y. Q. Zhang1 1Department of Plasticity Engineering, Shanghai Jiaotong University, P. R. China; and 2Center of CAD, Nanjing University of Chemical Technology, P. R. China</p><p>  The gate i

3、s one of the most important functional structures in an injection mould, as it has a direct influence on the quality of the injection products. The design of a gating scheme includes the selectionof the types of gate and

4、 calculation of the sizes and determination of the location, which depends heavily on prior experience and knowledge and involves a trial-and-error process. Due to the vagueness and uncertainty in the design of a gating

5、scheme, classical rough set theory is not effectiv</p><p>  Keywords: Fuzzy rough set; Gating scheme; Injection mold; </p><p>  Intelligent design; Knowledge acquisition</p><p>  1.

6、 Introduction </p><p>  The manufacturing industry for plastic products has been growing rapidly in recent years, and plastics are used widely to substitute for metals. The injection moulding process is the

7、most popular moulding process for making thermoplastic parts. The feeding system, which is one of the important functional structures, comprises a sprue, a primary runner, a secondary runner and a gate. The molten plasti

8、c flows from the machine nozzle through the sprue and runner system and into the cavities through</p><p>  The design of a gate includes the selection of the type of gate, calculation of the size and determi

9、nation of the location. And the design of a gate is based on the experience and knowledge of the designers. The determinations of the location and sizes are made based on a trial-and-error process. In recent years, a fea

10、ture-modelling environment and intelligent technology have been introduced for gate design. Lee and Kim investigated gate locations using the evaluation criteria of warpage, weld l</p><p>  Deng used ID3 and

11、 its modified algorithms to generate the rule set for the selection of the gate types [7]. However, there are many fuzzy or vague attributes in the selection of the types, such as the attribute of loss of pressure that h

12、as two fuzzy linguistic variables i.e. can be high and must be low. The ID3 algorithms cannot deal with fuzzy or “noise” information efficiently. It is also difficult to control the size of the decision tree extracted by

13、 the algorithms and sometimes very large tr</p><p>  Rough set theory provides a new mathematical approach to vague and uncertain data analysis [9,10]. This paper introduces the theory of rough sets for the

14、design of a gating scheme. The selection of the type of gate is based on the theory of rough sets. Considering the limitations of rough sets, this paper proposes an improved approach based on rough set theory for the des

15、ign of the gating scheme. The improved rough set approach to the scheme design will be given first. A fuzzy rough-set-based in</p><p>  Table 1. Classi?cation criteria.</p><p>  2. A Rough Set A

16、pproach to Gating Scheme Design </p><p>  2.1 Design of the Gating Scheme </p><p>  The model of the gating scheme design can be described as follows. A decision table with 4-tuples can be repre

17、sented as T = (U, C, D, T). where U is the universe. C = {C1, C2, …, Ck} is the set of condition attributes, each of which measures some important feature of an object in the universe U. T(Ck) = { Tk 1,T2k ,...,TkSk} is

18、the set of discrete linguistic terms. In other words, T(Ck)is the value set of the condition attributes. D = {D1, D2, …, Dl} is the set of decision attributes, that is, e</p><p>  Generally, the condition at

19、tributes can be classified as five sets, including style of plastic parts, number of cavities, loss of pressure, condition of separating gate from parts and machine performance. The details of the five condition attribut

20、es and corresponding variables of the fuzzy linguistic are shown in Table 1.</p><p>  From the table, it can be seen that most of the attributes are vague since they represent a human perception and desire.

21、For instance, shell, tube and ring are selected for the classification of plastic parts and their fuzzy linguistic values are “deep”, “middle” and “shallow”, respectively. For the attribute loss of pressure, “can be high

22、” and “must be low” are selected to approximate the fuzzy attribute.</p><p>  A fuzzy rule for gating scheme design can be written in the following form:</p><p>  IF (C1 is T1 i1) AND … (Ck is T

23、ik) THEN (DisDj) (1) where Tkik is the linguistic term of condition attribute Ck, and Dj is a class term of the decision attribute D. </p><p>  Fuzzy rules with the form of Eq. (1) are used to

24、 perform min-max fuzzy inference. Let ck be the membership value of an object in Tk and d be the forecast value of Dj, where d = ik min(ck) and min is the minimum operator. If two or more rules have the sameconclusion, t

25、he conclusion with the largest value of d, which is also named the certainty factor is chosen. </p><p>  For the problem of the gating schemedesign, a fuzzy design rule can be described as follows.</p>

26、<p>  IF (Type of plastic part = middle shell) </p><p>  AND (Number of cavities = single) </p><p>  AND (Condition of separating gate from part = not request especially)(2) </p>&

27、lt;p>  THEN (Gating scheme = straight gate) </p><p>  CF = 0.825 </p><p>  From the above rule, the gating scheme of the straight gate will be selected is s with a certainty factor of 0.825,

28、if the type of part is middle shell and the number of cavities is single and the condition of separating gate from part is not required. The above is just like human language and is easy to understand.</p><p&g

29、t;  2.2 Basic Concepts of Rough Sets </p><p>  In recent years, the rough set (RS) theory, proposed by Pawlak, has been attracting the attention of the researchers. The basic idea of RS is to classify the ob

30、jects of interest into similarity classes (equivalent classes) containing indiscernible objects via the analysis of attribute dependency and attribute reduction. The rule induction from the original data model is data-dr

31、iven without any additional assumptions. Rough sets have been applied in medical diagnosis, pattern recognition, machine</p><p>  A decision table with a 4-tuple can be represented as T = <U, A, V, f>,

32、 where U is the universe, , C and D are the sets of condition and decision attributes, respectively, V is the value set of the attribute a in A, and f is an information function.</p><p>  Assuming a subse

33、t of the set of attributes, two objects x and y in U are indiscernible with respect to P if and only if , .The indiscernibility relation is written as IND(P). U/IND(P) is used to denote the partition of U given the indis

34、cernibility relation IND(P).</p><p>  A rough set approximates traditional sets by a pair of sets, which are the lower and the upper approximations of the sets. The lower and upper approximations of a set Y

35、. U given an equivalence relation IND(P) are defined as follows: </p><p>  The definition of the lower approximation of a set involves an inclusion relation whereby the objects in an equivalence class of the

36、 attributes are entirely contained in the equivalence class for the decision category. This is the case of a perfect or unambiguous classification. For the upper approximation, the objects are possibly classified using t

37、he information in attribute set P.</p><p>  Attribute reduction is important for rough set theory. Based on the above definitions, the concept of reduction, denoted by RED(P), is defined as follows: Q . P is

38、 a reduction of P if and only if IND(P)=IND(Q).</p><p>  2.3 An Improved Rough Set Approach </p><p>  In the design of the gating scheme, it is crucial to acquire the fuzzy rules efficiently. Kn

39、owledge acquisition is the bottleneck. A rough set is applied to solve the problem for the design of the gating scheme. The block diagram for the design of the gating scheme with the rough set is shown in Fig. 1. The cas

40、e library is obtained from the experience and knowledge of experts and some reference books. A rough-set-based inductive learning algorithm is adopted to identify the hidden patterns and r</p><p>  The knowl

41、edge is represented as a set of fuzzy “if–Then” rules. During the design stage, the system employs the fuzzy rules to perform fuzzy inference according to the design requirements. Then the appropriate gating scheme can b

42、e obtained.</p><p>  Although the rough set is efficient for knowledge acquisition, there are some limitations for the application of the original rough set in the selection of the gating scheme.</p>

43、<p>  1. The original rough set is efficient for problems with discrete attributes, but it cannot deal with the fuzzy attributes efficiently. For fuzzy attributes, the traditional decision table is normally transfor

44、med into a binary table by obtaining Fig. 1. Block diagram of the gating scheme design with RS. the -cut set of the fuzzy set. Obviously, there is no crisp boundary between the fuzzy attributes.</p><p>  2.

45、 The original rough set is based on the indiscernibility relation. The universe is classified into a set of equivalent classes with the indiscernibility relation. The lower and upper approximations are generated in terms

46、 of the equivalent classes. In practice, the original rough set classifies the knowledge too fussily, which leads to the complexity of the problem. </p><p>  The fuzzy set and rough set theories are generali

47、sations of classical set theory for modelling vagueness and uncertainty. Pawlak and Dubois proposed that the two theories were not competitive but complementary [11,16]. Both of the theories are usually applied to model

48、different types of uncertainty. The rough set theory takes into consideration the indiscernibility between objects, whereas the fuzzy set theory deals with the ill-definition of the boundary of a class through the member

49、ship functi</p><p>  A fuzzy rough set model is presented based on the extension of the classical rough set theory. The continuous attributes are fuzzified with the proper fuzzy membership functions. The ind

50、iscernibility relation is generalised to the fuzzy similarity relation. An inductive learning algorithm based on fuzzy rough set model (FRILA) is then proposed. The fuzzy design rules are extracted by the proposed FRILA.

51、 The gate design scheme is then obtained after fuzzy inference. The detailed implementation wi</p><p>  Fig. 1. Block diagram of the gating scheme design with RS.</p><p>  3. Implementation of F

52、RILA </p><p>  A fuzzy rough-set-based inductive learning algorithm consists of three steps. These steps are the fuzzification of the attributes, attribute reduction based on the fuzzy similarity relation an

53、d fuzzy rule induction. </p><p>  3.1 Fuzzifying the Attributes </p><p>  Generally, there are some fuzzy attributes in the decision table, such as loss of pressure. These attributes should be f

54、uzzified into linguistic terms, such as high, average and low. In other words, each attribute a is fuzzified into k linguistic values Ti, i = 1, …, k. The membership function of Ti can be subjectively assigned or transfe

55、rred from numerical values by a membership function. A triangular membership function is shown in Fig. 2, where (x) is membership value and x is attribute value</p><p>  Step 1. Calculate normal similarity r

56、elation matrix R.. in terms </p><p>  of de.nition 3. </p><p>  Step 2. Select ,and let and. </p><p><b>  Step 3. </b></p><p>  Step 4. If and , then X .

57、X . {xj}, Y . Y{xj}.; </p><p>  Step 5. . </p><p>  Step 6. If j < n, then GOTO Step 4; otherwise, GOTO next step. </p><p>  Step 7. If card(Y) . 1, then and {xi}, GOTO Step 3; o

58、therwise,GOTO next step. </p><p>  Step 8. Output the set X and let </p><p>  Step 9 If , then end; otherwise, GOTO Step 2. </p><p>  In step 7, card (Y) denotes the cardinality of

59、set Y. </p><p>  According to the algorithm, U/IND(R~ {.i}), the partition is calculated given the attribute ai . A with the level value .i. The partition of U given attribute set A with level value set can

60、be de.ned as follows: </p><p>  where A and . are the attribute set and the level value set, respectively, and operator . is de.ned as follows:</p><p>  Considering a subset X C U and a fuzzy si

61、milarity relation R. de.ned on U, the lower approximation of X, denoted by R..(X), and the upper approximation of X, denoted by R. .(X), are respectively de.ned as follows: </p><p>  Assuming U/IND(R. ) and

62、Y are two partitions on U, where U/IND(R.) = {X1, X2, …, Xk} and Y = {Y1, Y2, …, Yr}, the positive region POS. C(Y)isde.ned as follows:.</p><p>  The amount of data is normally very large and there is a lot

63、of redundant information. Attribute reduction can remove the redundant or noise information successfully. In the attribute reduction, the attribute reduction set is not single. The cardi-nality of the reduction set deter

64、mines the dimensionality of problem, so it is important to select a minimal reduction. The minimal reduction can be de.ned as follows:</p><p>  Assuming a subset C′. C and C is the attribute set, C′ is the m

65、inimal reduction, if and only if C′ is characterised by following two properties.</p><p>  In order to construct the fuzzy similarity relation, the measurement of the fuzzy similarity relation should be intr

66、oduced first. </p><p>  Generally, the max–min method, the relational factor method and the Minkowski distance-based closeness degree method are used to calculate the factor rij. Considering R. is a fuzzy si

67、milarity matrix and . is the level value, the matrix R. . is called normal similarity relation matrix after the following operation. </p><p>  The matrix R has the properties of reflexivity and the symmetriv

68、ity. In order to obtain the partition of U given the fuzzy similarity relation R an algorithm is given as follows. </p><p>  Input: fuzzy similarity matrix R. and level value . Output: U/IND(R), which is a p

69、artition of U given fuzzy similarity relation R. and level value. </p><p>  Calculate normal similarity relation matrix R. in terms of definition 3. </p><p>  Step 2. Select xj. U,and let X and

70、Y. </p><p>  Step 3. j. 0. </p><p>  Step 4. If rij = 1 and xj . </p><p>  X, then X . X . xj}, Y . Y{xj}.; </p><p>  Step 5. j . j + 1. </p><p>  Step 6.

71、If j < n, then GOTO Step 4; otherwise, GOTO next step. </p><p>  Step 7. If card(Y) . 1, then select xi . Y and Y . Y . {xi}, GOTO Step 3; otherwise, GOTO next step. </p><p>  Step 8. Output

72、the set X and let U . U . X. Step 9 If U = , then end; otherwise, GOTO Step 3.2 </p><p>  In step 7, card (Y) denotes the cardinality of set Y. {a}</p><p>  According to the algorithm, U/IND(R~

73、{i}), the partition is calculated given the attribute ai . A with the level value i. The partition of U given attribute set A with level value set can be defined as follows: U/IND(R. A) =. {U/IND(R~ {{a} i}): ai . A,

74、 i . } (3) where A and . are the attribute set and the level value set, respectively, and operator . is defined as follows: </p><p>  Considering a subset X C U and a fuzzy similarity relation .AR. defined

75、 on U, the lower approximation of X, denoted by (X), and the upper approximation of X, denoted by R. (X), are respectively defined as follows: . (X) = {Y:Y. U/IND(R), Y . X) (5) .A .AR(X) = {Y:Y. U/IND(R), Y . X (6)

76、CAssuming U/IND(R) and Y are two partitions on U, where .CU/IND(R) = {X1, X2, …, Xk} and Y = {Y1, Y2, …, Yr}, the positive region POS. </p><p>  The amount of data is normally very large and there is a lot

77、of redundant information. Attribute reduction can remove the redundant or noise information successfully. In the attribute reduction, the attribute reduction set is not single. The cardinality of the reduction set determ

78、ines the dimensionality of problem, so it is important to select a minimal reduction. The minimal reduction can be defined as follows: </p><p>  Assuming a subset C′. C and C is the attribute set, C′ is the

79、minimal reduction, if and only if C′ is characterised by following two properties: </p><p>  Assuming a condition attribute set C and a decision attribute set D, the degree of dependency of C on D, denoted b

80、y where card(X) denotes the cardinality of set X and 0 . According to the definition of the degree of dependency, the attribute significance for every attribute a . C . R can be defined as follows. </p><p>

81、  In order to obtain the minimal reduction, a hierarchy attribute reduction algorithm is proposed as follows. Step 2. Compute the attribute significance SIG(x, R, D) for Step 3. Select the attribute x with the highest va

82、lue SIG. </p><p>  The computational complexity of the algorithm is O(m2), where m is the number of the algorithm,attribute reduction can be treated as a tree traversal. Each node of the tree represents the

83、condition attribute Calculating the minimal reduction can be transformed to picking the best based on some heuristic information. The operator can reduce the computation by using refrom .</p><p>  3.3 Fuzzy

84、Rules Induction </p><p>  Based on the above fuzzy rough set model, the rule inductive learning algorithm is proposed, and is described as follows. </p><p>  1. Fuzzify the attributes and comput

85、e the fuzzy distribution of them. </p><p>  2. Calculate the fuzzy similarity matrix for every attribute. </p><p>  3. Calculate the fuzzy partition U/IND(R. ) given the fuzzy similarity relatio

86、n R. with the value set . based on Algorithm 1. </p><p>  4. Calculate the minimal attribute reduction based on Algorithm </p><p>  5. Calculate the attribute core of the condition attribute wit

87、h respect to the decision attribute and obtain the minimal reduction of the condition attribute, then delete the redundant objects. </p><p>  6. For every object, calculate the value core of the condition at

88、tribute, and then delete the redundant attribute values and objects. </p><p>  7. Delete the same objects in decision table and translate the decision rules. </p><p>  4. A Case Study </p>

89、<p>  In order to evaluate the effectiveness of the proposed method, an example shown in Fig. 3 is chosen in this section. The design requirements are given as follows: </p><p>  Part style: middle sh

90、ell </p><p>  Number of cavities: single Loss of pressure: may be high Condition of separating gate from parts: must be easy Machining performance: must be easy Part material: ABS </p><p>  4.1

91、Fuzzy Knowledge Acquisition Eliciting knowledge from any source of data is notoriously difficult, so knowledge acquisition is a bottleneck. There are five condition attributes for the gating scheme design as shown in Tab

92、le 1. The attribute number of cavities has no fuzziness for its value is either “single” or “multiple”, so it is represented as {0, 1}, and the other four attributes are fuzzy ones and are represented by membership funct

93、ions. The decision attribute of the gating scheme has ni</p><p>  Fig. 3. An example part. </p><p>  Rough Set Approach to Gating Scheme for Injection Moulding 667 by d, is 0.8. Fourth, Calcula

94、te the attribute reduction so there is no redundant attribute. Finally, 22 fuzzy rules with the form of Eq. (2) are obtained. </p><p>  According to the different level values, the different number of fuzzy

95、rules can be obtained. In practice, it is shown that the value of d has the largest effect on the number of rules. </p><p>  If the level values of condition attributes are given as follows: the value of d

96、 and the number of rules (num) is obtained, and shown in Fig. 4. </p><p>  4.2 Discussion </p><p>  As stated previously, the different number of fuzzy rules can be obtained in terms of the leve

97、l values. In reference [7], the D3 algorithm and ID3-like algorithms are used to extract rules. However, the algorithms tend to involve more attributes than FRILA for the hierarchical structure of its output decision rul

98、es. In other words, the rules induced by the ID3-like algorithms have redundant attributes and are not more concise than the rules induced by FRILA. In the gating scheme design, on one h</p><p>  It is seen

99、from Table 2 that higher . may lead to a bigger rule set with higher accuracy rate; moreover, when the accuracy rate is 100%, the number of rules induced by FRILA is fewer than that induced by ID3-like algorithms. Theref

溫馨提示

  • 1. 本站所有資源如無特殊說明,都需要本地電腦安裝OFFICE2007和PDF閱讀器。圖紙軟件為CAD,CAXA,PROE,UG,SolidWorks等.壓縮文件請下載最新的WinRAR軟件解壓。
  • 2. 本站的文檔不包含任何第三方提供的附件圖紙等,如果需要附件,請聯(lián)系上傳者。文件的所有權(quán)益歸上傳用戶所有。
  • 3. 本站RAR壓縮包中若帶圖紙,網(wǎng)頁內(nèi)容里面會有圖紙預(yù)覽,若沒有圖紙預(yù)覽就沒有圖紙。
  • 4. 未經(jīng)權(quán)益所有人同意不得將文件中的內(nèi)容挪作商業(yè)或盈利用途。
  • 5. 眾賞文庫僅提供信息存儲空間,僅對用戶上傳內(nèi)容的表現(xiàn)方式做保護處理,對用戶上傳分享的文檔內(nèi)容本身不做任何修改或編輯,并不能對任何下載內(nèi)容負責(zé)。
  • 6. 下載文件中如有侵權(quán)或不適當(dāng)內(nèi)容,請與我們聯(lián)系,我們立即糾正。
  • 7. 本站不保證下載資源的準(zhǔn)確性、安全性和完整性, 同時也不承擔(dān)用戶因使用這些下載資源對自己和他人造成任何形式的傷害或損失。

評論

0/150

提交評論