您好,欢迎来到华佗小知识。
搜索
您的当前位置:首页Discovering Representation Space Transformations for Learning Concept Descriptions Combinin

Discovering Representation Space Transformations for Learning Concept Descriptions Combinin

来源:华佗小知识
Discovering Representation Space Transformations for

Learning Concept Descriptions Combining DNF and M-of-N Rules

Janusz Wnek and Ryszard S. Michalski

George Mason University4400 University Dr.Fairfax, VA 22030

{wnek, michalski}@aic.gmu.edu

Abstract

This paper addresses a class of learning problemsthat require a construction of descriptions thatcombine both M-of-N rules and traditionalDisjunctive Normal form (DNF) rules. Thepresented method learns such descriptions, whichwe call conditional M-of-N rules, using thehypothesis-driven constructive inductionapproach. In this approach, the representationspace is modified according to patterns discoveredin the iteratively generated hypotheses. The needfor the M-of-N rules is detected by observing\"exclusive-or\" or \"equivalence\" patterns in thehypotheses. These patterns indicate symmetryrelations among pairs of attributes. Symmetricalattributes are combined into maximal symmetryclasses. For each symmetry class, the methodconstructs a \"counting attribute\" that adds a newdimension to the representation space. The searchfor hypothesis in iteratively modifiedrepresentation spaces is done by the standard AQinductive rule learning algorithm. It is shownthat the proposed method is capable of solvingproblems that would be very difficult to tackle byany of the traditional symbolic learning methods.

This paper concerns a class of learning problems thatcannot be satisfactorily solved by symbolic methods thatconstruct DNF-type descriptions using original attributes,because such descriptions would be prohibitively long.Specifically, such problems require a construction ofdescriptions that involve \"counting properties\" (e.g., thatM properties out of N possible properties are present in anobject), which may be additionally combined with logicalconditions (e.g., in the DNF form). Problems of this typeoccur in many real-world problems (e.g., Spackman,1988; Towell & Shavlik, 1994).

The proposed solution is based on the application of anew type of constructive induction rule, \"countingattribute generation rule,\" which explores an attributesymmetry in generated hypotheses. Such a symmetry isindicated by the presence of the \"exclusive-or\" or\"equivalence\" patterns in the hypothesis.

It is shown that M-of-N concepts are easy to detect indescriptions generated by the AQ learning system, becausethey form exclusive-or patterns (XOR-patterns). The XOR-pattern characterizes relationship between two attributes,in a way that only one of the two binary attributes can bepresent. This translates to exactly 1-of-2, which is aspecific form of M-of-N concepts. Therefore, the presenceof an XOR (or non-equivalence) relation among two ormore binary attributes may indicate that counting them(i.e., how many of them are true) is likely to produce anew relevant descriptor. On the other hand, the presence ofthe equivalence symmetry (EQ) among two or moreattributes may indicate that they can be replaced by onlyone attribute.

By observing the relationships among attributes, theproposed method combines ideas of logic and arithmetic.The introduced \"counting attribute\" captures a conceptualtransition from logic to arithmetic.

This work was inspired by the failure of well-knownsymbolic learning systems in solving the MONK2problem used in an international competition of learningprograms (Thrun et al, 1991). The MONK2 problem was

1 INTRODUCTION

Constructive induction (CI) can be viewed as a process ofconducting two intertwined searches: one–for the mostappropriate representation space, the second—for the\"best\" inductive hypothesis in the space. Search for therepresentation space involves applying constructiveinduction operators (\"CI operators\") that modify therepresentation space by generating new dimensions,removing dimensions, or changing their quantizationlevel. Since there is no limit on what operators can beapplied, the search is very large. Therefore, a centralresearch issue in constructive induction is the developmentof rules and heuristics for applying CI operators.

to learn the concept, \"exactly two of the six attributeshave their first value,\" which is a special case of the M-of-N concept.

There have been several efforts concerned with learning M-of-N concepts. For example, the system CRLS learns M-of-N rules by employing non-equivalence symmetry biasand criteria tables (Spackman, 1988), ID-2-of-3incorporates M-of-N tests in decision trees (Murphy &Pazzani, 1991), AQ17-DCI (Bloedorn & Michalski, 1991)employs a variety of operators to construct new attributes,NEITHER-MofN (Baffes & Mooney, 1993) is able to refineM-of-N rules by increasing or decreasing either of M or N.The idea of \"counting attributes\" and \"counting argumentsrule\" is related to research on detecting symmetry inBoolean functions of many variables (Michalski, 1969;1983), and implementation of SYM programs (Jensen,1975). More recently, Fawcett and Utgoff (1991) usedcounting attributes to expand representation space bysumming up the number of distinct values of the set ofvariables that can satisfy a set of conditions. So modifiedrepresentation allows expressing a \"number of ways\" inwhich conditions can be satisfied.

Callan and Utgoff (1991) used the counting argumentsrule to create a numeric function from a Booleanexpression that begins with a universal quantifier. Thefunction calculates the percentage of permutations ofvariable bindings satisfying the Boolean expression. Suchfunction is useful because it indicates a degree to which asubgoal is satisfied in a given state.

2 CLASS-PATTERNS

A hypothesis-driven constructive induction (AQ-HCI)method combines an inductive rule learning algorithm(Aq) with a procedure for iteratively transformingrepresentation space. The method is based on detectingpatterns in the hypotheses generated by the learningsystem in one iteration, and then using them fortransforming the representation space for the nextiteration. A pattern is a component of a description in agiven knowledge representation language that accounts fora significant number of training examples. Thesignificance is defined by a user specified threshold.Our earlier work on AQ-HCI method involved search forthree types of patterns, value-patterns, condition-patterns,and rule-patterns. Value-patterns aggregate subsets ofattribute values that often co-occur in a given description.Condition-patterns represent a conjunction of two or moreelementary conditions that frequently occur in a ruleset ofa given concept. A rule-pattern is a rule or a subset ofrules. Detecting such patterns and using them forexpanding the representation space has shown to beeffective in improving performance accuracy in DNF-typeproblems (Wnek & Michalski, 1994b).

Following the scheme of different types of patterns, it wasconjectured that there may exist class-patterns. Such

patterns would represent relations that are common forsubsets of learned classes (concepts). The XOR-patternssatisfy this definition. They apply both to examples of aM-of-N concept as well as its negation. This would implythat new descriptor is needed to discriminate betweenclasses that have such property.

Following the notation introduced by Michalski (1983),we will formulate a rule for generating \"countingattribute.\" To this end, we need to introduce somenotation. We use the general term \"symmetry\" for tworelations: \"exclusive-or\" or \"equivalence.\" Since one is thenegation of the other, from now on, we will discussprimarily the \"exclusive-or\" relation.2.1 RELATIONAL CONDITIONS

Let Ci be relational conditions on values of singleattributes (selectors), such as xi = 5, xi > 3, xi=2..5.Relational conditions evaluate to true (1) or false (0).Such conditions are building blocks in the rules generatedby the rule learning system AQ15 employed in theproposed method (Michalski et al., 1986).2.2 BINARY SYMMETRY CLASS (BSC)Let R1 and R2 be two conjunctive rules in a rulesetrepresenting a hypothesis. Suppose further that R1 can berepresented as Ci & ~Cj & CTX1 and R2 as ~Ci & Cj &CTX2, where CTX1 and CTX2 are \"context\" conditions,expressed in the form of a conjunction of zero or morerelational conditions. It is said that Ci and Cj represent a\"binary symmetry class,\" if CTX1 and CTX2 are in asubsumption relation, that is CTX1 = CTX2 & CTX3 orCTX2 = CTX1 & CTX3, where CTX3 is a contextcondition. In other words, two attributes or relationalconditions constitute a symmetry class if an XORrelationship binding them constitutes a pattern.2.3MAXIMUM SYMMETRY CLASS (MSC)Given k binary symmetry classes, they can be combinedinto k-ary symmetry class if they have non emptyintersections with each other. The resulting class consistsof n relational conditions. For example, if BSC1 = {c1,c3}, BSC2 = {c1, c2}, BSC3 = {c2, c3} then they can becombined into MSC = {c1, c2, c3}.

The maximum symmetry class is the maximum numberof relational conditions that can be combined together.2.4COUNTING ATTRIBUTE GENERATION

RULEGiven a k-ary symmetry class, generate a countingattribute CA that stands for the expression[C1+C2+...+Ck] that sums up evidence given byrelational conditions. The domain of the attribute is aninteger interval from 0 to k. Values of the countingattribute represent the number of relational conditions(attributes) that hold for the given concept example.

The counting attribute represents an arithmetic sum oftwo or more relational conditions. (Each relationalcondition evaluates to 1 or 0.) When the relationalconditions Ci use binary attributes then the \"countingattribute\" agglomerates evidence described by thoseattributes. If the conditions use multivalued attributes thenthe evidence agglomeration is carried over whole relationalconditions.

Note:Systems that use different representationalformalisms for data and hypotheses may not be able todetect class-patterns in hypotheses. In such cases, data canbe a source for finding these patterns. In the case of theAQ learning method that uses the VL1 representationalformalism (a form of propositional calculus), class-patterns can be detected both in data and hypotheses. Aninitial examination of descriptions generated by FOIL(Quinlan, 1990) indicates that they also contain XOR-patterns. The class-patterns are different from intra-construction and inter-construction operators used in Duceand CIGOL (Muggleton, 1987; Muggleton & Buntine,1988).

2.5LEARNING M-of-N CONCEPTS

M-of-N concepts are easy to detect in descriptionsgenerated by the AQ learning system (or in data), becausethey form XOR patterns. Figure 1 gives an outline of the

algorithm for changing the representation space based onthe detection of XOR-patterns.

The determination of the DNF concept representation isdone using AQ algorithm. By examining learned conceptdescriptions XOR-patterns are detected. XOR-relatedattributes are grouped into maximum symmetry classes.

1.Determine a DNF concept representation.

If the expression is sufficiently simple, STOP.

2.Detect XOR-patterns in the learned concept description.3.

If XOR-patterns do not exist, then STOP. Otherwise:

Build maximum symmetry classes (MSC)

For each MSC-class, generate a \"counting attribute\"Project data to the new representation spaceGo to 1.

Figure 1: Algorithm for Changing the Representation

Space Based on XOR-patternsFor example, if the following patterns were detected, x1XOR x3, x1 XOR x5, x1 XOR x7, x3 XOR x5, x3 XOR x7,x5 XOR x7, the following MSC-class is created {x1, x3,x5, x7}. Next, for each MSC-class a \"counting attribute\"is created. The name of such an attribute reflects names ofattributes from the XOR-class. Values of the countingattribute represent the number of properties that hold for

the learned concept. They are established based on directcounting of values of attributes from the MSC-class. Forthe above example, the attribute CA <:: x1+x3+x5+x7 iscreated. Its domain is an integer interval from 0 to 4.Counting attributes used within DNF expressions allowfor representing various forms of M-of-N concepts. Fig. 2shows examples of such concepts. Note that if the targetconcept includes an M-of-N rule as a part of itsdescription, the above method will generate a conditionalM-of-N rules. Such a rule will contain a M-of-N rule withadditional conditions represented as a rule set (DNFexpression). This is a consequence of using AQ.

Parity 5 expressed using 5 binary attributes: x1-x5.

[CA = 0, 2, 4], where CA <:: x1+x2+x3+x4+x5

3-of-6

[CA >= 3], where CA <:: x1+x2+x3+x4+x5+x6

Exactly 3-of-6

[CA = 3], where CA <:: x1+x2+x3+x4+x5+x6

MONK2 problem \"exactly two of six attributes

have their first value\"

[CA = 2], where CA <:: c1+c2+c3+c4+c5+c6

XOR (exactly 1-of-2)

[CA = 1], where CA <:: x1+x2

Figure 2: Concept Representation Using Counting

Attributes in DNF

3ILLUSTRATIVE EXAMPLE:MONK2 PROBLEM

The concept to be learned is the MONK2 problem (Thrunet al., 1991; Wnek & Michalski, 1994a). Figure 3a showsa diagram visualizing the problem. The total number ofpossible instances in the representation space is 432. Inthe diagram, the target concept is represented by 142instances (shaded area). The remaining 290 instancesrepresent the negation of the concept. The training set isrepresented by positive (+) and 105 negative (-)examples. The data contains no noise.3.1LEARNING IN THE ORIGINAL

REPRESENTATION SPACE

MONK2 problem is hard for symbolic learning systems. In

fact, none of the 18 symbolic learners taking part in theinternational competition learned the MONK2 concept(Thrun et al., 1991). This problem is also hard for theAQ15 program.

12

Positive exampleNegative example

123456710111213141516

[HS=r][HS=s,o][HS=s,o][HS=r][HS=r,o][HS=s,o][HS=s,o][HS=s,o][HS=s,o][HS=s,o][HS=s,o][HS=r][HS=s,o][HS=o][HS=r]

&&&&&&&&&&&&&&&[BS=s,o][BS=s,o][BS=s,o][BS=s,o][BS=r][BS=r,o][BS=s,o][BS=r][BS=r,o][BS=r,o][BS=r][BS=s,o][BS=s,o][BS=s][BS=s][BS=r]&&&&&&&&&&&&&&&&

[SM=y][SM=y][SM=n][SM=n][SM=y][SM=n][SM=y][SM=n][SM=n][SM=n][SM=y][SM=n][SM=y][SM=y][SM=n]

&&&&&&&&&&&&&&&[HO=f,b][HO=f,b][HO=f,b][HO=f,b][HO=s,f][HO=s,b][HO=s][HO=f,b][HO=s][HO=s,b][HO=f,b][HO=f,b][HO=f,b][HO=s][HO=s][HO=b]&&&&&&&&&&&&&&&&[JC=y,g,b][JC=y,g,b][JC=r][JC=y,g,b][JC=g,b][JC=g][JC=y,g,b][JC=y][JC=y,g][JC=r][JC=y][JC=r][JC=r][JC=y,b][JC=g][JC=y]&&&&&&&&&&&&&&&&[TI=n][TI=y][TI=y][TI=y][TI=n][TI=n][TI=y][TI=n][TI=n][TI=n][TI=y][TI=n][TI=n][TI=n][TI=n][TI=n](t:9,(t:9,(t:7,(t:5,(t:5,(t:4,(t:4,(t:4,(t:4,(t:3,(t:3,(t:2,(t:2,(t:2,(t:1,(t:1,u:9)u:9)u:7)u:5)u:4)u:4)u:4)u:4)u:3)u:3)u:3)u:2)u:2)u:2)u:1)u:1)

Figure 4: The MONK2 Concept Learned in the Original Representation Space

(c1 = 1) <:: [HS=r](c1 = 0) <:: [HS=s,o](c4 = 1) <:: [HO=s](c4 = 0) <:: [HO=f,b](c2=1) <:: [BS=r](c2=0) <:: [BS=s,o](c5=1) <:: [JC=r]

(c5=0) <:: [JC=y,g,b](c3=1) <:: [SM=y](c3=0) <:: [SM=n](c6=1) <:: [TI=y](c6=0) <:: [TI=n]

Figure 5: Attributes Constructed From Value-patterns

12345671011121314[c1=0][c1=1][c1=1][c1=1][c1=1][c1=0][c1=0][c1=0][c1=0][c1=0][c1=0][c1=0][c1=0][c1=1]&&&&&&&&&&&&&&

[c2=1][c2=0][c2=0][c2=0][c2=1][c2=1][c2=1][c2=0][c2=0][c2=0][c2=0][c2=0][c2=0]&&&&&&&&&&&&&

[c3=0][c3=0][c3=1][c3=0][c3=0][c3=1][c3=0][c3=0][c3=1][c3=1][c3=1][c3=0][c3=0][c3=0]&&&&&&&&&&&&&&

[c4=0][c4=0][c4=1][c4=0][c4=0][c4=1][c4=0][c4=1][c4=0][c4=0][c4=1][c4=0][c4=0]&&&&&&&&&&&&&

[c5=1][c5=0][c5=0][c5=0][c5=0][c5=0][c5=0][c5=0][c5=0][c5=1][c5=0][c5=0][c5=1][c5=1]&&&&&&&&&&&&&&[c6=0][c6=0][c6=0][c6=0][c6=1][c6=0][c6=0][c6=1][c6=0][c6=0][c6=1][c6=1][c6=1][c6=0](t:2,(t:1,(t:1,(t:1,(t:1,(t:1,(t:1,(t:1,(t:1,(t:1,(t:1,(t:1,(t:1,(t:1,u:2)u:1)u:1)u:1)u:1)u:1)u:1)u:1)u:1)u:1)u:1)u:1)u:1)u:1)

Figure 6: Concept Learned in the Representation Space Developed in Iteration #1

Rule No3635276101014

[c1=1][c1=0][c1=1][c1=1][c1=1][c1=0][c1=0][c1=0][c1=0][c1=1]

[c2=0][c2=1][c2=0][c2=0][c2=1][c2=1][c2=1][c2=0][c2=0][c2=0]

[c3=1][c3=1][c3=1][c3=0][c3=0][c3=0][c3=1][c3=1][c3=1][c3=0]

[c4=0][c4=0][c4=0][c4=0][c4=0][c4=1][c4=0][c4=0][c4=0][c4=0]

[c5=0][c5=0][c5=0][c5=0][c5=0][c5=0][c5=0][c5=1][c5=1][c5=1]

[c6=0][c6=0][c6=0][c6=1][c6=0][c6=0][c6=0][c6=0][c6=0][c6=0]

Figure 7: An Example of XOR-patterns Leading to Creation of the Attribute CA <:: c1+c2+c3+c4+c5+c6

The number of instances representing the target concept is15, therefore in the worst case, the number of rulesrequired to describe the concept is 15. This is a reductionin description complexity in comparison to the originalrepresentation space. Each instance in the new spacerepresents from 1 to 24 instances that were mapped fromthe original space. The transformation does not causeambiguity in the new representation space, i.e., each newinstance represents instances of the same class, eitherpositive or negative. For more details see (Wnek, 1993).In this representation space, all possible positiveexamples are present, and only 13 negative examples aremissing (in the original space only positive examplesout of 142 are present). It seems that learning should givebetter results. However, the AQ15 learning program stillgenerates a long and inaccurate description of the concept(Fig. 6). Errors are caused by overly general rule #1. Thisrule covers not only two positive examples but alsocovers two negative instances.

3.3REPRESENTATION SPACE

TRANSFORMATION—ITERATION #2The description obtained after the first transformation ofthe representation space is more accurate but still verycomplex (Figs. 6, 3b). Therefore, search for a betterrepresentation is continued, and the XOR-patterns arefound. Fig. 7 lists examples of pairs of rules with someof XOR-patterns. All six attributes form a MSC class.From this class a new counting attribute is constructed. Itis defined as CA <:: c1+c2+c3+c4+c5+c6. Its domain isan integer interval from 0 to 6. Summing up values in theXOR-patterns always gives exact value 2. The finalconcept description is [CA = 2], i.e., exactly two of thesix attributes are present. Fig. 3c visualizes the finalrepresentation space and the concept learned.

4 RELATED WORK

Seshu (19) studied the problem of learning conceptdescriptions that combine DNF and M-of-N rules in thecontext of decision tree learning. He calls such learningproblems, unsplittability or parity problems. Resultspresented in that paper, suggest that solving the parityproblem is important, because it \"causes a significantamount of error.\" Moreover, \"we cannot entirely eliminatethe error due to the parity problem except by using anexponentially large number of description languageattributes or by tolerating exponentially large decisiontrees.\"

Seshu's solution is based on generating new binaryattributes from a randomly selected subset of originalattributes and taking the XOR operation over all subsetsof those features. With this method, splittablity of adecision tree is improved and the error associated with theparity problem is reduced to low levels. However, it is notclear what is the meaning of the new attributes, as well as

generated concept descriptions using such attributesTherefore, even though the method is able to learnconditional M-of-N rules, there is no evidence producedabout the kind of underlying M-of-N conditions.

5THE RELATIONSHIP BETWEENLOGIC AND ARITHMETIC

For several decades logic circuits were used in digitalcomputers. It is well known that such circuits not onlyfacilitated logical operations but also arithmetic ones.Actually, all possible arithmetic operations, starting withaddition through multiplication, ending with the mostcomplex functions, are in fact represented by logiccircuits. A link between logic and arithmetic could beexpressed by two logical operators: XOR and AND. Fig. 8shows the relationship between \"XOR\" and \"ADD\"operators.

(a)

xyCarry/AND Sum/XOR0 0 0 00 1 0 11 0 0 11 1 1 0(b)

ADDySum=0Sum=1Carry=1Carry=01Sum=1Sum=0Carry=0Carry=00x10(c)

xor-patterny10x10Figure 8: (a) Truth Table for Addition of Two BinaryDigits, (b) Visualization of ADD Operator in a BinaryDomain, (c) Visualization of an XOR-pattern

Carry can be represented as logical AND, and Sum can berepresented as logical XOR. Given the \"ADD\" operator wecan implement multiplication, and other arithmeticoperators.

In the context of concept learning, the presence of an XORrelationship among two or more binary attributes mayindicate that counting them (i.e., how many of them aretrue) is likely to produce a relevant new attribute.

6 CONCLUSIONS

This paper demonstrates that M-of-N relations are easy todetect and learn from descriptions generated by the AQinductive learning system, because they form XOR-patterns. The existence of such patterns in learneddescriptions speeds their detection but it is not necessarycondition for the application of the method with otherlearning systems. If a learning system does not form XOR-patterns in descriptions, because of a different syntax indifferent representational formalisms, then the input datashould be examined for XOR-patterns.

The counting attribute generation rule was introduced andapplied using algorithm for changing representation spacebased on XOR-patterns. By observing the relationshipsamong attributes, the proposed method combines ideas oflogic and arithmetic. The introduced \"counting attribute\"captures a conceptual transition from logic to arithmetic.It is shown that the proposed method is capable oflearning conditional M-of-N rules, using the hypothesis-driven constructive induction approach. As an illustrativeexample, the MONK2 learning problem was solved with100% accuracy, and \"minimal\" complexity of thedescription. The method gives a good promise for itsapplicability in real-world problems carrying M-of-Nrelationships. The study confirming such applicability isin progress.

Acknowledgments

We thank Zenon Kulpa for his comments on this work.This research was conducted in the Center for ArtificialIntelligence at George Mason University. The Center'sresearch is supported in part by the National ScienceFoundation under Grants No. CDA-9309725, IRI-9020266, and DMI-9496192, in part by the AdvancedResearch Projects Agency under Grant No. N00014-91-J-1854, administered by the Office of Naval Research, andthe Grant No. F49620-92-J-0549, administered by the AirForce Office of Scientific Research, in part by the Officeof Naval Research under Grant No. N00014-91-J-1351.References

Baffes, P. T. and Mooney, R. J., \"Symbolic Revision ofTheories with M-of-N Rules,\" Proceedings of the 2ndInternational Workshop on Multistrategy Learning,Harpers Ferry, WV, pp. 69-75, 1993.

Bloedorn, E. and Michalski, R.S., \"Data DrivenConstructive Induction in AQl7-PRE: A Method andExperiments,\" Proceedings of the Third InternationalConference on Tools for AI. San Jose, CA, 1991.Callan, J.P. and Utgoff, P.E., \"A TransformationalApproach to Constructive Induction,\" Proceedings of theEight International Workshop on Machine Learning,Evanston, Ill., pp. 122-126, 1991.

Fawcett, T.E. and Utgoff, P.E., \"A Hybrid Method forFeature Generation,\" Proceedings of the EightInternational Workshop on Machine Learning, Evanston,Ill., pp. 137-141, 1991.

Jensen, G.M., \"Determination of Symmetric VL1Formulas: Algorithm and Program SYM4,\" M.S. Thesis,Report No. UIUCDCS-R-75-774, Department ofComputer Science, University of Illinois, Urbana-Champaign, December 1975.

Michalski, R.S., \"Recognition of Total or PartialSymmetry in a Completely or Incompletely SpecifiedSwitching Function,\" Proceedings of the IV Congress ofthe International Federation on Automatic Control(IFAC), Vol. 27, pp. 109-129, Warsaw, June 16-21,1969.

Michalski, R.S., \"A Theory and Methodology ofInductive Learning,\" in Machine Learning: An ArtificialIntelligence Approach, R.S. Michalski, J.G. Carbonelland T.M. Mitchell (Eds.), TIOGA Publishing, Palo AltoCA, 1983.

Michalski, R.S., Mozetic, I., Hong, J. and Lavrac, N.,\"The Multi-Purpose Incremental Learning System AQ15and its Testing Application to Three Medical Domains,\"Proceedings of AAAI-86, pp. 1041-1045, MorganKaufmann, San Mateo, CA, 1986.

Muggleton, S., \"Duce, an Oracle-Based Approach toConstructive Induction,\" Proceedings of IJCAI-87, pp.287-292, Morgan Kaufmann, San Mateo, CA, 1987.Muggleton, S. and Buntine, W., \"Machine Invention ofFirst Order Predicates by Inverting Resolution,\"Proceedings of the 5th International Conference onMachine Learning, pp. 339-352, Morgan Kaufmann, SanMateo, CA, 1988.

Murphy, P. M. and Pazzani, M. J., \"ID2-of-3:Constructive Induction of M-of-N Concepts forDiscriminators in Decision Trees,\" Proceedings of the 8thInternational Workshop on Machine Learning, Evanston,Ill., pp. 183-187, 1991.

Quinlan, J.R., \"Learning Logical Definitions fromRelations,\" Machine Learning, Vol. 5, No. 3, pp. 239-266, 1990.

Seshu, R., \"Solving the Parity Problem,\" Proceedings ofEWSL-, Montpellier, France, pp. 263-271, 19.

Spackman, K.A., \"Learning Categorical Decision Criteriain Biomedical Domains,\" Proc. of the 5th InternationalConference on Machine Learning, Morgan Kaufmann, SanMateo, CA, pp. 36-46, 1988.

Thrun, S.B., Bala, J., Bloedorn, E., Bratko, I., Cestnink,B., Cheng, J., DeJong, K.A., Dzeroski, S., Fahlman,S.E., Hamann, R., Kaufman, K., Keller, S., Kononenko,I., Kreuziger, J., Michalski, R.S., Mitchell, T.,Pachowicz, P., Vafaie, H., Van de Velde, W., Wenzel,W., Wnek, J. and Zhang, J., \"The MONK's Problems: APerformance Comparison of Different LearningAlgorithms,\" Technical Report, Carnegie MellonUniversity, December 1991.

Towell, G. G. and Shavlik, J. W., \"Refining SymbolicKnowledge Using Neural Networks,\" in MachineLearning: A Multistrategy Approach , Vol. IV,Michalski, R.S. and G. Tecuci, Morgan Kaufmann, SanMateo, CA, pp. 1994.

Wnek, J. Hypothesis-driven Constructive Induction.Ph.D. Dissertation, School of Information Technologyand Engineering, George Mason Univ., Fairfax, VA,University Microfilms Int., Ann Arbor, MI, 1993.Wnek, J. and Michalski, R.S., \"Comparing Symbolic andSubsymbolic Learning: Three Studies,\" in MachineLearning: A Multistrategy Approach, Vol. 4., R.S.Michalski and G. Tecuci (Eds.), Morgan Kaufmann, SanMateo, CA, 1994a.

Wnek, J. and Michalski, R.S., \"Hypothesis-drivenConstructive Induction in AQ17-HCI: A Method andExperiments,\" Machine Learning, Vol. 14, No. 2, pp.139-168, 1994b.

因篇幅问题不能全部显示,请点此查看更多更全内容

Copyright © 2019- huatuo0.cn 版权所有 湘ICP备2023017654号-2

违法及侵权请联系:TEL:199 18 7713 E-MAIL:2724546146@qq.com

本站由北京市万商天勤律师事务所王兴未律师提供法律服务