实施决策树

时间:2016-10-03 08:10:15

标签: c# machine-learning decision-tree

我正在学习如何在C#中实现简单的决策树。有人可以解释一下,在伪代码中看起来如何,或者在c#中实现一些简单的教程?

我有这个数据集:

enter image description here

(从这里:http://storm.cis.fordham.edu/~gweiss/data-mining/weka-data/weather.nominal.arff

我已经完成了图形决策树 enter image description here

(对不起我的英文)

我的想法只有这个:

if outlook = "overcast" then no 
if outlook = "sunny" and humidity = "normal" then yes
if outlook = "sunny" and humidity = "high" then no
if outlook = "rain" and wind = "true" then no
if outlook = "rain" and wind = "fasle" then yes

我真的不知道,如何继续

2 个答案:

答案 0 :(得分:2)

为了部分回答这个问题,显然已经描述了决策树的概念here。要为上面的类型实现决策树,您可以声明一个与您问题中的表中的类型匹配的类。基于该类型,您需要创建一个树数据结构,其中子数不受限制。虽然实际数据仅包含在叶子中,但最好将基本类型的每个成员定义为可空。这样,在每个节点中,您只能设置设置为其子项的特定值的成员。此外,应表示值为noyes的节点数。

答案 1 :(得分:2)

如果要基于ID3算法构建决策树,则可以引用此伪代码。

ID3 (Examples, Target_Attribute, Attributes)
Create a root node for the tree
If all examples are positive, Return the single-node tree Root, with label = +.
If all examples are negative, Return the single-node tree Root, with label = -.
If number of predicting attributes is empty, then Return the single node tree Root,
with label = most common value of the target attribute in the examples.
Otherwise Begin
    A ← The Attribute that best classifies examples.
    Decision Tree attribute for Root = A.
    For each possible value, vi, of A,
        Add a new tree branch below Root, corresponding to the test A = vi.
        Let Examples(vi) be the subset of examples that have the value vi for A
        If Examples(vi) is empty
            Then below this new branch add a leaf node with label = most common target value in the examples
        Else below this new branch add the subtree ID3 (Examples(vi), Target_Attribute, Attributes – {A})
End
Return Root

如果您想了解有关ID3算法的更多信息,请转到ID3 algorithm

链接