"Recursive partitioning regression tree?"

tmanasatmanasa MemberPosts:13Contributor II
edited June 2019 inHelp
I would like to discover explanatory variables for a numeric dependent variable (which conditions result in the shortest commute times on any given week day). Some of the explanatory variables are numeric, others are categoric. Does RapidMiner have any algorithms that can handle that?

I understand that recursive partitioning regression trees can. Is there an equivalent or alternative in RM?
Tagged:

Answers

  • MariusHelfMariusHelf RapidMiner Certified Expert, MemberPosts:1,869Unicorn
    Hi Ted,

    RapidMiner's Decision Trees can only be used for classification problems, not for regression tasks. But they can deal with both nominal an numeric explanatory attributes.

    If you want to stick to decision trees you can discretize the label (as we call the dependent variable) with one of the Discretize operators.

    However, decision trees are probably not the best way to define the explanatory power of variables. For that you could e.g. try the Linear Regression and have a look at p-value and attribute weights of the resulting model, or use a Forward Selection around a decision tree.

    Best regards,
    Marius
  • RalfKlinkenbergRalfKlinkenberg Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, RMResearcher, Member, Unconfirmed, University ProfessorPosts:68RM Founder
    Hi Ted,

    if you install theWeka Extensionfor RapidMiner, you can use regression tree learners from Weka seamlessly within RapidMiner:






    <宏/ >












    <连接from_op = "用" from_port = "output 1" to_op="W-REPTree" to_port="training set"/>
    <连接from_op = "用" from_port = "output 2" to_op="W-M5P" to_port="training set"/>











    For installing the Weka Extension, start RapidMiner, go to the Help menu, and then to the "Updates and Extensions (Marketplace)" submenu.

    Cheers,
    Ralf
Sign InorRegisterto comment.