"Recursive partitioning regression tree?"
I would like to discover explanatory variables for a numeric dependent variable (which conditions result in the shortest commute times on any given week day). Some of the explanatory variables are numeric, others are categoric. Does RapidMiner have any algorithms that can handle that?
I understand that recursive partitioning regression trees can. Is there an equivalent or alternative in RM?
I understand that recursive partitioning regression trees can. Is there an equivalent or alternative in RM?
Tagged:
0
Answers
RapidMiner's Decision Trees can only be used for classification problems, not for regression tasks. But they can deal with both nominal an numeric explanatory attributes.
If you want to stick to decision trees you can discretize the label (as we call the dependent variable) with one of the Discretize operators.
However, decision trees are probably not the best way to define the explanatory power of variables. For that you could e.g. try the Linear Regression and have a look at p-value and attribute weights of the resulting model, or use a Forward Selection around a decision tree.
Best regards,
Marius
if you install theWeka Extensionfor RapidMiner, you can use regression tree learners from Weka seamlessly within RapidMiner: For installing the Weka Extension, start RapidMiner, go to the Help menu, and then to the "Updates and Extensions (Marketplace)" submenu.
Cheers,
Ralf