About the LinearRegression operator

Legacy UserLegacy User MemberPosts:0Newbie
edited November 2018 inHelp
Hi,

I am a new user of rapidminer. I tried to use the LinearRegression operator and I ran with my dataset. But in the end I did not get an equation. I am currently trying to train a model.

My process are as follows:

Root

>CSV Example Source

>Attribute Filter

>RemoveUselessAttributes

>Genetic Algorithm

>>Operator Chain

>>>X-Validation

>>>>LinearRegression

>>>>OperatorChain(2)

>>>>>>ModelApplier

>>>>>>RegressionPerformance

>>>>ProcessLog

>CSV Example Set Writer


Thanks!

Answers

  • haddockhaddock MemberPosts:849Maven
    Hi there,

    The optimisation produces a parameter set for the learning operator and anything else you tweak, rather than a model. To keep it simple the process is like this.

    1. Generate a parameter set - you've done that.

    2. Apply that set using the parameter setter operator. You'll need to map between the optimised parameters and the ones you will use. So if I had optimised an SVM and some Validation settings I would have this sort of mapping.








    3. Run the same learner on an appropriate dataset ( careful here ). That will produce the optimised model which has this sort of form.

    31.736 * a1 + 42.948 * a2 + 23.773 * a3 + 3.706 * a4 - 4.184 * a5 - 304.228

    Where a1 etc. are the attribute names.

    With RM a big skill is to keep track of the inputs and ouputs of the operators, because sometimes it is not what you think!

  • Legacy UserLegacy User MemberPosts:0Newbie
    Hi haddock,

    Thanks for your suggestion.

    因此,参数eter setter operator should include two learner operators and placed after genetic algorithm?

    Thanks
  • haddockhaddock MemberPosts:849Maven
    No, just the one learner after setting the parameters.
Sign InorRegisterto comment.