implement this algorithm in rapidminer

MelodyMelody MemberPosts:9Contributor I
edited December 2018 inHelp

Hi, I want to implement an algorithm in the RapidMiner like this, but I do not know how? please guide me

Untitled.png

Tagged:

Answers

  • Thomas_OttThomas_Ott RapidMiner Certified Analyst, RapidMiner Certified Expert, MemberPosts:1,761Unicorn

    根据你的图你需要阅读opeator to load in your data, a Set Role operator to set your label, then a Sample operator, a Cross Validation(CV) operator, and a Stacking operator on the training side of CV operator. You embed the different machine learners in the Stacking operator.

    nmahesh
  • MelodyMelody MemberPosts:9Contributor I

    Hi, Thank you for your reply.
    For the sampler operator, should I use the bootstrap operator or bagging?


    This error occurred for the operation I used. What is this error?What should I do?

    p1.png

    p1.png 0B
  • Thomas_OttThomas_Ott RapidMiner Certified Analyst, RapidMiner Certified Expert, MemberPosts:1,761Unicorn

    Well that depends on what you want to do with sampling as you balance your classes. Is it better to bootstrap (aka unsample) or downsample? Have you considered weighting them using a Generate Weight (stratification)?

    Your other error means that you can't deliver and example set (EXA) from that operator, rather you need an operator that delivers a model (MOD). Something like a Naive Bayes or Decision Tree, etc

  • MelodyMelody MemberPosts:9Contributor I

    I want to use an optimal model to achieve higher ranking accuracy in unbalanced data in an ensemble algorithm by combining two ensemble bagging and boosting and using a genetic programming model as a learning algorithm for classifying unbalanced data.If I just to use bagging for sampling and give data for training in boosting. It makes a better model by weight.

    I want to use genetic programming to improve this model.How do you think I can make this model? Is this idea feasible?

  • Thomas_OttThomas_Ott RapidMiner Certified Analyst, RapidMiner Certified Expert, MemberPosts:1,761Unicorn

    Yup, you can do that in RapidMiner. Post your process when you're ready and we can troubleshoot.

  • MelodyMelody MemberPosts:9Contributor I

    Thank you,

    Post my process here or email you?

  • Thomas_OttThomas_Ott RapidMiner Certified Analyst, RapidMiner Certified Expert, MemberPosts:1,761Unicorn

    Please post it to the thread, thanks.

  • MelodyMelody MemberPosts:9Contributor I

    Hi, Mr. Ott.

    Is my processing correct?
    Do you think this complies with the model I explained?
    Is sampling done in the same way?
    How can the minority class (positive) specifically weigh more to see more in the prediction?

  • Thomas_OttThomas_Ott RapidMiner Certified Analyst, RapidMiner Certified Expert, MemberPosts:1,761Unicorn

    See I'm guess that the positive class is the minority class. I would handle it by overweighting the minority class and underweigthing the majority class. Something like this.

    Then i would use a Cross Validation (not Split Validation) in the Optimize Weights.

















































































    <参数键= " root_relative_squared_error”价值e="true"/>














    <连接from_port = "榜样" to_op = "叉瓦里dation" to_port="example set"/>




















  • MelodyMelody MemberPosts:9Contributor I

    I used cross vallidation, but why is the number of error predictions in the confusion matrix not equal to the number of displayed errors of optimize weight and wrong prediction negative and positive? Or am I wrong?

    Not compatible with the confusion matrix for visualization.How to be corrected?

    1.png2.png

    How can I get the tree out of this output process?

    1.png 0B
    2.png 0B
Sign InorRegisterto comment.