Bagging optmization reduced performance

ravtalravtal MemberPosts:9Contributor II
edited January 2020 inHelp
Hi,

I am running an attached dataset to measure the performance of the model. Decision tree gave me a good accuracy value, however, when I used a bagging operator to increase the performance of the model, the output reduced the performance accuracy.
Could anyone help me with what changes I need to make in the model so that accuracy is optimized?

Note: dataset has no attribute tables, uncheck "first row as names" and column operator value to "," while importing the dataset.

Regards,
RT
Jasmine_

Answers

  • MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University ProfessorPosts:3,404RM Data Scientist
    Hi@ravtal,
    this sounds like you overtrained your decision tree. Did you check for it?

    Best,
    Martin
    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
    ravtal Jasmine_
  • ravtalravtal MemberPosts:9Contributor II
    You mean I should reduce the depth?
    Jasmine_
  • varunm1varunm1 Moderator, MemberPosts:1,207Unicorn
    Hello@ravtal

    As@mschmitzsaid, it might be due to overfitting. DId you try hyperparameter optimization using "optimize parameter grid operator"? You can search for the best hyperparameters for your algorithm and reduce overfitting.
    Regards,
    Varun
    https://www.varunmandalapu.com/

    Be Safe. Follow precautions and Maintain Social Distancing

    ravtal Jasmine_
  • ravtalravtal MemberPosts:9Contributor II
    @varunm1Ok, but do you think model design is good?
    Jasmine_
Sign InorRegisterto comment.