"Optimization of SVM"

funnyhatfunnyhat 成员Posts:13Contributor II
edited June 2019 inHelp
Hi,
I have been wasting a while on the SVM optimization (or maybe not). The question is more or less same ashttp://rapid-i.com/rapidforum/index.php/topic,1573.0.html, From Rapido ,

The idea is
1. generate data (binomial label)
2. transform the data
3. Parameter optimization with xvalidation, but AUC will be the main control parameter
4. Draw the best parameter's ROC and show AUC

but when i ran the process, the output is always negtive. (the machine judge everything to negtive), This really confused me... Thanks for advance



-
-




——<运营商激活= "true" class="process" expanded="true" name="Root">




<参数键= value =“process_duration_for_mail30" />

-
——<运营商激活= "true" class="generate_data" expanded="true" height="60" name="TrainData" width="90" x="45" y="30">









——<运营商激活= "true" class="normalize" expanded="true" height="94" name="Ztransformation" width="90" x="180" y="30">

















——<运营商激活= "true" class="remember" expanded="true" height="60" name="IOStorer" width="90" x="313" y="30">





——<运营商激活= "true" class="optimize_parameters_grid" expanded="true" height="148" name="ParameterOptimization" width="90" x="313" y="120">
-



-
——<运营商激活= "true" class="x_validation" expanded="true" height="112" name="Validation" width="90" x="112" y="30">







-
——<运营商激活= "true" class="support_vector_machine_libsvm" expanded="true" height="76" name="Training" width="90" x="82" y="30">





















-
——<运营商激活= "true" class="apply_model" expanded="true" height="76" name="Test" width="90" x="45" y="30">



——<运营商激活= "true" class="performance_binominal_classification" expanded="true" height="76" name="Performance (2)" width="90" x="112" y="210">




































——<运营商激活= "true" class="log" expanded="true" height="112" name="Log" width="90" x="313" y="120">

-






























<连接from_op = from_port“ParameterOptimization”="performance" to_port="result 1" />
<连接from_op = from_port“ParameterOptimization”="parameter" to_port="result 2" />
<连接from_op = from_port“ParameterOptimization”="result 1" to_port="result 3" />
<连接from_op = from_port“ParameterOptimization”="result 2" to_port="result 4" />
<连接from_op = from_port“ParameterOptimization”="result 3" to_port="result 5" />










Tagged:

Answers

  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, MemberPosts:2,531Unicorn
    Hi,
    please switch to the XML View of RapidMiner to see the process as text. Then copy it to the clip board and paste it here, since the Internet Explorer representation contains these "-" signs, that are not allowed in XML. I will then try to reproduce the behavior and check if there's a bug.

    Greetings,
    Sebastian
  • funnyhatfunnyhat 成员Posts:13Contributor II













    <参数键= value =“process_duration_for_mail30"/>




















































































































































    <连接from_op = from_port“ParameterOptimization”="performance" to_port="result 1"/>
    <连接from_op = from_port“ParameterOptimization”="parameter" to_port="result 2"/>
    <连接from_op = from_port“ParameterOptimization”="result 1" to_port="result 3"/>
    <连接from_op = from_port“ParameterOptimization”="result 2" to_port="result 4"/>
    <连接from_op = from_port“ParameterOptimization”="result 3" to_port="result 5"/>











    I hope this time works, Great Thanks!!
  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, MemberPosts:2,531Unicorn
    Hi,
    seems to me to deliver the best performance anyway? Or what do you want to the algorithm do, if you serve completely random data? learning the random seed? Possible, but more complex...

    Greetings,
    Sebastian
  • funnyhatfunnyhat 成员Posts:13Contributor II
    Very, very strange. I am using the version 5.0.005 without any extension

    I can not upload the result pictures here, but the accuracy table is
    accuracy 58.00% +- 16.61%
    true negative true positive
    pred. negative 58 42
    pred. positive 0 0
    class recal 100% 0.00%

    I am really confused.. thanks
  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, MemberPosts:2,531Unicorn
    Hi,
    it's really simple: The data you are trying to learn from is completely random. There is no statistical dependency between the attribute values and the label. Without such an dependency, you cannot predict the label based on the attribute values, because they simply are completely independent from the label value. So the best thing you can do is predicting always the most frequent class.
    And exactly this is done by the SVM.
    To have more sexier results, change the parameter of the data generator to something different, that does not contain "random" in it's name.

    Greetings,
    Sebastian
  • funnyhatfunnyhat 成员Posts:13Contributor II
    好谢谢! !我估计的能力SVM. I thought it can find something out anyway. I will try it with some other dataset anyhow, by the way, I Hope the SVM can pick out some attribites weights and generate a better model....Also , I hope my 5.0.005 version is not so bad compare to the current updated version.
  • funnyhatfunnyhat 成员Posts:13Contributor II
    Something more, I noticed in the log file, it said s PM warning: performance criterion AUC was already part of performance vector, overwritten.... I hope this wont harm .
  • landland RapidMiner Certified Analyst, RapidMiner Certified Expert, MemberPosts:2,531Unicorn
    Hi,
    what exactly is the problem with the current 5.0.006 version?

    Greetings,
    Sebastian
  • funnyhatfunnyhat 成员Posts:13Contributor II
    我不能升级由于administrion权利。- - - - - - just a local problem, which I can solve it by buying a new computer. Anyhow, have a nice weekend, besides, if you have time, please check during the optimization, if AUC is selected as main critia, it can not proper show in the performance results. But I am not so sure about it so far. . Have a nice weekend
  • IngoRMIngoRM Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, Community Manager, RMResearcher, Member, University ProfessorPosts:1,751RM Founder
    Hi,

    just quickly jumping in:

    我估计的能力SVM. I thought it can find something out anyway.
    I think it is exactly the strength of SVM tonotfit the model to random data - this reduces the risk for overfitting. A neural net, for example, can easily be tuned to learn the random data (or "to memorize" it...) but this is exactly the reason why I prefer SVM over NN ;D

    Just my 2c. Cheers,
    Ingo
Sign InorRegisterto comment.