Confidence interval calculation on performance
Hello,
I see that for many classification performance metric, RapidMiner provides an estimation which I interpret as an interval of confidence around the value provided. However I have failed to find the exact calculation that is performed, in particular for the kappa value.
To be clear, my question seems similar to the one of@taghaddofrom last December (see post there:https://community.www.turtlecreekpls.com/discussion/54694/error-range-of-classifier), which as far as I can tell, hadn't been answered to. Would anyone be able to clarify this point? Maybe by posting a snippet of the actual source code for that calculation as it seems to be sometimes done (e.g. for the kappa calculation therehttps://community.www.turtlecreekpls.com/discussion/54909/regarding-kappa-value-in-cross-validation)?
Many thanks,
François
Tagged:
0
Best Answer
-
lionelderkrikor Moderator, RapidMiner Certified Analyst, MemberPosts:1,195UnicornHi François,
It is because RapidMiner is using, in this particular case, aCross Validation(See theHelpsection of this operator).
使用这种技术,RM是训练和测试kmodels (according to thenumber of foldsyou set)
Thus it obtains k performances. Then RM calculates the average and the standard deviation of this k performances.
So the displayed values are :mean +/- standard deviation
Regards,
Lionel7
Answers