"Neural Networks: Nominal Class"

mksaadmksaad MemberPosts:42Guru
edited May 2019 inHelp
Hi,

I tried to train a neural network with a nominal class label dataset (iris dataset), but I received the following error:
This learning scheme does not have sufficient capabilities for the given data set: polynominal label not supported
How can I train a neural network with a nominal class label dataset?

Greetings,
--
Motaz K. Saad

Answers

  • IngoRMIngoRM Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, Community Manager, RMResearcher, Member, University ProfessorPosts:1,751RM Founder
    Hi,

    只是NeuralNet操作符添加到操作员"Binary2MultiClassLearner" which can be used to transform any binominal learning scheme into one which can be applied on multiple classes. Here is an example:












    Cheers,
    Ingo
  • mksaadmksaad MemberPosts:42Guru
    Hi,
    Thanks for replying

    Hmmm ::), It builds 3 NNs, one for each class values !


    Do you recommend using other solutions rather than using "Binary2MultiClassLearner" operator, like replacing each class label value with a numeric value ?
    for example, for Iris dataset, I performed the following:
    replaced "Iris Setosa" class with 10
    “虹膜Versicolour”类20所取代
    replaced "Iris Virginica" class with 30

    After training the NN for 1000 training cycles, with learning rate of 0.3, momentum of 0.2, and error epsilon of 0.00. I got the following results
    root_mean_squared_error: 1.802 +/- 0.000
    squared_error: 3.246 +/- 12.872


    When I tried to replace nominal class values with smaller numbers, I got difference results for same training cycles (1000). like the following
    replaced "Iris Setosa" class with 0
    replaced "Iris Versicolour" class with 1
    replaced "Iris Virginica" class with 2

    I got the following results
    root_mean_squared_error: 0.180 +/- 0.000
    squared_error: 0.032 +/- 0.129

    What do you think? !

    Another question please, What +/- 0.129 in squared_error results stands for ?

    Worm Greetings,
    --
    Motaz K. Saad
  • TobiasMalbrechtTobiasMalbrecht Moderator, Employee, MemberPosts:294RM Product Management
    Hi Motaz,
    mksaad wrote:

    What do you think? !
    嗯,what doyouthink?

    Let me try to point you at how to get a conclusion yourself by mirroring to you what you did:

    First of all, you had a classification problem. Then you transformed this to a regression problem by arbitrarily mapping the three classes to real values. Afterwards you compared the (regression!) errors you obtained with two different mappings. You observed that the errors were different. Actually, the errors seem to be scaled as your label values imply.

    But what happens if you map the three classes to the values 2, 1, 0? ... or 0.1, 38297159 and 7? Or -4, 328 trillion and pi? Well, the errors will surely be different again. But the example mappings I just mentioned should not be less reasonable than the mappings you tried. The point is: mapping a classification to a regression problem and examining the regression errors will almost not give you any information at all. If you try using a regression learner on a classification problem, you at least have to map the predictions back to the classes and examine the classification errors. Nevertheless, this will still be highly dependent on the mappings you have chosen.

    Hence, to cut a long story short: the method Ingo proposed is the adequate way to use neural nets for multi classification problems.

    Hope I could clarify things a bit.

    Regards,
    Tobias
  • mksaadmksaad MemberPosts:42Guru
    Hi Tobias,

    Thanks for your reply.

    What +/- 0.129 in squared_error results stands for ?


    Thanks in advance,
    --
    Motaz
  • TobiasMalbrechtTobiasMalbrecht Moderator, Employee, MemberPosts:294RM Product Management
    Hi Motaz,

    sorry, I forget to answer that question!;)

    That value is the standard deviation of the error in the folds (assuming you have done a cross validation).

    Regards,
    Tobias
Sign InorRegisterto comment.