NeuralNet LVQ

GottfriedGottfried MemberPosts:17Maven
edited December 2018 inKnowledge Base

Hello all!

This post is a big-up to the designers of the "information selection" extension (and to Teuvo Kohonen, finder of learning vector quantization). The NeuralNet LVQ operator does a great job. It is impressive to see how predictive modeling - whether logistic regression, neural nets, deep learning and even naïve bayes - keeps its accuracy even when chained behind this NeuralNet LVQ operator. Try it out : insert it just in front of your favorite predictive learner, plugging the "prototypes" port out of the NN-LVQ to the training port of your learner. Even with as few as 100 prototypes out of thousands of lines in your initial training set, your learner will perform greatly against your test set. Obviously, this helps a great deal in reducing the learning time. This also proves the relevance of learning vector quantization...

sgenzer

Comments

  • MartinLiebigMartinLiebig Administrator, Moderator, Employee, RapidMiner Certified Analyst, RapidMiner Certified Expert, University ProfessorPosts:3,400RM Data Scientist

    @marcin_blachnik, i think this goes to you:)

    - Sr. Director Data Solutions, Altair RapidMiner -
    Dortmund, Germany
    sgenzer
  • marcin_blachnikmarcin_blachnik MemberPosts:61Guru

    Thanks for the post,

    I'm glad you are using my extension, and it helps in your work.

    愿一切都好!

    Marcin

    sgenzer
Sign InorRegisterto comment.