RapidMiner doesn't have this kind of JSON processing capabilities. You could put the data one by one into a JSON capable database like PostgreSQL and use the functionality built in to group the JSON documents, for example. Or one of the scripting languages supported by RapidMiner: Groovy, Python or R.
If the number of documents is not too large, you could try processing the collection with Loop Collection and add all the elements to a macro with Generate Macro.
Create Macro with "[" as the starting point, then add the elements from the connection and "," (but not for the last element), and finally close it with "]". That would result in the macro having a valid JSON array.
If you're after a loop value, you can useappend supersetoperator to make example of your object collection and connect toData To JSONoperator check generate array and finally connect toWrite Documentby specify youfile.json (enter a path before ex: C:\DATA\youfile.json) encodingUTF8and checkoverwrite.
Answers
RapidMiner doesn't have this kind of JSON processing capabilities.
You could put the data one by one into a JSON capable database like PostgreSQL and use the functionality built in to group the JSON documents, for example. Or one of the scripting languages supported by RapidMiner: Groovy, Python or R.
If the number of documents is not too large, you could try processing the collection with Loop Collection and add all the elements to a macro with Generate Macro.
Create Macro with "[" as the starting point, then add the elements from the connection and "," (but not for the last element), and finally close it with "]". That would result in the macro having a valid JSON array.
Regards,
Balázs