Contents


Overview

Skill Level: Beginner

Beginner

KickstartNode-red Boilerplate(Skip this section if you already have a Node-RED starter application running.) On your bluemix account navigate to the catalog. Once there select the Node-RED starter under boilerplates. Follow the instructions to create an instance of the Node-RED starter app on BlueMix Github You may also choose to link your new BlueMix app to […]

Ingredients

  • BlueMix

  • Node-Red.

  • Speech to text/text to speech service

  • Natural Language Classifier service

  • IBM IoT Platform

 

Step-by-step

  1. Kickstart

    Node-red Boilerplate

    (Skip this section if you already have a Node-RED starter application running.)

    On your bluemix account navigate to the catalog. Once there select the Node-RED starter under boilerplates.

    Follow the instructions to create an instance of the Node-RED starter app on BlueMix

    Github

    You may also choose to link your new BlueMix app to github. This is not required for our prototyping environment, but can be a very useful tool.

    For documentation on the Node-RED Starter feel free to visit https://www.ng.bluemix.net/docs/#starters/Node-RED/nodered.html#nodered

  2. Node-red Sample Flow

    Launch the Node-RED flow editor from the App overview page.

    Feel free to access the original flow at http://generalclassifierapp.mybluemix.net/red/# You can also copy the nodes from this flow and import them to the clipboard.

    Enter the Node-RED flow and use the top right corner menu to import from clipboard the following:

    [{“id”:”27b68ad3.846706″,”type”:”websocket-listener”,”z”:””,”path”:”/ws/test”,”wholemsg”:”false”},{“id”:”538a8f23.25ac6″,”type”:”subflow”,”name”:”Context Classifier Tier1″,”info”:””,”in”:[{“x”:42.5,”y”:212.5,”wires”:[{“id”:”89e2f248.48e9f8″}]}],”out”:[{“x”:1085,”y”:126.24996948242188,”wires”:[{“id”:”f8595231.ad40f”,”port”:0}]},{“x”:1086.25,”y”:187.49996948242188,”wires”:[{“id”:”f8595231.ad40f”,”port”:1}]},{“x”:1087.5,”y”:249.99996948242188,”wires”:[{“id”:”f8595231.ad40f”,”port”:2}]},{“x”:1086.25,”y”:316.25,”wires”:[{“id”:”f8595231.ad40f”,”port”:3}]}]},{“id”:”89e2f248.48e9f8″,”type”:”http request”,”z”:”538a8f23.25ac6″,”name”:”Tier1 Classifier”,”method”:”GET”,”ret”:”txt”,”url”:”https://gateway.watsonplatform.net/natural-language-classifier/api/v1/classifiers//classify?text=”{{{payload}}}””,”x”:209,”y”:212.75,”wires”:[[“dce33bec.cf186”]]},{“id”:”f8595231.ad40f”,”type”:”switch”,”z”:”538a8f23.25ac6″,”name”:”Tier1 Switch”,”property”:”globalclass”,”rules”:[{“t”:”eq”,”v”:”Insert”},{“t”:”eq”,”v”:”Your”},{“t”:”eq”,”v”:”Classifications”},{“t”:”eq”,”v”:”Here”}],”checkall”:”true”,”outputs”:4,”x”:897.2999267578125,”y”:213.04998779296875,”wires”:[[],[],[],[]]},{“id”:”cb8e10e6.126eb8″,”type”:”function”,”z”:”538a8f23.25ac6″,”name”:”Tier1 Classification Extractor”,”func”:”msg.tier1=msg.payload.top_class;n//node.warn will show the chosen class in the debug tab for testingnnode.warn(msg.tier1);nmsg.payload=msg.inputtext;nreturn msg;”,”outputs”:1,”noerr”:0,”x”:664.7999267578125,”y”:213.04998779296875,”wires”:[[“f8595231.ad40f”]]},{“id”:”dce33bec.cf186″,”type”:”json”,”z”:”538a8f23.25ac6″,”name”:”NLC to JSON “,”x”:417.300048828125,”y”:213.04998779296875,”wires”:[[“cb8e10e6.126eb8”]]},{“id”:”36d2d9b4.72b2e6″,”type”:”subflow”,”name”:”Global Classifier”,”info”:”This is the first classifiernit makes global decisions basednon what classifications you need.nnThere can be many outputs.nIf you continue using the nmsg.inputtext to cary the message nto be classified.nThe only nodes you need to modifynfor it to work correctly are thenGlobal Context Classifer, and thenGlobal Switch node.nnThe Global Classification is storednas msg.globalclass in case you wantnto use it later in the flow.nnYou can build as many outputs as you nwould like, but keep in mind thatnthe more classes in your classifiernthe more complex your dataset must be.nnThere is a ballence to be struck betweennaccuracy, speed, and simplicity.n”,”in”:[{“x”:76.4500732421875,”y”:226.20001220703125,”wires”:[{“id”:”2936fd9e.a278d2″}]}],”out”:[{“x”:1132.5,”y”:122.49996948242188,”wires”:[{“id”:”942a347d.0bd6f”,”port”:0}]},{“x”:1132.5,”y”:186.24996948242188,”wires”:[{“id”:”942a347d.0bd6f”,”port”:1}]},{“x”:1135,”y”:247.5,”wires”:[{“id”:”942a347d.0bd6f”,”port”:2}]},{“x”:1136.25,”y”:311.25,”wires”:[{“id”:”942a347d.0bd6f”,”port”:3}]}]},{“id”:”2936fd9e.a278d2″,”type”:”http request”,”z”:”36d2d9b4.72b2e6″,”name”:”Global Context Classifier”,”method”:”GET”,”ret”:”txt”,”url”:”https://gateway.watsonplatform.net/natural-language-classifier/api/v1/classifiers//classify?text=”{{{payload}}}””,”x”:261.9500732421875,”y”:226.20001220703125,”wires”:[[“1ecff052.89d338”]]},{“id”:”942a347d.0bd6f”,”type”:”switch”,”z”:”36d2d9b4.72b2e6″,”name”:”Global Switch”,”property”:”globalclass”,”rules”:[{“t”:”eq”,”v”:”Insert”},{“t”:”eq”,”v”:”Your”},{“t”:”eq”,”v”:”Classifications”},{“t”:”eq”,”v”:”Here”}],”checkall”:”true”,”outputs”:4,”x”:950.25,”y”:226.5,”wires”:[[],[],[],[]]},{“id”:”ae2499e1.693c2″,”type”:”function”,”z”:”36d2d9b4.72b2e6″,”name”:”Global Classification Extractor”,”func”:”msg.globalclass=msg.payload.top_class;n//node.warn will show the chosen class in the debug tab for testingnnode.warn(msg.globalclass);nmsg.payload=msg.inputtext;nreturn msg;”,”outputs”:1,”noerr”:0,”x”:717.75,”y”:226.5,”wires”:[[“942a347d.0bd6f”]]},{“id”:”1ecff052.89d338″,”type”:”json”,”z”:”36d2d9b4.72b2e6″,”name”:”NLC to JSON “,”x”:470.2501220703125,”y”:226.5,”wires”:[[“ae2499e1.693c2”]]},{“id”:”b8877e3e.42195″,”type”:”inject”,”z”:”3f44d95b.e1bca6″,”name”:”Test Text Input to Classifier Tree”,”topic”:””,”payload”:””,”payloadType”:”string”,”repeat”:””,”crontab”:””,”once”:false,”x”:175.03326416015625,”y”:159.533353805542,”wires”:[[“6aec91d4.989ab”]]},{“id”:”6aec91d4.989ab”,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”Input stored as msg.inputtext”,”func”:”msg.inputtext=msg.payload;nreturn msg;”,”outputs”:1,”noerr”:0,”x”:1031.9166946411133,”y”:155.25000762939453,”wires”:[[“b33f5aef.019ae”]]},{“id”:”de1fbf8a.c99628″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”Global Context Classifier”,”info”:”This is the first classifiernit makes global decisions basednon what classifications you need.nnThere can be many outputs.nIf you continue using the nmsg.inputtext to cary the message nto be classified.nThe only nodes you need to modifynfor it to work correctly are thenGlobal Context Classifer, and thenGlobal Switch node.nnThe Global Classification is storednas msg.globalclass in case you wantnto use it later in the flow.nnYou can build as many outputs as you nwould like, but keep in mind thatnthe more classes in your classifiernthe more complex your dataset must be.nnThere is a ballence to be struck betweennaccuracy, speed, and simplicity.n”,”x”:1287.333351135254,”y”:58.999982833862305,”wires”:[]},{“id”:”b33f5aef.019ae”,”type”:”subflow:36d2d9b4.72b2e6″,”z”:”3f44d95b.e1bca6″,”name”:”Global Classification”,”x”:1273.3917922973633,”y”:153.22501373291016,”wires”:[[],[],[],[“34c60b1f.46d3ac”]]},{“id”:”34c60b1f.46d3ac”,”type”:”subflow:538a8f23.25ac6″,”z”:”3f44d95b.e1bca6″,”x”:1539.4167556762695,”y”:247.75000762939453,”wires”:[[],[],[],[“6b9a4732.76af8”]]},{“id”:”dff37716.704d18″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”Tier1 Classifier”,”info”:”Typically this is the level of depth required nfor basic actions or information lookup to benidentified accurately.”,”x”:1513.583351135254,”y”:58.999982833862305,”wires”:[]},{“id”:”6384520b.e13bec”,”type”:”ibmiot in”,”z”:”3f44d95b.e1bca6″,”authentication”:”quickstart”,”inputType”:”evt”,”deviceId”:””,”applicationId”:””,”deviceType”:””,”eventType”:””,”commandType”:””,”format”:””,”name”:”IBM IoT”,”service”:””,”allDevices”:””,”allApplications”:””,”allDeviceTypes”:””,”allEvents”:””,”allCommands”:””,”allFormats”:””,”x”:105.33329010009766,”y”:274.1666793823242,”wires”:[[“f8d65074.30ec28”]]},{“id”:”1b2a65be.0e83b2″,”type”:”watson-speech-to-text”,”z”:”3f44d95b.e1bca6″,”name”:””,”lang”:”en-US”,”band”:”BroadbandModel”,”x”:661.27783203125,”y”:468.1166687011719,”wires”:[[“fc6fe658.de03″,”a05d1df7.ddb9b8”]]},{“id”:”9217ebdc.6eae8″,”type”:”ibmiot in”,”z”:”3f44d95b.e1bca6″,”authentication”:”quickstart”,”apiKey”:””,”inputType”:”evt”,”deviceId”:””,”applicationId”:””,”deviceType”:”HarmanTest”,”eventType”:”audio”,”commandType”:””,”format”:”JSON”,”name”:”IBM IoT”,”service”:”registered”,”allDevices”:true,”allApplications”:””,”allDeviceTypes”:true,”allEvents”:””,”allCommands”:””,”allFormats”:””,”x”:102.6943359375,”y”:469.7500915527344,”wires”:[[“1bbe3528.1ceff3”]]},{“id”:”1bbe3528.1ceff3″,”type”:”switch”,”z”:”3f44d95b.e1bca6″,”name”:”switch msg type”,”property”:”payload.type”,”rules”:[{“t”:”eq”,”v”:”start”},{“t”:”eq”,”v”:”end”},{“t”:”eq”,”v”:”body”}],”checkall”:”true”,”outputs”:3,”x”:278.4943542480469,”y”:469.7500457763672,”wires”:[[“79cce4c1.f7291c”],[“7d907e33.6bee2”],[“1ac8be7a.f75b3a”]]},{“id”:”4e8330bb.19ab28″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”Message Formats”,”info”:”Messages are published to IoTF on event `event_type`nnA new audio transmission always begins with a `start` message containing some metadata. nThe wav file is then base64 encoded and sent in chunks as `body` messages. nFinally, an `end` message is sent to identify the end of an audio transmission. nnStart Message:n————–n“`jsonn{n “type”:”start”,n “id”:”12345″,n “format”:”audio/wav”,n “size”:125234n}n“`nnBody Message:n————-n“`jsonn{n “type”:”body”,n “id”:”12345″,n “body”:”YWJj”n}n“`nnEnd Message:n————n“`jsonn{n “type”:”end”,n “id”:”12345″n}n“`”,”x”:530.4943542480469,”y”:381.7500305175781,”wires”:[]},{“id”:”fc6fe658.de03″,”type”:”debug”,”z”:”3f44d95b.e1bca6″,”name”:””,”active”:true,”console”:”false”,”complete”:”transcription”,”x”:848.8277587890625,”y”:428.4166564941406,”wires”:[]},{“id”:”f8d65074.30ec28″,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”Extract Assumptions From IoT”,”func”:”//Depending on the information sent to IoT some n//Assumptions can be captured at this stagenn//uncomment this code if you have informationn//to store from the device sent to the IoTFn//msg.assumptions=msg.payload.;nn//Store the main user text in msg.payloadn//This code is completed by establishing the n//structure of your uploaded data through IoTFnmsg.payload=msg.payload.;nreturn msg;”,”outputs”:1,”noerr”:0,”x”:323.27759552001953,”y”:318.6166763305664,”wires”:[[“6aec91d4.989ab”]]},{“id”:”79cce4c1.f7291c”,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”Start”,”func”:”nreturn msg;”,”outputs”:1,”noerr”:0,”x”:494.9999694824219,”y”:420.5000305175781,”wires”:[[]]},{“id”:”7d907e33.6bee2″,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”End”,”func”:”nreturn msg;”,”outputs”:1,”noerr”:0,”x”:494.9999694824219,”y”:469.2500305175781,”wires”:[[“1b2a65be.0e83b2”]]},{“id”:”1ac8be7a.f75b3a”,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”Body”,”func”:”nreturn msg;”,”outputs”:1,”noerr”:0,”x”:494.9999694824219,”y”:513.0000305175781,”wires”:[[]]},{“id”:”7e183f9.54e55c”,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”Basic Input Types”,”info”:””,”x”:101.41668701171875,”y”:27.333328247070312,”wires”:[]},{“id”:”4597c07f.e8deb8″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”Test”,”info”:”If you want to test your own input. This can ncontain a string to be classified.”,”x”:86.33334350585938,”y”:111,”wires”:[]},{“id”:”5df02fa5.4ae848″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”IoT Message”,”info”:”If the string is sent to our IoT platform alongnwith the device metadata that we will use.nnWe can exctract the string to be classified andnstore other message data in this function.”,”x”:96.66665649414062,”y”:207.66668701171875,”wires”:[]},{“id”:”a05d1df7.ddb9b8″,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”Transcription to Payload”,”func”:”//This will send information from an instance n//of speech to text to the payloadnmsg.payload=msg.transcription;nreturn msg;”,”outputs”:1,”noerr”:0,”x”:871.5277099609375,”y”:507.6167297363281,”wires”:[[“6aec91d4.989ab”]]},{“id”:”b562bb62.5fecf8″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”Streaming Audio to IoT”,”info”:””,”x”:146.39999389648438,”y”:422.4000549316406,”wires”:[]},{“id”:”ae2b6da.a0b9d1″,”type”:”mqtt in”,”z”:”3f44d95b.e1bca6″,”name”:””,”topic”:””,”x”:98.39999389648438,”y”:652.4000244140625,”wires”:[[“90b9be33.a19fb8”]]},{“id”:”90b9be33.a19fb8″,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”Prep MQTT as Input to Speech to Text”,”func”:”//Prep MQTT message from your devicen//as a stream or recording for n//speech to textnreturn msg;”,”outputs”:1,”noerr”:0,”x”:368.3999938964844,”y”:652.4000244140625,”wires”:[[“1b2a65be.0e83b2”]]},{“id”:”22bb8df0.7408da”,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”Message over MQTT”,”info”:”This can be either a stream or a file sentnover mqtt”,”x”:132.39999389648438,”y”:592.4000244140625,”wires”:[]},{“id”:”e288c4a5.d4aab”,”type”:”http in”,”z”:”3f44d95b.e1bca6″,”name”:””,”url”:””,”method”:”get”,”x”:98.39999389648438,”y”:758.3999938964843,”wires”:[[“2046b7e8.2034f”]]},{“id”:”f0f137ba.ffc02″,”type”:”websocket in”,”z”:”3f44d95b.e1bca6″,”name”:””,”x”:114.39999389648438,”y”:884.4000244140625,”wires”:[[“f877a742.4fe968”]]},{“id”:”f877a742.4fe968″,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”Extract from Websocket”,”func”:”//Recordingn//msg.payload=msg.payload.;nn//Streamn//msg.payload=msg.payload.;nreturn msg;”,”outputs”:1,”noerr”:0,”x”:356.3999938964844,”y”:884.4000244140625,”wires”:[[“1b2a65be.0e83b2”]]},{“id”:”bf0baef1.d9091″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”Websocket”,”info”:””,”x”:108.39999389648438,”y”:834.3999938964841,”wires”:[]},{“id”:”2928494c.fb3e0e”,”type”:”tcp in”,”z”:”3f44d95b.e1bca6″,”server”:”client”,”host”:””,”port”:””,”datamode”:”stream”,”datatype”:”buffer”,”newline”:””,”topic”:””,”name”:””,”base64″:false,”x”:106.39999389648438,”y”:1012.4000244140625,”wires”:[[“55ab6bb0.f57ffc”]]},{“id”:”ee07d28c.73f0f”,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”TCP”,”info”:””,”x”:100.39999389648438,”y”:958.4000244140625,”wires”:[]},{“id”:”2046b7e8.2034f”,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”Extract Recording from HTTP”,”func”:”//msg.payload=msg.req.body;nreturn msg;”,”outputs”:1,”noerr”:0,”x”:376.3999938964844,”y”:758.4000244140625,”wires”:[[“1b2a65be.0e83b2”]]},{“id”:”99ad85de.aab01″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”Message over HTTP”,”info”:””,”x”:126.39999389648438,”y”:706.4000244140625,”wires”:[]},{“id”:”55ab6bb0.f57ffc”,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”Extract Recording/Stream from TCP”,”func”:”nreturn msg;”,”outputs”:1,”noerr”:0,”x”:376.3999938964844,”y”:1012.4000244140625,”wires”:[[“1b2a65be.0e83b2”]]},{“id”:”1ed936a6.4241b1″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”Tier2-N Classifiers”,”info”:”These classifiers are used until the levelnof detail is sufficient to take actions.nTypically there are no more than three if ndesigned appropriately.”,”x”:1802.39990234375,”y”:230.39999389648438,”wires”:[]},{“id”:”6b9a4732.76af8″,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”Placeholder”,”func”:”nreturn msg;”,”outputs”:1,”noerr”:0,”x”:1786.39990234375,”y”:260.3999938964844,”wires”:[[“71e618.ed8fd9e8″,”362bc9e0.a3065e”,”6e18d196.152c78″,”1e4d94bd.552543″]]},{“id”:”71e618.ed8fd9e8″,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”Command/Action”,”func”:”//prep message for IoT Foundation herenreturn msg;”,”outputs”:1,”noerr”:0,”x”:2166.39990234375,”y”:260.3999938964844,”wires”:[[“1f0e9a6a.5aba76”]]},{“id”:”392009c6.c3dc76″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”Possible Outputs “,”info”:”Each of these example outputs can be configurednto be triggered after the classifiers are nselected. They will need to be changed to fitnyour application.nnIf you application needs conversation, these ncan have a question to the user posed over textnor speaker, and a new branch of classifiers fornthe user's response.n”,”x”:2164.39990234375,”y”:66.39999389648438,”wires”:[]},{“id”:”9971cba6.e0c768″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”Command to IoT”,”info”:””,”x”:2158.39990234375,”y”:212.39999389648438,”wires”:[]},{“id”:”1f0e9a6a.5aba76″,”type”:”ibmiot out”,”z”:”3f44d95b.e1bca6″,”authentication”:”quickstart”,”outputType”:””,”deviceId”:””,”deviceType”:””,”eventCommandType”:””,”format”:””,”data”:””,”name”:”IBM IoT”,”service”:””,”x”:2390.39990234375,”y”:260.3999938964844,”wires”:[]},{“id”:”457f2e23.0922a8″,”type”:”ibmpush”,”z”:”3f44d95b.e1bca6″,”name”:””,”ApplicationID”:””,”ApplicationRoute”:””,”identifiers”:””,”notification”:”broadcast”,”x”:2384.39990234375,”y”:448.3999938964844,”wires”:[]},{“id”:”4d55ff41.20bc8″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”IBM Push Notifications”,”info”:””,”x”:2178.39990234375,”y”:344.3999938964844,”wires”:[]},{“id”:”362bc9e0.a3065e”,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”Prep Push”,”func”:”msg.url=”insert url here”;nmsg.payload=”your message goes here”;nreturn msg;”,”outputs”:1,”noerr”:0,”x”:2148.399993896484,”y”:448.39999389648426,”wires”:[[“457f2e23.0922a8”]]},{“id”:”dfc3878b.766b08″,”type”:”websocket out”,”z”:”3f44d95b.e1bca6″,”name”:””,”server”:”27b68ad3.846706″,”client”:””,”x”:2446.39990234375,”y”:608.4000244140625,”wires”:[]},{“id”:”6e18d196.152c78″,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:”Prep Message for Websocket”,”func”:”msg.payload=”your message goes here”nreturn msg;”,”outputs”:1,”noerr”:0,”x”:2206.39990234375,”y”:608.4000244140625,”wires”:[[“dfc3878b.766b08”]]},{“id”:”4b52504c.243df8″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”Websocket output”,”info”:””,”x”:2168.39990234375,”y”:542.4000244140625,”wires”:[]},{“id”:”2bbd4ea9.89b02a”,”type”:”http response”,”z”:”3f44d95b.e1bca6″,”name”:””,”x”:2396.39990234375,”y”:742.4000244140625,”wires”:[]},{“id”:”a4cd5e35.3cbaf8″,”type”:”comment”,”z”:”3f44d95b.e1bca6″,”name”:”HTTP Message “,”info”:””,”x”:2160.39990234375,”y”:692.4000244140625,”wires”:[]},{“id”:”1e4d94bd.552543″,”type”:”function”,”z”:”3f44d95b.e1bca6″,”name”:””,”func”:”msg.payload=”Body of response goes here”;nmsg.headers=”headers go here”;nmsg.statusCode=”if applicable put status code here”;nreturn msg;”,”outputs”:1,”noerr”:0,”x”:2140.39990234375,”y”:742.4000244140625,”wires”:[[“2bbd4ea9.89b02a”]]}]

    The resulting flow should look something like this:

  3. How to use the General Architecture

    General Architecture

    Inputs

    The inputs should be defined by your use case, but starting points have been provided to help facilitate prototyping. There are other ways to send input, however these are the most common. They will each need to be configured to work with the devices that you intend to use.

    Classifier Tree

    The classification of translated speech is done by analyzing the natural language through multiple tiers.

    Each step in the process represents a classifier on the NLC BlueMix service (one service many classifiers). For most applications you only need two tiers to get to actionable information, but this can be grown depending on the application. The number of outputs should be changed to represent the number of classes. This can be done after you have built your global classifier on the BlueMix service.

    This architecture makes each step simple to solve, and in turn provides high accuracy with limited training data. This is most appropriate when we know the interactions that we want to support between user and device.

    You will need to configure each of the nodes in this flow to fit with the classifier that you built.

    For more information on how to build a classifier with the NLC service feel free to visit:

    https://www.ibm.com/smarterplanet/us/en/ibmwatson/developercloud/nl-classifier.html

    When building your classifier tree remember that each category should be obvious. Thus each individual classifier is small and accurate. Your classification categories should be as different from each other as possible.

    As an example a global classification could be food, a tier 1 classification could be meats, a tier 2 classification could be fish. After the tier 2 classification is identified we can parse the string to find words like tuna or salmon. Thus in three layers we can have machine and human speaking the same language. More tiers means more specificity but also more outcomes.

    Outputs

    These are among many possible outputs. Each demonstrates the capacity that BlueMix and Node-RED have to facilitate communication from human to device to cognitive cloud. Like the inputs, these will need to be configured for your environment and devices.

4 comments on"General Architecture for Voice Interaction Quickstart guide (outdated)"

  1. Import to Clipboard of Node-RED did not work – I think there is an issue with the JSON since also tried it in a editor which formatted the JSON and it was not formatted correctly

  2. @kwiatks: Did you get this import to work ?
    I am also having this problem of node-red not accepting the cut and paste of the above code.

  3. The very first node listed in the code above “websocket-listener” is not available on node-red editor:
    I used websocket input . then I ‘exported’ and ‘imported’ the code with the change of websocket-in to websocket-listener. Listed below is what the node red editor shows in the info pane.

    What is the solution for this ?

    ——————————————–
    Name websocket listener
    Type unknown
    ID 83ef25df.7a2c98
    Properties
    This node is a type unknown to your installation of Node-RED.

    If you deploy with the node in this state, it’s configuration will be preserved, but the flow will not start until the missing type is installed.

    It is possible this node type is already installed, but is missing a dependency. Check the Node-RED start-up log for any error messages associated with the missing node type. Use npm install to install any missing modules and restart Node-RED and reimport the nodes.

    Otherwise, you should contact the author of the flow to obtain a copy of the missing node type

Join The Discussion