Watson Assistant services on IBM Cloud are a set of REST APIs. This makes them quite simple to use as a piece of a solution within an application. It also means they need to be integrated with various other parts of the solution to allow your users to interact with your instance of Watson.

While we have published a number of assets to assist in this effort — such as the Watson SDKs, API reference, sample applications and other user-contributed GitHub repos as well — many of our users still ask how to integrate with specific external systems. If you are not familiar with the Watson APIs, you can sign up for IBM Cloud.

A general outline for integration

While it would be physically impossible for us to provide instructions for every possible integration point on the planet, there is a general outline that nearly all integrations follow that hopefully sets our developers up for success when they need to integrate with something new.

There are essentially three major components to a solution.

Sample architecture diagram for integrations

The left-hand side of the solution typically is the front end or channel. This could be a webpage or an application window where the user types their questions, responses are shown, images are displayed and so on. It may be a messaging channel, an embedded chat widget, a mobile app or even SMS messaging.

The brains behind the interaction would be a Watson Assistant service. Watson Assistant takes the inputs, understands them and drives what happens next. This can be simply displaying a response, disambiguating what’s being asked, playing a video or using a multi-modal interaction like showing a map, or it can be something more complex like reading/writing to a database or even calling an enterprise service.

These first two pieces are typically pretty standard. The left-hand side can be customized if you are using your own website, but existing messaging channels like Slack or Facebook can’t be customized — you can only connect to them. The right-hand side can be trained on your content, but the interaction follows some structure. Depending upon the content source or type, you may have to use some data transformation or connectivity patterns.

The varying application layer

The middle layer is the application layer and is typically the piece that can vary the most. If you take a look at any of our sample applications, you will see there is one job that the middle layer must accomplish — passing information from the left side to the right side, including system context, and passing it back right to left to carry the conversation. It’s simply a translation layer taking data from one side to the other and back.

Where this gets the most complex is when you have additional integrations you want to work with. Let’s say you want to add Tone Analyzer so you have an empathetic chatbot. We typically call this pre-processing because it happens before calling Watson Assistant. Your application would take the user input, run it through this pre-processor — in this case to get the tone of the user statement — then attach that as context for Assistant before passing it on to Watson.

The third layer is a post-processing step where logic necessary to respond to the user query resides, meaning it happens after calling Watson Assistant but before returning a response to the front end. In the Assistant with Discovery example, we illustrate using Watson Discovery Service as a post-processed service.

Another use case for this might be writing information to a database. Let’s say a user orders a large pepperoni pizza. Your application would potentially need to make two callouts: The first would place the order in your POS system to actually get them the pizza, and the second might write their order to a database. This way, the next time this user logs in, they could simply say, “Order my usual,” or something similar. Watson Assistant would typically return an action tag as documented here and also return some text. Your application could take action and do the activities as defined, then also show a message such as, “I’ll remember that’s your favorite, and it’s on the way. Thank you for your order.”

We publish various samples for things such as connecting to channels using Botkit or using our new Serverless Architecture built on Cloud Functions. We have demos for connecting to Tone, NLU and Discovery, but these are just samples. You will probably find more unique and powerful things to integrate with your virtual agent. Use the pattern established above to swap in a new front-end or messaging channel that we may not support out of the box. Using Actions, you can now call out to other services to enrich your conversation or allow users to actually complete activities using post-processing. You can also add additional pieces along with Assistant in order to make your Watson more powerful.

This article was originally published on Medium.

Additional credit to Laksh Krishnamurthy for his contributions to the content and diagram.

The post How to integrate Watson Assistant with just about anything appeared first on Mobile Business Insights.

#awvi,#ai,#watson,#IBMCloud

via Mobile http://bit.ly/2g7ZRfN

May 11, 2018 at 10:00AM