What is the Microsoft Bot framework?
The Microsoft Bot Framework is a set of APIs that simplifies the process of creating a chat bot using C#. The bot framework us set up a web service or Azure function that allows for interacting with a user via chat. The bot framework has several features for integrating chat bots with Microsoft Teams, Skype, and other Microsoft products. While the framework is geared toward Microsoft integration with Teams, it can be used as a stand alone bot system that can control: user flow, group chats, etc.
Bots use JSON to write back and forth to the web service, but interpreters like Teams will recognize certain JSON objects, such as cards, and display them in a unique way without any styling. The framework also has a feature that allows for conversation flow called dialogs. This will manage a user's conversation flow and allow for more complex interactions like booking a hotel room, or implementing a pizza ordering system.
Create a Bot in 5 Minutes
This example is going to build off of a previous example where we developed a NLP faq application. If you have not read the NLP example, please do so before beginning since we will use a lot of the codebase from this example. The NLP example can be found here:
https://fiveminutecoder.blogspot.com/2020/08/using-mlnet-to-create-natural-language.html. Before creating our chatbot, we need to get the bot templates. There are several starting templates to choose from, for this example we will use the echo bot. This bot template just writes back to the chat what you type in which is perfect for our demo since we do not need any dialog flow. To install the template we just need to run the following commands.
dotnet new -i Microsoft.Bot.Framework.CSharp.EchoBot
dotnet new echobot -n BotFrameworkFAQBot
Once we have our framework setup, we need to install the Nuget package Microsoft.ML. This will allow us to use ML.NET to process our questions and post an answer.
From the NLP example we will want to bring over several items. The first being our trained machine learning model which we saved earlier called FAQModel.zip. We will also need our Prediction data model and our FAQ data model.
using Microsoft.ML.Data;
namespace EchoBot.Bots
{
public class FAQModel
{
[ColumnName("Question"), LoadColumn(0)]
public string Question {get;set;}
[ColumnName("Answer"), LoadColumn(1)]
public string Answer {get;set;}
}
}
using Microsoft.ML.Data;
namespace EchoBot.Bots
{
public class PredictionModel
{
[ColumnName("PredictedAnswer")]
public string PredictedAnswer {get;set;}
[ColumnName("Score")]
public float[] Score {get;set;}
}
}
With our data moved from the previous project, we can go ahead and rename the EchoBot.cs to FAQBot.cs. This will break any dependency injection that is setup by our service, so we will need to go to the startup.cs and change the services.AddTransient<IBot, Bots.EchoBot>() to services.AddTransient<IBot, Bots.FAQBot>()
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
services.AddControllers().AddNewtonsoftJson();
// Create the Bot Framework Adapter with error handling enabled.
services.AddSingleton();
// Create the bot as a transient. In this case the ASP Controller is expecting an IBot.
services.AddTransient();
}
Our machine learning model is pretrained, so since this information is static we will create a singleton instance of our prediction engine . This way we do not have to read the file from filestream every time we want to predict an answer.
//context or our machine learning model
static MLContext context;
//used to read and predict our questions
static PredictionEngine predictionEngine;
//creating a private staitc constructor
//Our model is not changing so it doesnt make sense to keep opening it and reading the zip for performance
static FAQBot()
{
//structure of our data model
DataViewSchema modelSchema;
//the model loaded for prediction
ITransformer trainedModel;
context = new MLContext();
//load our file
using(Stream s = File.Open("FAQModel.zip", FileMode.Open))
{
trainedModel = context.Model.Load(s, out modelSchema);
}
//creates our prediction engine
predictionEngine = context.Model.CreatePredictionEngine(trainedModel);
}
Now that our constructor on class are in place and loaded, all we need to do is call the predict function for our question and display the outputs. We want to ensure that we have some confidence in our predictions so we will check the score and only display the output if the prediction score over 60% confident. If not we will give the user a list of choices to choose from.
protected override async Task OnMessageActivityAsync(ITurnContext turnContext, CancellationToken cancellationToken)
{
//creates our FAQ model
FAQModel question = new FAQModel()
{
Question = turnContext.Activity.Text,//gets text from bot
Answer = ""
};
//uses our trained model to predict our answer
PredictionModel prediction = FAQBot.predictionEngine.Predict(question);
//accuracy of prediction
float score = prediction.Score.Max() * 100;
//gonna check if we were accurate,if below a threshold we will ask them to clarify
if(score > 60)
{
//sends our answer back to the bot
await turnContext.SendActivityAsync(MessageFactory.Text(prediction.PredictedAnswer), cancellationToken);
await turnContext.SendActivityAsync(MessageFactory.Text($"We think our answer to your question is this accurate: {score}%"), cancellationToken);
}
else
{
//sends them suggestions that are clickable
await turnContext.SendActivityAsync(MessageFactory.Text("Sorry, we didnt understand the question, please try selecting a question below"), cancellationToken);
string[] actions = {"What are your hours?","How can I reach you?", "What payments do you accept?"};
await turnContext.SendActivityAsync(MessageFactory.SuggestedActions(actions), cancellationToken);
}
}
Our chatbot is now finished. SendActivityAsync allows us to send a message back to the user. this can be called at any time during the process and is helpful for long running processes. Microsoft also provides us a set of activities, found in the MessageFactory, for interacting with users. We can easily send images, cards, or attachments using the different types found in the factory.
Testing the Chatbot
With our bot endpoint setup, we now need to test it out. Microsoft provides a tool for emulating a bot system which you can find here:
https://github.com/microsoft/BotFramework-Emulator/releases. The emulator will emulate a Teams chat so you can see how the responses will interact with Teams and other Microsoft products. Download the latest version, and run the application. Once the emulator is running, open your site by the URL http://localhost:{port}/api/messages. You should see a successful connection and the message "Hello and welcome!" This comes from our bot's "OnMemberAddesAsync" function found in the FAQBot.cs file. The final step is to ask the bot a question and test out the functionality.
Clone the project
Microsoft, Bot, Chatbot, Bot Framework, Teams, Microsoft Teams, ML.NET, Machine Learning, AI, Artificial Intelligence, C#,C Sharp, NLP, Natural Language Programming, Robot
No comments:
Post a Comment