Who am I?
Let’s divide the word ‘Chatbot’ and begins with Bot. Well, the technical definition of bot states that it is a short form of robot and is an automated program that runs over the internet. Some bots run automatically while others only execute commands when they receive specific input.
Chatbot is one of the kind of bots which looks for certain text patterns in terms of input text or voice messages submitted by participants and respond back with automated action.
The chatbot can talk to you through different channels; such as Facebook Messenger, Siri, WeChat, Telegram, SMS, Slack, Skype and many others. Now a days, Amazon Alexa, Google Home and Microsoft Windows Cortana are few examples of chatbots which even respond with automated action and voice messages.
It’s also very important to note that messaging applications are more being used by consumers than they spend time on social media. That’s the reason companies or business unit deliver chatbot experiences to consumers with messaging applications.
Many business entities are trying to make their own bots in the market, but when they choose to develop a bot, then they got confused between the platforms for chatbot development. There are plenty of chatbot platforms are available from which we can make chatbot, but it is difficult for business unit to decide.
Below mentioned are few frameworks which are being used to make chatbot:
1. IBM Watson Conversation
2. Microsoft Bot Framework
4. Amazon Lex
6. RASA Stack
Today, we are going to discuss about RASA Stack chatbot framework.
RASA Stack — A chatbot Solution
The Rasa Stack is a pair of open source libraries (Rasa NLU and Rasa Core) that allow developers to expand chatbots and voice assistants beyond answering simple questions.
Using state-of-the-art machine learning, bots can hold contextual conversations with users. Rasa is production ready and used in large companies everywhere.
Rasa NLU: A natural language understanding solution which takes the user input and tries to infer the intent and extract the available entities.
Rasa Core: A dialogue management solution tries to build a probability model which decides the set of actions to perform based on the previous set of user inputs.
Rasa NLU’s job is to interpret messages, and Rasa Core’s job is to decide what should happen next.
Rasa NLU performs Natural Language Understanding, which means taking free-form text like Please send the confirmation to firstname.lastname@example.org and turning it into structured data. Rasa Core performs Dialog Management, which means keeping track of a conversation, and deciding how to proceed. Both Rasa Core and NLU use Machine Learning to learn from real example conversations.
I have explained Weatherbot with Slack — Walk Through from Justina Petraitytė’s blog From zero to hero: Creating a chatbot with Rasa NLU and Rasa Core where she had explained in detail about creating weatherbot from scratch in approximately 2 hours of video with this link.
She has also provided GitHub link of the source code for further reference.
Weatherbot with Slack — Walk Through:
We have understood the basic architecture of RASA stack. Let’s create a weatherbot with Slack as User Interface which will tell us the weather condition for a city.
Whenever we are going to give input text related to weather condition along with city name, then chatbot should understand the text and identify the city. This city should be passed to APIs and get the information of Temperature, Humidity, Wind Speed for that city.
I have divided creation of weatherbot into 4 different levels along with their understanding.
LEVEL 1: Creating a simple program using RASA NLU to get an understanding of defining intent and entities.
LEVEL 2: Creating an intermediate program using RASA Core and its dialogue management model where chatbot interacts at the command prompt level.
LEVEL 3: Integrating RASA Core dialogue management model with Slack Messenger User Interface.
LEVEL 4: Understanding Spacy and different algorithm underneath of RASA NLU and RASA Core.
LEVEL 1: Creating a simple program using RASA NLU to get an understanding of defining intent and entities:
1. Installation of Libraries: List out all the libraries required for creating chatbot in RASA NLU.
Create Requirements.txt file and put required libraries to be installed along with their version.
Command: pip install -r requirements.txt
During the installation of libraries, multiple issues might occur due to operating system (Windows, Mac, Linux) compatibility settings and default system configuration.
For Example: C++ Compiler issue, NumPy version issue etc. It is good practice to google it out and find resolution so that complete installation of libraries is done.
2. Download Spacy Language Model to Parse text messages for extracting necessary information.
Command: python -m spacy download en
3. Install npm and then RASA NLU Trainer.
Command: npm i -g rasa-nlu-trainer
4. RASA NLU Trainer has cool UI which makes data generation of NLU model a lot easier. We can add any number of intent and text for data.json file which will be used for training the model. We need to do any action based on user input then we can define entity also by selecting relevant word.
For Example: What’s the weather in Berlin now? Here, we need to highlight Berlin and give this entity a name like location so that next time onwards RASA NLU understands the same kind of context and mark other city name as location entity.
· How’z the weather in Ranchi?
· Can you please tell weather condition in Bengaluru?
Here, RASA NLU automatically interprets Ranchi and Bengaluru as location entity.
5. Creating Spacy_Config.json File: This file is required to be created as part of RASA NLU Configuration. The detailed explanation you can check at LEVEL 4. As of now, you need to understand that it helps to extract necessary information, provides path of data. json and storing generating nlu models’ files.
Pipeline is going to define what features we are going to be used to crunch messages and extract necessary information. Details of Pipeline as explained Later.
Path is directory where we will keep model after it is trained.
Data is directory where data.json file which is created using rasa-nlu-trainer we are going to define intent and entities.
6. RASA NLU TRAINER:
· Creation of data.json file: Create a new empty data.json file with path of data/data.json.
· Open from rasa-nlu-trainer data.json file.
· Specify proper text, intent, entities.
data.json: You can create this file manually first and then add the contents from rasa-nlu-trainer or manually also. This file is in json format which will have mainly Key-Value pair. The following terminology is mainly used in this file.
· text: User input is likely to be mentioned in this key.
· intent: It captures what bot should respond based on text. This can be further divided into few categories like greet, goodbye, inform, positive, negative, status, etc. If a user says, “What’s the weather in Bengaluru?”, the intent would be finding the details of weather in Bengaluru.
· Entity: The useful information from the user input can be extracted. From previous example, by intent we understand the aim is to find the day of week, but of which date? If we extract “Bengaluru” as the entity, we can perform the action on inform.
7. Training the ML Model: In this step, the loaded Training Data is fed into an NLP pipeline and gets converted into an ML model.
· Now, it’s time for execution. We have already created data.json (./data/data.json) and spacy_config.json files. We are going to use these files to train our model and store this model related files into ‘./models/nlu’ folder.
· First, we need to run train_nlu(‘./data/data.json’,’config_spacy.json’,’./models/nlu’) function which persist or stores model information into mentioned path ‘./models/nlu’.
· Second, we need to comment train_nlu function and uncomment run_nlu() function and execute the python file as below mentioned given command line.
· Here in run_nlu function we have mentioned text message which RASA NLU interprets using data.json and spacy_config.json and it will return different intents along with their confidence level so that we will come to know that how RASA NLU calculates accuracy and confidence level.
· OUTPUT: Below mentioned output clearly explains when we have executed command python nlu_model.py with run_nlu() function. RASA NLU has tried to interpret text message “I am planning my holiday to Bercelona. I wonder what is the weather out there.” It calculates the confidence level with different intents which we had passed into data.json file.
o Inform intent has got highest accuracy with 73%.
o Greet intent has got 16% of confidence.
o Goodbye intent has got 11% of accuracy.
· At the end, RASA NLU has classified intent as inform and value as bercelona and entity as location.
Command: python nlu_model.py
NOTE: For all the code I have added into this article are not aligned. Kindly refer to Github for actual code.
Kindly let me know your feedback or input to make this article much better along with your claps and also, if you are facing any issue regarding creating of Weatherbot.