Getting Started With RASA-ChatBot And Telegram
This article gives a basic introduction to RASA & shows how all the pieces of the puzzle fit together from training to testing to using it on 3rd party channels (In this case Telegram ). If you are new to RASA I highly recommend checking out their docs first it’s quite easy to follow.
Why RASA?
I chose RASA as it is open source and I can install it in my own local machine or cloud server. The main advantage of RASA over similar offerings like google dialogflow, amazon lex, etc.. is that your data remains private to you.
Why Telegram?
Because of its simple setup. you can use any other channel you like the process of integration remains the same.
let’s dive into RASA, RASA stack contains rasa_nlu (To understand message intent) rasa_core (To take action based on intent) both the libraries are independent. you can find detailed docs here.
Steps
- Clone the Repo from my Github the code is similar to that of RASA docs. If you are familiar with entity extraction you can use the rasa starter pack.
- Unzip and install the packages listed in requirements and also install spaCy English language model using following commands
pip install -r requirements.txtpython -m spacy download en
Explanation of files
I assume you have downloaded code from my GitHub and installed all the dependencies.
In rasa_telegram/nlu you will see two files that are used to train rasa_nlu
nlu_config.yml — It basically defines how the model will be trained and how features will be extracted. We will use the predefined spacy_sklearn pipeline since we have very few training examples. more info here
nlu_data.md — It contains nlu training data you can feed data using .md or .json format we will be using .md for simplicity.
In rasa_telegram/core you will see two files that are used to train rasa_core
domain.yml — The domain defines the universe your bot lives in — what user inputs it should expect to get, what actions it should be able to predict, how to respond and what information to store. (source)
stories.md — Rasa Core models learn from real conversational data in the form of training “stories”. A story is a real conversation between a user and a bot where user inputs are expressed as intents and the responses of the bot are expressed as action names. (source)
rest of the files will be explained as we move on
Training And Testing rasa_nlu
python -m rasa_nlu.train -c nlu/nlu_config.yml --data nlu/nlu_data.md -o models --fixed_model_name nlu --project current --verbose
this command trains the rasa_nlu using a config file specified by “-c” and data specified by “-data” and stores the model at the directory specified by “-o”
to test model you can run python3 test_nlu_model.py
Actions
If you see the domain.yml file you will find various actions that can be performed by the bot. the actions starting withutter_
just sends a message to the user. however, you can write custom actions that run any arbitrary code. we have specified a custom action named action_callapi
the code that your custom action executes need to be hosted on the server and the endpoints have to be provided using endpoints.yml. You can create an action server in node.js, Java, or any other language and define your actions there. if you choose python rasa does provide rasa_core_sdk to simplify your work. we will use the same. make sure to install it using “pip install rasa_core_sdk”(more info)
Our custom actions are written in actions.py and endpoints have been specified in endpoints.yml
Training And Testing rasa_core
python -m rasa_core.train -d core/domain.yml -s core/stories.md -o models/current/dialogue
this command trains the rasa_core using a domain file specified by “-d” and stories by “-s” and stores the model at the directory specified by “-o”
to test model start the action server
python -m rasa_core_sdk.endpoint --actions actions
then run the following command to start chat with the bot using terminal
python3 -m rasa_core.run -d models/current/dialogue -u models/current/nlu --endpoints endpoints.yml
congrats you have successfully built a chatbot using RASA Stack now let's deploy it on the telegram.
Setting Up Telegram
step 1. create a bot using botfather and get the access token.
step 2. setup a webhook using this URL
https://api.telegram.org/bot{token}/setWebhook?url={webhook url}
webhook URL needs to be https if you are following along on local machine you can use ngrok or any other similar service to create a tunnel. we will be using flask and ngrok (since it's easy to get started)
pip3 install flask
by default, flask runs on port 5000
ngrok http 5000
you can test everything out by running the main.py. don't forget to paste your bot token and make sure your action server is also running.
main.py
import all the dependencies
import requests
import json
from flask import Flask
from flask import request
from flask import Response
from rasa_core.agent import Agent
from rasa_core.interpreter import RasaNLUInterpreter
from rasa_core.utils import EndpointConfigtoken = 'yourbottoken'
app = Flask(__name__)
load trained nlu model
interpreter = RasaNLUInterpreter(‘./models/current/nlu’)
while loading the agent don't forget to pass the action_endpoint
agent = Agent.load(‘./models/current/dialogue’, interpreter=interpreter,
action_endpoint=EndpointConfig(url=”http://localhost:5055/webhook"))
the function to interact with our model
def applyAi(message):
responses = agent.handle_message(message)
text = []
if responses:
for response in responses:
text.append(response["text"])
return text
few helper functions
def parse_msg(message):
chat_id = message['message']['chat']['id']
txt = message['message']['text']
return chat_id,txtdef send_message(chat_id,messages=[]):
url = 'https://api.telegram.org/bot'+token+'/sendMessage'
if messages:
for message in messages:
payload = {'chat_id' : chat_id,'text' : message}
requests.post(url,json=payload)
return True
function to handle post request from the telegram
@app.route('/',methods=['POST','GET'])
def index():
if(request.method == 'POST'):
msg = request.get_json()
chat_id , message = parse_msg(msg)
response_messages = applyAi(message)
send_message(chat_id,response_messages)
return Response('ok',status=200)
else:
return '<h1>HELLO</h1>'
let's run the code and check
Conclusion
congrats we have successfully created and deployed our bot. In this article, we learned what is RASA, how to set it up, the difference between rasa_nlu and rasa_core, how to set up custom actions, how to train and test rasa_nlu and rasa_core models. and finally how to use it via 3rd party channels like Telegram.
If you face any issue or have any query ping me on telegram “@ManaanAnsari”.