Rapidly prototyping conversational UIs with Telegram using Python, AWS Lambda, and the AWS API Gateway

Conversational UIs, chat bots, design as a conversation, messaging as a platform: all of these terms -- which, quite frankly, mean the exact same thing to me -- have been hotly discussed over the past year. Plenty of silicon oracles have written about the impending world takeover of chat bots, so I won't bore you with more postulating, but there are a few important things you should know on the topic before diving into prototyping:

  • Messaging is the dominant digital experience for most young people
  • Chat bots were all the rage back in the day, think Microsoft's Clippy and AOL's SmarterChild, but are making a roaring comeback on the heels of Apple's Siri and Amazon's Alexa
  • Bots today have the potential to be infinitely more powerful than previous ones because they don't need to be standalone services offered only to your internal user base. Messaging apps such as Slack, Facebook Messenger, Telegram, and even Twitter to some extent are opening up APIs that enable you to inject your own services into their platform and thus reach far greater scale than you could with a standalone bot
  • At this point, the majority of today's bots are still either backed by human operators, or are glorified lists of if-this-then-that statements
  • Progress forward lies in the improvements in machine learning that will make it possible to automate an increasingly larger percentage of today's human-to-bot conversations

Now that you are an expert in chat bots, and have brainstormed all sorts of ways you can deliver value to your customers via a bot, it's time to actually build one using Telegram. I'm a fan of the Telegram approach because it enables me to quickly test the underlying value prop of a conversational UI with customers without needing to build out a custom command handler engine and visual layer -- both important parts of a bot, but complex enough that you want to avoid them until you prove value. Telegram has clients on pretty much every platform, even Linux, and all of them are incredibly polished products -- some of the best I've ever used -- so it's not hard to quickly get your testers up and running on their platform of choice.

Let's get cracking.

The Telegram Bot API is the magic that enables us to rapidly prototype something useful. For each bot I build, I follow the same steps to get it up and running and into the hands of users all over the world:

  • Create the bot
  • Configure dev environment
  • Configure an AWS IAM role with permissions to execute bot related things
  • Write an AWS Lambda function in Python to handle incoming bot messages
  • Deploy the Lambda function
  • Test the Lambda function
  • Create an AWS API Gateway endpoint that when hit, activates your Lambda function
  • Deploy and test the API endpoint
  • Set the bot's webhook to the API Gateway endpoint

Do that, and your bot should be up and running.

Creating the bot

You can easily create a bot by talking to the BotFather on Telegram. I won't rewrite the instructions, because they are already on the Telegram website, and the BotFather also provides pretty solid instructions as you go. Just make sure you copy your API key down once it's given to you.

Configuring dev environment

To quickly create working bots, I use the Telepot python framework instead of using the Requests library to directly hit the Telegram HTTP API. Telepot is a light framework and takes care of many annoying things right out of the box. Other than using Telepot, the only important characteristic is to make sure the dev environment is running Python 2.7, because that is what AWS Lambda supports. If you are mainly operating in Python 3+ at this point like I am, be sure to create your virtual environment like this

virtualenv -p /usr/bin/python2.7 venv

Configuring an AWS IAM role

As with anything AWS-related, you need an IAM role with access to specific resources and must give it a set of execution permissions before anything starts working properly. If you are like me, configuring IAM permissions is by far the most annoying part and is always where I stumble on new projects. To set one up that can run your Telegram bot, login to the IAM console and navigate to the "Roles" section. Create a new role. I name mine "lambda-gateway-execution-role." In the permissions section, attach the following policies to the role:

  • AWSLambdaBasicExecutionRole
  • AmazonAPIGatewayInvokeFullAccess
  • AmazonAPIGatewayPushToCloudWatchLogs
  • CloudWatchFullAccess
  • CloudWatchLogsFullAccess
  • AmazonAPIGateWayAdministrator

Writing the Lambda function that handles incoming bot messages

Allow me to briefly nerd out about AWS Lambda, which is just really cool and awesome and useful service. Lambda functions are executed based on event triggers, which is great, but an even better thing is that you don't need to configure or provision any servers. Amazon only charges you for the compute time you consume, which for a basic Telegram bot, is basically nothing. Event driven architectures are da bomb, yo!!

There are two important parts to getting a Telegram bot to work as an AWS Lambda Function:

  1. A default Lambda handler to deal with incoming events (you need to specify the name of this handler when you create your Lambda function)
  2. A function to parse the incoming messages and return the appropriate response

The Lambda event handler I use is dead simple. I print the incoming event to the console then pass it through to my parsing function. It looks like this

def my_handler(event, context):
    print("Received event: " + json.dumps(event, indent=2))
    handle(event['message'])

The parsing function -- "handle(event['message'])" in the above code snippet -- will totally differ depending on what you actually want your bot to do. Most of my bots just parse various commands and then call helper functions to generate the appropriate responses. Regardless, Telegram recommends that every bot start with support for at least three commands: start, help, and settings. For those, the simple skeleton below will work.

def handle(msg):
    flavor = telepot.flavor(msg)
    # normal message
    if flavor == "normal":
        content_type, chat_type, chat_id = telepot.glance2(msg)
        print("Normal Message:", content_type, chat_type, chat_id)
        command = msg["text"]
        if command == "/start":
            bot.sendMessage(chat_id, text='Hi! I am a Telegram Bot!!')
        elif command == "/help":
            bot.sendMessage(chat_id, text="I don't have any help commands yet!")
        elif command == "/settings":
            bot.sendMessage(chat_id, text="I cannot be configured via any settings yet. Check back soon!")
        else:
            bot.sendMessage(chat_id, text="Sorry, I didn't understand that command.")
        return("Message sent")
    else:
        raise telepot.BadFlavor(msg)

I've created a simple boilerplate that you can use to get a bot supporting the start, settings, and help commands up and running in seconds.

mkdir telegram-bot
cd telegram-bot
git init
git clone https://github.com/mamcmanus/telegram-awslambda-bot-boilerplate.git
virtualenv -p /usr/bin/python2.7 venv
source venv/bin/activate
pip install -r requirements.txt

Open bot.py and put your API key in on line 34.

bot = telepot.Bot('BOT KEY')

Then, make sure you've added the bot on Telegram, and run the bot locally to start chatting with it!

python bot.py

Deploying the Lambda function

To deploy, you need to first create a .ZIP of your lambda function, a requirements.txt file that lists any dependencies, and all of the contents of your venv/lib/python2.7/site-packages directory.

You can then create a function using the AWS Lambda web console, and upload the .ZIP file as your source package. Follow these steps

  • Sign into AWS and open the Lambda console. You can find the link for Lambda in the upper left corner under the "Compute" section.
  • Create a new Lambda function
  • Skip over the first step of selecting a blueprint, you already have one
  • Name your function "bot" and set the runtime to Python 2.7
  • Choose "Upload a .ZIP file" and upload the .ZIP of your function that you created
  • Set the Handler to "bot.my_handler" and create a basic execution role if you don't have one already
  • Review and submit

Testing the Lambda function

To test the Lambda function, you need data that simulates an actual person sending a message to the bot -- the JSON message body of a POST request to the Bot API, which looks like this

{
    "update_id": 8888,
    "message": {
    "chat": {
        "first_name": "Matt",
        "id": put_your_id_here,
        "last_name": "McManus",
        "type": "private",
        "rolename": "mcman_s"
    },
    "date": 1453851465,
    "from": {
        "first_name": "Matt",
        "id": put_your_id_here,
        "last_name": "McManus",
        "rolename": "mcman_s"
    },
    "message_id": 2,
    "text": "/start"
    }
}

You need to replace "put_your_id_here" with your chat ID. You can get that ID by running the bot locally, sending it a message, and copying the id from the output in your terminal window.

Once you have your test JSON blob, click the "Actions" dropdown and choose "Configure test event." Copy the JSON blob and hit "Save and test."

You should get a message from your bot as if you had sent it the "/start" command.

Creating an AWS API Gateway endpoint to activate the Lambda function

Go back to the AWS console, find "API Gateway" under the "Application Services" section and click on it. Then do some some stuff:

  • Create a new API and name it "TelegramWebHook"
  • Create a new POST method
  • Select "Lambda function" as the integration type, select the region in which your Lambda function is deployed, and type in the name of your bot, which should auto-complete, then save it
  • Deploy your API by creating a new stage, call it whatever you want, I just use the default "prod"
  • Write down the Invoke URL for later

Testing the API endpoint

  • In the API Gateway console, navigate to the "/ - POST - Method Execution" page and click "Test"
  • In the Request Body section, copy the same JSON blob you used to test your Lambda function, and then fire away

Once again, you should get a message from your bot as if you had sent it the "/start" command.

Setting the bot's webhook

The final part to getting your bot up and running is setting up a webhook that calls your API gateway endpoint. To do that, curl the Telegram set webhook API endpoint and pass it the invoke url of your API endpoint that you copied down earlier. The curl command looks like this

curl -X POST https://api.telegram.org/bot{your-api-key}/setWebhook \
-d "url={your invocation url}}"

Add your own invocation url and bot API key

Now, send your bot a command and it should respond.

BOTS ARE SO FUN!!!

The applications of conversational UIs are pretty much endless. I've already published one bot, the Philly Phanatic, that helps me automate the completely esoteric task of staying up to date with my favorite Philly sports teams. I've also written a few more that I haven't yet published.

If you work at major enterprise and need some inspiration to start prototyping, there are some potential easy wins that I can think of off the top of my head:

  • User account self servicing (e.g. help users get info on their accounts, change their passwords/account info, etc...)
  • Concierge bots (as a potential replacement for premium over the phone service)
  • New feature user on-boarding inside a traditional UI
  • Business intelligence bots (e.g. how many customers opened X type of account in the last Y days?)
  • Customer listening bots (e.g. what are people saying about X on twitter?)

If you are having trouble visualizing what these types of services could look like, there are some great examples already in the wild, with lots more coming, I suspect.

If you want to spend half a day rabbit-holing on the future of bots and conversational UIs, I recommended starting with the really great and lengthy piece from the Verge, my favorite on the topic so far. Also, Intercom wrote about it, Techcrunch wrote about it, and prominent venture capitalists are writing and tweeting about it.

Hit me up on Twitter to chat if this stuff interests you, too!

Disclaimer for all CLI junkies out there: I initially went through and did everything with the AWS CLI, but found that the time-to-deployment (TTD) was much faster when using the web console. Lambda is still fairly new and its CLI tutorial is full of typos and errors, which makes it hard to get through, and regardless, for a n00b like me, the AWS CLI is hard to navigate due to all of my different IAM roles and services deployed in different regions.

Back to top