Making the packtbot

Never miss the deal of the day!

Mon, 10 Jul 2017


Slack is an effective group communication tool that Ive found myself using quite a bit recently. It simplifies communication greatly eliminating the need for multiple apps. With Slack Im able to interact with work, side projects, and the programming community with one application. Where Packt comes into play, is that for the last couple of weeks Ive had a colleague posting packts deal of the day in one of our slack channels daily. So when I found a small block of free time, I decided to apply some simple automation.

Where we start

Writing a slack bot using their API and python is well documented with many examples being available. Infact I was surprised because it was probably the easiest API Ive worked with so far.

Ive also dabbled with web scraping in the past so I figured this would be incredibly quick to throw together something basic. Of course it wouldnt be any fun if I didnt hit a snag or two.

This is a recipe that has been done over plenty of times, requests to grab the page, and then BeautifulSoup to parse our what information we want.

So I threw open a python shell and just pulled the pages HTML and used BeautifulSoup to prettyprint(prettify) it, that way I could form an idea of where to go with this.

First Snag

403 - Forbidden

Its common for sites to deploy some mechanisms to deter people from scraping their pages. Which is entirely fair since making alot of requests programmatically could potentially bog a site down. But in our case we only want 1 request every 24 hours. Selenium has always been my go to when scraping or testing websites, especially if Im doing a more complicated process and want to see visually what is happening. In this case though since we are pulling just one page, so going headless is much more preferred. With the magic of PhantomJS and selenium we can emulate a browser, so on the site side they see the request as coming from a ‘browser’ and dont give us the 403.

Now I simply swapped out the Requests library for BS4 and reran the script. This time around it worked great! So now that we’re pulling clean HTML, its time to get to parsing.

A little right click and inspect on the packt deal of the day page, revealed the title to be right here

<div class="dotd-title"><h2>Our Title

BS4 made quick work of this parsing and soon we get the result.

['\\n\\t\\t\\t\\t\\t\\t\\t\\t\\t\\t\\t\\t\\t\\t\\tMongoDB Cookbook\\t\\t\\t\\t\\t\\t\\t\\t\\t\\t\\t\\t\\t\\t']

Woo! A new line and a bunch of tabs. Weirdly enough while I had some trouble on earlier, I dropped string.strip('\\n').strip('\\t') in the code after work and it stripped away all of those tabs.

Alright so now that we’ve got the scraping and parsing figured out, lets add our slack code in.

import codecs
from selenium import webdriver
from bs4 import BeautifulSoup
from slackclient import SlackClient

def get_book_title(url):
	driver = webdriver.PhantomJS()
	driver.set_window_size(1120, 550)
	response = driver.page_source.encode('utf-8')
	html_str = str(response)
	soup = BeautifulSoup(html_str, "html.parser")
	title = soup.find('div', 'dotd-title')
	children = title.findChildren()
	for child in children:
		return child.contents

def clean_string(title):
	clean = str(title[0])
	clean = clean.strip('\\n').strip('\\t')
	return clean

def deploy_bot(url, title):
	BOT_NAME = 'packtbot'
	slack_client = SlackClient(SLACK_BOT_TOKEN)
	message = get_message(url,title)
	if slack_client.rtm_connect():
		slack_client.api_call("chat.postMessage", channel='random',
			text=message, as_user=True)
		print('Connection failed\n')

def get_message(url, title):
	message = 'Todays free book is {0} \n'.format(title)
	message += 'Browse over to {0} to claim it!\n'.format(url)
	return message

def main():
	title = get_book_title(url)
	title = clean_string(title)

if __name__ == "__main__":


We have acheived automation!

Our Bot!


Making a slack bot to let me and my coworkers know what the deal of the day is was easy thanks to python and slacks API!

As well as the littany of other tools we put to use.

Theres just one thing though, do I really care about every free book packt has?

I barely have time to read things that interest me.

Fortunately my colleague was sharing with us things he found interesting and relevant.

Sadly my bot cant quite do that, but with the addition of a wordlist I think we can get pretty close. We can tackle that later

Anthony Laiuppa

If you have any feedback feel free to @ me on twitter, Im always looking to learn. -AL