Sending data reports and alerts to Slack (step-by-step guide)

TL;DR: To send scheduled or conditional alerts to Slack, simply create a Slack app custom to your workspace and generate an OAuth token. Query your data and search for the condition your want to check for and call the Slack Python API to send an alert if the condition is met. Schedule your Python script to run on a certain schedule. Setting this up takes approximately 1 hour.

If you build dashboards or reports, you know how frustrating it can be to work on an analysis, share it with your team and watch it never get used again. Of course, there are ways to validate the usefulness of the report before you start working on it, but a common issue is not the lack of usefulness, but the accessibility. Your stakeholders are bombarded with their own requests, meetings, emails and hundreds of dashboards and reports. Even if the one you delivered is useful, they may not even know how to get back to it.

One of the best ways to ensure that your analysis and data is used by the business, is to meet your business stakeholders where they are: email and Slack (or Microsoft Teams). In this tutorial, we’re going to show you in detail how to deliver automated data insights with a Slack bot that can either send regular reports to a Slack channel or proactive alerts when certain conditions are met. Working in Slack is also one of the most powerful ways to work on collaborative data analysis. This is where conversations are happening in the business, and the closer the data is to the conversation, the more likely it is to get used.

Building this should take you about 1 hour and requires basic SQL and Python knowledge and a Slack bot token (you will either need admin access to your workspace or request a token from your admin). We’re going to do this in Fabi.ai simply because it’s the easiest and quickest way to get this standing up in production without having to spin up your own scheduler, but we’ll also explain how to do this without Fabi.ai.

Here’s a video tutorial for visual learners:

What are automated data insights and why push them to Slack?

Before we dive into the details, let’s take a minute to talk about automated data insights and why you may want to leverage Slack.

Let’s start by talking about the BJ Fogg behavioral model. BJ Fogg is a Stanford researcher who best described the idea of an “action line”. If you or your stakeholders want to take action when you see something or something comes to mind, you need to be sufficiently motivated and capable of taking action in order to take that action. 

Let’s take a concrete example: If you’re a Head of Customer Success and you’re having a discussion with your CEO who asks “How many new customers have completed onboarding this week and what’s our success rate?”, you may be highly motivated to answer the question (you want to have an intelligent response for your CEO), but if you have no idea where to get that data, your ability to answer the question may be too low, so you don’t do anything. This is what we labelled as “A” in the chart below.

Let’s say that instead, you get a question “How many customers received their free holiday Starbucks gift card?”. You may have a dashboard fairly easily accessible, but that likely isn’t your most pressing question so you don’t get around to doing it. This is what we labelled “B” in the chart.

So as a data team, it’s important to focus on insights that your stakeholders are sufficiently motivated to answer and to make sure that your stakeholders are easily able to find the answers. This is where Slack comes into the picture. If you use Slack as an organization, this is where your stakeholders spend the majority of their time. Delivering insights right in their workspace is the best way to ensure that they are “able” to get to the data. Better yet, if you’re able to deliver the data and insights at the right time, you will increase the odds of your data being used even more. Think of this like a social media notification on your phone: if you receive a notification in the evening around the time you get home from work, you’re much more likely to check it out than if you receive the notification at 10AM and it gets buried with everything else.

Scheduled reports vs trigger-based alerts

In this tutorial we’re going to show you how to build both regularly scheduled reports in Slack as well as alerts that only get sent when a specific condition is met. As we touched on above, if you’re able to send the data at the right place and at the right time, the likelihood of it being consumed increases significantly. A report that is sent out every day at the same time is much more likely to get ignored over time. Conversely, reports that are sent out too regularly at irrelevant times are mostly likely to get ignored over time.

An important technical note: For both the scheduled reports and the trigger-based alerts, we’re going to be using a regularly scheduled job. The key difference is that for trigger-based alerts, the bot will only send an update if the condition is met, but it still needs to run in the background to check for that condition. Depending on the type of report you’re building, there may be ramifications on your data warehouse compute resources and costs. If you want to build a bot that listens for events, you should consider a solution like Zapier that is designed specifically for that use case.

Creating a data Slack bot step-by-step

Step 1: Create a Slack app

To send alerts to Slack, we’re going to create an analytics Slack bot that’s managed by you and only available in your Slack workspace.

Go to the Slack apps page: https://api.slack.com/apps

Click on “Create New App” and select “From scratch”. Name your bot whatever you would like, select the workspace you want it to belong to then “Create App”.

That’s it, you’ve created your Slack app, now we need to generate a token.

Step 2: Generate a bot token

In this step, you need to make sure that you’re generating a “bot” token and that you give your app the proper permissions, so follow these steps carefully.

First go to “OAuth & Permissions” on the left hand side.

Then give your app the proper OAuth scope. Scroll down and select “Add an OAuth Scope”. From the options, select “chat:write.public”. Here we’re assuming that you want your bot to be able to send messages to public channels. If you want to set up your bot differently you may need to adjust the scope accordingly. 

Once you’ve configured the scope, scroll back up and click “Install to {your_workspace}” under “OAuth Tokens”.

Once you’ve done this, you should see a token that starts with “xoxb-”. If your token doesn’t start with those letters (as of 2024), you don’t have the proper token and you may want to make sure you followed the right steps.

Step 3: Prep your data

In this step, you will want to query your data from wherever it lives. In this demo we’re generating some fake data to keep the example very simple, but if you’re doing this in Fabi.ai, you can simply query your data directly from your data source. Fabi.ai stores the results of SQL queries as a Python DataFrame, making it easy to quickly re-used in your Slack bot script.

If you want to follow along with our simple example, use the script below to generate a DataFrame:

import pandas as pd
from datetime import datetime, timedelta

# Generate dates for past 5 weeks
end_date = datetime.now()
dates = [(end_date - timedelta(weeks=x)).strftime('%Y-%m-%d') for x in range(4, -1, -1)]

# Create dataframe with sales data
sales_data = pd.DataFrame({
    'date': dates,
    'sales': [10, 20, 30, 40, 30]
})

# Display the dataframe
sales_data

As you can see in this that we have 5 weeks of sales data.

Step 4: Generate the Python script for the bot

Now that you have your token in hand and your data, it’s time to create the Slack app script. If you’re using Fabi.ai you can simply ask the AI “Create a Slack app script that looks at the last week-over-week change of @sales_data and only sends a message if the week-over-week change is negative” and it should generate the code for you. If you're not using Fabi.ai ChatGPT or Claude should be able to do this as well, you will just need to provide some sample data.

The Python app script should look like this:

import pandas as pd
from slack_sdk import WebClient
from slack_sdk.errors import SlackApiError
import os

# Set your Slack API token
SLACK_TOKEN = "xoxb-..."  # Replace with your actual token
CHANNEL_ID = "slackbot_demo"    # Replace with your channel ID

First we need to import the right packages and set the Slack token and the target channel. In this case I’m going to send my alert to “slackbot_demo” channel. You may want to start by sending this to a public channel before you try private channels just to reduce the odds of complications due to scope permissions.

Then we compute the latest week-over-week change in sales data:

# Calculate week-over-week change
sales_data['wow_change'] = sales_data['sales'].diff()

# Check if the most recent change is negative
latest_change = sales_data['wow_change'].iloc[-1]

And finally, if the change is negative, we send an alert to Slack:

if latest_change < 0:
    # Initialize Slack client
    client = WebClient(token=SLACK_TOKEN)
    
    # Prepare message
    message = f"⚠️ Alert: Sales decreased by ${abs(latest_change)} compared to last week"
    
    try:
        # Send message to Slack
        response = client.chat_postMessage(
            channel=CHANNEL_ID,
            text=message
        )
    except SlackApiError as e:
        print(f"Error sending message: {e.response['error']}")

That’s it! Run your script (or in Fabi.ai hit the “Run Code” button on your Python cell) and you should see the alert appear in Slack. Try changing the last sales data point to be greater or lesser than the previous week and run your script to see how it only triggers when the condition is met.

Step 5: Schedule your report

If you’re not using Fabi.ai, you will need to use a cron job to schedule your script to run at some regular interval to check for the condition. In order for this scheduled job to run even when your laptop is not running you will need to deploy it to the cloud following a similar process to deploying a Streamlit app to EC2.

If you’re doing this in Fabi.ai, once your Python data app is ready, simply click on “Report” on the top navigation bar.

Converting your Python script to a scheduled report in Fabi.ai

On the right, in the configuration panel, select a schedule, then “Publish”.

Fabi.ai configuration panel to schedule and publish reports

Now your Python script will pull your data and check for the condition on whatever schedule you’ve configured, and send an alert to Slack when the condition is met.

Slack apps are a powerful way to deliver the right insights at the right place at the right time

Sending schedule reports to Slack, or alerts when certain conditions are met, is one of the best ways to ensure that your analysis and data is used across your organization. Delivering insights where your stakeholders already are, is the best way to engage in collaborative data analysis. 

Setting up these reports and alerts is very simple. You simply need to generate a Slack app OAuth token with proper scope permissions and write a short Python app script to send the data to a select channel. Then, to schedule this report to run regularly or to scan for a condition, you can use a cron job and deploy your script to virtual machines in the cloud (for example in EC2), or you can simply schedule your script to run in Fabi.ai.

Fabi.ai is the simplest way to build and deploy Python data apps and Python dashboards in a secure and scalable production environment. You can get started for free in less than 2 minutes.

Related reads

Subscribe to Query & Theory