It's annoying for me that you cannot access old messages in Slack's free version.
So I decided to write a small utility to backup messages periodically.
After a quick search, I found a script. It's written in Python and uses slacker package. You can check the repository following the link.
To install slacker package run:
pip install slacker
Next you need token. You can get it from Slack's Legacy tokens page. As the title says these tokens are already legacy and will be deprecated in the future. Don't know the exact date but I will update this post accordingly.
Note: Slack Workspace must allow generating tokens to access API. So the administrator of the workspace must allow such access.
So we have required script, package and token. To get our backup we can run:
./backup_slack.py --token=<token> --outdir=/home/username/slack-backup
If everything is fine you will get the output:
Saving username list to /home/username/slack-backup/users.json
Saving public channels to /home/username/slack-backup/channels
Saving private channels to /home/username/slack-backup/private_channels
Saving direct messages to /home/username/slack-backup/direct_messages
All data is exported in JSON format.
The size of exported data is too small in my case. Only a few megabytes. So I decided to run this script with cron on an everyday basis.
Move script to bin directory:
sudo cp backup_slack.py /usr/local/bin
Create a backup directory:
mkdir -p ~/Backup/Slack
I decided to create a directory for every run. The format is YYYY-MM-DD. Shell command for that is:
But there is a small detail. Cron cannot escape % sign. So the simplest solution was to write a bash script and move it to the /usr/local/bin.
backup_slack.py --token=$token --outdir=$dir
Move this script to bin:
sudo mv slack_backup.sh /usr/local/bin
Next we need to add a cron task. I chose execution every day at 18:00. Add following line to /etc/crontab:
0 18 * * * username slack_backup.sh
Replace username with yours and restart cron:
sudo systemctl restart cron
Now your Slack data will be exported at 18:00 every day.
For now, I don't need to access or view this data. I just wanted to own my data and don't lose access to it without paying. In future, I plan to write merging script and integrate this data to some kind of view application.
I hope it was helpful to you.