<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:atom="http://www.w3.org/2005/Atom">
  <channel>
    <title>AI Automation Newsletter</title>
    <description>Get the best AI agents, automation workflows, blueprints, and tools in your inbox</description>
    
    <link>https://aiautomation.news/</link>
    <atom:link href="https://rss.beehiiv.com/feeds/I42PxucRmo.xml" rel="self"/>
    
    <lastBuildDate>Tue, 3 Mar 2026 11:55:13 +0000</lastBuildDate>
    <pubDate>Tue, 27 Jan 2026 11:00:33 +0000</pubDate>
    <atom:published>2026-01-27T11:00:33Z</atom:published>
    <atom:updated>2026-03-03T11:55:13Z</atom:updated>
    
      <category>Productivity</category>
      <category>Artificial Intelligence</category>
      <category>Technology</category>
    <copyright>Copyright 2026, AI Automation Newsletter</copyright>
    
    
    
    <docs>https://www.rssboard.org/rss-specification</docs>
    <generator>beehiiv</generator>
    <language>en-us</language>
    <webMaster>support@beehiiv.com (Beehiiv Support)</webMaster>

      <item>
  <title>Everyone&#39;s running Clawdbot</title>
  <description>Almost no one is backing it up. Here&#39;s the fix.</description>
      <enclosure url="https://media.beehiiv.com/cdn-cgi/image/fit=scale-down,format=auto,onerror=redirect,quality=80/uploads/asset/file/7814f927-51bc-4913-baec-417b8e85fb72/The_3-Layer_Backup_Strategy_for_Self-Hosted_Clawdbot.png" length="2017615" type="image/png"/>
  <link>https://aiautomation.news/p/everyone-running-clawdbot</link>
  <guid isPermaLink="true">https://aiautomation.news/p/everyone-running-clawdbot</guid>
  <pubDate>Tue, 27 Jan 2026 11:00:33 +0000</pubDate>
  <atom:published>2026-01-27T11:00:33Z</atom:published>
    <dc:creator>Cagri Sarigoz</dc:creator>
  <content:encoded><![CDATA[
    <div class='beehiiv'><style>
  .bh__table, .bh__table_header, .bh__table_cell { border: 1px solid #F3F3F3; }
  .bh__table_cell { padding: 5px; background-color: #FFFFFFFF; }
  .bh__table_cell p { color: #2D2D2D; font-family: 'Helvetica',Arial,sans-serif !important; overflow-wrap: break-word; }
  .bh__table_header { padding: 5px; background-color:#E5E9EFFF; }
  .bh__table_header p { color: #000000; font-family:'Work Sans','Lucida Grande',Verdana,sans-serif !important; overflow-wrap: break-word; }
</style><div class='beehiiv__body'><p class="paragraph" style="text-align:left;"><i>Hey everyone, welcome to the very first (and very delayed) email from AI Automation News (formerly Workflow Automation Tools Newsletter)!</i></p><p class="paragraph" style="text-align:left;"><i>Since Clawdbot is very popular these days, I wanted to kickstart the newsletter with a post about it.</i></p><p class="paragraph" style="text-align:left;"><i>Happy reading!</i></p><hr class="content_break"><p class="paragraph" style="text-align:left;">Running a self-hosted AI agent means you&#39;re responsible for everything: code, configs, credentials, and the entire server. Lose any of these, and you&#39;re starting from scratch.</p><p class="paragraph" style="text-align:left;">After setting up <a class="link" href="https://github.com/clawdbot/clawdbot?utm_source=aiautomation.news&utm_medium=newsletter&utm_campaign=everyone-s-running-clawdbot" target="_blank" rel="noopener noreferrer nofollow">Clawdbot</a> on a <a class="link" href="https://my.racknerd.com/aff.php?aff=11910&pid=907&utm_source=aiautomation.news&utm_medium=newsletter&utm_campaign=everyone-s-running-clawdbot" target="_blank" rel="noopener noreferrer nofollow">$5/mo VPS on Racknerd</a>, I built a 3-layer backup system that costs under $1/month and can restore a complete server in 30 minutes.</p><h2 class="heading" style="text-align:left;" id="the-problem">The Problem</h2><p class="paragraph" style="text-align:left;">A typical self-hosted setup has three types of data:</p><div style="padding:16px 16px 16px;"><table class="bh__table" width="100%" style="border-collapse:collapse;"><tr class="bh__table_row"><th class="bh__table_header" width="33%"><p class="paragraph" style="text-align:left;">Type</p></th><th class="bh__table_header" width="33%"><p class="paragraph" style="text-align:left;">Examples</p></th><th class="bh__table_header" width="33%"><p class="paragraph" style="text-align:left;">Risk if Lost</p></th></tr><tr class="bh__table_row"><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;"><b>Code & Config</b></p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">Scripts, agent configs, memory files</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">Hours of work</p></td></tr><tr class="bh__table_row"><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;"><b>Secrets</b></p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">API tokens, OAuth credentials, keystores</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">Auth headaches</p></td></tr><tr class="bh__table_row"><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;"><b>System State</b></p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">Docker volumes, OS configs, packages</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">Full rebuild</p></td></tr></table></div><p class="paragraph" style="text-align:left;">Each needs different handling. You can&#39;t push secrets to GitHub. You shouldn&#39;t pay for full-system backups of recoverable OS files.</p><h2 class="heading" style="text-align:left;" id="the-solution-3-layers">The Solution: 3 Layers</h2><div class="codeblock"><pre><code>┌─────────────────────────────────────────────────────┐
│                   BACKUP LAYERS                     │
├──────────────┬──────────────┬───────────────────────┤
│   GitHub     │ Cloudflare   │    Backblaze B2       │
│   (Free)     │ R2 (Free)    │    (~$0.25/mo)        │
├──────────────┼──────────────┼───────────────────────┤
│ • Scripts    │ • API tokens │ • Docker volumes      │
│ • Configs    │ • OAuth keys │ • /etc configs        │
│ • Memory     │ • Keystores  │ • Home directories    │
│ • Skills     │ • .env files │ • Package lists       │
├──────────────┼──────────────┼───────────────────────┤
│ Versioned    │ Encrypted    │ GFS Retention         │
│ Daily push   │ Daily sync   │ 7 daily/4 wk/12 mo    │
└──────────────┴──────────────┴───────────────────────┘
</code></pre></div><h3 class="heading" style="text-align:left;" id="layer-1-git-hub-code-config">Layer 1: GitHub (Code & Config)</h3><p class="paragraph" style="text-align:left;"><b>What:</b> Version-controlled workspace with scripts, agent configs, and memory files.</p><p class="paragraph" style="text-align:left;"><b>Why GitHub:</b> Free, versioned, searchable. You can diff changes and roll back mistakes.</p><p class="paragraph" style="text-align:left;"><b>Setup:</b></p><div class="codeblock"><pre><code># Initialize repo
cd ~/workspace
git init
git remote add origin git@github.com:username/my-agent-workspace.git

# Create .gitignore for secrets
cat &gt; .gitignore &lt;&lt; &#39;EOF&#39;
.env
*.env
**/credentials.json
**/token*.json
**/*.pem
**/*.key
EOF

# Initial commit
git add -A
git commit -m &quot;Initial backup&quot;
git push -u origin main
</code></pre></div><p class="paragraph" style="text-align:left;"><b>Automate daily backups</b> (cron or your agent&#39;s scheduler):</p><div class="codeblock"><pre><code>#!/bin/bash
cd ~/workspace
git add -A
if ! git diff --cached --quiet; then
  git commit -m &quot;Daily backup $(date +%Y-%m-%d)&quot;
  git push
fi
</code></pre></div><h3 class="heading" style="text-align:left;" id="layer-2-cloudflare-r-2-secrets">Layer 2: Cloudflare R2 (Secrets)</h3><p class="paragraph" style="text-align:left;"><b>What:</b> Encrypted backup of credentials, tokens, and sensitive configs.</p><p class="paragraph" style="text-align:left;"><b>Why R2:</b> S3-compatible, 10GB free tier, <b>zero egress fees</b> (critical for restores).</p><p class="paragraph" style="text-align:left;"><b>Setup:</b></p><ol start="1"><li><p class="paragraph" style="text-align:left;">Create an R2 bucket at the <a class="link" href="https://dash.cloudflare.com/?to=%2F%3Aaccount%2Fr2%2Fnew&utm_source=aiautomation.news&utm_medium=newsletter&utm_campaign=everyone-s-running-clawdbot" target="_blank" rel="noopener noreferrer nofollow">Cloudflare Dashboard</a></p></li><li><p class="paragraph" style="text-align:left;">Generate S3-compatible API token</p></li><li><p class="paragraph" style="text-align:left;">Configure rclone:</p></li></ol><div class="codeblock"><pre><code>cat &gt; ~/.config/rclone/rclone.conf &lt;&lt; &#39;EOF&#39;
[r2]
type = s3
provider = Cloudflare
access_key_id = YOUR_ACCESS_KEY
secret_access_key = YOUR_SECRET_KEY
endpoint = https://ACCOUNT_ID.r2.cloudflarestorage.com
acl = private
no_check_bucket = true
EOF
</code></pre></div><p class="paragraph" style="text-align:left;"><b>Backup script:</b></p><div class="codeblock"><pre><code>#!/bin/bash
# backup-secrets.sh

BACKUP_DIR=&quot;/tmp/secrets-backup-$$&quot;
mkdir -p &quot;$BACKUP_DIR&quot;

# Collect secrets
cp -r ~/.config/myapp &quot;$BACKUP_DIR/&quot; 2&gt;/dev/null
cp -r ~/.clawdbot &quot;$BACKUP_DIR/&quot; 2&gt;/dev/null
find ~/workspace -name &quot;.env&quot; -exec cp --parents &#123;&#125; &quot;$BACKUP_DIR/&quot; \;

# Create archive
tar -czf /tmp/secrets-$(date +%Y-%m-%d).tar.gz -C &quot;$BACKUP_DIR&quot; .

# Upload to R2
rclone copy /tmp/secrets-*.tar.gz r2:my-bucket/secrets/
rclone copyto /tmp/secrets-*.tar.gz r2:my-bucket/secrets/latest.tar.gz

# Cleanup
rm -rf &quot;$BACKUP_DIR&quot; /tmp/secrets-*.tar.gz
</code></pre></div><h3 class="heading" style="text-align:left;" id="layer-3-backblaze-b-2-full-system">Layer 3: Backblaze B2 (Full System)</h3><p class="paragraph" style="text-align:left;"><b>What:</b> Complete system backup including Docker volumes, OS configs, and home directories.</p><p class="paragraph" style="text-align:left;"><b>Why B2:</b> Cheapest storage at $0.005/GB/month. GFS retention keeps costs predictable.</p><p class="paragraph" style="text-align:left;"><b>Setup:</b></p><ol start="1"><li><p class="paragraph" style="text-align:left;">Create a B2 bucket at the <a class="link" href="https://www.backblaze.com/cloud-storage?utm_source=aiautomation.news&utm_medium=newsletter&utm_campaign=everyone-s-running-clawdbot" target="_blank" rel="noopener noreferrer nofollow">Backblaze Dashboard</a></p></li><li><p class="paragraph" style="text-align:left;">Create application key with: <code>listFiles</code>, <code>readFiles</code>, <code>writeFiles</code>, <code>deleteFiles</code></p></li><li><p class="paragraph" style="text-align:left;">Configure rclone:</p></li></ol><div class="codeblock"><pre><code>cat &gt;&gt; ~/.config/rclone/rclone.conf &lt;&lt; &#39;EOF&#39;
[b2]
type = b2
account = YOUR_KEY_ID
key = YOUR_APPLICATION_KEY
EOF
</code></pre></div><p class="paragraph" style="text-align:left;"><b>Full system backup script:</b></p><div class="codeblock"><pre><code>#!/bin/bash
# host-backup.sh - Run on HOST, not in Docker

set -e

B2_BUCKET=&quot;my-system-backup&quot;
DATE=$(date +%Y-%m-%d)
DAY_OF_WEEK=$(date +%u)
DAY_OF_MONTH=$(date +%d)
BACKUP_DIR=&quot;/tmp/backup-$$&quot;

# Retention policy
KEEP_DAILY=7
KEEP_WEEKLY=4
KEEP_MONTHLY=12

mkdir -p &quot;$BACKUP_DIR&quot;

# 1. Docker volumes
mkdir -p &quot;$BACKUP_DIR/docker/volumes&quot;
for vol in $(docker volume ls -q); do
  docker run --rm -v &quot;$vol:/data:ro&quot; -v &quot;$BACKUP_DIR/docker/volumes:/backup&quot; \
    alpine tar -czf &quot;/backup/$&#123;vol&#125;.tar.gz&quot; -C /data .
done

# 2. Docker compose files
find /home -name &quot;docker-compose*.yml&quot; -exec cp --parents &#123;&#125; &quot;$BACKUP_DIR/docker/&quot; \;

# 3. System configs
tar -czf &quot;$BACKUP_DIR/etc.tar.gz&quot; --exclude=&#39;/etc/ssl/certs&#39; -C / etc
dpkg --get-selections &gt; &quot;$BACKUP_DIR/packages.txt&quot;
crontab -l &gt; &quot;$BACKUP_DIR/crontab.txt&quot; 2&gt;/dev/null || true

# 4. Home directories (excluding cache)
for user in /home/*; do
  username=$(basename &quot;$user&quot;)
  tar -czf &quot;$BACKUP_DIR/$&#123;username&#125;.tar.gz&quot; \
    --exclude=&#39;node_modules&#39; --exclude=&#39;.cache&#39; --exclude=&#39;.npm&#39; \
    -C /home &quot;$username&quot;
done

# Create final archive
ARCHIVE=&quot;/tmp/backup-$&#123;DATE&#125;.tar.gz&quot;
tar -czf &quot;$ARCHIVE&quot; -C &quot;$BACKUP_DIR&quot; .

# Upload with GFS rotation
rclone copy &quot;$ARCHIVE&quot; &quot;b2:$&#123;B2_BUCKET&#125;/daily/&quot;

[ &quot;$DAY_OF_WEEK&quot; = &quot;7&quot; ] &amp;&amp; rclone copy &quot;$ARCHIVE&quot; &quot;b2:$&#123;B2_BUCKET&#125;/weekly/&quot;
[ &quot;$DAY_OF_MONTH&quot; = &quot;01&quot; ] &amp;&amp; rclone copy &quot;$ARCHIVE&quot; &quot;b2:$&#123;B2_BUCKET&#125;/monthly/&quot;

rclone copyto &quot;$ARCHIVE&quot; &quot;b2:$&#123;B2_BUCKET&#125;/latest.tar.gz&quot;

# Apply retention
rclone delete &quot;b2:$&#123;B2_BUCKET&#125;/daily/&quot; --min-age &quot;$&#123;KEEP_DAILY&#125;d&quot;
rclone delete &quot;b2:$&#123;B2_BUCKET&#125;/weekly/&quot; --min-age &quot;$((KEEP_WEEKLY * 7))d&quot;
rclone delete &quot;b2:$&#123;B2_BUCKET&#125;/monthly/&quot; --min-age &quot;$((KEEP_MONTHLY * 31))d&quot;

# Cleanup
rm -rf &quot;$BACKUP_DIR&quot; &quot;$ARCHIVE&quot;
</code></pre></div><h2 class="heading" style="text-align:left;" id="the-schedule">The Schedule</h2><p class="paragraph" style="text-align:left;">All backups run automatically:</p><div style="padding:16px 16px 16px;"><table class="bh__table" width="100%" style="border-collapse:collapse;"><tr class="bh__table_row"><th class="bh__table_header" width="33%"><p class="paragraph" style="text-align:left;">Time (UTC)</p></th><th class="bh__table_header" width="33%"><p class="paragraph" style="text-align:left;">What</p></th><th class="bh__table_header" width="33%"><p class="paragraph" style="text-align:left;">Where</p></th></tr><tr class="bh__table_row"><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">04:00</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">Full system</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">Backblaze B2</p></td></tr><tr class="bh__table_row"><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">06:00</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">Workspace code</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">GitHub</p></td></tr><tr class="bh__table_row"><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">06:30</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">Secrets</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">Cloudflare R2</p></td></tr></table></div><h2 class="heading" style="text-align:left;" id="disaster-recovery">Disaster Recovery</h2><p class="paragraph" style="text-align:left;">When things go wrong, here&#39;s the restore order:</p><p class="paragraph" style="text-align:left;"><b>Scenario 1: Lost a file</b></p><div class="codeblock"><pre><code># From GitHub
git checkout HEAD~1 -- path/to/file
</code></pre></div><p class="paragraph" style="text-align:left;"><b>Scenario 2: Corrupted secrets</b></p><div class="codeblock"><pre><code># From R2
rclone copy r2:my-bucket/secrets/latest.tar.gz /tmp/
tar -xzf /tmp/latest.tar.gz -C /
</code></pre></div><p class="paragraph" style="text-align:left;"><b>Scenario 3: Server died, starting fresh</b></p><div class="codeblock"><pre><code># On new VPS
curl https://rclone.org/install.sh | sudo bash

# Configure rclone with B2 credentials, then:
rclone copy b2:my-system-backup/latest.tar.gz /tmp/
cd /tmp &amp;&amp; tar -xzf latest.tar.gz

# Restore packages
sudo dpkg --set-selections &lt; packages.txt
sudo apt-get dselect-upgrade -y

# Restore Docker volumes
for vol in docker/volumes/*.tar.gz; do
  name=$(basename &quot;$vol&quot; .tar.gz)
  docker volume create &quot;$name&quot;
  docker run --rm -v &quot;$name:/data&quot; -v &quot;$(pwd)/docker/volumes:/backup&quot; \
    alpine tar -xzf &quot;/backup/$&#123;name&#125;.tar.gz&quot; -C /data
done

# Restore home dirs
sudo tar -xzf username.tar.gz -C /home/

# Start containers
docker compose up -d
</code></pre></div><h2 class="heading" style="text-align:left;" id="cost-breakdown">Cost Breakdown</h2><div style="padding:16px 16px 16px;"><table class="bh__table" width="100%" style="border-collapse:collapse;"><tr class="bh__table_row"><th class="bh__table_header" width="33%"><p class="paragraph" style="text-align:left;">Service</p></th><th class="bh__table_header" width="33%"><p class="paragraph" style="text-align:left;">Free Tier</p></th><th class="bh__table_header" width="33%"><p class="paragraph" style="text-align:left;">Monthly Cost</p></th></tr><tr class="bh__table_row"><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">GitHub</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">Unlimited private repos</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">$0</p></td></tr><tr class="bh__table_row"><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">Cloudflare R2</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">10GB storage</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">$0</p></td></tr><tr class="bh__table_row"><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">Backblaze B2</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">10GB storage</p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;">~$0.25 (50GB)</p></td></tr><tr class="bh__table_row"><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;"><b>Total</b></p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;"></p></td><td class="bh__table_cell" width="33%"><p class="paragraph" style="text-align:left;"><b>~$0.25/month</b></p></td></tr></table></div><h2 class="heading" style="text-align:left;" id="key-takeaways">Key Takeaways</h2><ol start="1"><li><p class="paragraph" style="text-align:left;"><b>Separate concerns</b>: Code → GitHub, Secrets → R2, System → B2</p></li><li><p class="paragraph" style="text-align:left;"><b>Zero egress fees matter</b>: R2 for secrets means free restores</p></li><li><p class="paragraph" style="text-align:left;"><b>GFS retention</b>: Keep 7 daily + 4 weekly + 12 monthly without runaway costs</p></li><li><p class="paragraph" style="text-align:left;"><b>Automate everything</b>: If it&#39;s not automated, it won&#39;t happen</p></li><li><p class="paragraph" style="text-align:left;"><b>Test your restores</b>: A backup you&#39;ve never restored is not a backup</p></li></ol><p class="paragraph" style="text-align:left;">The best time to set up backups is before you need them. The second best time is now.</p><hr class="content_break"><p class="paragraph" style="text-align:left;"><i>This setup protects a self-hosted Clawdbot instance, but the pattern works for any Docker-based deployment.</i></p><hr class="content_break"><p class="paragraph" style="text-align:left;">Thanks for reading,</p><p class="paragraph" style="text-align:left;"><b>Cagri Sarigoz</b></p><p class="paragraph" style="text-align:left;">AI Automation News Founder</p></div></div>
  ]]></content:encoded>
</item>

  </channel>
</rss>
