Cloud Storage Integration
Rsync doesn't natively support cloud storage, but combined with rclone, you can push backups to AWS S3, Google Cloud Storage, Backblaze B2, DigitalOcean Spaces, and 40+ other providers — completing your offsite backup strategy.
Rsync + Rclone Architecture
flowchart LR
SRC["Production Server"] -->|"rsync"| LOCAL["Local Backup"]
LOCAL -->|"rsync"| REMOTE["Remote Server"]
LOCAL -->|"rclone"| CLOUD["Cloud Storage<br/>(S3, GCS, B2)"]
- Rsync handles server-to-server sync (fast, delta transfer)
- Rclone handles server-to-cloud sync (supports cloud APIs)
Setting Up Rclone
Install
# One-line install
curl https://rclone.org/install.sh | sudo bash
# Or via package manager
sudo apt install rclone # Debian/Ubuntu
sudo dnf install rclone # CentOS/RHEL
Configure a Remote
rclone config
Follow the interactive wizard to set up your cloud provider. Common examples:
| Provider | Type | Notes |
|---|---|---|
| AWS S3 | s3 | Access key + secret key |
| Google Cloud Storage | google cloud storage | Service account JSON |
| Backblaze B2 | b2 | Application key |
| DigitalOcean Spaces | s3 | S3-compatible endpoint |
| Cloudflare R2 | s3 | S3-compatible, no egress fees |
| Google Drive | drive | OAuth token |
Verify Configuration
# List configured remotes
rclone listremotes
# Test connectivity
rclone lsd myremote:
# List buckets/containers
rclone lsd s3-backup:
Backup Workflows
Local Rsync + Cloud Upload
#!/bin/bash
# backup-to-cloud.sh
# Step 1: Incremental local backup with rsync
rsync -av --link-dest=/backup/latest \
/var/www/html/ /backup/$(date +%F)/
ln -sfn /backup/$(date +%F) /backup/latest
# Step 2: Upload to cloud with rclone
rclone sync /backup/$(date +%F)/ \
s3-backup:my-bucket/backups/$(date +%F)/ \
--transfers=8
Compressed + Encrypted Upload
#!/bin/bash
# secure-cloud-backup.sh
TIMESTAMP=$(date +%F)
# Create compressed archive
tar -czf /tmp/backup-$TIMESTAMP.tar.gz /backup/latest/
# Encrypt with GPG
gpg --symmetric --cipher-algo AES256 \
--output /tmp/backup-$TIMESTAMP.tar.gz.gpg \
/tmp/backup-$TIMESTAMP.tar.gz
# Upload encrypted file to cloud
rclone copy /tmp/backup-$TIMESTAMP.tar.gz.gpg \
s3-backup:my-bucket/encrypted/
# Cleanup temp files
rm /tmp/backup-$TIMESTAMP.tar.gz /tmp/backup-$TIMESTAMP.tar.gz.gpg
Using Rclone's Built-In Encryption
# Set up encrypted remote (during rclone config)
# Type: crypt
# Remote: s3-backup:my-bucket/encrypted/
# Then use it like any other remote
rclone sync /backup/latest/ crypt-remote:backups/$(date +%F)/
tip
Rclone's crypt remote encrypts filenames and contents transparently. You don't need to manually GPG-encrypt anything.
Provider-Specific Examples
AWS S3
# Sync to S3 with storage class
rclone sync /backup/latest/ s3:my-bucket/backups/ \
--s3-storage-class STANDARD_IA
# Use Glacier for long-term archives
rclone copy /backup/monthly/ s3:my-bucket/archives/ \
--s3-storage-class GLACIER
Backblaze B2
# B2 is cost-effective for large backups
rclone sync /backup/latest/ b2:my-backup-bucket/$(date +%F)/
Cloudflare R2
# R2: S3-compatible, zero egress fees
rclone sync /backup/latest/ r2:my-bucket/backups/$(date +%F)/
Cloud Retention Policies
Using Rclone to Clean Up
# Delete cloud backups older than 30 days
rclone delete s3:my-bucket/backups/ --min-age 30d
# Delete empty directories
rclone rmdirs s3:my-bucket/backups/ --min-age 30d
Using Cloud Provider Lifecycle Rules
Most providers support automatic lifecycle policies:
- S3: Transition to Glacier after 30 days, delete after 365 days
- B2: Lifecycle rules to auto-delete old versions
- GCS: Object lifecycle management
Rclone vs Rsync Comparison
| Feature | Rsync | Rclone |
|---|---|---|
| Protocol | SSH/rsync daemon | Cloud provider APIs |
| Destinations | Servers with SSH access | 40+ cloud providers |
| Delta transfer | Block-level | File-level only |
| Encryption | Via SSH (in transit) | Built-in crypt remote |
| Speed | Very fast (delta) | Depends on API/bandwidth |
| Best for | Server-to-server | Server-to-cloud |
Common Pitfalls
| Pitfall | Consequence | Prevention |
|---|---|---|
| No local copy, cloud only | Slow restore (download GB+ first) | Keep at least one local backup |
| Unencrypted cloud backups | Exposed credentials, data breach | Use rclone crypt or GPG |
| No cloud retention policy | Costs spiral as old backups accumulate | Set lifecycle rules or automated cleanup |
| API credentials in plain text | Compromised cloud account | Use rclone config (encrypted) or env vars |
| Heavy sync during business hours | Bandwidth competition | Schedule cloud uploads overnight |
Quick Reference
# Configure rclone
rclone config
# Upload backup to cloud
rclone sync /backup/latest/ s3:my-bucket/backups/
# Encrypted upload
rclone sync /backup/latest/ crypt-remote:backups/
# Download from cloud (restore)
rclone sync s3:my-bucket/backups/2024-01-15/ /restore/
# List cloud backups
rclone ls s3:my-bucket/backups/
# Clean old cloud backups
rclone delete s3:my-bucket/backups/ --min-age 30d
What's Next
- Backup Strategies — Design the backup architecture
- Disaster Recovery — Restore from cloud backups
- Secure Transfer — Protect backup data