S3 is more than just file storage. Used correctly, it can significantly reduce costs and improve performance. Here are practical tips from production experience.
Use Presigned URLs for Uploads
Let clients upload directly to S3 instead of routing through your server:
const AWS = require('aws-sdk')
const s3 = new AWS.S3()
async function getUploadUrl(filename, contentType) {
const params = {
Bucket: 'my-bucket',
Key: `uploads/${Date.now()}-${filename}`,
ContentType: contentType,
Expires: 300 // URL valid for 5 minutes
}
return s3.getSignedUrlPromise('putObject', params)
}
// Client can now upload directly to the returned URL Set Lifecycle Rules
Automatically move or delete old files to save costs. In AWS Console or via CLI:
{
"Rules": [
{
"ID": "MoveToGlacier",
"Status": "Enabled",
"Filter": { "Prefix": "archives/" },
"Transitions": [
{
"Days": 90,
"StorageClass": "GLACIER"
}
]
},
{
"ID": "DeleteOldLogs",
"Status": "Enabled",
"Filter": { "Prefix": "logs/" },
"Expiration": { "Days": 30 }
}
]
} Quick Tips
- Use CloudFront in front of S3 for faster global access - Enable versioning for important buckets - Set proper CORS configuration for web uploads - Use server-side encryption (SSE-S3) for sensitive data - Consider S3 Transfer Acceleration for large file uploads from distant locations