How I migrated my S3 buckets to another AWS account

For many, many years, I've had multiple AWS accounts (for personal and business). I first signed up to AWS with my own personal Amazon account, and I've long worried about what that might mean for me if anything happened to lock me out. We've all heard the stories of people getting their accounts suspended/closed (e.g., someone loses their entire Google account because of a Play Store incident) and how that impacts everything for them.
Losing my Amazon account isn't a big deal (since I don't really care what Amazon does after I've received my deliveries), but losing my AWS account out of the blue would be incredibly disruptive. I don't want to end up on Hacker News looking for support.
Fearing such a situation, over the years I've moved most of my stuff across. Domains have been shifted to Namecheap, DNS has been shifted to CloudFlare, and hosting has moved to bare metal providers. One provider per function - contains the blast radius if any of them decide to pull any tricks. But the one thing that I'd always left alone was S3. S3 is incredibly cheap and effortless to keep, so moving everything out of this account was never a priority. One of the buckets was also the origin of a CloudFront distribution, and I knew it would be a bit of a hassle to transfer it all.
I decided to look at it today, though, and the process turned out to be remarkably simple. I thought about using bucket ACLs to properly transfer the bucket, which might have been a less manual option, but given that I hadn't looked at these in years, I decided to do some housekeeping and go through each bucket manually to determine what I did and didn't want to keep. Some of the buckets were for services I didn't use anymore.
1. Set up credentials for the aws CLI tool for both accounts.
First, in ~/.aws/credentials, you can set access keys for all your accounts, making the entire process incredibly easy. You can use --profile account-1 with any of the CLI commands, and it'll use the correct access keys.
[account-1]
aws_access_key_id = ...
aws_secret_access_key = ...
[account-2]
aws_access_key_id = ...
aws_secret_access_key = ...2. Download & delete buckets you don't need anymore.
There are some buckets I am happy to delete, but being a digital hoarder, I still want to make sure I have a backup of them. I have a local home lab with a chunky mirrored ZFS array replicated off-site, so I am happy to store the S3 bucket backups there. My goal is to do housekeeping with my AWS accounts, not to get rid of the data.
In my case, I am not worried about the size of the buckets because they are small (these aren't powering production services, just personal projects and services).
Run the commands below, and that's it. AWS will sync everything for you.
My ZFS array is mirrored, meaning that data isn't lost even if a hard drive dies. And even if all drives were to die at the same time, there's an off-site replica with everything in it, so this stuff is adequately backed up. That whole setup is a story for another post, but it works great for my needs here.
# This will download the entire S3 bucket onto your computer.
aws s3 sync --profile account-1 s3://bucket-1 /zfs/backups/bucket-1
# This will delete every file in the S3 bucket and then delete the bucket.
aws s3 rb --profile account-1 s3://bucket-1 --force3. Create new buckets and transfer everything.
In my case, I don't need to worry about transferring the data from one bucket straight to another. That would definitely be a fun exercise (and a lengthier post), but in my case, I can hold everything I had in S3 locally, so I can afford to do this the manual way.
Create the new S3 bucket on your new account in the AWS Management Console.
Once that's done, syncing back is a single command:
# This will upload the entire folder from your computer onto S3.
# Note the different bucket name and profile parameter.
aws s3 sync /zfs/backups/bucket-1 s3://bucket-2 --profile account-2And that's that!
4. Setup new CloudFront distribution (if needed)
One of my buckets serves as my personal screenshot upload service. I have a script on my computers that automatically uploads screenshots whenever I take them (this is especially useful on macOS, combined with Cmd-Shift-4, which lets you take cropped screenshots).
I like to make those screenshots available via a CDN because I often link to them directly on websites and want them to load fast from anywhere in the world. Overkill? Certainly. But it was a fun learning experience when I set it up over 10 years ago (using a CDN for the first time), and it's virtually free, so... why not.
Things to remember/note:
- Don't forget to set a default root object (in my case, it's
index.html). - For a CloudFront distribution with an S3 origin, you can create a custom error response for 403 errors. I did it with
error.htmland even got it to respond with 404. So when you go to any random page, you get a neat 404 page rather than a bare S3 XML response. - Set up an SSL certificate with AWS Certificate Manager.
- Set the CloudFront distribution to redirect HTTP to HTTPS.
That's it! This might not be anything special, but as I went through the process, I figured it wouldn't hurt to write about it. I want to get more in the habit of writing about stuff as I'm doing it rather than keeping it to myself. Fun fact, though: I wrote this in December 2024, but only published it in December 2025. So I'm still not doing great on the whole "keeping it to myself" thing, but I'm trying.
If you have any suggestions or thoughts, I'd love to hear from you; I'm always excited to talk about this stuff, so feel free to reach out.