Super Slurper
Super Slurper allows you to quickly and easily copy objects from other cloud providers to an R2 bucket of your choice.
Migration jobs:
- Preserve custom object metadata from source bucket by copying them on the migrated objects on R2.
- Do not delete any objects from source bucket.
- Use TLS encryption over HTTPS connections for safe and private object transfers.
When to use Super Slurper
Section titled “When to use Super Slurper”Using Super Slurper as part of your strategy can be a good choice if the cloud storage bucket you are migrating consists primarily of objects less than 1 TB. Objects greater than 1 TB will be skipped and need to be copied separately.
For migration use cases that do not meet the above criteria, we recommend using tools such as rclone.
Use Super Slurper to migrate data to R2
Section titled “Use Super Slurper to migrate data to R2”- From the Cloudflare dashboard, select R2 > Data Migration.
- Select Migrate files.
- Select the source cloud storage provider that you will be migrating data from.
- Enter your source bucket name and associated credentials and select Next.
- Enter your R2 bucket name and associated credentials and select Next.
- After you finish reviewing the details of your migration, select Migrate files.
You can view the status of your migration job at any time by selecting your migration from Data Migration page.
Source bucket options
Section titled “Source bucket options”Bucket sub path (optional)
Section titled “Bucket sub path (optional)”This setting specifies the prefix within the source bucket where objects will be copied from.
Destination R2 bucket options
Section titled “Destination R2 bucket options”Overwrite files?
Section titled “Overwrite files?”This setting determines what happens when an object being copied from the source storage bucket matches the path of an existing object in the destination R2 bucket. There are two options: overwrite (default) and skip.
Supported cloud storage providers
Section titled “Supported cloud storage providers”Cloudflare currently supports copying data from the following cloud object storage providers to R2:
- Amazon S3
- Cloudflare R2
- Google Cloud Storage (GCS)
- All S3-compatible storage providers
Tested S3-compatible storage providers
Section titled “Tested S3-compatible storage providers”The following S3-compatible storage providers have been tested and verified to work with Super Slurper:
- Backblaze B2
- DigitalOcean Spaces
- Scaleway Object Storage
- Wasabi Cloud Object Storage
Super Slurper should support transfers from all S3-compatible storage providers, but the ones listed have been explicitly tested.
Create credentials for storage providers
Section titled “Create credentials for storage providers”Amazon S3
Section titled “Amazon S3”To copy objects from Amazon S3, Super Slurper requires access permissions to your S3 bucket. While you can use any AWS Identity and Access Management (IAM) user credentials with the correct permissions, Cloudflare recommends you create a user with a narrow set of permissions.
To create credentials with the correct permissions:
- Log in to your AWS IAM account.
- Create a policy with the following format and replace
<BUCKET_NAME>
with the bucket you want to grant access to:
{ "Version": "2012-10-17", "Statement": [ { "Effect": "Allow", "Action": ["s3:Get*", "s3:List*"], "Resource": ["arn:aws:s3:::<BUCKET_NAME>", "arn:aws:s3:::<BUCKET_NAME>/*"] } ]}
- Create a new user and attach the created policy to that user.
You can now use both the Access Key ID and Secret Access Key when defining your source bucket.
Google Cloud Storage
Section titled “Google Cloud Storage”To copy objects from Google Cloud Storage (GCS), Super Slurper requires access permissions to your GCS bucket. You can use the Google Cloud predefined Storage Admin
role, but Cloudflare recommends creating a custom role with a narrower set of permissions.
To create a custom role with the necessary permissions:
- Log in to your Google Cloud console.
- Go to IAM & Admin > Roles.
- Find the
Storage Object Viewer
role and select Create role from this role. - Give your new role a name.
- Select Add permissions and add the
storage.buckets.get
permission. - Select Create.
To create credentials with your custom role:
- Log in to your Google Cloud console.
- Go to IAM & Admin > Service Accounts.
- Create a service account with the your custom role.
- Go to the Keys tab of the service account you created.
- Select Add Key > Create a new key and download the JSON key file.
You can now use this JSON key file when enabling Super Slurper.
Caveats
Section titled “Caveats”While R2's ETag generation is compatible with S3's during the regular course of operations, ETags are not guaranteed to be equal when an object is migrated using Super Slurper. Super Slurper makes autonomous decisions about the operations it uses when migrating objects to optimize for performance and network usage. It may choose to migrate an object in multiple parts, which affects ETag calculation.
For example, a 320 MiB object originally uploaded to S3 using a single PutObject
operation might be migrated to R2 via multipart operations. In this case, its ETag on R2 will not be the same as its ETag on S3.
Similarly, an object originally uploaded to S3 using multipart operations might also have a different ETag on R2 if the part sizes Super Slurper chooses for its migration differ from the part sizes this object was originally uploaded with.
Relying on matching ETags before and after the migration is therefore discouraged.
Archive storage classes
Section titled “Archive storage classes”Objects stored using AWS S3 archival storage classes ↗ will be skipped and need to be copied separately. Specifically:
- Files stored using S3 Glacier tiers (not including Glacier Instant Retrieval) will be skipped and logged in the migration log.
- Files stored using S3 Intelligent Tiering and placed in Deep Archive tier will be skipped and logged in the migration log.
Was this helpful?
- Resources
- API
- New to Cloudflare?
- Products
- Sponsorships
- Open Source
- Support
- Help Center
- System Status
- Compliance
- GDPR
- Company
- cloudflare.com
- Our team
- Careers
- 2025 Cloudflare, Inc.
- Privacy Policy
- Terms of Use
- Report Security Issues
- Trademark