Using Rclone to migrate data from IBM COS to locally mounted SAN storage

Hello Everyone, I have been tasked with migrating about 200TB of data from an on premise IBM cloud object storage system. The data has been uploaded by IBM spectrum protect software previously known as TSM backup. The idea is to temporarily move this data off the system to buy us some time until we can expand it in the near future. We will be using either a Redhat linux or Windows 10 host that is attached to SAN Block storage via Fibre Channel. I suspect that metadata preservation to and from both storage systems will be essential for the software to be able to read data properly after it's return to object storage. Would appreciate any tips and caveats on Rclones usage specifically on metadata preservation.

Thank You !

What backend does the IBM cloud object storage system use? Is it swift or s3 or other?

I understand you want to copy the data to local storage - is that correct?

By metadata you mean the metadata on the objects? There isn't a way of preserving those on local storage yet so it would be best to double check if you actually need the metadata I think.

One thing you could do is set up minio which acts as an s3 server - rclone can preserve the metadata on that sort of copy (at least if it can't I can make it do so!).

Hello Nick,

IBM COS back-end uses S3 for it's storage. Yes looking to copy it to "local" SAN drives, thanks for the Minio suggestion but at first glance it looks as though it only works with JBODS vs SAN attached drives.

Do you definitely need the metadata?

How many objects are there?

Minio can just run off a local disk - that is how I use it.

Still need to determine if metadata is actually required. A few thousand objects.

If you only have a few 1000 objects you could potentially back the metadata up separately maybe with s3cmd then stick it back on afterwards.

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.