Hi Andrej,
Your observations share some similarities with this thread (depending on the characteristics of your data/folders/remotes):
https://forum.rclone.org/t/huge-memory-usage-when-copying-between-s3-like-services/33687
I therefore suggest you try:
rclone sync --dry-run --checkers=1 COS_US_SOUTH:ibm-docs-dev/"$PRODUCT_KEY" COS_US_SOUTH:ibm-docs-stage/"$PRODUCT_KEY" -v
If it succeeds, then you can add back the flags one by one and then gradually increase checkers to locate the breaking point.
If it fails, then I suggest your try to find out which remote is causing the issue by (dry-run) checking each of the remotes towards a local folder using the above command.
What is the highest number of objects (files and folders) you have in a single folder (excluding objects in subfolders of the folder)?