What is your cloud service provider? Recommendations may depend on that.
Also, for 1.5mill files - seriously consider archiving some of that. Consider if you really need to be able to individually download all of that or can tolerate bundling some of it while in storage. In terms of how to optimize a set of many, but small files. This is both the most powerful thing you can do and also about the only thing. Big files upload can be optimized for with settings in rclone, but very small files will depend almost entirely on the limits of the cloud provider - and thus making them into single, larger files has massive performance benefits.
Clouds usually have massive bandwidth, but loads of tiny files are their bane. The most high performance services deal with them "not so great". Most of them deal with them terribly. What this effectively often amounts to is poor effective use of the bandwidth you have - and thus it taking a long time. Tends to be especially bad on uploading, with somewhat more permissive limits on download.
Finally, not all clouds can even storage 1.5 mill files. Many of them have some sort of max limit - or at least a "recommended maximum for good performance".
To give an idea from Gdrive which is really common - you can start a little over 2 new transfers pr second, so that comes out to about 9 days of 24/7 transferring.
The maximum file-limit on a teamdrive is 400.000 according to google, with a recommendation of 100.000 or less for optimal performance.
These specs can vary pretty wildly between different providers.