Hello again folks.
I was on vacation and ended up not following up on the solution for my case.
It took me a long time but I found a way, and I'll try to explain it without much ado.
At least for me, I understood that One Drive limited my speed not for the connection as a whole, but rather, to between 2.5-3MB/s per file uploaded.
No matter the system, method, configuration, client ID, it stayed that way.
Remembering that my problem was that I had files that were too big, so the upload took much longer than 24 hours, and this caused a time limit for One Drive itself, which is also not possible to be removed, where it blocks the connection for updating the permission token .
For example, a single 15GB file would have an average upload speed of 2.5-3MB/s, but a 15GB file divided into 6 2.5GB parts would upload each of those parts at the same speed. however, with the help of the --transfers=6 parameter, I was able to upload the files simultaneously, and in the total upload I got my 15-18 MB/s upload.
The only problem with this is that I also didn't have much disk space to simply split this 214GB file with winrar, because even though the compressed files are smaller, it would still use at least 50% of space in addition to the original file (space that I don't have).
So with a lot of effort, due to my structure being divided with the data between windows and linux systems causing several code and compatibility errors, I developed a script using Python and Unix Shell, which does the following functions:
wbadmin command to generate bare metal image of my entire system;
So I adapted the use of the tail command together with truncate, so that for each part of the file that was created, it would remove this part of the original file, so I would get 6 parts of the original file in the end, without using storage space, and without affecting the integrity of the file.
Then upload the files to One Drive with rclone with the parameter --transfers=6;
After the upload is complete, I reverse tail and truncate the local file using the cat command to merge the file into a single one without using more disk space than necessary.
Obviously I summarized everything that was done and the methods I used, however, the result was that the upload that was done at a measly 2.5MB/s and would take more than 36 hours to complete, was reduced to 3 hours of upload.
I appreciate everyone's help, it was a long way but it was a great learning experience even more with the support of wise people like you.
@ncw @Ole @asdffdsa
Note : In case someone needs help with a script in the future for a case similar to mine, you can send me a DM and then I can make the scripts I developed more specifically available, it would be complicated to put it here because the script contains many tokens and secrets and I'll have to spend a lot of time with them to get the scripts redacted without affecting their functionality.