Recommendations for ACD->GSuite/Other transfers

Agreed, I wouldn’t do this on my home network, I’ve got a 200mbps connection and 35mbps upload that is no where near the amount of speed needed to transfer this much data. if anything I’ll rent another box if this isn’t done before all the trials are up but so far i think it should be fine was off to a rough start but pulling about 300GB every half hour now @ 20 Processes. I think my issue was more folder structure my folders went pretty deep. Now that that’s out of the way it’s constant download.

these are your speeds using odrive? what are you using to monitor speed? and what OS is that?

debian, and i am using nload to monitor the speeds. This is thru the google server with odrive and I am running the following command:

exec 6>&1;num_procs=20;output=“go”; while [ “$output” ]; do output=$(find “/mnt/data-disk/odrive/Amazon Cloud Drive” -name “.cloudf" -print0 | xargs -0 -n 1 -P $num_procs “$HOME/.odrive-agent/bin/odrive.py” sync | tee /dev/fd/6); done && exec 6>&1;num_procs=20;output=“go”; while [ “$output” ]; do output=$(find “/mnt/data-disk/odrive/Amazon Cloud Drive” -name ".cloud” -print0 | xargs -0 -n 1 -P $num_procs “$HOME/.odrive-agent/bin/odrive.py” sync | tee /dev/fd/6); done

I see in your script your syncing your entire amazon. I have an ubuntu system in and want to see the speed difference, although my amazon has 20TB whereas I only have 16 available locally, any ideas how I can get a folder at a time? as I have 4 inside… or a copy where I could pause it while I upload and resume and it will not go look for the files it has transferred already?

just pick a single folder to folder you can drill down the system just change the “mnt/data-disk/odrive/Amazon Cloud Drive” can drill down to say “mnt/data-disk/odrive/Amazon cloud Drive/tv” or whatever you want.

described here:

trying to use either of your scripts to sync seems to be causing me errors. I am on ubuntu and not debian… although my scripting isn’t to this level to figure out why this would work for either of you and not for me.

The folder I want to sync is located at ~/odrive-agent-mount/Amazon Cloud Drive/ZZeus (2017-05-21T06_35_46.359)/tl9ieuis27sium0r04gvip96qs.cloudf

Try to use … -name “*.cloudf” …
This helped here also on Ubuntu.

BTW, the command from Joseph_Douglas works best/fastest for me. The other ones were MUCH slower.

try this

exec 6>&1;num_procs=20;output=“go”; while [ “$output” ]; do output=$(find “/home/terry/odrive-agent-mount/Amazon Cloud Drive/ZZeus (2017-05-21T06_35_46.359)” -name “.cloudf" -print0 | xargs -0 -n 1 -P $num_procs “$HOME/.odrive-agent/bin/odrive.py” sync | tee /dev/fd/6); done && exec 6>&1;num_procs=20;output=“go”; while [ “$output” ]; do output=$(find “/home/terry/odrive-agent-mount/Amazon Cloud Drive/ZZeus (2017-05-21T06_35_46.359)” -name ".cloud” -print0 | xargs -0 -n 1 -P $num_procs “$HOME/.odrive-agent/bin/odrive.py” sync | tee /dev/fd/6); done

That does not work, however I did get this to work…

~/odrive-agent-mount/Amazon Cloud Drive/ZZeus (2017-05-21T06_35_46.359)$ exec 6>&1;num_procs=10;output=“go”; while [ “$output” ]; do output=$(find “$HOME/odrive-agent-mount/Amazon Cloud Drive/” -name “.cloud” -print0 | xargs -0 -n 1 -P $num_procs “$HOME/.odrive-agent/bin/odrive.py” sync | tee /dev/fd/6); done

I cannot get it to work with specifying the ZZeus folder or the folder inside zzeus, although that doesn’t matter a whole lot as there isn’t much else on this amazon account

Is this only syncing the files? it is not downloading them correct?

EDIT This is downloading files, and now by specifying the path inside ZZeus it works. I must have needed to sync the initial cloudf files or something

I also tried editing what you gave me to

exec 6>&1;num_procs=10;output=“go”; while [ “$output” ]; do output=$(find “$HOME/odrive-agent-mount/Amazon Cloud Drive/ZZeus (2017-05-21T06_35_46.359)/skhr4jjs6pqc2l603qd5jtgaqg/” -name “.cloud*” -print0 | xargs -0 -n 1 -P $num_procs “$HOME/.odrive-agent/bin/odrive.py” sync | tee /dev/fd/6); done && exec 6>&1;num_procs=20;output=“go”; while [ “$output” ]; do output=$(find “/home/terry/odrive-agent-mount/Amazon Cloud Drive/ZZeus (2017-05-21T06_35_46.359)/skhr4jjs6pqc2l603qd5jtgaqg” -name “.cloud*” -print0 | xargs -0 -n 1 -P $num_procs “$HOME/.odrive-agent/bin/odrive.py” sync | tee /dev/fd/6); done

But I receive the errors
usage: odrive.py sync [-h] placeholderPath
odrive.py sync: error: too few arguments
usage: odrive.py sync [-h] placeholderPath
odrive.py sync: error: too few arguments

I am currently getting about 12MB/s transferring with the script I have that is working, Although I’m interested in why yours has more code and what else it does? the only main difference I see is it looks like it transfers 20 files at once which might help make this faster if I can get it working. I was able to upload to acd using rclone at 40MB/s, so I should be able to get better speeds than this.

1 Like

@Joseph_Douglas I used rclone to directly copy from ACD to GD, used my own client ID for ACD. Rclone will give you these speeds, high-cpu-4 instance with 3,6gb in EU, ACD EU

Hey there,
Can you please explain how to do this to a noob ?
If you don’t mind sharing your own edited script, and to walk me through it.

I want to use rclone to move my data from amazon to GD using rclone only if possible.

Many thanks

how are you connecting rclone to ACD? that doesn’t work at all since rclone is banned from acd… you say client id? is that what is allowing you to bypass this?

But why are you using expandrive ?
You can just install the official Amazon Cloud .exe inside your VM (as you’ve installed a windows server), sync it with a local folder (as you have plenty of TB), and then re-upload everything with rclone, or am i missing something ?

(because i tryed expandrive on a windows VM with not enough disk space to download everything, and performances were pretty bad in my case)

he is doing with a personal ACD ID
which he registered a while back

nothing for newcomers :smiley:

1 Like

Can someone explain how exactly to do that with odrive ?

Is there a way to mount Amazon Drive without syncing, via odrive? Just like we don on rclone mount?

expandrive need no free space. it caches in RAM

Any ideas on how to get Amazon Drive files onto my FTP server without having to download/sync to desktop?

ExpanDrive is giving me nonstop “Can’t read from the source file or disk” errors with 0 byte/s speeds. What am I doing wrong? I can see the files in my ACD and Google Drive. worked last night for my first batch, failing now.