ACD oauth proxy broken!

they answered -> https://forums.developer.amazon.com/questions/106405/rclone-and-acd-cli-not-working-anymore.html

The Answer:

Hi SebiTNT,

Please see my answer about 429 error here: https://forums.developer.amazon.com/questions/54132/getting-429-too-many-request-errors.html

and about rclone and acd_cli here: https://forums.developer.amazon.com/questions/71941/rclone-and-acdcli.html - thanks!

While the first thread says:

Answer by Amazonian Levon@Amazon · Jan 31, 2017 at 02:50 PM

Hi John,

Thanks for posting! Have you actively submitted your application for approval and gotten a direct response saying that > your app is at production level now? If not, then the app is still at developer level, which is designed to be used by a single digit number of users who are actively developing or testing an application. Because of that, if your app is at developer level and you have indeed many thousands of customers using it, then you will be getting error 429s.

The second only points to the old know thread saying they don’t have any integration with rclone anymore, and are not planning on reopening.

Looks like the keys were revoked based on Amazon’s reply to your post @SebiTNT :frowning:

Also, I tried a different client_id/secret which did work, so it definitely seems like the proxy that rclone was using has been blocked.

I’ve made a very very hacky “fix” for this. It involves using the ACD client itself and “stealing” its authentication headers via Fiddler (and FiddlerScript), and also a GUI automation program to simulate clicks on ACD.

It obviously has a smaller chance of getting disabled, but it doesn’t really work all that well - it obviously has the need for the PC to have it’s GUI occupied, and it does tend to break from time to time (so it needs supervision). But it’s still better than the alternative.

I’m in the process of downloading all 2.5 TB of data off from ACD anyway, so it doesn’t really have to be very robust.

The main thing is to customize Fiddler. The GUI automation is the easy part; I’m using Mouse Recorder - just because it was the first one I found (and I was too bored to make a program of my own).

Let’s start with Fiddler.

  1. Download / install Fiddler
  2. Go to Fiddler options and select the option to decrypt HTTPS
  3. Make ACD client use Fiddler as its proxy (from options)
  4. Set HTTP_PROXY and HTTPS_PROXY variables so that rclone will use Fiddler as a proxy (e.g. SET HTTP_PROXY=http://localhost:8888 and SET HTTP_PROXY=http://localhost:8888)
  5. The hard part now: Editing FiddlerScript.

Add the following to just under the class Handlers declaration:

public static RulesOption("Impersonate A&mazon Cloud")
BindPref("fiddlerscript.rules.AmazonCloud")
var m_ImpersonateAmazonCloud: boolean = false;
public static var m_AmazonAccessToken: String = "xxx";
public static var m_AmazonCloudDriveSource: String = "xxx";
public static var m_AmazonCloudDriveAppId: String = "xxx";

And the following code to the OnBeforeRequest function:

if (m_ImpersonateAmazonCloud && oSession.oRequest.headers.Exists('User-Agent') && 
    (oSession.oRequest.host.Contains('amazonaws.com') || oSession.oRequest.host.Contains('cloudfront.net'))) {
    // FiddlerObject.alert(oSession.oRequest.host);
    var userAgent = oSession.oRequest.headers['User-Agent'];
    if (userAgent == 'rclone/v1.39' && oSession.oRequest.host.Contains('amazonaws.com')) {
        oSession.oRequest.headers["User-Agent"] = "CloudDriveWin/5.0.11.ab4ed4be";
        oSession.oRequest.headers.Remove("Authorization");
        oSession.oRequest.headers["x-amzn-clouddrive-source"] = m_AmazonCloudDriveSource;
        oSession.oRequest.headers["x-amz-access-token"] = m_AmazonAccessToken
        oSession.oRequest.headers["x-amz-clouddrive-appid"] = m_AmazonCloudDriveAppId;
        oSession.oRequest.headers["x-amzn-RequestId"] = Guid.NewGuid().ToString();
        // FiddlerObject.log(oSession["x-ProcessInfo"]);
    } else if (userAgent.StartsWith("CloudDriveWin/")) {
        var amzAT = oSession.oRequest.headers["x-amz-access-token"];
        FiddlerObject.log("Amazon CloudDrive connected");
        if (amzAT != "" && amzAT != m_AmazonAccessToken) {
            m_AmazonAccessToken = amzAT;
            m_AmazonCloudDriveSource = oSession.oRequest.headers["x-amzn-clouddrive-source"];
            m_AmazonCloudDriveAppId = oSession.oRequest.headers["x-amz-clouddrive-appid"];
            FiddlerObject.log("NEW AMZ ACCESS TOKEN:" + m_AmazonAccessToken);
        }
    }
}

Here’s the “easy” part, the GUI automation:

  1. Open Amazon Cloud Drive and go to the downloads tab.
  2. Open Mouse Recorder (or whatever) and record a macro that:
    1. Activates the window (and waits a bit)
    2. Clicks on the “All” folder item (wait a bit)
    3. Clicks on any first level folder (and wait a lot more)

In my case this forces a refresh.

You need to run Amazon Cloud Drive and force it to refresh at least once before you run rclone. And you also need to run the GUI automation part at least once every 30 minutes (preferably more frequently - I run the macro above every 15 seconds, but YMMV)

I’m sure I’m missing something important, but that’s about most of my setup.

How can we get the proxy to work with Rclone again? I’m really having tons of TBs to download. Can someone talk to the one who had it running ?

i user “Air Explorer” to transfer my acd data to a other storage

The only problem, is that it only runs on Mac or Windows

2 Likes

What it looks like is that the keys have had their rating for production use removed, so they’ve been downgraded to developer keys. Or Amazon have change they way they do that sort of thing.

Thanks for confirming that.

1 Like

Thank you s20p17a1m14 for this hint! I’m now using it on a Google Cloud Compute instance with Windows Server on it. It’s a pity that I can’t use rclone any more! I hope that there will be a solution for this problem with Amazon soon because I want to use rclone again.
Again: Thank you Nick for your great work!

is the oauth proxy fixed already? it’s still broken on my side

In case this ever gets fixed, what’s the best guide on how to set this up? I’m reading over at
https://rclone.org/amazonclouddrive/
and the instructions say rclone config will “guide” me through things, but well, I don’t even know what a client_id and client_secret are.
I guess I’m supposed to read and comprehend this first?


is that correct? Meaning I have to run my own proxy? or is there some default proxy someone else had setup? or am I misusing the word proxy in the first place?

Current plan of course is to use something like odrive or airexplorer if I can’t figure out rclone, I’ve still got 40days or so to move 20tb, fingers crossed I can make it.

@left1000 Please see this for the answer to most of your questions.

Thanks, @boldn and @ncw. Does anyone have plans to resolve this issue?

Ah, just seen this after posting my other thread. :frowning:

Hope we can come up with a solution…

Any advice for someone with encrypted files on ACD? I don’t think AirExplorer or odrive allow you to decrypt. I was about 1/3rd of the way through migrating the data of ACD before it stopped working. Honestly if I could just get a list of all the files I had I could probably redownload what’s missing from elsewhere.

You’re probably going to have to download using a different client…

I’m finally biting the bullet and switching to Synology Cloud Sync. It doesn’t work anywhere near as elegantly or fast, but if it’s all that works, I don’t have much choice.

If you are willing to spend some time, you could try find the client_id and client_secret from another program.
There are other ACD Desktop clients out there which may have them stored unencrypted somewhere.

Problem with that is that as soon as you find them, everyone will jump on them and then they’ll end up being killed too. So it’s not a long-term solution.

I think most people are not interested in long term solutions, as ACD Unlimited is dead and they are just hoping to get their files back.

Didn’t everyone already get their files back using the Rclone Auth proxy, after the last time ACD was ‘dead’? Seems careless to not have a local backup when this is the second time it’s happened. :wink:

1 Like

Yes : Definitely migrate everything elsewhere. I have got less than 2To so it’s easier for me.

thanks. that answered all my questions perfectly, and now I’ve got things configured correctly to experience the same error as the rest of you :slight_smile: