“file name too long” when copying files with very long path names from Google Drive to a local SSD

What is the problem you are having with rclone?

This command works properly…

rclone copy "New_Rclone_Google_Drive_to_local_SSD,root_folder_id=1UCuGuJ7U9M5O4sbEzZE09lDCBXfVh9de:" "/media/y/Backup/data_I_backed_up_from_Google_Drive_using_rclone" -vv

in all cases I have tried, except for cases where a file or files have a very long path name.

It fails to copy files that have very long path names.

By the way, I searched this forum for “file name too long” because I presumed this forum must have topics on this subject. After perusing the search results, I guessed that Rclone vfs cache says file name too long seems like it is closest to solving my problem, yet I don't think it actually solves my problem.

Run the command 'rclone version' and share the full output of the command.

rclone v1.65.0

  • os/version: linuxmint 21.2 (64 bit)
  • os/kernel: 5.15.0-89-generic (x86_64)
  • os/type: linux
  • os/arch: amd64
  • go/version: go1.21.4
  • go/linking: static
  • go/tags: none
    Rclone downloads

Which cloud storage system are you using? (eg Google Drive)

Google Drive

The command you were trying to run (eg rclone copy /tmp remote:tmp)

I included that above.

Please run 'rclone config redacted' and share the full output. If you get command not found, please make sure to update rclone.

[New_Rclone_Google_Drive_to_local_SSD]
type = drive
scope = drive.readonly
token = XXX
team_drive = 

A log from the command that you were trying to run with the -vv flag

Please note that the ridiculously long file name below is not indicative of the way that I normally name files on Google drive. I was simply trying to quickly replicate a long path for testing purposes.

~$ rclone copy "New_Rclone_Google_Drive_to_local_SSD,root_folder_id=1UCuGuJ7U9M5O4sbEzZE09lDCBXfVh9de:" "/media/y/Backup/data_I_backed_up_from_Google_Drive_using_rclone" -vv
2023/12/13 06:54:28 DEBUG : rclone: Version "v1.65.0" starting with parameters ["rclone" "copy" "New_Rclone_Google_Drive_to_local_SSD,root_folder_id=1UCuGuJ7U9M5O4sbEzZE09lDCBXfVh9de:" "/media/y/Backup/data_I_backed_up_from_Google_Drive_using_rclone" "-vv"]
2023/12/13 06:54:28 DEBUG : Creating backend with remote "New_Rclone_Google_Drive_to_local_SSD,root_folder_id=1UCuGuJ7U9M5O4sbEzZE09lDCBXfVh9de:"
2023/12/13 06:54:28 DEBUG : Using config file from "/home/y/.config/rclone/rclone.conf"
2023/12/13 06:54:28 DEBUG : New_Rclone_Google_Drive_to_local_SSD: detected overridden config - adding "{nFDSF}" suffix to name
2023/12/13 06:54:28 DEBUG : fs cache: renaming cache item "New_Rclone_Google_Drive_to_local_SSD,root_folder_id=1UCuGuJ7U9M5O4sbEzZE09lDCBXfVh9de:" to be canonical "New_Rclone_Google_Drive_to_local_SSD{nFDSF}:"
2023/12/13 06:54:28 DEBUG : Creating backend with remote "/media/y/Backup/data_I_backed_up_from_Google_Drive_using_rclone"
2023/12/13 06:54:28 DEBUG : New_Rclone_Google_Drive_to_local_SSD{nFDSF}: Loaded invalid token from config file - ignoring
2023/12/13 06:54:28 DEBUG : Saving config "token" in section "New_Rclone_Google_Drive_to_local_SSD" of the config file
2023/12/13 06:54:28 DEBUG : New_Rclone_Google_Drive_to_local_SSD{nFDSF}: Saved new token in config file
2023/12/13 06:54:28 DEBUG :    d—1—Trash this. .docx: Need to transfer - File not found at Destination
2023/12/13 06:54:28 DEBUG :   d—1—I can use the same shortcut to launch different scripts... TESTING SCRIPTS1/Same shortcut ... Firefox or Google Doc.py.docx: Need to transfer - File not found at Destination
2023/12/13 06:54:28 DEBUG :   d—1—This is test document 1..docx: Need to transfer - File not found at Destination
2023/12/13 06:54:28 DEBUG : d—1—Have I received a response to this topic I posted on our Rclone?.docx: Need to transfer - File not found at Destination
2023/12/13 06:54:28 DEBUG : Local file system at /media/y/Backup/data_I_backed_up_from_Google_Drive_using_rclone: Waiting for checks to finish
2023/12/13 06:54:28 DEBUG : Local file system at /media/y/Backup/data_I_backed_up_from_Google_Drive_using_rclone: Waiting for transfers to finish
2023/12/13 06:54:29 DEBUG : Local file system at /media/y/Backup/data_I_backed_up_from_Google_Drive_using_rclone: File to upload is small (6631 bytes), uploading instead of streaming
2023/12/13 06:54:29 DEBUG :   d—1—This is test document 1..docx.tumosex0.partial: md5 = 42a06c01d1019d559be9a2eab76c5b79 OK
2023/12/13 06:54:29 DEBUG :   d—1—This is test document 1..docx.tumosex0.partial.joqexun6.partial: renamed to:   d—1—This is test document 1..docx.tumosex0.partial
2023/12/13 06:54:29 INFO  :   d—1—This is test document 1..docx.tumosex0.partial: Copied (new)
2023/12/13 06:54:29 DEBUG :   d—1—This is test document 1..docx: Updating size of doc after download to 6631
2023/12/13 06:54:29 DEBUG :   d—1—This is test document 1..docx: Src hash empty - aborting Dst hash check
2023/12/13 06:54:29 DEBUG :   d—1—This is test document 1..docx.tumosex0.partial: renamed to:   d—1—This is test document 1..docx
2023/12/13 06:54:29 INFO  :   d—1—This is test document 1..docx: Copied (Rcat, new)
2023/12/13 06:54:29 DEBUG : Local file system at /media/y/Backup/data_I_backed_up_from_Google_Drive_using_rclone: File to upload is small (9401 bytes), uploading instead of streaming
2023/12/13 06:54:29 DEBUG : d—1—Have I received a response to this topic I posted on our Rclone?.docx.nolitax3.partial: md5 = 3bf1fe2b748da6b84355ef0cc4027d3d OK
2023/12/13 06:54:29 DEBUG : d—1—Have I received a response to this topic I posted on our Rclone?.docx.nolitax3.partial.zesojis1.partial: renamed to: d—1—Have I received a response to this topic I posted on our Rclone?.docx.nolitax3.partial
2023/12/13 06:54:29 INFO  : d—1—Have I received a response to this topic I posted on our Rclone?.docx.nolitax3.partial: Copied (new)
2023/12/13 06:54:29 DEBUG : d—1—Have I received a response to this topic I posted on our Rclone?.docx: Updating size of doc after download to 9401
2023/12/13 06:54:29 DEBUG : d—1—Have I received a response to this topic I posted on our Rclone?.docx: Src hash empty - aborting Dst hash check
2023/12/13 06:54:29 DEBUG : d—1—Have I received a response to this topic I posted on our Rclone?.docx.nolitax3.partial: renamed to: d—1—Have I received a response to this topic I posted on our Rclone?.docx
2023/12/13 06:54:29 INFO  : d—1—Have I received a response to this topic I posted on our Rclone?.docx: Copied (Rcat, new)
2023/12/13 06:54:30 DEBUG :    d—1—Trash this. I gave this an extremely long file name to test Rclone. .docx.hefuzoj6.partial: md5 = 3b5c5e0a091ee588449ea91dff857197 OK
2023/12/13 06:54:30 DEBUG :    d—1—Trash this. I gave this an extremely long file name to test Rclone. .docx.hefuzoj6.partial: Size and md5 of src and dst objects identical
2023/12/13 06:54:30 DEBUG :    d—1—Trash this. I gave this an extremely long file name to test Rclone. .docx: Updating size of doc after download to 218687
2023/12/13 06:54:30 DEBUG :    d—1—Trash this. I gave this an extremely long file name to test Rclone. .docx: Src hash empty - aborting Dst hash check
2023/12/13 06:54:30 DEBUG :    d—1—Trash this. I gave this an extremely long file name to test Rclone. .docx.hefuzoj6.partial: renamed to:    d—1—Trash this. I gave this an extremely long file name to test Rclone. .docx
2023/12/13 06:54:30 INFO  :    d—1—Trash this. I gave this an extremely long file name to test Rclone. .docx: Copied (Rcat, new)
2023/12/13 06:54:30 DEBUG :   d—1—I can use the same shortcut to launch different scripts... TESTING SCRIPTS1/Same shortcut ... Firefox or Google Doc.py.docx.fibuyij1.partial: md5 = 3b5c5e0a091ee588449ea91dff857197 OK
2023/12/13 06:54:30 DEBUG :   d—1—I can use the same shortcut to launch different scripts... TESTING SCRIPTS1/Same shortcut ... Firefox or Google Doc.py.docx.fibuyij1.partial: Size and md5 of src and dst objects identical
2023/12/13 06:54:30 DEBUG :   d—1—I can use the same shortcut to launch different scripts... TESTING SCRIPTS1/Same shortcut ... Firefox or Google Doc.py.docx: Updating size of doc after download to 218687
2023/12/13 06:54:30 DEBUG :   d—1—I can use the same shortcut to launch different scripts... TESTING SCRIPTS1/Same shortcut ... Firefox or Google Doc.py.docx: Src hash empty - aborting Dst hash check
2023/12/13 06:54:30 DEBUG :   d—1—I can use the same shortcut to launch different scripts... TESTING SCRIPTS1/Same shortcut ... Firefox or Google Doc.py.docx.fibuyij1.partial: renamed to:   d—1—I can use the same shortcut to launch different scripts... TESTING SCRIPTS1/Same shortcut ... Firefox or Google Doc.py.docx
2023/12/13 06:54:30 INFO  :   d—1—I can use the same shortcut to launch different scripts... TESTING SCRIPTS1/Same shortcut ... Firefox or Google Doc.py.docx: Copied (Rcat, new)
2023/12/13 06:54:30 INFO  : 
Transferred:   	  458.436 KiB / 458.436 KiB, 100%, 0 B/s, ETA -
Transferred:            8 / 8, 100%
Elapsed time:         2.2s

2023/12/13 06:54:30 DEBUG : 27 go routines active

You are correct. I failed to send the proper information. I am sorry.

SECOND ATTEMPT... A log from the command that you were trying to run with the -vv flag

Please note that the ridiculously long file name in the text I posted at controlc.com is not indicative of the way that I normally name files on Google drive. I was simply trying to quickly replicate a long path for testing purposes.

In case you were unaware, https://controlc.com/ is similar to https://pastebin.com/.

Here is a snippet of what I pasted at https://controlc.com/4510223c...

2023/12/13 07:51:18 ERROR : Attempt 3/3 failed with 2 errors and: open /media/y/Backup/data_I_backed_up_from_Google_Drive_using_rclone/   d—1—Trash this. I gave this an extremely long file name to test Rclone.  Lily Parniani loves her job, but like so many Southern Californians the 24-year-old investment analyst has to put up with a soul-sucking commute. It can take more than two hours to get from her office in Orange County to her home 44 miles away in Riverside County. “In total, I lose about 4 hours of my life every day to commuting,” she said. To make matters worse, she spends around $450 per month on automated tolls to try to make her commute more bearable..docx.yoyusaj3.partial: file name too long
2023/12/13 07:51:18 INFO  : 
Transferred:   	      300 KiB / 300 KiB, 100%, 49.972 KiB/s, ETA 0s
Errors:                 2 (retrying may help)
Checks:                 9 / 9, 100%
Elapsed time:         3.3s

2023/12/13 07:51:18 DEBUG : 12 go routines active
2023/12/13 07:51:18 Failed to copy with 2 errors: last error was: open /media/y/Backup/data_I_backed_up_from_Google_Drive_using_rclone/   d—1—Trash this. I gave this an extremely long file name to test Rclone.  Lily Parniani loves her job, but like so many Southern Californians the 24-year-old investment analyst has to put up with a soul-sucking commute. It can take more than two hours to get from her office in Orange County to her home 44 miles away in Riverside County. “In total, I lose about 4 hours of my life every day to commuting,” she said. To make matters worse, she spends around $450 per month on automated tolls to try to make her commute more bearable..docx.yoyusaj3.partial: file name too long

Am I correct in understanding that I cannot download files with very long file names from Google Drive using Rclone?

Google Drive is a genuine oddball when it comes to max file name length. It supports names up to 32767 characters long. In comparison most popular file systems support up to 255 characters (and up to 4096 for full path).

It is unfortunately common problem for people transferring data from Google Drive. If not taking care of it before then unfortunately it means that it is not transferable without renaming everything to fit into new destination limits.

rclone does not have any problems handling long names - as long as both used source and destination support them.

No, my local filesystem cannot handle filenames of such length. (I'm running Linux Mint 21.2 Cinnamon, by the way).

However, my local filesystem can handle filenames of this length...

/home/y/Documents/_ d—1—Trash this. I gave this an extremely long file name to test Rclone. Lily Parniani loves her job, but like so many Southern Californians the 24-year-old investment analyst has to put up with a soul-sucking commute. It can take more than.docx

I used the extremely long name d—1—Trash this. I gave this an extremely long file name to test Rclone. Lily Parniani loves her job, but like so many Southern Californians the 24-year-old investment analyst has to put up with a soul-sucking commute. It can take more than two hours to get from her office in Orange County to her home 44 miles away in Riverside County. “In total, I lose about 4 hours of my life every day to commuting,” she said. To make matters worse, she spends around $450 per month on automated tolls to try to make her commute more bearable..docx because I wanted to be sure I got the "file name too long" error message. Be that as it may, let me explain what I have been doing up to this point.

Normally I have been downloading data from Google Drive using Google Takeout. Google Takeout creates a compressed file (of files) that are either .zip or .tgz files. (I can choose the format). When I uncompress those .zip or .tgz files, I see all of the files which I chose to download... even those with very long file names.

In other words, when I use Google Takeout, I don't have to worry about the length of the file names I created on Google Drive.

I actually glanced at Google Drive | Download & export files because I guessed that, perhaps, Rclone uses the Google Drive API. Nevertheless, I didn't glean any information that seemed like it might help solve my problem.

Essentially, Google Takeout downloads 100% of the data I want, but it's a crude tool. By contrast, Rclone is a much more elegant tool; however, it doesn't download all of my data because some of the files I named are too long.

I like giving long file names to some of my files on Google Drive. I find it very helpful when I am searching through files to be able to read a file name which contains a lot of information. Therefore, given the choice as it is constructed in the previous paragraph, I would very probably choose not to use Rclone.

At the end of the day, my hunch is Rclone might be using the Google Drive API which might not allow very large file names to be downloaded. However, I was hoping there would be a way that I could give long file names to some of my files on Google Drive, yet download them with Rclone.

I read about Rclone mount previously. I was a little unclear, but I guessed that using Rclone mount would require me to give Rclone the ability to write to my Google Drive. I don't want Rclone to be able to do that because I am concerned I might make a mistake which would result in me losing data on Google Drive. Sure, it's a small possibility, but I don't want to worry about it.

Does Rclone mount require the ability to write to Google Drive?

seems to me, then you will have the same problem, cannot unzip the files from the takeout compressed file onto your local filesystem?

Superficially your supposition seems plausible; however, you seem to have made a mistake by understandably, yet implicitly, creating a false dichotomy. See, there aren't 2 types of files (long files and short files); there are 3 types of files (long files, medium files, and short files).

I don't create super, ridiculously verbose file names. That is, I don't create "long files" The one you saw previously in this topic, was one I created because I didn't want to bother figuring out the size of a "short file." I simply wanted to recreate the error message I had seen before.

In other words, "medium files" are those files I can download as .tgz or .zip files and successfully uncompress on my file system, yet are too long for Rclone to copy to my local SSD.

The answer for this case can be that rclone uses temporary file names during download (longer than original name - by ".yoyusaj3.partial" part). Maybe it trips your max SSD limit. You can disable this functionality by providing --inplace flag

In general whatever details - Google Drive allows for names much longer than your SSD. If you want to operate "trouble free" make sure that you stick to the lowest common denominator max length in your setup.

then do not write files to the mount, or use --read-only

Thank you very much for taking the time to explain that to me. It seems like Rclone might work for me. I will give it a try.

sorry, at this point, not sure what you need help with.
the local filesystem has hard limits on path/filename length.

If you don't mind I'd like to demur.

please, do you have a real-world issue that you can replicate and post about in detail, with exact commands and debug logs?

In the real world, for over a decade, I have used Google Takeout to download my data from Google Drive as .tgz and/or .zip files. In the real world, I have always been able to successfully uncompress 100% of the files in whatever flavor of Linux I was using. That is, I have never once received an error message when I uncompressed one of those .tgz or .zip files. In the real world, to reiterate what I've indicated previously, I don't create super, verbose file names.

However, recently as I have been testing Rclone (to see if it will meet my needs), many files I have tried to copy to my local SSD using Rclone have failed. The error message I have seen in every instance has been "file name too long."

I am sorry, but I don't intend to provide you with the data you requested, because I am convinced that it's not necessary. Put simply: I have ample information to conclude that the Google Drive API seems to restrict the file names to a significantly shorter length than Google Takeout.

The answer for this case can be that rclone uses temporary file names during download (longer than original name - by ".yoyusaj3.partial" part). Maybe it trips your max SSD limit. You can disable this functionality by providing --inplace flag

Your supposition seems extremely unlikely to me.

In general whatever details - Google Drive allows for names much longer than your SSD. If you want to operate "trouble free" make sure that you stick to the lowest common denominator max length in your setup.

It seems to me that, in a theoretical vacuum, that is excellent advice; however, in my particular case, that is terrible advice. Terrible advice? Yes.

I benefit significantly by using long file names on Google Drive because it helps me more quickly find the files I search for on Google Drive. Furthermore, I can download and uncompress copies of 100% of my files on Google Drive using Google Takeout.

let's say, inside the takeout, there is a filename of length 300.
the local file system can handle max length 255.

just how do you copy that 300 into 255?
maybe using --magic ;wink

that is why i asked for a real-world example?
we need to add that functionality to rclone.

I appreciate your help but, frankly, your ignorance is astounding. Please stop ignoring what I indicated. In other words, please carefully read what I indicated.

My local file system has been able to uncompress (extract) 100% of the .zip and .tgz files that were created with Google Takeout which I downloaded... for over a decade.

As I presume you know, my local file system cannot uncompress a file that is larger than the maximum length my file system can handle. Because you have ignored me, I am going to repeat that and emphasize that point: as I presume you know, my local file system cannot uncompress a file that is larger than the maximum length my file system can handle.

Therefore, it is reasonable to conclude that every .zip and .tgz file I have ever downloaded with Google Takeout, which I have then uncompressed (extracted) has had a file size less than or equal to the maximum length my file system can handle. No magic necessary... :wink:

I've worked with myriad bad engineers who have repeatedly clamored for more information despite my having provided them with sufficient information. You remind me of them. You should not expect people to jump through hoops to provide you with information you don't need.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.