File manager not archiving properly

I’m facing an issue with one of my domains. In the file manager, I selected all files (across multiple pages) and archived into a tar.gz file. The file was around 200MB but when I downloaded it, it was corrupt and the website did not load properly on the local computer.

I then created an archive via my FTP manager and that came out to be around 300MB. This file extracts properly and website loads too. It seems there’s some issue with the utility that creates archives in the file manager.

Can someone help me troubleshoot the issue? Or is it a bug?

Unless you don’t have Webmin running behind proxy and connection timed out, it shouldn’t be an issue.

Do you run Webmin in this example, as master administrator or as server-owner? Are you trying to do it on the root directory (in case of a server-owner) or on some sub-directories?

When you’re trying to select all files across multiple pages, are those files/directories have symlinks in it that are not followed, maybe?

The simple workaround would be to go directory up and download single directory as an archive - would that work for you?

this is not an solution but do try this… ssh into server and do your tar.gz or perhaps zip command via terminal and then do it via file manager and see… - I use only terminal so you would have to compare it by your own… be my guest and let us know :slight_smile: I bet virtualmin file manager does same job as terminal - those folks does not create joke code / products… I believe in them. - anyway good luck

I’m not using proxy.

As server owner, I login with username ‘root’. I’m trying to archive all files inside a public_html folder.

I’m not sure what that means?

It would’ve worked if it worked. The file came out to be 160MB (Should’ve been over 340). Also, selecting all files and creating archive inside the public_html folder is giving me a 0 bytes archive. It also doesn’t take very long (2-3 seconds).

I tried that using comman tar -czf test2.tar.gz public_html/ and file came out to be 340MB same as what I get with FTP. I tried creating archives on other domains that have fewer files and fits on one page, then the archives created are fine.

So there’s something going on with the file manager. Any logs you’d like to see?


It doesn’t have to be the same size. If this is the only criteria you’re relying on, then it’s wrong. Check actual files.

As server owner, I login with username ‘root’. I’m trying to archive all files inside a public_html folder.

As root you would have logged in as Master Admin.

Please, show me what you’re doing with a screenshot.

Sure it is. Whatever software/utility you use to create an archive, all of them would be of similar size. They might be ± 2MB but one software is not that efficient that it makes an archive of half the size from the same set of files. Besides, I already told you, the zip can not be extracted, its corrupt.

I created the archive again and made a recording of it. Sometimes it makes 160MB file and sometimes its 0 bytes.

Ohh. :smiley: That’s how you’re using it!

I will take a look but I never expected it to be used this way! Just after selecting files across 4 pages, right click on the table row and click Download from context menu. Does it work then??


Your method worked well. The archive downloaded and extracted properly with all the files. I’ll be downloading files this way now.

Do take a look at that archive function.


Okay, how many files were there in the directory? Did you have server pagination on or off? You can check that by looking at the string below path breadcrumb. Was it written Total: or Paginated: for you? How many files and directories were there?


It just works for me in my test cases.

@Vipul.K Duh. You already shared video-screencast. Sorry, too tired. I will try to reproduce it over again tomorrow.

Do you have it happening when choosing .zip as an option?

When I select zip option it says something like zip can’t be used when selecting files across multiple pages. However when I go up a folder and zip the entire public_html, it works fine.