CPU load averages 0.65 (1 min) 0.65 (5 mins) 0.73 (15 mins) Real memory 18.91 GiB used / 17.09 GiB cached / 62.66 GiB total
Upgraded from 2.021 to 2.102. Immediately VM (4 GiB) jumped to max (but real RAM stayed the same), upload module failed to download from server (halted, opened browser tab as normal (slowly) and then browser gave network error). Tried different themes, no help. “Show in browser if possible?: NO”
Uninstalled, and then downgraded to 2.021: all works just fine as before.
Makes me really careful to act when next upgrade appears. These things should be tested better before commit.
All updates applied, so it is current.
Also: “CentOS Linux 7 will continue to receive community security patches and bug fix updates until June 2024.”
Running a live server with thousands of people online 24/7 means when it is running fine you do not change things when not absolutely necessary. I do not see any reason Webmin should not be backwards compatible in these simple modules like up/download.
Where does it say in changelogs about distro version requirements?
OS version almost certainly has nothing to do with this. I don’t know why @Stegan yells about CentOS 7 every time it comes up no matter the context, as it’s not a helpful comment and is almost always just distracting from whatever the problem is. CentOS 7 is a maintained distro, and it is supported by all of our software. Unless the problem is “the packages are old”, the simplest solution is not “Upgrade your OS.” And, maybe if someone is installing a new system with CentOS 7; that’d also be crazy, and worth offering a gentle nudge in the direction of Rocky 9.
I encourage you to do whatever level of testing you believe is appropriate for your deployments, if you find our level of testing is insufficient. We are a tiny mostly volunteer team working on a project that’s over a million lines of code and with effectively no resources. We’re doing the best we can.
Files are two about 6,5 GB zips (no extension) downloaded from /root/subfolder
Sorry I can’t give the files for testing, they are private backups. I’m sure a random file works for testing.
I have downloaded them with Webmin daily for a very long time without problems (apart from an odd speed limiting/slowing that happens just before 1GB is done, that may be caused by Java).
I did not notice other issues, but then again I use Webmin for monitoring mostly. The odd issue was instant Swap 100%, did not effect anything adversely but after I reinstalled the old Webmin I cleared it out without problems.
With 2.102 the download process started slowly, it opened the tab and slowly after 10 seconds Chrome got it into download page (filename and size were correct) where it said after a moment it failed due “network error”. My network was fine, Webmin refreshed normally (1 gig fibre at home).
You’re right! This is a bug indeed! I’m not sure how I could miss that!
Please do the following:
Upgrade Webmin to the latest 2.102 version.
Apply the latest patch by running:
cd /usr/libexec/webmin && curl https://github.com/webmin/webmin/commit/99b89595596261ceb847d94dae14e01da4d843c8.patch | git apply --reject --whitespace=fix
Also, as you have a lot of RAM you may also consider increasing default buffer size for downloads from currently default 6553600 to some larger value in /etc/webmin/miniserv.conf with bufsize_binary option.
@Jamie we will need a new Webmin 2.103 release to address this and a few other small issues.
In corporate world they use “older” versions of software that are supported and updated with all security patches (so they are to date) because they have the prove of time, if the system works as expected and is stable is ok.
To migrate again and again is not feasible, and downtime is not desirable, and could break things and clients that need offered services would be f***.
Mainstream “new” releases are in general beta testing, just like Fedora was for Red Hat, Red Hat took only stable code to put in them enterprise edition (plus them corporate code, of course).
The “best” buffer size can vary. What’s for sure, you wouldn’t want a buffer size that cause your system to swap, as accessing swap space (disk) is much slower than accessing RAM.
In practice, a common buffer size is 4KB (4096 bytes), which corresponds to the block size of many file systems. However, this is way to small chunk of data to read in terms of a 4 GiB file. So, the buffer for downloads should be greater and would practically depend on the system resources, in particular amount of available RAM and network bandwidth.
It doesn’t mean that the buffer should be set to gigabytes. The reasonable
buffer for downloads in your case would be something like 8192000 bytes. If you feel that your system (including network) can handle more, in particular if the download seems to be slower than your network can handle, then you can increase this value.
Play with different values and let us know what worked the best for you!