How to use bacula

Ubuntu
18.04

Hello everyone… I’ve installed bacula and it appears to be working (at least I’m getting no errors when I open it in Virtualmin). But as much as I’ve tried to configure it to back up files on my server, I’ve had no success at all! Can someone show me a basic set up or point me in the direction of a basic tutorial that would get me pointed in the right direction? If someone has successfully set the system up I’m sure once I’ve seen it I will be able to use it!

This is probably not the best place to ask about Bacula. We didn’t make it (though there is a Webmin module for it, I don’t use it). The Bacula website seems to have lots of documentation, though I can’t say anything about its quality. https://www.bacula.org/

Thank you… I tried reading through their site and it’s really confusing. I’ll try again when I have a couple hours to dedicate!

Dan Lewis

Maybe it’s just not the right tool for your needs? It’s a pretty complicated creature. There are many ways to backup your system(s), not all of them are that complicated. It has reasons for its complexity, but maybe they just don’t matter to your deployment(s).

I want to back up the data in certain directories and I’d like to automate it so I accomplish that every night. Preferably I’ll be able to create a new directory each night and delete the oldest one every 6th day or so. What would you suggest as the best software to do that?

Dan

Have you tried the Webmin Filesystem Backup module? It uses simple tarballs and has basic scheduling functionality. Works for me.

1 Like

The files I’d like to back up are fairly large and this system fails saying the files are too big. Am I doing something wrong? It seems as though this is primarily for small files that may fit on a tape backup. Also… is there a way to back the files up to a different machine on the internet? It seems as this only works with files local to the server.

Dan

How big? What is the exact error?

I don’t think tar has a limit that you’d be likely to run into, but maybe I’m wrong. There used to be some 2GB limits in various Linux file-related stuff, but I’d be shocked if it’s still there.

I’m not sure where the log would be to check the error. When I do a backup to a local drive things seem to work fine. The backup file is 95 gigs. When I try to do it over the internet the file gets to around 5 or 6 gigs and quits. I’m not sure if this is a connectivity problem or a problem with the program itself. Previously I was only about to get to one or two gigs before it stopped but that problem seems to have resolved itself.

Dan

Oh, you definitely shouldn’t try to do a huge backup interactively, browser timeouts, intermittent connectivity, etc. make it pretty likely to fail. Just schedule it, and then see how it turns out.

I scheduled it using the file backup program. It did fail twice. Maybe it would be better to back it up locally and then try to transfer the single file. What do you think?

Dan


Here is the image of the error and this is what I’ve been running into. It’s looking for a tape change. On the local server it made it all the way to 95 gigs but over the internet it made it variously to 5, 6.5 and 10.3 gigs before showing this same error

Dan

That means it ran out of space on the destination device, I’m pretty sure? (Or hit a size limit on the destination device. tar was original for tapes, so it uses outmoded language, but it works fine on disks.)

1 Like

This topic was automatically closed 60 days after the last reply. New replies are no longer allowed.