Hi! A friend just recommended the backup tool that comes with Ubuntu. I took a look at it and was wondering what you guys include and exclude from the backups. I just installed wire guard VPN and but the config file in the etc/wireguard folder, where it belongs. I would have to include this folder as well if I want to keep my configs. And I guess many programs do the same, so how do you know what to include, so you can just revert to the last backup if something breaks or you get a new machine? Maybe that is a stupid question, but it was going through my head for some time now. Thanks a lot!

  • atzanteol@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    16
    ·
    8 months ago

    My philosophy is “anything I can’t reproduce easily”. This will vary depending on the machine and data. But it’s been a good guide so far.

  • Nia [She/Her]@beehaw.org
    link
    fedilink
    arrow-up
    11
    ·
    edit-2
    8 months ago

    I back up the entirety of my /home directory except for cache, temp, trash (trash is stored at /home/$user/.local/share/Trash), Download folder, and a folder I named “NOBACKUP”.

    It backs up a lot of stuff I probably don’t need, but I’d rather back up more than I need, than to be caught not backing up something that I did need.

    edit: oh, I have a btrfs snapshot of /root too, but I don’t think that’s something the backup tool in Ubuntu can do since it defaults to ext4

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      4
      ·
      8 months ago

      It’s best not to overuse native filesystem snapshots. Someone else was saying they delete them daily, that’s the right spirit.

      Filesystem snapshots can’t be dissociated from that filesystem and they are strictly incremental to the point they’re literally all-or-nothing which is quite inconvenient.

      They’re good for those “oh fuck” moments when you’ve just deleted the wrong dir but that’s about it.

      • Nia [She/Her]@beehaw.org
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        8 months ago

        That’s a good point, I use my /home backup via borgbackup which I keep for a bit longer (store 7 days + last 2 weekly before it prunes them), and my /root btrfs snapshots were set to be kept for 7 days just out of habit. I’ll probably dial it back to 2-3 days instead. I do intend them as just rollbacks rather than actual backups but I tend to be too overly cautious for my own good sometimes

        I like to keep a few more than the last day of snapshots as a minimum in case there was something silently breaking my system that I didn’t notice for a few days and is too advanced for me to fix normally

  • lurch (he/him)@sh.itjust.works
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    8 months ago

    the most important thing is your user files. everything else just speeds up recovery.

    you should keep a bootable recovery medium around, like an installer USB, so you don’t have to bother your neighbours for one at 2 in the morning…

    to restore faster, you either make disk images (can restore everything quickly in one go) or save partiton layouts, configs and package selections as well as everything you installed without package management. if you don’t do this second part, you have to sit through a reinstall and figure out everything again and that sucks if you don’t have time. like you really need to open that document, but you forgot the name of the program you use to edit it etc…

    if you use just one large file system, you can tar everything up, using --one-file-system , so it skips stuff like the inside of mounted snap packages, which also are present in another place. on restore you then have to format untar and install a boot loader. beware that EFI boot can be difficult to set up and has another partition. so this is just for pros. however, this enables to use tar features like differential backups.

      • lemmyvore@feddit.nl
        link
        fedilink
        English
        arrow-up
        3
        ·
        8 months ago

        Yes but be careful with that option because it depends how it’s implemented by each tool. Some of them will not cross btrfs subvolumes for example.

        • Avid Amoeba@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          Yeah, I think I can see some other funny cases if there’s multiple partitions with separate filesystems on them. Just doing the regular tar/rsync with exclusions is likely safer as it would work for either case.

  • ChojinDSL@discuss.tchncs.de
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    8 months ago

    If you don’t know, or aren’t sure. Backup everything if you have the space. Once you’ve hit a couple of disaster scenarios, it will become apparent what stuff is really important.

    Obviously, the stuff you can’t recreate otherwise is most important. But apart from that, even the stuff you can recreate from other sources might be worth backing up because of time savings. E.g. faster to restore from backup than to recreate.

    • kevincox@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      8 months ago

      Yup. Step 1 is backup everything. Step 2 is maybe improve your reproducibility and then remove the things that can be reproduced from the backups.

    • Brewchin@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      edit-2
      8 months ago

      Great advice. For me, it’s the irreplaceable data first, and then stuff like configs and credentials/keys.

      My borg-backup (to my NAS) config is “My Documents” type files, /etc stuff I’m likely to customise, and home stuff except the stuff like “*Cache”, “*Storage”, assets/icons/history/recent/blah. It’s tedious to fine-tune, but I figure too much is infinitely better than too little.

      If I want to be able to do an image-based restore, then I’d use a different tool. But life’s too short for that.

    • pbjamm@beehaw.org
      link
      fedilink
      English
      arrow-up
      1
      ·
      8 months ago

      Also, while it may be fairly easy to recreate the OS/Application install from scratch that is generally small potatoes storage wise compared to you music/movies/photos etc that you for sure want to back up.

  • limelight79@lemm.ee
    link
    fedilink
    arrow-up
    4
    ·
    8 months ago

    Data and configurations.

    If you have the space, software is nice because it’s easier to get the system going again, but the data (your files - music, documents, pictures) and system configuration files (/etc for example) are the most critical. If you have databases set up, learn about their dump commands and add that.

    You don’t have to use the same method for everything. My pictures are backed up to another side in a second computer and to Amazon Glacier for $2/month (I’ll have to pay to download them if I ever need it, but I’ll gladly pay if I’m in that situation - those should only be needed if I have a major house fire or something like that). My weekly backups are my /home directories, /etc, /root, a database dump, and maybe one or two other important things.

    • kevincox@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      8 months ago

      Really configuration is best not backed up but created from some source of truth like a Git repo. But a backup can serve as a poor-man’s version control.

  • cmnybo@discuss.tchncs.de
    link
    fedilink
    English
    arrow-up
    3
    ·
    8 months ago

    I take a btrfs snapshot of my root partition daily so I can easily revert to an older version if I break something or get a bad update. There’s nothing on my desktop or laptop root partition that can’t be easily replaced, so I don’t bother with any backups apart from the snapshots.

    On my server, I keep multiple backups of /etc/ since there is a lot of stuff in there that I manually setup.

    If you just want to backup the configuration, you can backup the entire /etc/ directory, it will only take a few MB when compressed.

    • anarchoilluminati [comrade/them]@hexbear.net
      link
      fedilink
      arrow-up
      1
      ·
      edit-2
      8 months ago

      Do you know what you’d have to backup if you wanted to backup desktop settings, fonts, themes, app design customization, and so on? Would that be /etc/? I usually want to backup the desktop environment itself, if that makes sense, because I easily backup files (pictures, videos, and so on) already but I’m not sure how to backup the desktop itself and it’d take so much more time setting that up again so that’s actually my priority. Sounds like it would be /etc/ but just clarifying.

      I’m using PopOS.

      • lemmyvore@feddit.nl
        link
        fedilink
        English
        arrow-up
        2
        ·
        8 months ago

        You want to back up your home dir (/home/username) for that, specifically .config and .local dirs in there. But be aware that there are a lot of apps that circumvent that convention, for example Firefox uses .mozilla and so on. If you set your file manager to show hidden files you’ll see a lot of stuff in there, unfortunately.

  • Avid Amoeba@lemmy.ca
    link
    fedilink
    arrow-up
    2
    ·
    edit-2
    8 months ago

    If you want to be able to restore the machine completely, with everything installed and configured, then yes you have to backup everything. There’s generally two ways, file-level backup where you’d use something like rsync, tar, etc. and block-level where you’d backup the whole partition/disk using something like dd, clonezilla, etc. The latter is the easiest to restore but it’s a bit of a pain to backup because the system generally has to be offline, booted from alternative OS. The forner is a bit more difficult to restore but not by much, and it’s so easier to backup. You can do it while the system is live. I’d probably try that first. Find documentation on backing up a complete root filesystem with rsync/tar and you’re good to go. Some ideas. It’s typically a single command which can be run on a schedule.

    The built-in GUI backup tool is generally intended for your own user data. In order to be able to backup other things it’ll have to run as root or be given caps and that might get more complicated than using straight rsync/tar.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      2
      ·
      8 months ago

      You can use Borg for both things you mentioned. It stores deduplicated chunks so it doesn’t care if you backup files or a block device.

      Not sure why you’d have to be offline to do that though.

      • Avid Amoeba@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        8 months ago

        Because if you’re not offline, something is writing to the filesystem and changing blocks while you’re copying. If you’re lucky what you copied would be outdated. If you’re less lucky it would cause fs inconsistency which would be cleaned up by fsck. If you are even less lucky you’d end up with silently corrupted files, e.g. a text file with old parts mixed with new. If you’re even less lucky, you’d hit a vital metadata part of the fs and it would not be mountable anymore.

        To clarify, the filesystem being block-copied has to be offline or mounted RO, not the whole OS. However if that’s the root/home filesystem, then you can’t unmount it while the OS is online.

        If you don’t want to deal with that you need a filesystem or volume manager that supports snapshots, then you can copy the snapshot. E.g. snapshot your root LVM vol, then block-copy the snapshot.

    • WbrJr@lemmy.ml
      cake
      OP
      link
      fedilink
      arrow-up
      1
      ·
      8 months ago

      What I am always wondering, to set up Linux until everything runs without problem, it takes quite some time for me. I use Linux for about a year regularly, and had to set it up about 4-5 times. And it almost always is a pain and I need to search online for some time until everything works. Is it getting easier the more often it’s done? Or do you create a setup script that runs everything if you reinstall the system?

      • Avid Amoeba@lemmy.ca
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        8 months ago

        I use config-as-code for some stuff but in reality there are many manual steps that aren’t covered. This is why I run an LVM mirror (RAID1) with two SSDs and I keep a full backup. The system hasn’t been reinstalled in 10 years.

        If you feel the way you do, you should probably just do a full disk backup with clonezilla or dd every X days and be done with it. If X is large, e.g. months, you should also run home dir backup more often. The Ubuntu built-in tool is great for that. Then when something dies, restore the whole OS from the clonezilla/dd backup, boot, then restore the most recent home dir backup, reboot, and you’re back. Minimal effort.

        • WbrJr@lemmy.ml
          cake
          OP
          link
          fedilink
          arrow-up
          1
          ·
          8 months ago

          I started my journey with fedora, but got annoyed by things like not working videos. Ubuntu works for me pretty well and I had very little issues with it compared to fedora. And that’s what I seek in an os

  • PenguinCoder@beehaw.org
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 months ago

    An OS can be restored. Backup your data, so /home for sure and maybe any custom configs for /etc, like your wireguard configs. So anything you specifically edited/added for /etc directory.

    • everett@lemmy.ml
      link
      fedilink
      arrow-up
      5
      ·
      8 months ago

      Skipping the OS backup is reasonable, but you probably want to at least save a package list. Add something like dpkg -l > ~/packages.txt to your backup script.

  • UnfortunateShort@lemmy.world
    link
    fedilink
    arrow-up
    1
    ·
    7 months ago

    I auto-backup my entire /home, except for stuff I explicitly exclude and hidden files. I only explicitly include some of the latter, because I don’t want to back up all the stuff programs put there without my knowledge.

    Config files outside of /home I copy semi-manually to and from a dedicated dir in which I replicate exactly where they go in my actual FS. I have written shell functions that easily allow me to backup and restore stuff from there and it’s synced to my cloud storage.

  • GadgeteerZA@fedia.io
    link
    fedilink
    arrow-up
    1
    ·
    8 months ago

    @WbrJr@lemmy.ml I’m on Manjaro Linux but principles are the same. I have an SSD boot drive and a 4TB hard drive for /home data etc. I also have a second 4TB drive for backups:

    1. Timeshift app - does snapshots of OS to backup drive. I have 4x hourly snapshots, 2 daily ones, and one weekly one. This allows easy roll back from any updates or upgrades that went wrong.
    2. luckyBackup app - does a full rsync backup daily of /home data and configs. There are other rsync apps too, and you can opt for versions it you have space. But usually I’ve been fine with recovering anything I deleted or overwrote by mistake. I do this more for hard drive failure. I do also have one additional 1TB drive I keep in a safe. I connect this myself once a month or so for an offline backup.
  • Tick Dracy@lemm.ee
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    8 months ago

    Hijacking this topic, I use this software on Windows, which does incremental backups of the system (including the OS, alongside documents, downloads, etc). It can also be easily restored by booting a custom image from an USB and restore the image created.

    Is there anything like this with Linux?