Basically title. I’m in the process of setting up a proper backup for my configured containers on Unraid and I’m wondering how often I should run my backup script. Right now, I have a cron job set to run on Monday and Friday nights, is this too frequent? Whats your schedule and do you strictly backup your appdata (container configs), or is there other data you include in your backups?

  • notfromhere@lemmy.ml
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    Not as o̸̯̪̳̫͗f̴̨͇̉̉̀ͅt̶̢̩̞̽̾̆ẽ̶̳n̸̩͓̯̼͑̃̀̉ ̶̛̜̘̠̉̍̕a̸̭͆̓̀s̴̙͚̮̣̊ ̷̮̽̀Ị̷̬͓̀̕ ̸̧̨̜̥̄͠ş̸̨̫̼͔̠̘͕̮̫̥̘̜͉͖̦̱̭͕̟͕̳̩͎̅̍̿̓̆̈̍̏͛͛̋̈́̇̅̑̓̀̊͗͘͝͝͝͠h̸̢̡̢̢̖͖̝̦̰̤̦͉̒̀̋̾̉̈́̏́̉ơ̶̢̲̤̩͈̹͙̯̝͕͕͔̱̌̀͛̑͑̏̓̔͐͋̆ŭ̶̧̢͙͉̭̮̺͚͍͙̮̫̩̮͓͉͗͗̃̏͊̀̽̂̏͊̎̐̓̌̕͝͠l̸̖̙̩̖̈͗́̀̓̀͗̏͑̊̃̓͋͛̕͠͝d̷̳̼̆́͛̀̆̽́͑̏͂͌͘

  • atzanteol@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    I have a cron job set to run on Monday and Friday nights, is this too frequent?

    Only you can answer that - what is your risk tolerance for data loss?

  • Andres@social.ridetrans.it
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    @Sunny Backups are done weekly, using Restic (and with ‘–read-data-subset=9%’ to verify that the backup data is still valid).

    But that’s also in addition to doing nightly Snapraid syncs for larger media, and Syncthing for photos & documents (which means I have copies on 2+ machines).

  • Lem453@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Local zfs snap every 5 mins.

    Borg backups everything hour to 3 different locations.

    I’ve blown away docker folders of config files a few times by accident. So far I’ve only had to dip into the zfs snaps to bring them back.

    • Avid Amoeba@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      5 months ago

      Try ZFS send if you have ZFS on the other side. It’s insane. No file IO, just snap and time for the network transfer of the delta.

      • Lem453@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        5 months ago

        I would but the other side isn’t zfs so I went with borg instead

  • Lucy :3@feddit.org
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Every hour, automatically

    Never on my Laptop, because I’m too lazy to create a mechanism that detects when it’s possible.

    • thejml@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      I just tell it to back up my laptops every hour anyway. If it’s not on, it just doesn’t happen, but it’s generally on enough to capture what I need.

  • hendrik@palaver.p3x.de
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    Most backup software allow you to configure backup retention. I think I went with some pretty standard once per day for a week. After that they get deleted, and it keeps just one per week of the older ones, for one or two months. And after that it’s down to monthly snapshots. I think that aligns well with what I need. Sometimes I find out something broke the day before yesterday. But I don’t think I ever needed a backup from exactly the 12th of December or something like that. So I’m fine if they get more sparse after some time. And I don’t need full backups more than necessary. An incremental backup will do unless there’s some technical reason to do full ones.

    But it entirely depends on the use-case. Maybe for a server or stuff you work on, you don’t want to lose more than a day. While it can be perfectly alright to back up a laptop once a week. Especially if you save your documents in the cloud anyway. Or you’re busy during the week and just mess with your server configuration on weekends. In that case you might be alright with taking a snapshot on fridays. Idk.

    (And there are incremental backups, full backups, filesystem snapshots. On a desktop you could just use something like time machine… You can do different filesystems at different intervals…)

  • itsame@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Using Kopia, backups are made multiple times per day to Google drive. Only changes are transferred.

    Configurations are backed up once per week and manually, stored 4 weeks. Websites and NextCloud data is backed up every hour and stored for a year (although I’m doing this only 7 months now).

    Kopia is magic, recommended!

  • AnExerciseInFalling@programming.dev
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    I use Duplicati for my backups, and have backup retention set up like this:

    Save one backup each day for the past week, then save one each week for the past month, then save one each month for the past year.

    That way I have granual backups for anything recent, and the further back in the past you go the less frequent the backups are to save space

  • mosjek@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    I classify the data according to its importance (gold, silver, bronze, ephemeral). The regularity of the zfs snapshots (15 minutes to several hours) and their retention time (days to years) on the server depends on this. I then send the more important data that I cannot restore or can only restore with great effort (gold and silver) to another server once a day. For bronze, the zfs snapshots and a few days of storage time on the server are enough for me, as it is usually data that I can restore (build artifacts or similar) or is simply not that important. Ephemeral is for unimportant data such as caches or pipelines.

  • AMillionMonkeys@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    I tried Kopia but it was unstable and janky, so now it’s whenever I remember to manually run a bunch of rsync. I backup my desktop to cold storage on the first of the month, so I should get in the habit of backing up my server to the NAS then also.

  • ikidd@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    edit-2
    5 months ago

    Proxmox servers are mirrored zpools, not that RAID is a backup. Replication between Proxmox servers every 15 minutes for HA guests, hourly for less critical guests. Full backups with PBS at 5AM and 7PM, 2 sets apiece with one set that goes off site and is rotated weekly. Differential replication every day to zfs.rent. I keep 30 dailies, 12 weeklys, 24 monthly and infinite annuals.

    Periodic test restores of all backups at various granularities at least monthly or whenever I’m bored or fuck something up.

    Yes, former sysadmin.

    • scarecrow365@reddthat.com
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      This is very similar to how I run mine, except that I use Ceph instead of ZFS. Nightly backups of the CephFS data with Duplicati, followed by staggered nightly backups for all VMs and containers to a PBS VM on a the NAS. File backups from unraid get sent up to CrashPlan.

      Slightly fewer retention points to cut down on overall storage, and a similar test pattern.

      Yes, current sysadmin.

  • infinitevalence@discuss.online
    link
    fedilink
    English
    arrow-up
    0
    ·
    5 months ago

    Depends on the application. I run a nightly backup of a few VM’s because realistically they dont change much. I have containers on the other hand that run critical (to me) systems like my photo backup and they are backed up twice a day.