Hi fellow self-hoster.

Almost one year ago i did experiment with Immich and found, at the time, that it was not up to pair to what i was expecting from it. Basically my use case was slightly different from the Immich user experience.

After all this time i decided to give it another go and i am amazed! It has grown a lot, it now has all the features i need and where lacking at the time.

So, in just a few hours i set it up and configured my external libraries, backup, storage template and OIDC authentication with authelia. All works.

Great kudos to the devs which are doing an amazing work.

I have documented all the steps of the process with the link on top of this post, hope it can be useful for someone.

  • happydoors@lemm.ee
    link
    fedilink
    English
    arrow-up
    0
    ·
    21 hours ago

    My only issue with it is that on my iphone, the app constantly freezes and says I have 3 photos left to upload. It’s almost certain to freeze for a few minutes and the upload becomes stalled as well. This behavior made it take a long time to backup my library and it makes it a pain in the ass to share photos quickly with people. Popping into the webUI has none of these issues (just no uploading of my photos). I still quite love the app

  • nucleative@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    24 hours ago

    Haven’t checked in a while but is there any hope for cloud storage of the image library yet? I’m kind of holding out for S3 support because I don’t want to manage multiple terabytes locally.

    • sandwichsaregood@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      11 hours ago

      I don’t think immich supports this natively but you could mount an S3 store with s3fs-fuse and put the library on there without much trouble. Or many other options like webdav.

  • Sibbo@sopuli.xyz
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    I’m using immich for half a year or so now. There only problem is that it did not chunked uploads. So one large video just never uploaded, and I had to use nextcloud to upload it instead. Otherwise, it’s great.

    • Shimitar@downonthestreet.euOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      22 hours ago

      Yes, i encountered this issue as well. Seems that tweaking NGINX setting helped. Still stupid that a large upload will stall all the others.

    • retro@infosec.pub
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      If you’re self hosting Immich on your local network, I’ve gotten around this by setting the Immich app to use my local ip address while on my home wifi network.

  • Darkassassin07@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    I’m curious;

    Which ML CLIP model did you go with, and how accurate are you finding the search results?

    I found the default kinda sub-par, particularly when it came to text in images.

    Switched to “immich-app/XLM-Roberta-Large-Vit-B-16Plus” and it’s improved a bit; but I still find the search somewhat lacking.

    • waitmarks@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      2 days ago

      The best one I have found was one of the newer ones that was added a few months ago. ViT-B-16-SigLIP__webli

      Really impressed with the accuracy even with multi word search like “espresso machine”

      • Darkassassin07@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        How well does it do with text in images?

        I often find searching for things like ‘horse’ will do a decent job bringing up images of horses, but will often miss images containing the word ‘horse’.

  • corsicanguppy@lemmy.ca
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago
    • backup - noun
    • back up - verb

    I quit as soon as I saw it still has a docker crutch. Fails security reqs due to the validation issue.

    Thanks, though. Sounds like it’s gonna work well for you.

    • Lem453@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      I used to use a docker container that makes db dumps of the database and drops it into the same persistent storage folder the main application uses. I use this for everything in docker that had a db.

      Immich as recently integrated this into the app itself so its no longer needed.

      All my docker persistent data is in a top level folder called dockerdata.

      In that I have sub folders like immich which get mounted as volumes in the docker apps.

      So now I have only 1 folder to backup for everything. I use zfs snapshots to backup locally (zfs auto shot) and borgmatic for remote backups (borgbase).

      All my dockers all compose files that are in git.

      I can restore he entire server by restoring 1 data folder and 1 compose file per stack.

      • Ulrich@feddit.org
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        I don’t understand how that’s helpful. If something is corrupted or my house burns down, a local backup is going to go with it. That’s why I asked for external backups.

        • Shimitar@downonthestreet.euOP
          link
          fedilink
          English
          arrow-up
          0
          ·
          22 hours ago

          I have three tiers of backup. Never heard or the 3,2,1 rule?

          3 backups 2 locations 1 offsite

          I backup one time on an external disk connected to the server. A second time to another disk, connected on an OpenWRT router located in the patio. A third copy is uploaded to my VPS in the cloud.

          not all three are symmetrical due to disk sizes. But critical data is always backed up on all three. Daily backups.

          Restic do deduplication and encryption too, so actual data usage is really minimal and all is kept safe.

    • Shimitar@downonthestreet.euOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      I backup with restic the database backups done by immich, not the database itself, and the Library/library folder which contains the actual images and videos.

        • bdonvr@thelemmy.club
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 days ago

          If anyone’s interested, here’s my Immich backup script. You setup rclone to use an S3 storage service like BackBlaze which is quite cheap. I also use a crypt which means RClone will encrypt and decrypt all files to/from the server. S3 configuration and crypt setup.

          Then set this up as a cron job. With the “BACKUP_DIR” option when you delete a photo it will get moved to the “deleted” folder. You can go into your S3 provider’s lifecycle settings and have these get deleted after a number of days. I do 10 days. Or you can skip that and they’ll be gone forever.

          #!/bin/bash
          SRC_PATH="/path/to/immich/library"
          DEST_REMOTE="b2crypt:immich-photos/backup"
          BACKUP_DIR="b2crypt:immich-photos/deleted"
          RCLONE_OPTIONS="--copy-links --update --delete-during --backup-dir=$BACKUP_DIR --suffix `TZ='America/New_York' date +%Y-%m-%d`.bak --verbose"
          rclone sync $SRC_PATH $DEST_REMOTE $RCLONE_OPTIONS
          
          
          • Ulrich@feddit.org
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 days ago

            Yeah, I don’t know what any of these words mean. I just want to click “export” and back all the data up to a flash drive. Is that too much to ask?

            • Shimitar@downonthestreet.euOP
              link
              fedilink
              English
              arrow-up
              0
              ·
              22 hours ago

              I think it is. It doesn’t take much to understand which folders needs to be backed up. They are also pretty clear on the immich website on how to backup the database itself. No, just an “export” wouldn’t be good enough since the files themselves do not include the metadata.

              • Ulrich@feddit.org
                link
                fedilink
                English
                arrow-up
                0
                ·
                2 days ago

                Reading the comment I replied to, it appears to be much much more complicated. And I don’t understand how anyone can claim otherwise.

                • catloaf@lemm.ee
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  2 days ago

                  Key word is “appears”. Choose your source and destination, run rclone. That’s it. No harder than going to the page, clicking export, picking a folder, save. It’s really not hard at all, give it a try.

            • bdonvr@thelemmy.club
              link
              fedilink
              English
              arrow-up
              0
              ·
              2 days ago

              Well yeah you could go on the site and select whatever photos and hit download I suppose.

              • Ulrich@feddit.org
                link
                fedilink
                English
                arrow-up
                0
                ·
                2 days ago

                There’s no way to do that for your entire library. Also I assume that would not retain the Immich-specific metadata like the ML object tags and the “people” tagged in the photos.

                • bdonvr@thelemmy.club
                  link
                  fedilink
                  English
                  arrow-up
                  0
                  ·
                  2 days ago

                  You should have a backup solution for your server that should cover this, without that you should probably stick with managed photo backup services.

  • TrickDacy@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    I’ve been trialing some similar apps and none of them really fully satisfy me, including immich. Mostly because they all make it clunky to exclude some photos from showing up, or indexing being slow as hell and not particularly good at removing photos I recently ignored, deleted or moved. Immich in particular is bad with the ignore part. I wish I could edit a text block that defined ignore rules like a gitignore, but instead you have to add each rule separately in the UI. Then it feels very slow to add thumbnails for raw files and slow to index period. So many of these apps seem to me like they fumbled the ball just short of a touchdown because otherwise the featuresets seem nice.

    I have tried damselfly, immich, libre photos, photo prism, and I tried to configure nextcloud memories but I could not even get it running. It seemed pretty complicated and picky about its setup.

    • Shimitar@downonthestreet.euOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      I went trough them all and probably a few more (photoview) and Immich is by far the best. Also at that pace of development it will be perfect soon.

      Its by far the fastest for thumbnails and indexing so far at least on my hardware.

  • jqubed@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    Your website hasa banner that says it uses cookies and that by using it I acknowledge having read the privacy policy, but if I click More Information it takes me to a page the wiki says want created yet.

    • Shimitar@downonthestreet.euOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      23 hours ago

      i have double checked but i do not have any banner on my wiki at all… Where did you see one? The only cookie is a technical cookie only used for your preferences and no tracking.

    • Shimitar@downonthestreet.euOP
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      Never noticed. I don’t do anything with the cookies anyway, its just a docuwiki self hosted, no ads, no data collection, nothing. I don’t even store logs.

      I might need to write the privacy policy… Will do tomorrow.

      • Atemu@lemmy.ml
        link
        fedilink
        English
        arrow-up
        0
        ·
        1 day ago

        If you don’t process any user data beyond what is technologically required to make the website work, you don’t need to inform the user about it.

      • teawrecks@sopuli.xyz
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        Afaik the cookie policy on your site is not GDPR compliant, at least how it is currently worded. If all cookies are “technically necessary” for function of the site, then I think all you need to do is say that. (I think for a wiki it’s acceptable to require clients to allow caching of image data, so your server doesn’t have to pay for more bandwidth).

      • starshipwinepineapple@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        Im not familiar with doku wiki but here’s a few thoughts

        • privacy policy is good to have regardless of what you do with rest of my comments
        • your site is creating a cookie “dokuwiki” for user tracking.
        • cookie is created regardless of user agreement, rather than waiting for acceptance (implied or explicit agreement). As in i visit the page, i click nothing and i already have the dokuwiki cookie.
        • i like umami analytics for a cookieless google analytics alternative. They have a generous free cloud option for hobby users and umami is also self hostable. Then you can get rid of any banner.
  • kr0n@piefed.social
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    I have a problem generating thumbnails for photos taken from summer 2023 until now (using my iPhone 12 Pro). It’s like a format problem or something. I don’t know ¯_(ツ)_/¯

    • ra1d3n@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      You might want to submit a bug report. Their pace of development is insane for OSS.

  • ReallyActuallyFrankenstein@lemmynsfw.com
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    Thank you for this. I plan to look at the authentication part more closely, but that’s the part I can’t quite figure out (being an amateur at this stuff but still trying), since I’m nervous with just a password accessing it remotely or from the phone.

    Authelia, NGINX, there is so much that’s confusing to me, but this might help.

    • enumerator4829@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      0
      ·
      2 days ago

      I’d recommend setting up a VPN, like tailscale. The internet is an evil place where everyone hates you and a single tiny mistake will mess you up. Remove risk and enjoy the hobby more.

      Some people will argue that serving stuff on open ports to the public internet is fine. They are not wrong, but don’t do it until you know, understand and accept the risks.(’normal_distribution_meme.pbm’)

      Remember, risk is ’probability’ times ’shitshow’, and other people can, in general, only help you determine the probability.

      • gray@pawb.social
        link
        fedilink
        English
        arrow-up
        0
        ·
        2 days ago

        good general advice until you have to try to explain to your SO the VPN is required on their smart TV to access Jellyfin.

        • AtariDump@lemmy.world
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 days ago

          It’s one thing to expose a single port that’s designed to be exposed to the Internet to allow external access to items you don’t care if the entire internet sees (Jellyfin).

          Ots other thing when you expose a single port to allow access to items you absolutely do care if the entire internet sees (Immich).

        • enumerator4829@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          0
          ·
          2 days ago

          Then you expose your service on your local network as well. You can even do fancy stuff to get DNS and certs working if you want to bother. If the SO lives elsewhere, you get to deploy a raspberry to project services into their local network.

          • pirat@lemmy.world
            link
            fedilink
            English
            arrow-up
            0
            ·
            2 days ago

            deploy a raspberry to project services into their local network

            This piqued my interest!

            What’s a good way of doing it? What services, besides the VPN, would run on that RPi (or some other SBC or other tiny device…) to make Jellyfin accessible on the local network?

  • non_burglar@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    2 days ago

    I love immich. I just wish for two things:

    • synchronised deletes on client server
    • the edit tools on mobile to actually work on the photo at hand instead of creating a new photo with new metadata. May as well not have the tools, tbh.