Reminds me of the group limit attribute in nextcloud. You could try looking at the ‘Custom profile scope’ section of https://docs.goauthentik.io/integrations/services/nextcloud/ to see if it helps to work out what to do
- 3 Posts
- 16 Comments
After some research on here and reddit about 6 months so, I settled on Borgbase and its been pretty good. I also manually save occasionally to proton drive but you’re right to give up on that as a solution!
The hardest part was choosing the backup method and properly setting up Borg or restic on my machine properly, especially with docker and databases. I have ended up with adding db backup images to each container with an important db, saving to a specific folder. Then that and all the files are backed up by restic to an attached external drive at well as borgbase. This happens at a specific time in the morning and found a restic action to stop all docker containers first, back them up, then spin them back up. I am find the guides that I used if it’s helpful to you.
I also checked my backups a few times and found a few small problems I had to fix. I got the message from order users several times that your backups are useless unless you regularly test them.
Im using a free Mailgun account for all my self hosted services. Happy with it so far.
Proton only has SMTP on it’s business accounts, and you have to apply for one with a good reason (ie not for spam)
Is that true though? I use always on VPN (one of the 3 mentioned) and never had issues. I have more problems with Firefox/Fennec and adblocking add ons. I am in Europe so is this more a US issue? I get blocked sometimes because of GDPR
Hopefully this works and you can see the compose file. I’ve put a few things in [square brackets] to hide some stuff, probably overly cautiously. I have an external network linked to NPM and in that, I use nextcloud-server for IP address and 80 for the port (it’s the inside container port, not 8080 on the system - that took me a long time to figure out!). Add a .env file with everything referenced in the compose file, then (hopefully!) Away you go
Not sure if it makes a difference and not quite your question but I’ve just switched away from nextcloud-aio to just having my own docker compose, so I have better control and know what’s going on more. I always found it funny and when installing on a new VPS decided to try. It was surprisingly straightforward and Ive been able to install everything I need.
Let me know if my docker compose would help. I still need to add the backup solution but it’s going to be straightforward as well.
My experience has taught me not to ‘apt autoremove’ unless im really sure what they are!
Take it one software at a time. See it’s running fine then move on to another. You’ll often realise something down the line will be helpful so will go back to make changes.
Keep a running list of software and the ports used.
With docker, do not automatically do :latest on important software (nginx proxy manager, SSO software, password database, anything you use regularly, etc). I did that and was burned a few times.
Also that at some point you’ll either mess up or realise it would just be easier and start again with a fresh OS install. Keep copying data (docker compose files and persistent storage) on working software before starting a new one, or before installing anything directly onto the OS, or before major updates.
brewery@lemmy.worldto
Selfhosted@lemmy.world•How do y'all backup docker databases with backup programs like Borg/Restic?English
4·2 years agoI just started using some docker containers I found on Docker Hub designed for DB backups (e.g. prodrigestivill/postgres-backup-local) to automatically dump from the databases into a set folder, which is included in the restic backup. I know you could come up with scripts but this way, I could easily copy the compose code to other containers with different databases (and different passwords etc).
I would recommend it as it is fairly easy to understand and most Foss services give you an example to use. You can also convert docker run examples to compose (search docker composeriser) although it doesn’t always work.
I found composer files easier when learning it, to digest what is going on (ports, networks, depends_on etc) and can compare with other services to see what is missing (container name, restart schedule etc). I can then easily backup the compose files, env files and data directories to be able to very quickly get a service up again (although DBs are trickier but found a docker image that I can stick on the compose files which backups the DB dumps regularly)
brewery@lemmy.worldto
Selfhosted@lemmy.world•Authelia + Bitwarden + other selfhosted stuffEnglish
3·2 years agoI use authentik but believe it’s similar. You can create accounts for people and give them passwords, or send a welcome email asking them to register to create one. I would warn you though, not every service has the ability to use it and it does take quite some effort to get it working! It’s interesting to learn about though
I tried the readarr and other options. They work sometimes but not enough to rely on it. As others mention, there’s no standard naming and also, lots of people use their library card for Libby access. I also think there’s a bit more of a direct link to authors so I’d prefer to buy the book unless theyre super well off anyway. To be honest, I can’t see the arr’s working with LibGen having looked at the open issues on integrating it, it just doesn’t allow for scraping in the same way.
For me, I self host openbooks (uses IRC) and select a download straight away, which to be fair, is about the same time as searching / finding a TV show if you are after one book. I have exposed it behind an SSO so can access it on my phone and download the book straight away when someone gives me a recommendation. Most of the time I just add to a running note on phone and go through it every few months when I need more books.
It’s fairly quick for multiple books but not sonarr levels of ease. The downloads go into a calibre monitored folder which then does the automation (naming, conversion if needed etc). I bulk email the new books to my kindle with one click. Calibre-web is on read only for a nice browsing experience and to read on other devices if I need to (althogh no page sync). It’s a bit of manual work but I find it is not too bad and in 10 minutes I can load up enough books for months.
Occasionally IRC does not have the book so try manually searching on prowlarr, and download on sab or transmission. The downloads are almost instant so I then just wait and copy them to my downloads folder (I could probably automate this step too with tags but it’s so infrequent).
brewery@lemmy.worldto
Selfhosted@lemmy.world•NUC, Proxmox and HA (a noob seeking for help) Update (4/8/24)English
2·2 years agoI have dynamic IP and there are several ways around it. I use Cloudflared (updates DNS records regularly) and a script I found to update duck DNS as a backup. Both very simple.
Accessing the services is not the problem, the problem is keeping them safe. I’ve tried lots of different ways (although not tailscale yet) and have a few services exposed directly to the internet behind authentik \ NPM \ Cloudflare \ fail2ban \ ufw. Others, I access through my router openvpn server, with keys for my laptop and phone as clients. There are so many guides online for all VPN types. Its just finding the right approach between ease of use vs safety
brewery@lemmy.worldOPto
Selfhosted@lemmy.world•Appreciation / shock at workplace IT systemsEnglish
4·2 years agoLots of little things really. Obviously I couldn’t say for certain but they seemed to on top of it without causing us too much difficulty in doing our jobs.
Sometimes things were blocked like if a new email, or questioned after to check it was expected and followed policy. Policies were clear, and there were helpful prompts or warnings.
We were involved in something where we had to copy a sh*t load of files from a shared folder to a hard disk. There were like three automatic blocks that kicked in at different times, which was a pain at first to figure out but because we had a good reason, someone in IT just kept at it to get it done and looking back, that should have raised flags given the size of it all.
They changed from passwords changing every 6 months to no changes but had to be longer and mandatory 2FA. We were told to use keepass for all passwords for things that weren’t SSO for various reasons.
brewery@lemmy.worldto
Selfhosted@lemmy.world•What advice can you give to a beginner?English
46·2 years agoDon’t provide services to others, including your own family, actually especially your own family, until you are quite comfortable with what is going on and what might be causing issues. Focus on helping yourself or keeping whatever other services you were using before just in case.
Trying to fix something at night, with a fuming partner who’s already put up with a difficult to use service, because of your want for privacy even though they don’t care care, whilst saying “it should work, I don’t know what’s wrong”, is not a great place to be 😁.
Overall though, I found it so interesting that I am doing a part time degree in computer science in my 30s, purely to learn more (whilst being forced to do it to timelines and having paid for it).
I have a very comfortable and ‘forget about it’ setup my family are now using. Every now and then I add new services for myself, and if it works out, will give access to others to use, keep it just for me or just delete it and move on.
brewery@lemmy.worldto
Selfhosted@lemmy.world•Should I use authentik with or without nginx proxy manager?English
6·2 years agoThey serve two different purposes. You can have one, both or neither. Sorry if you already know all this below but thought it might be good to explain in detail.
NPM is a proxy provider so passes subdomains to the right service (e.g. service1.url.com passes to service 1 at IP x.x.x.x on port 5050). This allows you to only open one port to NPM but access other services through subdomains. I have NPM in front of myexternal apps so I can access each through a subdomain (e.g. service1.url.com). You could also use it for accessing internally if you setup your internal DNS to pass (e.g. service1.internal) to the IP address and port of your service, and set NPM only to allow access from internal IPs.
Authentik provides single sign on so instead of having different usernames and passwords for every user on every service, you have one set of users and it manages the passwords.
There are at a high level two levels of using it.
Some services have proper SSO integration so you setup Authentik to replace it’s own login system. For instance, with Nextcloud you are going to the Nextcloud homepage but it then goes out to Authentik to do the login process and once passed, Authentik will tell Nextcloud user B has successfully logged in, I vouch for them and here are their details. You can do this for internal and external access. Obviously with Nextcloud you need to login either through it’s own login system or via SSO so even if I go directly to the internal IP and port (and therefore don’t need NPM to access it), I still need Authentik to login so it knows it’s me and not my partner trying to access her account
Some services don’t have SSO integration or have no login required. For instance, I have Stirling PDF which doesn’t need user details or login. However, you don’t want to just allow anyone to access so I have setup NPM to use Authentik as a proxy pass. If I go to stirlingpdf.url.com then it sends me to Authentik to login. You can only ever get to the Stirling app if you successfully log in. You can also set Authentik so that only certain users or groups of users can access certain apps but that’s more than I need.
It does take some effort to get SSO working correctly for each service and it’s only really worth it if you do have multiple users or services that need logins.
You don’t want just NPM unless you trust the service to have a secure login.
Others will probably say, you shouldn’t have anything facing externally. You can setup Tailscale or Wireguard tunnels so you always appear to be on the local network. That way, you don’t need NPM to be open externally. However you might still want it so you can type the address service1.internal instead of 192.168.1.1:8063 each time. You probably also want Authentik to make the login shared.
In terms of network access to get them working, NPM needs to be able to access Authentik internally on your network. You could either put them on the same shared Docker network or in my case, they are both on the same server so share an internal IP. I have opened the individual ports on Docker so they can access each other internally just like I can access both from my laptop. If I’m accessing away from home, I have my domain pointing my home external network ID, port 443 open on my router pointing to my home server with NPM. NPM then “talks” to Authentik through the home network so I login through that but I don’t have to open the Authentik port externally.
In my case, in the NPM settings, instead of using the docker created network for Authentik (like 172.3.1.1 or something that might change), I use the internal IP of the machine (like 192.168.1.1:4443 {if 4443 is the Authentik port}). I also have an NPM entry auth.url.com that points to Authentik which some apps need instead of the internal address. It took some playing around to get it right but once you do, it’s essentially copy and paste for new services.

Purely anecdotal but I accidentally fell into this during university when studying for final exams. 3 hours sleep at night, 3 hours in early afternoon. Was great and never felt better so tried to keep it after exams but it was just impossible. Once I got back to real life, it was impossible to keep such a rigid and inflexible system. I didn’t do it long enough to see any long term effects but just found it impossible to keep anyway so naturally reverted back to 7-8 hours overnight