• 0 Posts
  • 358 Comments
Joined 3 years ago
cake
Cake day: June 18th, 2023

help-circle










  • AI coding tools can do common, simple functions reasonably well, because there are lots of examples of those to steal from real programmers on the Internet. There is a large corpus of data to train with.

    AI coding tools can’t do sophisticated, specific-case solutions very well, because there aren’t many examples of those for any given use case to steal from real programmers on the Internet. There is a small corpus of data to train with.

    AI coding tools can’t solve new problems at all, because there are no examples of those to steal from real programmers on the Internet. There is no corpus of data to train with.

    AI coding tools have already ingested all of the code available on the Internet to train with. There is no more new data to feed in. AI coding tools will not get substantially better than they are now. All of the theft that could be committed has been committed, which is why the AI development companies are attempting to feed generated training material into their models. Every review of this shows that it makes the output from generative models worse rather than better.

    Programming is not about writing code. That is what a manager thinks.
    Programming is about solving problems. Generative AI doesn’t think, so it cannot solve problems. All it can do is regurgitate material that it has previously ingested which is hopefully close-ish to the problem you’re trying to solve at the moment - material which was written by a real thinking human that solved that problem (or a similar one) at some point in the past.

    If you patronize a generative AI system like Claude Code, you are paying into, participating in, and complicit in, the largest example of labor theft in history.




  • You SHOULD NOT do software RAID with hard drives in separate external USB enclosures.

    There will be absolutely no practical benefit to this setup, and it will just create risk of transcription errors between the mirrored drives due to any kind of problems with the USB connections, plus traffic overhead as the drives constantly update their mirroring. You will kill your USB controller, and/or the IO boards in the enclosures. It will be needlessly slow and not very fault-tolerant.

    If this hardware setup is really your best option, what you should do is use 1 of the drives as the active primary for the server, and push backups to the other drive (with a properly configured backup application, not RAID mirroring). That way each drive is fully independent from the other, and the backup drive is not dependent on anything else. This will give you the best possible redundancy with this hardware.


  • You can just use openssl to generate x509 certificates locally. If you only need to do this for a few local connections, the simplest thing to do is create them manually and then manually place them in the certificate stores for the services that need them. You might get warnings about self-signed certificates/unrecognized CA, but obviously you know why that’s the case.

    This method becomes a problem when:

    1. You need to scale - manually transferring certs is fine maybe half a dozen times, after that it gets real tedious and you start to lose track of where they are and why.
    2. You need other people to access your encrypted services - self-signed certs won’t work for public access to an HTTPS website because every visitor will get a warning that you’re signing your own encryption certs, and most will avoid it. For friends and family you might be able to convince them that your personal cert is safe, but you’ll have to have that conversation every time.
    3. You need to implement expiration - the purpose of cert expiration is to mitigate the damage if the cert private key leaks, which happens a lot with big companies that have public-facing infrastructure and bad internal security practices (looking at you, Microsoft). As an individual, it is still worthwhile to update your certs every so often (e.g. every year) if for no other reason than to remind yourself how your SSL infrastructure is connected. It’s up to you whether or not it’s worth the effort to automate the cert distribution.

    I’ve used Letsencrypt to get certs for the proxy, but the traffic between the proxy and the backend is plain HTTP still. Do I need to worry about securing that traffic considering its behind a VPN?

    In spite of things you may have read, and the marketing of VPN services, a VPN is NOT a security tool. It is a privacy tool, as long as the encryption key for it is private.

    I’m not clear on what you mean by “between the proxy and the backend”. Is this referring to the VPS side, or your local network side, or both?

    Ultimately the question is, do you trust the other devices/services that might have access to the data before it enters the VPN tunnel? Are you certain that nothing else on the server might be able to read your traffic before it goes into the VPN?

    If you’re talking about a rented VPS from a public web host, the answer should be no. You have no idea what else might be running on that server, nor do you have control over the hypervisor or the host system.