Just don’t use Ubuntu. They do too much invisible fuckery with the system that hinders use on a server.
Would that warning also apply to Mint, since it’s based on Ubuntu, as well as other Ubuntu-based distros?
Just don’t use Ubuntu. They do too much invisible fuckery with the system that hinders use on a server.
Would that warning also apply to Mint, since it’s based on Ubuntu, as well as other Ubuntu-based distros?
Your comment makes no sense.
The article you posted is from 2023 and PERA was basically dropped. However, this article talks about PREVAIL, which would prevent patents from being challenged except by the people who were sued by the patent-holder, and it’s still relevant.
Yes, but have you seen some of the decisions the Supreme Court has come up with?
Do you only experience the 5-10 second buffering issue on mobile? If not, then you might be able to fix the issue by tuning your NextCloud instance - upping the memory limit, disabling debug mode and dropping log level back to warn if you ever changed it, enabling memory caching, etc…
Check out https://docs.nextcloud.com/server/latest/admin_manual/installation/server_tuning.html and https://docs.nextcloud.com/server/latest/admin_manual/installation/php_configuration.html#ini-values for docs on the above.
https://www.apple.com/airpods-pro/hearing-health/ says it has received FDA authorization, but doesn’t mention receiving approval from any other country’s regulatory body. It doesn’t say it’s US exclusive, though:
The Hearing Test and Hearing Aid features are expected to be available fall 2024. The Hearing Aid feature has received FDA authorization. Both features will be supported on AirPods Pro 2 with the latest firmware paired with a compatible iPhone or iPad with iOS 18 or iPadOS 18 and later and are intended for people 18 years old or older. The Hearing Aid feature will also be supported on a compatible Mac with macOS Sequoia and later. It is intended for people with perceived mild to moderate hearing loss.
The Hearing Protection feature, on the other hand, is explicitly listed as being exclusive to the US and Canada.
Do you memorize all of your passwords? If so, I take that to mean that you don’t use a password manager. Password managers - really, any app with 2FA - have this problem, too. But if you use a password manager and store your 2FA methods in it, then you only need to be able to regain access to your password manager.
If you use a cross-platform password manager with Passkey support, like Bitwarden, you can use it on any of your devices. In the event that you lose all of your devices, if you don’t have an Emergency Contact set up, you will need your password and one of the following to gain access to your account:
If you use security keys for 2FA, then you should have at least two - one that you keep with you and a backup that you keep in a safe place, like at home in a lockbox.
If you use a TOTP app to log in, or if you use security keys and want another backup, then making sure you’ll have access to the Recovery Code should be your priority. You can write it down and keep it in a few different places - at home, in your car, in your locker at work, etc… You can share it with someone you trust in person or over an encrypted channel (like Signal). You can store it on a flash drive, encrypted by a second password (which can be much easier than your primary password) or even unencrypted, if you generally keep the drive somewhere safe, disconnected from your computer. As long as you remember your password and can access your recovery code, you’ll also be able to regain access to your account, including all of your passkeys.
Emergency Access requires someone else to have access to their Bitwarden account, but assuming you don’t both lose access, it’s a pretty solid solution. When they request access, Bitwarden will send you an email allowing you to accept or reject their request. If you accept or don’t respond within the allotted “Wait Time” (which you configure: 1 day minimum, 90 days maximum) then they’ll be granted access. You also get a choice (when setting this up) to let them takeover the account (resetting your master password) or to just get read-only access.
Maybe you don’t like Bitwarden and want to use some other app, like 1Password, Dashlane, Roboforms, etc… Whatever your choice, familiarize yourself with how to restore access to your account in an emergency. Then you only need to worry about that and not about how to get access to your passkeys that are on your Windows laptop or only synced to your Apple devices.
But that is exactly what he recommends, using a password manager - with one time email authentication for the first login as an extra step, right?
Nope.
Using a cross-platform password manager with synced passkeys is different and much more secure than using a password manager with email TOTPs or sign-in links with emails that aren’t end-to-end encrypted.
And password manager adoption is much higher than PGP keyserver adoption, and if you can’t discover someone’s public key you can’t use it to encrypt a message to them, so sending end-to-end encrypted emails with TOTPs/sign-on links isn’t a practical option.
According to Statista, 34% of Americans used password managers in 2023 (a huge increase from 21% in 2022), so it’s not even like the best case scenario is rare.
The author mentions it: the QR code approach for cross device sign in. I don’t think it’s cumbersome, i think it’s actually a great and foolproof way to sign in. I have yet to find a website which implements it though.
The site doesn’t need to implement this; the browser handles that part.
I confirmed this works and logged into Github using Google Chrome on my work computer using a passkey stored in Bitwarden earlier today. I had to enable Bluetooth for Chrome, since I’d had it disabled, but then everything else was seamless.
You could’ve scrolled down to the bottom, clicked on “Links,” then clicked on the repo link
The repo has instructions to install a Snap or build from source. If you build from source, it looks like you should download an archive from the releases page rather than just pulling from master.
Open-Webui published a docker image that has a bundled Ollama that you can use, too: ghcr.io/open-webui/open-webui:cuda
. More info at https://docs.openwebui.com/getting-started/#installing-open-webui-with-bundled-ollama-support
Trademarks have to be enforced or they can be lost, so it makes sense to be overbroad about them. You say you could have fought it but that doesn’t mean you were legally in the right.
In this case, everything on their site is legal and above board.
Admittedly, Nintendo doesn’t care if what you’re doing is legal if it could cut into sales of current systems, games, or merchandise - they’ll issue takedowns regardless. That’s why videos of people demoing the MIG Switch got taken down for copyright infringement, for example. But given that every system this can extract games from already has its entire library available online in the form of pirated ROMs, getting it taken down won’t do anything for their bottom line.
In fact, Nintendo taking legal action against products like this would encourage piracy of their games. If a consumer wants a backup of their physical game cartridge library and the tools to create such backups are made unavailable or harder to access due to Nintendo’s actions, that consumer is likely to simply download the ROMs instead. That’s already piracy, and it’s only a few clicks more for the user to download ROMs for games they don’t own (and if you’re already legally a pirate, that line in the sand is awfully faint). And sites that host ROMs for the Gameboy Advance probably host ROMs for newer systems, too - including the ones that Nintendo actually cares about - so it’s in Nintendo’s best interest not to push those consumers in their direction.
They don’t have support for any recent Nintendo systems (not even the DS) so they’ll probably be fine.
Why? Open source doesn’t mean “cheap” or “at cost.”
I assume this was supposed to say “more noticeable,” not “less”:
but of course for example the difference between 21 and 30 FPS is less noticeable than the one between 231 and 240 FPS
I made a typo in my original question: I was afraid of taking the services offline, not online.
Gotcha, that makes more sense.
If you try to run the reverse proxy on the same server and port that an existing service is using (e.g., port 80), then you’ll run into issues. You could also run into conflicts with the ports the services themselves use. Likewise if you use the same outbound port from your router. But IME those issues will mostly stop the new services from starting - you’d have to stop the services or restart your machine for the new service to have a chance to grab the ports while they were unused. Otherwise I can’t think of any issues.
I’m afraid that when I install a reverse proxy, it’ll take my other stuff online and causes me various headaches that I’m not really in the headspace for at the moment.
If you don’t configure your other services in the reverse proxy then you have nothing to worry about. I don’t know of any proxy that auto discovers services and routes to them by default. (Traefik does something like this with Docker services, but they need Docker labels and to be on the same Docker network as Traefik, and you’re the one configuring both of those things.)
Are you running this on your local network? If so, then unless you forward a port to your server on the port your reverse proxy is serving from, it’ll only be accessible from the local network. This means you can either keep it that way (and VPN in to access it) or test it by connecting directly to your server on that port and confirm that it’s working as expected before forwarding the port.
I don’t know that a newer drive cloner will necessarily be faster. Personally, if I’d successfully used the one I already have and wasn’t concerned about it having been damaged (mainly due to heat or moisture) then I would use it instead. If it might be damaged or had given me issues, I’d get a new one.
After replacing all of the drives there is something you’ll need to do to tell it to use their full capacity. From reading an answer to this post, it looks like what you’ll need to do is to select “Change RAID Mode,” then keep RAID 1 selected, keep the same disks, and then on the next screen move the slider to use the drives’ full capacities.
Giphy has a documented API that you could use. There have been bulk downloaders, but I didn’t see any that had recent activity. However you still might be able to use one to model your own script after, like https://github.com/jcpsimmons/giphy-stacks
There were downloaders for Gfycat - gallery-dl supported it at one point - but it’s down now. However you might be able to find collections that other people downloaded and are now hosting. You could also use the Internet Archive - they have tools and APIs documented
There’s a Tenor mass downloader that uses the Tenor API and an API key that you provide.
Imgur has GIFs is supported by gallery-dl, so that’s an option.
Also, read over https://github.com/simon987/awesome-datahoarding - there may be something useful for you there.
In terms of hosting, it would depend on my user base and if I want users to be able to upload GIFs, too. If it was just my close friends, then Immich would probably be fine, but if we had people I didn’t know directly using it, I’d want a more refined solution.
There’s Gifable, which is pretty focused, but looks like it has a pretty small following. I haven’t used it myself to see how suitable it is. If you self-host it (or something else that uses S3), note that you can use MinIO or LocalStack for the S3 container rather than using AWS directly. I’m using MinIO as part of my stack now, though for a completely different app.
MediaCMS is another option. Less focused on GIFs but more actively developed, and intended to be used for this sort of purpose.