Link without adblock harassment:
Link without adblock harassment:
Mac has always felt more like mine than Windows. Nothing has changed there.
And neither holds a candle to the pure, blinding, white light that is Linux. GNOME, KDE, the world is your oyster and the desktop is your choice.
They’re the same picture.
So it DOES have a passive-aggressive mode!!!
I was going to say most of this, too. I’m a big adherent of BDD, which works well with agile. It clarifies what everyone is working on without getting weighed down in unnecessary minutiae or “documentation for paperworks sake”… it lives and evolves with the project, and at the end becomes both testing criteria and the measurement of success.
I’ve been using Sunshine for Linux with Moonlight on my AVP and that works great. The native Moonlight port for AVP is still very much a buggy, crashy WIP, but the iPad version is a decent enough standby.
Honestly, using virtual Mac Display on AVP is so, so, so good, that I want that functionality everywhere… from any and all of my devices. Sunshine + Moonlight is currently the most promising path forward, IMO.
…SpaceOS, which is a built on top of Google’s ChromiumOS…
I’m out.
Linux is… right there. It’s right there.
Holy cow. Canadian unions are my newest heroes! ❤️
Hey Teamsters et al, let’s get this done in the US now, yeah?
I’m surprised it scores that well.
Well, ok… that seems about right for languages like JavaScript or Python, but try it on languages with a reputation for being widely used to write terrible code, like Java or PHP (hence having been trained on terrible code), and it’s actively detrimental to even experienced developers.
Projects not containing anything owned by Nintendo has never stopped Nintendo from destroying those projects (and peoples lives) in the past.
Bad faith misdirection question is bad faith, misdirection.
E for effort, though.
To grossly oversimplify things, there are two kinds of vegans…
Type 1 are “healthy living” and “sustainability” vegans. These type are generally benign, polite, helpful, positive, and keep to themselves unless asked. They also tend to not be super militant about their veganism… like the occassional egg from someone’s beloved home-raised chickens is fine.
Type 2 are ideological vegans. These types believe that “exploiting” “living creatures” in any way is fundamentally immoral, and because it’s a morality issue (e.g. basically religion) the vast majority are very preachy, demanding, and in-your-face about it. They don’t consider type 1 to be “real vegans”.
Type 2, being the loudest and most abrasive, giving veganism a bad name and ruining it for everyone.
extremists on both sides
Right Extremists: Murder our political opponents! Kill the queers and the coloreds! Install our dictator! Anyone that isn’t a white christian nationalist doesn’t deserve to live!
Left Extremists: Everyone deserves access to health care, housing, and basic necessities! Limit the hoarding of wealth and power! Protect democracy! End mass murder and genocide!
“bOTh sIdeS Are THe sAME!”
Clearly, the algorithm thinks you need Jesus.
If you use Kagi you can block the entire site and let better results float up.
Ouch. Don’t get me started on iPad “Pro”. There is nothing “Pro” about a device that is so heavily sandboxed and restricted to only installing things from a single, heavily curated app store. 😫
Such great hardware and UX, ruined because their app distribution monopoly is more profitable than moving devices.
Yeah. You train it to recognize an object/image, and then you can respond to that however you want, programmatically. It’s really cool for things like tabletop games.
For the how: Apple has APIs to including lightweight ML models for that sort of thing. You have to train a model, but for something like a card game it’s super easy. For physical objects it requires more prep just because you have to take photos instead of using existing artwork… but it’s still relatively straightforward.
I have a card game (a physical one, not virtual) and I want to “replace” the real cards with digital, animated, “living” ones. Ideally, I could apply this technique to other types of tabletop game, later… but cards are the current project.
This works fine on iOS and even in the Vision Pro simulator, but on hardware, the image recognition is slow and unreliable, and it doesn’t track items through space in real-time. It’s laggy and “floaty”. Image recognition for unique, flat cards should be one of the simplest possible use-cases… and given how much more powerful the hardware is than a phone, and the fact that it works on the simulator, it sure seems like a software issue… but you can’t ship Apps with such severe problems, either.
As an “infinity monitor” you can use anywhere, it is really great.
But it is not - as it was advertised - a “spatial computer”. I can’t even think of it as an AR device, because it is terrible at image recognition and tracking… something even an iPhone can do. I have no idea why that is the case, because the hardware is, theoretically, ridiculously powerful… but something is seriously, seriously wrong with the software right now… and it cripples the headset for what is supposed to be it’s one major use-case: spatial computing.
P.S. In case it wasn’t obvious, I bought one to develop for, and as a developer I’m pretty angry at how poorly it performs at basic AR tasks.
“At least…”
I feel like the 15% number is very, very low.