Yay!
Does NZ count as Australia too? Or are we stuck with parallel importers, or picking one up on holiday?
Yay!
Does NZ count as Australia too? Or are we stuck with parallel importers, or picking one up on holiday?
Imo, the term “buy” for all goods should pass some sort of litmus test. Eg:
does the product being sold have the same properties as a brick?
- can the product be resold privately?
- can the product be lent to another user temporarily?
- would the product still perform its function when the manufacturer stops supporting it?
- would the product still perform its function if the manufacturer ceased to exist.
if the product does not pass all these tests, the customer is not buying. Consider using terms such as ‘rent’ or ‘lease’ or ‘subscription’
It’s America, so the answer is probably “No”.
Do you not have consumer protection laws?
We’ve had digital price tags for decades. But you couldn’t do this in NZ. Stores are obligated to sell you a product at the price they advertise it for AND have a reasonable quantity of units at that price… you couldn’t sell 1 TV for $1.
So these systems would need to track what price you saw it at.
(Caveat: Our stores are still cunts and have been found to overcharge people)
They are also IR controlled. A lot of them have a little window on the front of the unit, and an array of transmitters in the ceiling.
Is “The Algorithm” just “we stuffed all our GPT responses into a Lucene index and look for 80% matches”?
Because that’s what I’d do.
I used to love ‘the cloud’. Rather, a specific slice of it.
I worked almost exclusively on AppEngine, it was simple. You uploaded a zip of your code to appengine and it ran it at near infinite scale. They gave you a queue, a database, a volatile cache, and some other gizmos. It was so simple you’d struggle to fuck it up really.
It was easy, it was simple, and it worked for my clients who had 10 DAU, and my clients who had 5 million DAU. Costs scaled nearly linearly, and for my hobby projects that had 0 DAU, the costs were comparable.
Then something happened and it slowly became complicated. The rest of the GCP cloud crept in and after spending a term with a client who didn’t use “the cloud” I came back to it and had to relearn nearly everything.
Pretty much all of the companies I’ve worked for could be run on early AppEngine. Nobody has needed anything more than it, and I’m confident the only reason they had more was because tech is like water. You need to put it in a bucket or it goes everywhere.
Give me my AppEngine back. It allowed me to focus on my (or my clients) problems. Not the ones that come with the platform.
They were first to market with a decent GPGPU toolkit (CUDA) which built them a pretty sizeable userbase.
Then when competitors caught up, they made it as hard as possible to transition away from their ecosystem.
Like Apple, but worse.
I guess they learned from their Gaming heyday that not controlling the abstraction layer (eg OpenGL, DirectX, etc) means they can’t do lock in.
While suing everyone else that makes shovel handles that work with your shovel heads.
Problem then is, You Still gotta buy a truck to buy and haul your 2nd motorcycle, your 3rd motorcycle, your dirt bike, and your track bike.
Alternate headline:
Companies accept money for a thing that will happen anyway, and will be unable to prove if they say no.
GenAi is unfortunately here, and the technocracy wants you to want it so they can farm you for more and more intimate data to leverage and enforce their technocracy. And the only way they’re going to do it is by keeping the press positive, and feed it more and more data in the hopes it fixes things.
I was expecting some sort of “Ai discovers new bug in 30 year old software”… cool I’m excited.
Then they were talking about how the bug was persistent, and I’m more intrigued “is the bug some weird emergent behaviour corrupting state somewhere?”
Nope, just another example of a shit in shit out data model.
I’m old, I have other shit to do, and I don’t have the time. If I’m writing code, I’m doing it because there is a problem that needs a solution. Either solving someone else’s ‘problems’ for $$$, or an actual problem at home.
If it’s a short term problem like “reorganising some folders” I’m not going to (re)learn another language. I’m going to smash it out in 30mins with whatever will get the job done the quickest, then get back to doing something more important.
If it’s an ongoing problem, I’m going to solve it in the most sustainable way possible. I might fix the problem now but 100% someone’s going to drop support or change an API in 2 years time and it’ll break. Sure, doing it in Chicken would be fun. But the odds are, I won’t remember half the shit I learned 2 years later. It’ll be unmaintainable. A forever grind of learning, fixing, forgetting.
So without a commercial driver to actively invest in Lisps, there’s no point. It’s not profitable and It doesn’t solve any problems other tools can. Without the freedom youth brings, I don’t have the time to do it “for fun”.
I love lisp. Well, scheme and less so clojure. I don’t know why. Is it macros? Is it the simplicity? Or is it just nostalgia from learning it during a time in my life.
But I just can’t find a place for it in my life.
It’s not job material, effectively nobody uses it. It doesn’t solve basic problems with ease like Python does.
And because of this, anything I do in it is nothing more than a toy. As soon as i put it down, I have no hope of picking it up or maintaining it in 6,12,24 months later.
A toy I spend 2 weeks in absolute joy, but as soon as life gets in the way it is dead.
Move to NZ. It’s nearly all c# here.
My brother behaves weird with Linux (fedora 39 silverblue).
When doing multiple copies of double sided printing, it’ll print [1|2] [1|1] [2|2] [1|1] [2|2] and then repeat until you realise you now have onen copy of what you want and 10 pages of one side, and 10 pages of the other side.
It’ll also randomly refuse jobs, and then print them 30 minutes later (lmao if you printed multiple copies, gave up and went for a walk)
My Panasonic I replaced it with was better, but you had to download binary blobs to make it work.
But, Linux has gotten more and more complicated in the last 20 years I really can’t be fucked working out if it’s the printer, cups, flatpacks, the app that’s printing, or all of the above.
Now I just email myself a PDF and print from my phone. Fucking stupid but it works.
The host was stable. And I was compiling the kernel for hardware and vfio reasons anyway, so why not compile everything and it’s not like there was a lot to compile.
Tips: don’t
Performance was ok. Lots of fiddling required on both host and guest to get performance close to native.
I used to have Gentoo running a Libvirt hypervisor, which would then run multiple short lived isolated windows and Linux machines with GPU passthrough for all the different companies and projects I was working on.
Spent far too much time keeping the guest machine images up to date, and all the configs and stuff managed and synchronised.
Then my laptop died that I was using to manage everything so I gave up.
Hips that make a Pixar mom jealous.
It’s obviously enough of a thing to warrant Google to crack down on it in both chrome and YouTube.
If it’s such a small problem, why spend the effort?