

We’ve been without a lot of things for millennia
We’ve been without a lot of things for millennia
Not sure if Google Lens counts as AI, but Circle to Search is a cool feature.
Not to the point where it’s worth having a button for it permanently taking up space at the bottom of the screen.
On a lot of phones you can hide the navigation pill, but Samsung started forcibly showing it when they added Circle to Search. Fortunately I don’t have a Samsung phone.
They’re also generally lower quality
A big blocker that the article surprisingly doesn’t talk about is tons of IoT stuff that uses 2G and 3G. Stuff like alarm systems, emergency phones, street light control, cars etc. Here in Sweden there was recently a report that thousands of elevators have emergency phones using 2G and 3G, and if the network is shut down you would no longer be allowed to use those elevators. And since 2018 all new cars in the EU has to have eCall, which alerts emergency services on a crash. Many of these use 2G and 3G, and if it stops working the car won’t pass inspection so you’ll no longer be allowed to drive it.
It’s not even that, there are multiple languages spoken in the same region. Webpages should just use the language the browser tells it to use.
A robot doesn’t need to be anthropomorphic, an assembly line robot is still a robot. It does however need to be able to perform some actions autonomously, for which a vibrator hardly qualifies.
Still doesn’t allow background playback though, so it’s useless to me
I don’t think there has ever been a PPU on the GPU. It did originally run on PPU cards by Ageia, but AFAIK PhysX on GPU:s used CUDA GPGPU right from the start.
Mirror’s Edge actually had a place with tons of broken glass falling down, where the framerate would drop into the single digits if it used CPU PhysX. I remember that because it shipped with an outdated PhysX library that would run on the CPU even though I had an Nvidia GPU, so I had to delete the game’s PhysX library to force it to use the version from the graphics driver, in order to get it to playable performance. If you didn’t have an Nvidia driver you would need to disable PhysX for that segment to be playable.
Though that’s not where you would use HDMI. I would argue for TV:s, 4k is generally enough, and HDMI 2.1 already has enough bandwidth for 4k 120 Hz 12 bit-per-color uncompressed.
But DisplayPort, yeah, that could use a bit more.
Where’s the part where he suffers?
It’s 401 unauthorized or 403 forbidden, not 403 unauthorized
Harvesting IP addresses shouldn’t be a problem, since the firewall shouldn’t allow packets from a peer you haven’t talked to first. But true, if you can be attacked in response by a server you’re connecting to that would be bad.
This would presumably mainly be an issue for computers open to the internet. So not so much for home PCs, unless the router’s firewall is opened up.
How would that bypass the firewall?
This TV Streamer costs significantly more than a CCwGTV combined with an adapter.
Apparently so it does, and it says “HDMI Freesync” rather than “HDMI [2.1] VRR”. FreeSync HDMI is a completely different protocol and is supposed to work under Linux. Found a thread here, can you try cat /sys/kernel/debug/dri/0/HDMI-A-1/vrr_range
and edid-decode < /sys/class/drm/card0-HDMI-A-1/edid
? Though there is no solution there.
I thought that there was VRR support over HDMI even for versions below 2.1 spec.
Yes, there is FreeSync HDMI, which is supposed to be supported on Linux, and which is unrelated to HDMI 2.1 VRR. Don’t see anything about the monitor supporting that though (LG 24GS60F based on your previous post). Not anything about HDMI 2.1 VRR, it probably only supports VRR via DisplayPort Adaptive Sync.
Until services stop supporting it.
This article really needs some illustration of what the new UI looks like, and what the old one looks like for comparison