

Unfortunately, I’m not the right kind of software engineer to answer in more detail than that.
Unfortunately, I’m not the right kind of software engineer to answer in more detail than that.
I think for something like this, you’d rent cloud servers as you’d expect the number of concurrent users to change over time and ideally would be able to spin up more capacity when you need it without having to have those machines available all the time. You still need some kind of system that decides when to order more capacity with enough warning that it’s actually available (you can tell AWS you want a VM immediately, but it still takes a couple of minutes to transfer your data onto it and boot it up, which is longer than people want to sit in a loading screen) and decides which servers to assign to which users.
There’s a strong argument that the server architecture needed to be better at launch, but then the game sold more than an order of magnitude better than it was expected to, so no one would have noticed that it scaled badly had the player count been in line with their design and testing.
As someone else said, installing things outside of Program Files is generally only necessary if they were made for XP or older, and the developers didn’t test on Vista or newer or read the bit of the Windows documentation that said not to write to an application’s installation directory because it might not work on future versions that was there since the early nineties. Regular Oblivion works fine in Program Files (although it makes it more of a pain to mod) and the Remaster was obviously made post-Vista.
All that said, none of this is relevant because you’ve got the Windows App version, which uses a completely different system and works in a partial sandbox so doesn’t interact with the rest of the computer like a traditional program would.
It’s not guaranteed that it’s interpreted as a platitude by the person it’s directed at, and when the mismatch between the task and the work done is big enough to make it obviously a platitude, it’s just patronising, and risks being more insulting than not saying it at all.
The feedback in the article was obviously far from perfect, but from the sound of it, “good attempt” could be an actively harmful thing to say. Lots of effort had gone into making the wrong thing and making it fragile, which isn’t good at all, it’s bad. If you’d asked an employee to make a waterproof diving watch, and they came back with a mechanical clock made from sugar, even though it’s impressive that they managed to make a clock from sugar, it’s completely inappropriate as it’d stop working the instant it got wet. You wouldn’t want to encourage that kind of thing happening again by calling it good, and it’s incompatible enough with the brief that acknowledging it as an attempt to fit the brief is giving too much credit - someone who can do that kind of sugar work must know it’s sensitive to moisture.
The manager can apologise for not checking in sooner before so much time had been spent on something unsuitable and for failing to communicate the priorities properly, and acknowledge the effort and potential merit in another situation without implying it was good to sink time into something unfit for purpose without double checking something complicated was genuinely necessary.
I was meaning that the blade count and detachability was the difference in definition between turboprop and propfan/open turbofan, not that it was necessarily the thing making the engine more efficient.
I’ve seen turboprops in museums and on the internet with around six or eight blades. When I looked on the Wikipedia page for propfan engines, which seems to be another name for an open turbofan, the distinction seemed to be mainly how the blades were shaped (like propellor blades or turbine blades) and how tightly-integrated everything is (you can swap the propeller out on a turboprop).
How many blades do you have to add to a turboprop before it’s promoted to an open turbofan and touted as a major new innovation?
Arch is at least more likely to update to a fixed version sooner, and someone getting something with pacman is going to be used to the idea of it breaking because of using bleeding edge dependencies. The difference with the Flatpak is that most users believe that they’re getting something straight from the developers, so they’re not going to report problems to the right people if Fedora puts a different source of Flatpaks in the lists and overrides working packages with ones so broken as to be useless.
People fall off rooftops fitting solar panels, burn to death repairing wind turbines that they can’t climb down fast enough to escape, and dams burst and wash away towns. Renewable energy is much less killy than fossil fuels, but per megawatt hour, it’s comparable to nuclear, despite a few large incidents killing quite a lot of people each. At the moment, over their history, hydro is four times deadlier than nuclear, wind’s a little worse than nuclear, and solar’s a little better. Fission power is actually really safe.
The article’s talking about fusion power, though. Fission reactions are dangerous because if you’ve got enough fuel to get a reaction at all, you’ve got enough fuel to get a bigger reaction than you want, so you have to control it carefully to avoid making it too hot, which would cause the steam in the reactor to burst out and carry chunks of partially-used fuel with it, which are very deadly. That problem doesn’t exist with fusion. It’s so hard to make the reaction happen in the first place that any problem just makes the reaction stop immediately. If you somehow blew a hole in the side of the reactor, you’d just get some very hot hydrogen and very hot helium, which would be harmless in a few minutes once they’d cooled down. It’s impossible for fusion power, once it’s working, not to be the safest way to generate energy in history because it inherently avoids the big problems with what is already one of the safest ways.
That’s misleading in the other direction, though, as PhysX is really two things, a regular boring CPU-side physics library (just like Havok, Jolt and Bullet), and the GPU-accelerated physics library which only does a few things, but does them faster. Most things that use PhysX just use the CPU-side part and won’t notice or care if the GPU changes. A few things use the GPU-accelerated part, but the overwhelming majority of those use it for optional extra features that only work on Nvidia cards, and instead of running the same effects on the CPU if there’s no Nvidia card available, they just skip them, so it’s not the end of the world to leave them disabled on the 5000-series.
You can jam the Windows UI by spawning loads of processes with equivalent or higher priority to explorer.exe
, which runs the desktop as they’ll compete for CPU time. The same will happen if you do the equivalent under Linux. However if you have one process that does lots of small allocations, under Windows, once the memory and page file are exhausted, eventually an allocation will fail, and if the application’s not set up to handle that, it’ll die and you’ll have free memory again. Doing the same under every desktop Linux distro I’ve tried (which have mostly been Ubuntu-based, so others may handle it better) will just freeze the whole machine. I don’t know the details, but I’d guess it’s that the process gets suspended until its request can be fulfilled, so as long as there’s memory, it gets it eventually, but it never gets told to stop or murdered, so there’s no memory for things like the desktop environment to use.
Nintendo used to have a page on emulation on their website incorrectly claiming that it was always illegal and all emulators had solely been created to enable piracy. This new claim is not compatible with their previous action of having that page.
Putting "false"
in a YAML file gives you a string, and just false
on its own gives you a boolean, unless you tell the YAML library that it’s a string. Part of the point of YAML is that you don’t have to specify lots of stuff that’s redundant except when it would otherwise be ambiguous, and people misinterpret that as never having to specify anything ever.
Most of the problems can be totally avoided by telling the YAML loader what type you’re expecting instead of forcing it to guess (e.g. provide a schema or use typed getter functions). If it has to guess, it’s no surprise that some things don’t survive the string to inferred type to desired type journey, and this is something that isn’t seen as a dealbreaker in other contexts, e.g. the multitude of languages where the string "false"
evaluates to true when converted to a boolean because it’s non-empty.
The new law allows you to have more than one charging connector provided that either the USB-C one is the best one, or the USB-C one is as good as the spec allows. If the new connector’s genuinely better, then it’ll beat a maxed-out USB-C connector, so devices will provide it in addition to a maxed-out USB-C connector.
Male to female A-to-A cables are pretty common (they’re just basic extensions) and totally legal under the spec provided they’re limited to a certain length or contain a powered repeater. It’s just the rare male-to-male (which my keyboard stupidly uses) and even rarer female-to-female that aren’t legal. There’s also the exception of USB-on-the-go cables with a micro-B end and a female A end for devices like smartphones that are capable of being host or connecting to a host, back before they switched to USB-C.
It adds the executable permission (without which, things can’t be executed) to all the files in the game’s directory. You only need to be able to execute a few of those files, and there’s a dedicated permission to control what can and can’t be executed for a reason. Windows doesn’t have a direct equivalent, so setting it for everything gives the impression that they’re trying to make it behave like Windows rather than working with the OS.
Wikipedia management shouldn’t be under that pressure. There’s no profit motive to enshittify or replace human contributions. They’re funded by donations from users, so their top priority should be giving users what they want, not attracting bubble-chasing venture capital.