that does happen to be one of the defining characteristics of mersenne primes.
And searching for mersenne primes happens to be the easiest known way to find extremely large prime numbers (via the Special Number Field Sieve I believe)
that does happen to be one of the defining characteristics of mersenne primes.
And searching for mersenne primes happens to be the easiest known way to find extremely large prime numbers (via the Special Number Field Sieve I believe)
Looks like the image I found cropped out the signature, seems to be jeremykaye.tumblr.com
Garbage collection is still allowed, and technically JIT languages are still compiled so it really isn’t that restrictive
every single language (except Vlang of course) is memory safe if you program it perfectly.
Very, very few humans are capable of doing that, especially with C.
There’s always a niggling feeling that maybe it’s a human who sarcastically pastes a recipe
fairly sure hezbollah has more than 2800 members
to be even more pedantic, if we follow the relevant official RFCs for http (formerly 2616, but now 7230-7235 which have relevant changes), a 403 can substitute for a 401, but a 401 has specific requirements:
The server generating a 401 response MUST send a WWW-Authenticate header field (Section 4.1) containing at least one challenge applicable to the target resource.
(the old 2616 said 403 must not respond with a request for authentication but the new versions don’t seem to mention that)
with another OS nix is not going to be “in control” so it’s probably more limited. I’m not sure how common using nix is outside of nixos.
also I’ll point out that many other linux distros I think recommend doing a full system backup even immediately after installation, the “grep history” thing is not very stable as eg. apt installing a package today will default to the newest version, which didn’t exist 1 year ago when you last executed that same command.
with nixos, the states of all the config files are collected into the nix configuration which you can modify manually. And if there’s something that can’t be handled through that, I think the common solution is to isolate the “dirty” environment into a vm or some other sort of container that I think comes with nixos
(and there’s always going to be “data” which isn’t part of the “configuration” … which can just be used as a configuration for individual applications)
assuming you have never used anything except apt commands to change the state of your system. (and are fine with doings superfluous changes eg. apt install foo && apt remove foo)
it’s replicable and “atomic”, which for a well-designed modern package manager shouldn’t be that noticable of a difference, but when it’s applied to an operating system a la nixos, you can (at least in theory) copy your centralized exact configuration to another computer and get an OS that behaves exactly the same and has all the same packages. And backup the system state with only a few dozen kilobytes of config files instead of having to backup the entire hard drive (well, assuming the online infrastructure needed to build it in the first place continues to work as expected), and probably rollback a bad change much easier
Actually I think he has already had an adequate amount of recognition:
“In 1999, Red Hat and VA Linux, both leading developers of Linux-based software, presented Torvalds with stock options in gratitude for his creation.[29] That year both companies went public and Torvalds’s share value briefly shot up to about US$20 million”
his autobiography is in several hundred library collections worldwide
Awards he’s received:
2 honorary doctorates
2 celestial objects named after him
Lovelace Medal
IEEE Computer Pioneer Award
EFF Pioneer Award
Vollum Award
Hall of Fellows of the Computer History Museum
C&C prize
Millenium Technology Prize
Internet Hall of Fame
IEEE Masaru Ibuka Consumer Electronics Award
Great Immigrants Award
other techbros have praised him, citing the exact list of symptoms google gives for “high-functioning psychopath”
(disclaimer: google may give bad medical advice)
for a large project, you can probably look at the history of issues, if there are lots of issues that are 5 years old, it’s almost certainly legit
All 9k stars, 10k PRs, 400 forks & professional web site are fake?
Technically, it is entirely possible to find a real existing project, make a carbon copy of the website (there are automated tools to accomplish this), then have a massive amount of bots give 9K stars and make a lot of PRs, issues and forks (bonus points if these are also copies of actual existing issues/PRs) and generate a fake commit history (this should be entirely possible with git), a bunch of releases could be quickly generated too. Though you would probably be able to notice pretty quickly that timestamps don’t match since I don’t think github features like issues can have fake timestamps (unlike git)
though I don’t think this has ever actually been done, there are services that claim to sell not only stars but issues, pull requests and forks too. Though assuming the service is not just a scam in itself, any cursory look at the contents of the issues etc would probably give away that they are AI generated
looks like work on the android client started in 2011 (or at least, that’s when it seemingly started using version control)
the app was released in 2014
so it has likely inherited decisions from ~14 years ago, I’d guess there is a several year gap where having a native desktop app was not even a concern
Also the smartphone landscape was totally different back then, QT’s android support back then was in alpha (or totally nonexistent if the signal project is a bit older than the github repository makes it seem), and the average smartphone had extremely weak processing power and a tiny screen resolution by today’s standards. Making the same gui function on both desktop and mobile was probably a pretty ridiculous proposition.
what’s wrong with them? are you sure it’s just not set to use 100% of all cores, and then the OS does some shuffling?
the “will linearly speedup anything [to the amount of parallel computation available]” claim is so stupid that I think it’s more likely they meant “only has a linear slowdown compared to a basic manual parallel implementation of the same algorithm”
also the person apparently spent 2 million dollars to find the number. and the money is probably from stock compensation from nvidia