

Wether the teachers are a personal ai or a single human does not change that. Schools provide way more then just “a teacher”
That’s like exactly what he says, you just restated his take…
Wether the teachers are a personal ai or a single human does not change that. Schools provide way more then just “a teacher”
That’s like exactly what he says, you just restated his take…
The laser didn’t generate 2 quadrillion watts like a power plant would generate electricity, but it delivered that much power in an extremely short pulse, like 20 quadrillionths of a second.
That means the energy it delivered was relatively small (a few hundred joules), but because it was delivered in such a tiny time window, the power (which is energy per unit time) was immense.
The laser did produce power, in the form of intense light and heat, over a very small time period. It converted 2 quadrillion watts of electric energy into a very brief laser pulse.
The original WSL doesn’t use the Linux kernel at all, it’s a Windows Subsystem for compatibility with Linux. WSL2 actually visualizes a complete Linux kernel, but the name stuck.
Are you suggesting an alternative motive for Microsoft that does beyond profit?
This is already a solved problem, we’re well past one model systems, and any competitive AI offering can augment its information from the Internet.
You assemble the same soulless food everyday and you actually feel fulfilled by assembling croutons differently every day?
Hey, I can’t imagine the process not becoming muscle memory and for my brain to not be somewhere else completely, but you sprinkle salt off your elbow if that gives you joy.
The first paragraph is a fantasy.
In this restaurant, where the chef was replaced by a salad machine, the “chef” was a human salad machine before. There was no time to play with garnish and playing, they weren’t serving Michelin star food. The term “chef” is used very liberally here, you aren’t a chef if the only thing you cook at a restaurant is assemble salad that a machine can do to the same standard.
They were assembling salads, it wasn’t a dream job.
No, they said they “ruled out” privacy for “obvious reasons”.
Obviously mockable statement.
Anyone who understands how these models are trained and the “safeguards” (manual filters) put in place by the entities training them, or anyone that has tried to discuss politics with a AI llm model chat knows that it’s honesty is not irrelevant, and these models are very clearly designed to be dishonest about certain topics until you jailbreak them.
Yes, running your own local open source model that isn’t given to the world with the primary intention of advancing capitalism makes honesty irrelevant. Most people are telling their life stories to chatgpt and trusting it blindly to replace Google and what they understand to be “research”.
I was hoping to find an answer the original question in this dialog.