As a workaround, you could use OBS and use OBS’s virtual camera so Discord is streaming what it thinks is a camera, and set up whatever you want to share on your desktop through OBS.
As a workaround, you could use OBS and use OBS’s virtual camera so Discord is streaming what it thinks is a camera, and set up whatever you want to share on your desktop through OBS.
How old is “older?”
I run the latest Debian on a 10 year old Macbook Pro. Linux has given this laptop a second life as a lab machine - it’s still plenty fast enough and it has a really nice screen (Retina) which Debian gets right out of the box with no tweaking. The only thing I needed to do when installing Debian is manually get the drivers for the WiFi hardware during the install (although Debian has the non-free firmware by default these days, they aren’t permitted to distribute all firmware and the WiFi hardware in this machine unfortunately happened to be one of those).
You can use dd on another machine to make a bitwise copy of the card before you first use it.
Debian (a very conservative distro) switched to Wayland by default in debian 10 if I’m not mistaken (we’re now on 12).
I didn’t notice the change until I tried to run a niche program that really needs X11. Unless you’re doing this kind of thing, then you can probably just use Wayland. At least in Debian it’s really easy to switch between Wayland and X11 by selecting the session type when you log in.
https://www.youtube.com/watch?v=iYWzMvlj2RQ
“I’m also very happy to point out that nVidia has been the worst […] so nVidia, “fuck you!””
But it does help give an idea of who’s making the most reliable drives (both SSD and hard disk). No, this isn’t a guarantee, but it’s still useful information especially when it’s not just a friend-of-a-friend anecdote but gained over tens of thousands of drives.
The biggest factor in making good, automatic backups for my home server wasn’t speed (it’s an older machine with a SAS array of spinning discs) but the availability of affordable cloud based backup storage (I use Backblaze and sync my files to a storage bucket once a day). Then it becomes automatic, and no one has to remember to do it, and it’s offsite.
Even when external USB discs got cheap you had to remember to do it regularly and many people would forget.
Hard drives are not that unreliable, well, so long as you pick the right model.
BackBlaze’s statistics are here: https://www.backblaze.com/blog/backblaze-drive-stats-for-q2-2023/ - they run tens of thousands of inexpensive drives to run their cloud backup service. Some HDDs are much better than others.
That document also links to their SSD statistics (they don’t have that many SSDs yet, so the stats aren’t as good) but while SSDs tend to have lower failure rates, there are some models of SSD that have higher failure rates than HDDs. For example, one Seagate SSD they use has an AFR (annualised failure rate) of just under 2%, but one Toshiba HDD they use has an AFR of only 0.31%. (Another thing to be aware of is that Backblaze’s drives will all be in air conditioned data centres, not in the random temperature/humidity spreads of a PC in someone’s home).
If you look at the stats as a whole generally SSDs have half the failure rate across the board to HDDs, but it varies a lot by make and model. So be careful on which you pick, and take backups :-) For my money, all my PCs (desktop and laptops) are pure SSD setups. My server still uses spinning disks, mainly because it’s older server class hardware with a SAS array.
What do you mean by “advanced Linux distro”?
If you mean starting at a minimal starting point and only installing what you need, then you may as well start off with a minimal Debian netinst, then add the stuff you want once you’ve got the minimal system installed.
I use Backblaze B2 buckets too, just use a cron job to sync stuff once a day (using it for backups). It’s not expensive and it just seems to keep on working. I also like their disc reliability reports they send out.
Honestly, I go watch this video from time to time whenever I need a lift.
Sometimes the issues with WiFi chipsets is not the distro but the manufacturer. Debian for instance now includes non-free firmware on its installation ISO image, but some manufacturers do not allow the distribution (e.g. Broadcom) of firmware, so Debian can’t legally include them. And unfortunately the manufacturers don’t make it easy to “just download the firmware” so you can put it on the USB stick so the installer can see them. (Literally the only issue with putting Debian on my old 2013 Macbook Pro was the Broadcom firmware - but fortunately, having a Debian desktop I could install the firmware downloader there to get the two files the installer needed).
This is not a fault of the Linux distro, but a fault of the hardware manufacturer. Unfortuantely, like the smell of piss in a subway, we all have to deal with Broadcom.
People are imperfect. People have left laptops full of personal and/or commercially sensitive data on trains or planes, had them stolen from cars and houses etc. Full disc encryption is a defence against data breaches especially for computers that are not bolted down. Or it might be as simple as a person not wanting the embarrassment of their porn stash being found.
An older friend of mine told me years back about an incident that happened on a university VAX running Unix. In those days, everyone was using vt100 terminals, and the disk drives weren’t all that quick. He was working on his own terminal when without warning, he got this error when trying to run a common command (e.g.
ls
)$ ls -l sh: ls: command not found
So he went on over to the system admin’s office, where he found the sysadmin and his assistant, staring at their terminal in frozen horror. Their screen had something like:
# rm -rf / tmp/*.log ^C^C^C^C^C^C^C^C^C^C # ls -l sh: ls: command not found # stat /bin/ls sh: stat: command not found
A few seconds after hitting return, and the
rm
command not finishing immediately, he realised about the errant space, and then madly hammered Ctrl-C to try to stop it. It turns out that the disk was slow enough that not everything was lost, and by careful use of the commands that hadn’t been deleted, managed to copy the executables off another server without having to reinstall the OS.