

No, it has not „always been fine” - I’ve worked with people who disabled auto updates on their dev machines just to keep a specific kernel & driver version working together. Circa 2016 :)


No, it has not „always been fine” - I’ve worked with people who disabled auto updates on their dev machines just to keep a specific kernel & driver version working together. Circa 2016 :)


Some countries have pretty strict laws about not promoting Nazi ideology - by pushing that notifications it would probably be breaking that law.


Stolen? It was forked as is allowed by the MIT license. With GPL as well there is no „you cannot fork” rule, you can do exactly the same thing. The author misunderstood that „you have to push the changes to upstream”, which is not in any of those licenses.
I’ve looked at the sources of this paper, and it looks like the grams of CO2 per email are taken out of the ass, as it’s just a random value in „SAMRIDDHI Volume 15, Issue 1, 2023 Print ISSN: 2229-7111 Online ISSN: 2454-5767 Survey on Carbon Dioxide Emissions Through Email Conversion”


Counter-counter point: people don’t get a Mac or windows laptop to learn about osx or windows. They generally want to run software or at least browser to do what they need to do.


Uh, memory metrics in Linux are a pain. The only tool that reports most cached as available is htop. free, top and a lot of other software (like node_exporter) will report that a lot of cached memory is not available.
To OP: don’t worry, a lot of Linux tools are smart enough to give back memory if memory pressure rises.


Microsoft is big tech, and GitHub is owned by y Microsoft.


…and then you learn that packageX v1 is not maintained anymore and relies through a deep set of dependencies on a seriously vulnerable package (in a version which is also not maintained anymore).
Sorry, I had a pretty eventful December :)


My take is that it’s already your systems feature, rather than admins responsibility. If you treat departments like customers, you’d find a good way to spread the costs. If something is just a „common infrastructure”, you will always find something that makes costs that doesn’t have an easy way to track who triggered that - because you don’t pass enough information with it.


Not sure what is hard in it - you need consistent tagging, and that by itself gives you a lot of mileage in cost explorer.


This is not an unusual comment section on Phoronix, to put it mildly.
Walk without a rhytm, and you won’t attract the worm!


I’ve read his book a few years ago, and he was pretty bullish on risky investments, so…


First Diablo 4 and now this… Horrible company.


Lots of hardware lies about its useful capabilities.
Can you run 4k? Of course. But can you run more than 4 frames a second?
Both are valid (if you’d add seconds) in both RFC 3339 and ISO 8601, but timezone support is the same here and there…


Yeah, and the same thing would happen if e.g. PII or HIPAA related would end up in trained model. The fact that some PII or health data ended up being publicly available, doesn’t mean that automatically you can process or store such data, and train on such data.


If you do stuff, earn from it, and ignore parties and their rights, you are forced to compensate. I guess it will be peanuts though.
When I first booted a Windows Server with tiles interface, I was tempted to yeet it out of the window [sic!].