

183·
1 year agoThere’s is a huge difference though.
That being one is making hardware and the other is copying books into your training pipeline though
The copy occurs in the dataset preparation.
There’s is a huge difference though.
That being one is making hardware and the other is copying books into your training pipeline though
The copy occurs in the dataset preparation.
Privacy preserving federated learning is a thing - essentially you train a local model and send the weight updates back to Google rather than the data itself…but also it’s early days so who knows what vulnerabilities may exist
You could try dexed, it’s a YamahaDX7 clone https://github.com/asb2m10/dexed/releases
They were invented *by 9k bc :)