• 0 Posts
  • 104 Comments
Joined 2 years ago
cake
Cake day: July 7th, 2023

help-circle




  • I’ve been thinking of a dreaming-like algorithm for neural networks (NN) which I have wanted to try.

    When training an NN, you have a large set of inputs and corresponding desired outputs. You make random subsets of this and for each subset you adjust the NN to correspond more to the outputs. You do this over and over and eventually your NN is close to the outputs (hopefully). This training takes a long time and will only be done this initial time. (This is very a simplified picture of the training)

    Now for the dreaming. When the NN is “awake” it accumulates new input/output entries. We want to adjust the NN to also incorporate these entries. But if we use only these for training we will lose some of the information the NN has learned in the initial training. We might want to train on the original data + the new data, but that is a lot, so no. Lets assume we do no longer even have the original data. We want to train on what we know and what we have accumulated during the waking time. Here comes the dreaming:

    1. Get an “orthogonal” set of input/outputs of what the NN already knows (e.g. if the network outputs vectors, take some random input, save vector. Use a global optimization algorithm to find the next vector such that is orthogonal to the first. Do this until you have a spanning set).
    2. Repeat point 1 until you have maybe one set per newly accumulated input/output entry, or however much appears to not move you too far from the optimization extrema your NN is in – this set should still be a lot smaller than the original training set.
    3. Fine-tune train your NN on the accumulated data and this data we have generated. The generated data should act as an anchor, not allowing the NN deviate too much from the optimization extrema and the new data will also be invorporated.

    I see this as a form of dreaming as we have a wake and sleep portion. During waking we accumulate new experiences. During sleeping we incorporate these experiences into what we already know by “dreaming”, that is make small training sessions on our NN.




  • I guess it is different reasons for different people. But for me, I started using ubuntu in 2005. When I was learning linux, it was just not complete enough. You install another DE/WM, to try it out, and stuff started to break. So I switched pretty quickly. I tried to return every now and then, because it had an environment of newer packages which I waned/needed. But it was never worth it, this or that always broke when you tried to do something peculiar. I use ubuntu every now and then, but it is mostly no good. The issue is really just snap. Snap firefox on rpi, which is the default, is just trash and unusable. It is crazy that they made it the default. I have also had servers where snap-services just eats too much cpu and first thing I have to do is to purge it. So, in summary, I don’t really trust them to provide a reliable system, and I am sceptical of their direction.










  • Not the interview itself, but… I had a personality test before the interview and it felt so fucked up. There were always two completely different statements of, at least to me, questionable morals. Like “I enjoy people’s envy of me having better things” and “In social situations, the conversation should only be about me”. Stuff like that, but not only egoistic statements. Then you had a single scale under the two statements which went from “describes me” to “describes me very well”, for both statements, no neutral option. Stated time was like 10 minutes, I took it like in an hour. An hour of having to think through if I should say that “not having sympathy for an abandoned dog describes me” because the other option was more horrible. Felt fucking traumatized after that.

    It got me the interview, but not the job.