• Gladaed@feddit.org
      link
      fedilink
      English
      arrow-up
      30
      ·
      13 hours ago

      The simplest neural network (simplified). You input a set of properties(first column). Then you weightedly add all of them a number of times(with DIFFERENT weights)(first set of lines). Then you apply a non-linearity to it, e.g. 0 if negative, keep the same otherwise(not shown).

      You repeat this with potentially different numbers of outputs any number of times.

      Then do this again, but so that your number of outputs is the dimension of your desired output. E.g. 2 if you want the sum of the inputs and their product computed(which is a fun exercise!). You may want to skip the non-linearity here or do something special™

    • Zwiebel@feddit.org
      link
      fedilink
      English
      arrow-up
      6
      ·
      edit-2
      13 hours ago

      To elaborate: the dots are the simulated neurons, the lines the links between neurons. The pictured neural net has four inputs (on the left) leading to the first layer, where each neuron makes a decision based on the input it recieves and a predefined threshold, and then passes its answer on to the second layer, which then connects to the two outputs on the right