Commit Graph

19 Commits (main)

Author SHA1 Message Date
mandlm 5e32724b1a Layers average their output sums 2015-11-15 16:08:49 +01:00
mandlm 8d01edb7a1 Removed naked double * calls 2015-10-31 14:59:10 +01:00
mandlm 650b4be9fc Added forward-feeding of double-ptrs 2015-10-27 15:33:10 +01:00
mandlm 81d1f54c98 Started loader-class for 8x8 pixel digit training data 2015-10-26 22:05:50 +01:00
mandlm 37b5153d6a Cleaned up code to work with MinGW/GCC Kit 2015-10-26 20:34:26 +01:00
mandlm b899c6f55e Replaced a few int with size_t to fix 64 bit warnings. 2015-10-26 07:33:45 +01:00
mandlm 99ef63e019 Added simple (de-)serialization of (trained) nets 2015-10-25 17:40:22 +01:00
mandlm 6943fc0116 Threaded learning and signaling in the Qt UI 2015-10-24 18:03:07 +02:00
mandlm 4eb232b1e9 Working net, calculates the mean value of two inputs 2015-10-23 22:16:12 +02:00
mandlm 6ed30e56c4 Finished the max-value net (2/3/1 neurons) with 10k learning iterations. No good. 2015-10-22 22:09:35 +02:00
mandlm f22e4221a1 Fixed feed-forward algo 2015-10-22 16:02:27 +02:00
mandlm 3d30346f2d Implemented dynamic learning 2015-10-18 22:05:18 +02:00
mandlm 6ef1f9657c Backprop seems to be working, yay 2015-10-18 21:20:37 +02:00
mandlm a79abb5db1 First implementation of weight updates. Very slow rate of change in the output value. 2015-10-17 22:05:27 +02:00
mandlm 370451c2e6 Calculation of hidden neuron gradients (partial) 2015-10-16 22:59:04 +02:00
mandlm ea454b58c6 Added 64-bit configuration and support 2015-10-15 22:37:13 +02:00
mandlm 7ba16e9e9d Renamed a few things, started working on back-propagation 2015-10-15 19:18:26 +02:00
mandlm 2f556d1b92 Added a (hacky) bias neuron 2015-03-24 13:45:38 +01:00
mandlm e3a804242c Split up to different source files, entry-point for back propagation 2015-03-23 21:58:30 +01:00