In C++, we have all sorts of std containers, such as vector. Normally, when we work with vectors, we also pass them around by reference, and vectors of course own their data in a contiguous way. When a vector is passed to a function, we pass it by reference: In C++20, there is a newContinue reading “How std::span changes C++ fundamentally”
Author Archives: Nicholas Komsa
Software development and sneaky algorithms
So I have been engineering my own neural network technology for some time, and I’ve been running tests and experiments sometimes. What I have discovered about backrpopagation, is that, when there are errors in backpropagation algorithm, they may not break backpropagation completely. It can work in partial completeness or with some amount of errors, toContinue reading “Software development and sneaky algorithms”
The smallest xor network and lottery ticket
I am creating networks that are: two input nodes, x*y hidden nodes, and one output node. The ability to create these networks has to do with some odds which are based on the initial state of the starting network pool. I have a pool of 2000 networks which are randomly generated, and I try toContinue reading “The smallest xor network and lottery ticket”
C++ comma operator and parallelization
I have never had a reason to use the comma operator, however, writing some modern code, it seems required. Say you have some series of variables and you want to perform a common operation on the group. New C++ is always a fun thing, I know. We could change this to be a generic functionContinue reading “C++ comma operator and parallelization”
Errors and Progress in mnist recognition
So, in my earlier post, I said I had a “convergent mnist network”. At the time I was excited and I wrote that in haste. What that network had been doing, it had been trained on null and digits, but only one image for each of these digit was actually ever trained into the network.Continue reading “Errors and Progress in mnist recognition”
Non-Convolutional Image Recognition
I have had some difficulty determining numbers for training times for mnist, so I am going to post some of mine, and also discuss what my network is doing. So far in my work on mnist, I have generated a convergent network. Using cpu only, ryzen 7 1700, I train a convergent network in underContinue reading “Non-Convolutional Image Recognition”
Code Simplicity of binary files and C++ wonder
There is a lot of code online about reading MNIST dataset, and I have produced my own version, which uses my Binary/Reader classes talked about in a previous post. So, you could be thinking, Binary Image file stuffs with C++, oh, no! I have seen some of the parsers easily obtainable online, written in C++,Continue reading “Code Simplicity of binary files and C++ wonder”
Movement to OpenCl++
In my project, I have been thinking about something, data-oriented-design, quite a lot during the creation of new code. I’m going to present some of these concepts in current implementation and then describe moving to a specifically DOD language, opencl++. Moving into OpenCl++ seems like a natural extention of DOD C++ I use. So, inContinue reading “Movement to OpenCl++”
Backpropagation is a class of random search algorithms
This is evident when watching networks learn using my visualisation program. The backpropagation algorithm creates spontaneous types of patterns which are put into or scuplted into the network briefely or for some period, and then there is a random shift to a new concept with a new pattern for its own breifity. This process repeatsContinue reading “Backpropagation is a class of random search algorithms”
Visualizing network training
Here is a video of where my project is at, Basically, it provides a graphical visualization of the network weights. I am going to experiment with generating and examining this sort of image through training. In this video, the network is training on six problems, and it slowly learns for about 45 seconds. After that,Continue reading “Visualizing network training”