ChatGPT2 Animator Adventure

I followed RaffK project to its completion and did much more feature development such as complete chat feature. I created a lot of Modern and screaming Future C++, it is sort of crazy. This ChatGPT2 animation experiment involved training chatgpt2 on scrolling Shakespeare 1000 times using SGD with a learn rate of 0.0002. After SGDContinue reading “ChatGPT2 Animator Adventure”

Golden Ticket Exploration

So, if we have made some network, initial state, not like this one: And we applied typical backpropagation training process using SGDx0.002, on MNIST, we could get an image somehow like this: This is an image of a trained Golden Ticket than I created which is not Dyson Hatching and I will describe later. WhatContinue reading “Golden Ticket Exploration”

Dyson Hatching Golden Ticket

Nick: so now it is time to make use of the voronoi diagram. To begin, I want to calculate the average slope of the edges of each cell ChatGPT: The slope of a line is typically calculated as the difference in the y-coordinates of two points divided by the difference in the x-coordinates of thoseContinue reading “Dyson Hatching Golden Ticket”

Non-Convolutional Image Recognition

I have had some difficulty determining numbers for training times for mnist, so I am going to post some of mine, and also discuss what my network is doing. So far in my work on mnist, I have generated a convergent network. Using cpu only, ryzen 7 1700, I train a convergent network in underContinue reading “Non-Convolutional Image Recognition”