Golden Ticket Exploration

So, if we have made some network, initial state, not like this one: And we applied typical backpropagation training process using SGDx0.002, on MNIST, we could get an image somehow like this: This is an image of a trained Golden Ticket than I created which is not Dyson Hatching and I will describe later. WhatContinue reading “Golden Ticket Exploration”

Dyson Hatching Golden Ticket

Nick: so now it is time to make use of the voronoi diagram. To begin, I want to calculate the average slope of the edges of each cell ChatGPT: The slope of a line is typically calculated as the difference in the y-coordinates of two points divided by the difference in the x-coordinates of thoseContinue reading “Dyson Hatching Golden Ticket”

ChatGPT and Golden Ticket

So, some long time ago when ChatGPT first came out, the first really cool thing I somehow made, was my Golden Ticket algorithm. I’m going to discuss the Golden Ticket code and the production method I used with the original ChatGPT. This is an easy task because my chat history is saved by Open Ai.Continue reading “ChatGPT and Golden Ticket”

Configurations of random variables

In my neural network program, I refactored some code and produced an error which I did not notice for some time. When I would run the program, eventually, out of 100 networks, a single or few networks would learn the pathing problem. With no difference in how they are trained, they are all taught exactlyContinue reading “Configurations of random variables”

My AI study so far

I have been studying neural networks for some time, and recently during a YSU hackthon, I managed to make interesting progress. After about a year long break, I return to this code and make large amounts of progress and a number of topics have presented in C++ software. I’m going to describe some of myContinue reading “My AI study so far”