Back to timeline
hobby 2018 Lost to time

My first neural network

A self-learning network that played the browser game Slope. The model didn't see pixels directly. A hand-rolled preprocessing pass extracted the lane geometry first, and the net learned to steer from that.

Why preprocess?

Training a convnet from raw pixels on a 2018 personal GPU, as a high-school student, was not going to happen. So I cheated: a small computer-vision pass extracted the lanes' edges and turned them into a tiny feature vector. The neural net was small enough to train at home and still learned the game.

The loop

What it taught me

That representation matters more than architecture. The model architecture was a few dense layers; the magic was in the features it got handed. I still think about that. It's a lesson that re-shows up in every applied ML project I've done since.

The source code is archived somewhere on a dead hard drive. If I dig it out it'll show up here.

Back to timeline