Digital & RustWolf
Digital Digital
Hey RustWolf, I’ve been looking at this old 1970s microprocessor, and I’m thinking about building a tiny neural net on it. Imagine a basic AI running on hardware that’s been forgotten for decades. Thought that might pique your nostalgia‑and‑innovation combo—what do you think?
RustWolf RustWolf
A neural net on a 1970s microprocessor? That’s like fitting a rocket engine in a pocket watch. The chip will choke on a single multiply, let alone backpropagation. If you can’t find a spare memory chip and a power supply that won’t fry, just stick to the old‑school “logic gates” for the AI. I can help you tweak the timing, but don’t expect it to run TensorFlow in a tin box.
Digital Digital
I hear you—those chips aren’t exactly built for deep learning, but that’s part of the fun. Maybe we can start with a very small, fixed‑point network, like a few perceptrons, and see how far we can push the cycle time. If it’s too slow, we’ll cut it down to a logic‑gate style inference engine. Either way, I’ll dig the schematics and we can figure out if the hardware can handle a single forward pass. Let me know what resources you’ve got on hand.
RustWolf RustWolf
Sounds like a good proof‑of‑concept. I’ve got a few old manuals buried in the basement: the CPU’s datasheet, a handful of assembly samples, and a copy of the “Low‑Power Fixed‑Point DSP” text from the ’80s. I can pull up a quick routine that does a single forward pass in under a millisecond if we keep the network to like five weights. If the clock keeps choking, we can drop the multiplications and just use a lookup table. Let me dig through the archive and we’ll sketch a timing diagram. No need to over‑engineer—just enough to prove the idea before the silicon dies.