TARGUN

From Blueprint to Behemoth: The Evolution of AI Neural Networks

Patrick Targun :: Mar 5, 2024

00:00

Back in the '50s, when folks were just starting to mess around with what we now call AI, they kicked things off with this idea of "neural networks." These neural networks weren't much to write home about by today's standards, but they laid the groundwork. We've had the blueprint for this AI beast in our hands for ages. We just needed enough horsepower and data to get it running.

In a human brain, each neuron is hooked up to about a thousand others. We call these connections a parameter. The human brain's got about 100 trillion of these connections. We've been trying to mimic that with computers. The bigger and more complex the brain, the better it performs. That's why the latest AI models are starting to feel like they've got some real power under the hood.

With GPT-4 and its 1.76 trillion connections, when are we gonna see this 100 trillion connection AI? Anthropic just released Claude 3, with the parameter count unknown. However, after reading their technical report, it was trained with synthetic data. This potentially means that parameter count and data could be scaled, leaving compute as the final bottleneck.

But if there's one thing I know, it's that with the right combination of parts, fuel, and a little bit of elbow grease, there's no limit to what we can build.

Add speed and simplicity to your workflow