I have a book from the early 90s which goes over some emerging technologies at the time. One of them was bubble memory. It was supposed to have the cost per MB of a hard drive and the speed of RAM.
Of course, that didn’t materialize. Flash memory outpaced its development, and it’s still not quite as cheap as hard drives or as fast as RAM. Bubble memory had a few niche uses, but it never hit the point of being a mass market product.
Point is that you can’t assume any singular technology will advance. Things do hit dead ends. There’s a kind of survivorship bias in thinking otherwise.
AI is not a technology, it’s just a name for things that were hard to do. It used to be playing chess better than a human was considered AI, but when it turned out you can brute force it, it wasn’t considered AI anymore.
A lot of people don’t consider AlphaGo to be AI, even though neural networks are the kind of technique that’s considered as AI.
AI is a moving target so when we get better at something we don’t consider it true AI
I’m quite aware of the history of the field, thanks. It’s had a lot of cycles of fast movement followed by a brick wall. You can’t assume it’ll have a nice, smooth upward trajectory.
I think the problem is that you think you’re talking like a time traveler heralding us about the wonders of sliced bread, when really it’s more like telling a small Victorian child about the wonders of Applebee’s and in the impossible chance they survive to it then finding everything is a lukewarm microwaved pale imitation of just buying the real thing at Aldi and cooking it in less time for far tastier and a fraction of the cost.
Stockfish can’t play Go. The resources you spent making the chess program didn’t port over.
In the same way you can use a processor to run a completely different program, you can use a GPU to run a completely different model.
So if current models can’t do it, you’d be foolish to bet against future models in twenty years not being able to do it.
Buy any bubble memory lately?
I have a book from the early 90s which goes over some emerging technologies at the time. One of them was bubble memory. It was supposed to have the cost per MB of a hard drive and the speed of RAM.
Of course, that didn’t materialize. Flash memory outpaced its development, and it’s still not quite as cheap as hard drives or as fast as RAM. Bubble memory had a few niche uses, but it never hit the point of being a mass market product.
Point is that you can’t assume any singular technology will advance. Things do hit dead ends. There’s a kind of survivorship bias in thinking otherwise.
AI is not a technology, it’s just a name for things that were hard to do. It used to be playing chess better than a human was considered AI, but when it turned out you can brute force it, it wasn’t considered AI anymore.
A lot of people don’t consider AlphaGo to be AI, even though neural networks are the kind of technique that’s considered as AI.
AI is a moving target so when we get better at something we don’t consider it true AI
I’m quite aware of the history of the field, thanks. It’s had a lot of cycles of fast movement followed by a brick wall. You can’t assume it’ll have a nice, smooth upward trajectory.
I think the problem is that you think you’re talking like a time traveler heralding us about the wonders of sliced bread, when really it’s more like telling a small Victorian child about the wonders of Applebee’s and in the impossible chance they survive to it then finding everything is a lukewarm microwaved pale imitation of just buying the real thing at Aldi and cooking it in less time for far tastier and a fraction of the cost.