I wouldn’t say a normal CPU is inefficient at graphics or cryptography, rather that a specialized GPU is particularly efficient at those tasks.
We only consider a CPU slow at these tasks because of how much faster a GPU is with them, but we never see how much worse a GPU is at general conputation tasks, because of how stupendously bad it is.
As soon as operations need to share info, the GPU speed advantage is gone. Branching paths bog a GPU down with redundant execution. Latency is quite poor too. And exceptions & interrupts are basically impossible at the system level. Trying to run normal programs on the GPU would be a disaster.
I wouldn’t say a normal CPU is inefficient at graphics or cryptography, rather that a specialized GPU is particularly efficient at those tasks.
We only consider a CPU slow at these tasks because of how much faster a GPU is with them, but we never see how much worse a GPU is at general conputation tasks, because of how stupendously bad it is.
As soon as operations need to share info, the GPU speed advantage is gone. Branching paths bog a GPU down with redundant execution. Latency is quite poor too. And exceptions & interrupts are basically impossible at the system level. Trying to run normal programs on the GPU would be a disaster.