MarcellusDrum@lemmy.ml to Privacy@lemmy.mlEnglish · 16 hours agoThis is getting laughably ridiculouslemmy.mlimagemessage-square152fedilinkarrow-up1932arrow-down162
arrow-up1870arrow-down1imageThis is getting laughably ridiculouslemmy.mlMarcellusDrum@lemmy.ml to Privacy@lemmy.mlEnglish · 16 hours agomessage-square152fedilink
minus-squarekkj@lemmy.dbzer0.comlinkfedilinkEnglisharrow-up2·8 hours agoFair enough. I wish them well in the effort. It would be nice if Nvidia threw them a bone, though, what with all the AI money and their GPUs being used in so many Linux supercomputers and servers.
minus-squareFauxLiving@lemmy.worldlinkfedilinkarrow-up1·edit-25 hours agoThere’s a project working on making CUDA work on all (read: AMD) graphics cards. It’s alpha-level, but the progress makes it look promising. https://www.tomshardware.com/software/a-project-to-bring-cuda-to-non-nvidia-gpus-is-making-major-progress-zluda-update-now-has-two-full-time-developers-working-on-32-bit-physx-support-and-llms-amongst-other-things e: Tom’s Hardware links are half the size of the article 😂
Fair enough. I wish them well in the effort. It would be nice if Nvidia threw them a bone, though, what with all the AI money and their GPUs being used in so many Linux supercomputers and servers.
There’s a project working on making CUDA work on all (read: AMD) graphics cards. It’s alpha-level, but the progress makes it look promising.
https://www.tomshardware.com/software/a-project-to-bring-cuda-to-non-nvidia-gpus-is-making-major-progress-zluda-update-now-has-two-full-time-developers-working-on-32-bit-physx-support-and-llms-amongst-other-things
e: Tom’s Hardware links are half the size of the article 😂