Tux@lemmy.worldM to Technology Memes@lemmy.worldEnglish · 4 days agoSoftware: Then vs Nowlocklemmy.worldimagemessage-square5fedilinkarrow-up134arrow-down111cross-posted to: memes@lemmy.ml
arrow-up123arrow-down1imageSoftware: Then vs Nowlocklemmy.worldTux@lemmy.worldM to Technology Memes@lemmy.worldEnglish · 4 days agomessage-square5fedilinkcross-posted to: memes@lemmy.ml
minus-squareMako_Bunny@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up7·4 days agoWhat Python code runs on a graphics card?
minus-squareapfelwoiSchoppen@lemmy.worldlinkfedilinkEnglisharrow-up14·4 days agoPhyton, not Python. 🙃
minus-squareBougieBirdie@lemmy.blahaj.zonelinkfedilinkEnglisharrow-up3·4 days agoPython has a ton of machine learning libraries. I’d maybe even go so far as to say it’s the de facto standard when developing AI There’s also some cuda libraries which by definition do things directly on the card
minus-squareTarogar@feddit.orglinkfedilinkEnglisharrow-up2·4 days agoYes… It’s possible to have that. Even when it doesn’t do that by default. The CPU can and still is the bottleneck in a fair few cases and you bet you can run shitty code on there.
What Python code runs on a graphics card?
Phyton, not Python. 🙃
Python has a ton of machine learning libraries. I’d maybe even go so far as to say it’s the de facto standard when developing AI
There’s also some cuda libraries which by definition do things directly on the card
Yes… It’s possible to have that. Even when it doesn’t do that by default. The CPU can and still is the bottleneck in a fair few cases and you bet you can run shitty code on there.