NinjaZ@infosec.pub to Technology@lemmy.worldEnglish · 1 day agoChina scientists develop flash memory 10,000× faster than current techinterestingengineering.comexternal-linkmessage-square92fedilinkarrow-up1385arrow-down129cross-posted to: [email protected][email protected]
arrow-up1356arrow-down1external-linkChina scientists develop flash memory 10,000× faster than current techinterestingengineering.comNinjaZ@infosec.pub to Technology@lemmy.worldEnglish · 1 day agomessage-square92fedilinkcross-posted to: [email protected][email protected]
minus-squaregravitas_deficiency@sh.itjust.workslinkfedilinkEnglisharrow-up1arrow-down2·21 hours agoYou’re willing to pay $none to have hardware ML support for local training and inference? Well, I’ll just say that you’re gonna get what you pay for.
minus-squarebassomitron@lemmy.worldlinkfedilinkEnglisharrow-up7·20 hours agoNo, I think they’re saying they’re not interested in ML/AI. They want this super fast memory available for regular servers for other use cases.
minus-squarecaseyweederman@lemmy.calinkfedilinkEnglisharrow-up1·1 hour agoI have a hard time believing anybody wants AI. I mean, AI as it is being sold to them right now.
You’re willing to pay $none to have hardware ML support for local training and inference?
Well, I’ll just say that you’re gonna get what you pay for.
No, I think they’re saying they’re not interested in ML/AI. They want this super fast memory available for regular servers for other use cases.
Precisely.
I have a hard time believing anybody wants AI. I mean, AI as it is being sold to them right now.