25 by blululu | 15 comments on Hacker News.
I'm trying to setup server to run ML inferences. I need to provision a somewhat beefy gpu with a decent amount of RAM (8-16 GB). Does anyone here have personal experience and recommendations about the various companies operating in this space?
0 comments:
Post a Comment