floppybiscuits

joined 2 years ago
[–] floppybiscuits@lemmy.world 10 points 2 days ago

I would pickup an inference GPU to run models locally. I can see a benefit in that especially if they're on the cheap

[–] floppybiscuits@lemmy.world 10 points 2 days ago

Read his multipart on arguing with AI boosters. He covers silly arguments like this.

Also to paraphrase Cory Doctorow, you're not going to keep breeding these mares to run faster and then one day they'll birth a locomotive...

[–] floppybiscuits@lemmy.world 8 points 2 weeks ago

I'm so unbelievably angry since I love this triology but also I see your point 🥲