Ask HN: Will local models on normal hardware ever compete?

Ask HN: Will local models on normal hardware ever compete?

Summary

I have a Macbook Air M3 with 24gb RAM. The other day, I wanted to try running an LLM locally for the first time ever. I ran gemma-4-e4b and threw some chats at ...

Original reporting

AFBytes is a read-only aggregator. Use the original source for full context and complete reporting.

Open original source

Related coverage