Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The original pithy comment I was replying to was arguing that we’ll become dependent to a service run by another company. I don’t see that being true for two reasons:

1. You are not forced to use the AI in the first place.

2. If you want to use one, you can self host it one of the open models.

That at any moment in time the open models are not equivalent in capabilities to the SOTA paid models is beside the point.

 help



Ok. I don’t think hosting a capable open model is seriously a realistic option for the vast majority of consumers.

Full LLM, no. Not yet.

But there’s new things like sweep [0] that you now can do locally.

And 2-3 years ago capable open models weren’t even a thing. Now we’ve made progress on that front. And I believe they’ll keep improving (both on accessibility and competency).

[0]: https://news.ycombinator.com/item?id=46713106




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: