Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

notice that all these buzzwords you give actually correspond to real advances in the field. All of these were improvements on something existing, not a big revolution for sure, but definitely measurable improvements.




Those are not "real advances in the field", which is why they are constantly abandoned for the next new buzzword.

Edit:

This just in:

https://news.ycombinator.com/item?id=46870514#46929215

The Next Big Thing™ is going to be "context learning", at least if Tencent have their way. And why do we need that?

>> Current language models do not handle context this way. They rely primarily on parametric knowledge—information compressed into their weights during massive pre-training runs. At inference time, they function largely by recalling this static, internal memory, rather than actively learning from new information provided in the moment.

>> This creates a structural mismatch. We have optimized models to excel at reasoning over what they already know yet users need them to solve tasks that depend on messy, constantly evolving context. We built models that rely on what they know from the past, but we need context learners that rely on what they can absorb from the environment in the moment.

Yep. Reasoning is so 2025.


I think you might be salty because the words become overused and overhyped, and often 90% of the people jumping on the bandwagon are indeed just parroting the new hot buzzword and don't really understand what they're talking about. But the terms you mentioned are all obviously very real and and very important in applications using LLMs today. Are you arguing that reasoning was vaporware? None of these things were meant to the be the final stop of the journey, just the next step.

Excuse me? I'm "salty"? What the hell are you talking about?

Why doesn't this site have a block user button?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: