Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

if this was currently possible wouldn't it lead to sentient/superhuman AI rapidly?

>tell AI to make itself more efficient by finding performance improvements in human written code

>that newly available processing power can now be used to find more ways to improve itself

>flywheel effect of AI improving itself as it gets smarter and smarter

eventually you'd turn it loose on improving the actual hardware it runs on. I think the question now is really how far transformers can be taken and if they are really the path to "real" AI.



Within a couple of years of improvement processes like you suggest will actually be really dangerous and stupid.

Also don't confuse all other types of human/animal characteristics like sentience with intelligence. They are different things. Things like sentience, subjective stream of experience, or other aspects of being alive don't just accidentally fall out of larger training datasets.

And we should be glad. The models are going to be orders of magnitude faster (and perhaps X times higher IQ) than humans within a few years. It is incredibly foolish to try to make something like that into a living creature (or emulation of living).


Intelligence is about action, and sentience is about qualia, which I equate to perceptions coloured by values. Action is visible and qualia are hidden, but they are closely interconnected: we choose our actions in accordance with our values and situation at hand.


> Things like sentience, subjective stream of experience, or other aspects of being alive don't just accidentally fall out of larger training datasets.

I disagree, language is all we need. Agency? Encode your “internal needs” as prompts, periodically generate prefixes from these, append them to incoming prompts. Self-awareness? Summarize this internal dialogue, reflect on it with a few iterations, add the results to the common prefix. Sentience? Attach some sensors, summarize their observations with the language model, prepend to prefix. Actions? Make the model output commands that some servos or other interfaces understand. Etc, etc.

And, of course, it would be extremely _cool_ to make something like that into a living creature, and lots of labs are already doing that. Fear and luddism should not stay in the way of curiosity.

If we humans cannot improve our own intelligence, making something smarter than us is an evolutionary imperative.


That's not what I meant. What you describe is deliberate engineering. Not accidentally falling out such as from just training on larger and larger datasets which some people think will result in digital consciousness or something through "emergence".

It is almost certain that the next stage of intelligence will be digital. But it is very foolish and unnecessary to try to speed that along.

It is likely that we have a century or two max left in control of the planet, regardless of what we do. On some level I agree that totally suppressing it indefinitely would be a shame.

When I said "living" I meant digital life. Such as those things you describe and others including control-seeking, self-preservation, and reproduction which are all central to living beings.

The problem is that AI will soon think 100 or more times faster than humans. This is anticipated based on the history of increases in computing efficiency and the fact that we are now optimizing a very specific system (LLMs). Humans will not in any way be able to keep up.

This is not luddism. I have a service that connects GPT-4 to Linux VMs on the internet to install or write software. I think this technology is great and has a lot of positive potential.

But when you deliberately try to emulate animals (like humans) and combine that with hyperspeed and other superintelligent characteristics, you are essentially approaching suicide or at least, abdicating all responsibility for your environment. There is no way to prevent such a thing from making all of your decisions for you.

The speed difference will be incredible. Imagine a bullet time scene where everyone seems to be moving in extreme slow motion. Now multiply by 10 so they are so slow they seem completely frozen.

This level of performance is coming in five years or less.

While I don't want to suppress the evolution of intelligent life in our corner of the universe, I also am not ready to join a death cult. Especially not accidentally.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: