(1) Computing is growing as a whole. What was good enough in 1980 isn't good enough now. Computing is used for more problems, and by more people. Programs are bigger, so there is pressure toward compressing common code and idioms. Hardware is more capable and more complex: it's more heterogeneous, dynamic, has more sensors, etc.
(2) Different problems have different use cases. Your "wart and bloat" is another person's feature, and vice versa.
Multi-core, multithreading and GPU are major reasons that features have been added to C++, and those were less of a concern in 1980 when C++ was conceived. (especially GPU).
The "cycle" you lay out isn't accurate because languages in fact do make progress. We're not going around in a circle. Rust fixes a lot of things wrong with C and C++; Go and Swift are also improvements, etc. (I don't even use those languages, but I can recognize the improvements as someone who's been using C and C++ for a long time).
I recommend writing some C from scratch to get a feel for this. I do this on occcasion, because there are advantages. But you will also be extremely hard pressed to do anything that's interesting for a user. The gap to bridge is very large.
For example, try writing a web app in C (not for production). You can save a lot of code by using CGI, but even then it's not fun. (It IS still done; look at the source code to the modern cgit UI if interested.)
Also try to modify and understand source code from the 90's like GNU bash or CPython. (Interestingly, there is a pretty big difference between the codebases, despite being from similar eras.) Nonetheless it should be clear from that experience that we've made progress.
The progress isn't perfect, but freezing languages in time isn't a reasonable option, given the massive change in the problems being solved, and the environment.
(1) Computing is growing as a whole. What was good enough in 1980 isn't good enough now. Computing is used for more problems, and by more people. Programs are bigger, so there is pressure toward compressing common code and idioms. Hardware is more capable and more complex: it's more heterogeneous, dynamic, has more sensors, etc.
(2) Different problems have different use cases. Your "wart and bloat" is another person's feature, and vice versa.
Multi-core, multithreading and GPU are major reasons that features have been added to C++, and those were less of a concern in 1980 when C++ was conceived. (especially GPU).
The "cycle" you lay out isn't accurate because languages in fact do make progress. We're not going around in a circle. Rust fixes a lot of things wrong with C and C++; Go and Swift are also improvements, etc. (I don't even use those languages, but I can recognize the improvements as someone who's been using C and C++ for a long time).
I recommend writing some C from scratch to get a feel for this. I do this on occcasion, because there are advantages. But you will also be extremely hard pressed to do anything that's interesting for a user. The gap to bridge is very large.
For example, try writing a web app in C (not for production). You can save a lot of code by using CGI, but even then it's not fun. (It IS still done; look at the source code to the modern cgit UI if interested.)
Also try to modify and understand source code from the 90's like GNU bash or CPython. (Interestingly, there is a pretty big difference between the codebases, despite being from similar eras.) Nonetheless it should be clear from that experience that we've made progress.
The progress isn't perfect, but freezing languages in time isn't a reasonable option, given the massive change in the problems being solved, and the environment.