Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A large percentage of the code I've written the last 10 years is Go. I think it does somewhat better than the others in some areas, such as relative simplicity and having a robust stdlib, but a lot of this is false security. The simplicity is surface level. The runtime and GC are very complex. And the stdlib being robust means that if you ever have to implement a compiler from scratch, you have to implement all of std.

All in all I think the end result will be the same. I don't think any of my Go code will survive long term.





I’ve got 8 year old Go code that still compiles fine on the latest Go compiler.

Go has its warts but backwards compatibility isn’t one of them. The language is almost as durable as Perl.


8 years is not that long, if it can still compile in say 20 years then sure but 8 years in this industry isn't that long at all (unless you're into self flagellation by working on the web).

Except 8 years is impressive by modern standards. These days, most popular ecosystems have breaking changes that would cause even just 2-year-old code bases to fail to compile. It's shit and I hate it. But that's one of the reasons I favour Go and Perl -- I know my code will continue to compile with very little maintenance years later.

Plus 8 years was just an example, not the furthest back Go will support. I've just pulled a project I'd written against Go 1.0 (the literal first release of Golang). It's 16 years old now, uses C interop too (so not a trivial Go program), and I've not touched the code in the years since. It compiled without any issues.

Go is one of the very few programming languages that has an official backwards compatibility guarantee. This does lead to some issues of its own (eg some implementations of new features have been somewhat less elegant because the Go team favoured an approach that didn't introduce changes to the existing syntax).


8 years is only "not that long" because we have gotten better at compatibility.

How many similar programs written in 1999 compiled without issue in 2007? The dependency and tooling environment is as robust as it's ever been.


> because we have gotten better at compatibility.

Have we though? I feel the opposite it true. These days developers expect users of their modules and frameworks will be regularly updating those dependencies and doing so dynamically from the web.

While this is true for active code bases. You can quickly find stable but unmaintained code will eventually rot as its dependencies deprecate.

There aren't many languages out there where their wider ecosystem thinks about API-stability in terms of years.


If they change the syntax sure but you can always use today's compiler if necessary. I generally find the go binaries to have even fewer external dependencies than a C/Cpp project.

On the scale of decades that's an incorrect assumption, unless you mean running the compiler within an emulated system.

It depends on your threat model. Mine includes the compiler vendors abandoning the project and me needing to make my own implementation. Obviously unlikely, and someone else would likely step in for all the major languages, but I'm not convinced Go adds enough over C to give away that control.

As long as I have a stack of esp32s and a working C compiler, no one can take away my ability to make useful programs, including maintaining the compiler itself.


For embedded that probably works. For large C programs you're going to be just as stuck as you are with Go.

I think relatively few programs need to be large. Most complexity in software today comes from scale, which usually results in an inferior UX. Take Google drive for example. Very complicated to build a system like that, but most people would be better served by a WebDAV server hosted by a local company. You'd get way better latency and file transfer speeds, and the company could use off the shelf OSS, or write their own.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: