Hacker Newsnew | past | comments | ask | show | jobs | submit | wpollock's commentslogin

Or more simply, use

   su -c 'echo 3 > /proc/sys/vm/drop_caches'

echo 3 | sudo tee /proc/sys/vm/drop_caches

Interestingly, the IETF has several published RFCs for text protocols, all of which require \r\n line endings.

<https://www.rfc-editor.org/old/EOLstory.txt>

Note this does not apply to file formats (except for RFCs).


> In general, I find it unacceptable for programs to use (anywhere in) my file system, besides /tmp, as a dumping ground for their caches and downloads, without cleaning it up.

/tmp must be world-writable and for multi-user or multi-tenant systems it becomes a security hole. Storing temporary files in the current user's home directory (or a subdirectory thereof) makes sense.

What doesn't make sense is this blog post about TMP and TEMP, and ending with "I don't know why" (in different words).

The reason is simple: different programmers thought the other name was bad. They were under no obligation to come to a consensus.

Don't forget about TEMPDIR and TMPDIR! Also Windows has its own environment variables for this. But generally, Linux software ported to Windows still use TMP or TEMP.


> /tmp must be world-writable and for multi-user or multi-tenant systems it becomes a security hole. Storing temporary files in the current user's home directory (or a subdirectory thereof) makes sense.

It makes sense when it's a user option. If /tmp isn't an option due to security concerns, then use $CWD by default. I can always alter the config to some other location if I do not like it. With the amount of programs that litter $HOME, especially with caches, you have to whitelist directories when backing it up. With a naive rsync, you'll find half your transfer is junk.


Thomas Jefferson famously said that "A coreutils rewrite every now and again is a good thing". Or something like that.

When I was a beta tester for System Vr2 Unix, I collected as many bug reports as possible from Usenet (I used the name "the shell answer man". Looking back I conclude that arrogance is generally inversely proportional to age) and sent a patch for each one I could verify. Something like 100 patches.

So if this rust rewrite cleans up some issues, it's a good thing.


> ... Unicode says that 0xFF is an invalid character.

Not so. You may be thinking of UTF-8 encoding. 0xff is DEL in Unicode.


DEL is unicode codepoint U+007F, which is the byte 0x7F in UTF-8, not 0xFF. Perhaps you were thinking of ÿ which is codepoint U+00FF, which encodes to the bytes 0xC3 0xBF in UTF-8.

I was thinking of DEL, but was obviously mistaken. Thanks for catching that!

The "char" type in D represents a UTF-8 code unit, the byte 0xFF is not a valid character code and is strictly forbidden.

My mental model of columnar storage is as the old notion of parallel arrays, which I used in the 1970s with FORTRAN. Whatever you learned first sticks with you and you end up translating everything to that, or at least I do. I believe this is known as the baby duck syndrome.


The source and destination addresses don't change. If a bomb takes out a router in-between (the military scenario DARPA had in mind), it is NOT IP (L3) or TCP (L4) that handles it. Rather it is a dynamic routing protocol that informs all affected routers of the changed route. Since the early days of the Internet, that's been the job of routing protocols.

For smaller internets, protocols such as RIP (limited to 16 hops) broadcast routing information from each still-working router to other routers. Each router built a picture of the internet (simplifying a bit here, RIP and similar protocols used "distance vector" routing, but other more advanced routing protocols did have each a picture of the internet). So when a packet arrived at its router, that router can forward the pack towards the destination. Such protocols are "interior" routing protocols, used within an ISP's network.

The Internet is too big for such automatic routing and uses an "exterior" routing protocol called BGP. This protocol routes packets from one ISP to the next, using route and connectivity information input by humans. (Again I'm simplifying a bit.)

Wifi uses entirely different protocols to route packets between cells.

Fun fact: wifi is not an acronym for anything, the inventors simply liked how it sounded.


> Fun fact: wifi is not an acronym for anything, the inventors simply liked how it sounded.

Most certainly it's a reference to "Sci-Fi" or "Hi-Fi".


I always thought Wi-Fi meant wireless fidelity? (Or wireless fiction since in the end, everything is wired).


It doesn't, but the phrase was used in the early days.

https://boingboing.net/2005/11/08/wifi-isnt-short-for.html


t was made to sound like Hi-Fi, which stands for high fidelity, and Wireless, but "wireless fidelity" is a meaningless phrase and not what it was intended to directly mean.


> I notoriously write bad English...

You mean that you write in English badly. :-)


This. Except worse, during busy days you had to stand on line for an hour or more for a turn on the machines. I believe the skill of debugging by mentally stepping through a program's execution came from such long run times, a useful skill many younger programmers lack.


> a useful skill many younger programmers lack.

Because it’s unnecessary.

It’s not a difficult skill.

When folks are in that situation, they tend to adapt quickly to their reality. But that’s not the reality for the vast majority of developers today.

Thankfully.


Yep I really hate the characterisation that tried to imply people are weaker or worse because they lack a contextually relevant skill.

I spent about 6 months teaching myself how to tie a set of useful knots, and the reality is by now I can't do most of them anymore because day to day it turns out I just never need to tie a Midshipmen's knot (it's super useful when the siruation arises..which is rarely for an IT worker).


The computer can single-step through the program far more accurately than you can. You can inspect the full state of the CPU and memory at any moment of execution. The debugger can tell you the real, exact value of a variable at runtime.

There is simply no reason to try doing this in your head. You're worse at it than the debugger is. And I say this as someone who does have the skill. It's just not necessary.


I want to learn that.

It’s just silicone. Who hard could it be?


It's probably not wise to say so on HN, but one possible strategic goal of this war was to distract from the Epstein files.


Now they need Epstein files to distract from the war.


Why do you think Melania started talking about Epstein again when she did?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: