Hacker Newsnew | past | comments | ask | show | jobs | submit | more SyzygyRhythm's commentslogin

The article mentions that some flights produce a net cooling effect. I wonder if it could be cost effective to divert flights toward contrail formation when it's predicted that they'll produce cooling (I also wonder what the actual circumstances are when they produce cooling--low surface temperatures, maybe?).


Perhaps also early morning flights where most of the contrail's lifespan will be in sun.


I went to Space Camp in Huntsville in '89 or so. One of the perks was a daily showing at their Omnimax theater. Felt absolutely incredible at the time. The most memorable moment was a scene where they filmed the Space Shuttle tower escape system--basically a basket on a zipline that goes into a sand pit. Everyone in the theater instinctively leapt forward when the basket hit the sand. The difference, I suppose, is that the screen filled your peripheral vision as well. I didn't experience the same level of immersiveness until VR, much later on.


You can prove it easily by induction. If Wm(n) is a mustard watch containing n micrograms of mustard, then it suffices to show that Wm(0) exists, and that if Wm(n) exists then Wm(n+1) must exist. Obviously a single additional microgram of mustard could not overload the structure of a watch. Therefore, Wm(10^100) or any other size must exist.


It's the original definition of "pushing the envelope." You start with a known flight envelope, then push a little past that. Sometimes you learn that the real envelope is bigger than you thought. Other times you find that the envelope is exactly where you thought it was.


I notice I mis-typed and should have said you can't overstate the importance of Starlink on the development program. And I agree with you about pushing the envelope. It was really interesting to hear that they were planning on testing an engine-out scenario with this booster by lighting one of the middle 10 engines and two centers. If they can pull that off, they'll have a measurable increase in fault tolerance over Falcon 9.


is igniting the engine pushing the envelope though?


We don't know all the things they did to the Booster, but among them were deliberately not igniting some engines as well as taking a more aggressive angle-of-attack on descent (the rocket is a fairly effective lifting body, as it turns out!).

There may be more things, but between those two I think the latter was a bigger problem. It would have gotten hotter and more physically stressed. And then weakened to the point to where re-igniting the engines caused it to fail.

They also used a new hot-staging maneuver, where the gases were directed out one side so that it flipped more rapidly in the other direction. It was a really fast flip! A rocket the size of a small skyscraper turning 90 degrees in just a few seconds. That could have jarred something loose, too.

Hopefully we find out in the post-mortem. SpaceX doesn't typically give the public as much detail as we'd like, but they're pretty good at sharing the high-level reasons why something failed.


>SpaceX doesn't typically give the public as much detail as we'd like

It gives magnitudes more details than anyone else.


ITAR unfortunately limits what can be publicly released.


I disagree with the article's claim that the geometric mean lacks physical meaning.

Say you have two benchmarks with different units, frames/second and instructions/second. You can't take the arithmetic mean of these unless you divide by some baseline first (or a conversion factor).

But the geometric mean has well-defined units of (frames * instructions)^.5/second. And the reason you can divide by another geometric mean is because these units are always the same.

Having coherent units isn't exactly the same as "physical meaning", but it's a prerequisite at the least.


Geometric average is just estimating the mean of logarithmic units. Units of measurement are arbitrary, and justification for using logarithmic units for statistical purposes can be easily made with distributional considerations. E.g. durations are bounded to be positive, so there will be at least some skew in the distribution (although this can be negligible in practice).


The article says that the crystals don't affect the taste or scent. The crystals are a signal that you have a good cheese, but not the cause of a good cheese. Adding them to a bad cheese won't make it a good cheese, so in that sense I'd call it a fake.

There is some gray area in that they affect the texture, which is a part of the whole experience. But that's again mostly signaling--we like the crunch because we associate it with good cheeses, not because there's anything inherently better about it.

There are some interesting philosophical questions here. If you put a fake label on some wine, and people perceive it as higher quality than it is, is it really fake? On one hand, obviously yes. And yet there was a real effect on the perceived quality.


> The article says that the crystals don't affect the taste or scent.

That seems hard to believe, frankly.


Drilling the well is the expensive part. It heats at a relatively constant rate depending on the geometry. You could size the plant to be exactly matched to the well output--but the generators and such are relatively cheap. So instead you make that part of the plant slightly oversized, so you can run at over the well's capacity when electricity is expensive, and under the capacity when it's cheap. The thermal mass of the rocks allows you to average this out over time.


So there are two capacities, that of the well and that of generation (oversized with respect to the well). On average this varying scheme utilizes the well at 100% of capacity (the expensive part to increase), and the turbine generation at less than 100% capacity (not expensive to oversize).


I would say instead that math is a game. A universal game with no predefined rules at all and only one guideline: if the rules you make up lead to a contradiction, then the rules are probably boring. If your rules say that 1+1=3, then you can prove anything and the whole thing becomes uninteresting.

Mathematicians have come up with various rules (axioms) that seem to work pretty well. And they spend a great deal of time figuring out their consequences. But it may still happen that the rules have a contradiction and they need to come up with a different set.

Sometimes mathematicians add extra rules when they run into a roadblock. And part of the meta-game is to come up with the minimum set of extra rules they need to keep going. Sometimes they spend time figuring out if the existing rules aren't needed.


>>I would say instead that math is a game.

Yup, and as you keep going the level too go up!

But the core ideas are simple though-

1. Start some where where you understand things enough to make sense.

2. Make the smallest possible, atomic change to some aspect of thing you know at point 1.

3. Test if the change sticks- If yes, repeat steps 1 - 3

4. If the change doesn't stick- Go back to step 1. Now either make a different change to the same thing or make a new change to a different thing. Repeat steps 1 - 3.

As you can see you write a lot. Like really a lot. Math is just writing skills.


Infinitesimal calculus is the old-fashioned calculus! It was what Newton and Leibniz invented. Limits only came into play later when mathematicians wanted a more robust foundation. But then Robinson proved that infinitesimals were perfectly rigorous. IMO, non-standard analysis is more intuitive than limit-based calculus.


Ok, it might be more intuitive. But in terms of applications, is there any example where there's any advantage of using infinitesimal calculus or non-standard analysis?


Yes, any time you have to reduce something to a point for analysis in any geometric problem.

You can also vary infinitesimals and utilize them not just in nonstandard analysis, but in fractional calculus, such as for inferring stock market motions.

They have helpful applications in physics, especially field theory.

*

I can imagine, a long time from now, many elegant mathematical constructs simplified by the use of, e.g. infinitesimals, Clifford algebras, category theory, etc. There's a lot of complicated ideas that are nicely simplified, and are even more intuitive, easy to teach the fundamentals of, rather than the standard approach.

I think it's important to understand that the canonical calculus approach came from rather mechanical questions in analysis and proofs, and the math is layered with that, as well as the notational conveniences of forms of calculus commonly used for electromagnetism, classical mechanics, etc. There's a lot of legacy syntax there, and we just live with it, but it's not optimal. Infinitesimals are a way to go back to applications and to better syntax.


Tim Walz claimed there is "no guarantee to free speech on misinformation or hate speech, and especially around our democracy." That's false--the First Amendment has no such carveouts for those things. So it's concerning that Walz would think otherwise.

Hillary Clinton has made similar comments, saying "But I also think there are Americans who are engaged in this kind of propaganda, and whether they should be civilly, or even in some cases criminally, charged is something that would be a better deterrence, because the Russians are unlikely, except in a very few cases, to ever stand trial in the United States." But again, there is no First Amendment carveout for propaganda, Russian or otherwise.

There are some limits to protected speech, but they're rare and mostly limited to direct incitement of a crime or other threat.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: