Originally posted by cbfull
[
Cold air tends to literally suck the energy out of your combustion reaction. If both the fuel vapor and air are very hot when the reaction is
triggered, you will be much closer to getting the maximum amount of energy possible for that reaction.
No, thats not true. You still get the same delta T, if mass and fuel energy are the same. BUT the potential efficiency is higher if the whole process
runs colder. Engine efficiency on Wikipedia
If you look at the efficiency eqn for a non-reversible heat engine you can see that n will be higher as Tc/Th gets smaller. This can be
rewritten as Tc/(Tc+deltaT). If deltaT is fixed by the energy input and mass of the system, the only way to increase efficiency is to make Tc smaller
(ie colder!)
In fact I think it was NASA developed an engine running on liquid Nitrogen (cold store) and atmospheric heat (as a heat source) and they acheived an
efficiency of around 77%!
Helicopter: "helico" -> spiral, "pter" -> with wings
I think the issue with temperature in an internal combustion engine generally revolves around predetonation. "High performance" engines run on premium
fuel. Premium fuel is less explosive. Not more. This is done to prevent predetonation and increase timing and compression.
I believe H being used in an internal combustion engine would have to be done at cooler temperatures(not cryogenic temps) in order to produce best
power output at best effeciency. The more explosive the fuel(more power) the greater the issue of predetonation.