Engine Intake Temperature Dyno Difference

FastDriver

My dad had a bra
SN Certified Technician
Sep 5, 2001
5,265
1,877
224
Fort Knox, KY
Most gear heads know that letting the motor cool can have a positive effect on performance at the track, but how much can it really make a difference? I thought some of you might find this interesting.

I got a chance a couple of weeks ago to dyno "black jack," see sig. I was concerned that since my fuel system is only runing around 30 psi w/o vacuum that the car may be running to lean. So we hooked up the wide band and went to work.

I drove the car in off the street and put it right up on the dynojet chassis dyno (will have to ask about the model number). I honestly believed that the car would make in the neighborhood of 270rwhp, because I've had a nearly identical combo with shorty headers that made 270rwhp in the past, and this combo is running BBK 1 5/8" long-tube headers. So, I expected a bit more.

While hot, the first pass laid down a disappointing 260rwhp. I honestly thought something was wrong. Maybe the motor is old and there's some blow by... no idea really, but you wouldn't know it from driving it. The shop owner and I are friends, and I was helping him with a project. So we let the car sit for around an hour and cool. Now the car was so cool, it didn't even register on the stock temp gauge, and on pass number 2, the power came up to 265rwhp, just over a 5rwhp improvement.

We immediately followed up with another pass. This time the temp gauge was at around the lowest white line, and it again showed about another 5 rwhp improvement. to a total of 270.6 rwhp.

We immediately ran it again, and on the final pass got another rwhp, but lost about 4rwtq at peak. Unfortunately, I didn't peak at the temp gauge this time. Now that performance was starting to turn back around, we called it a day.

The thing that stands out to me is the air fuel ratio during each run. The first and hottest, full operating temperature, run shows a vast departure to the rich side from the other runs. The leanest run also made the most low-end torque. At peak power, each of the three last runs were within .02 AFR of each other, a perfect place to really see the effect of the temperature difference, assuming timing was also close.

Just a little food for thought! Hope it helps one of you in your tuning and racing endeavors!

Hot vs. cold dyno.JPG
 
  • Like
Reactions: 1 users
  • Sponsors (?)


That is why we used to run cool cans and ice down the intakes in between passes. Colder air= more HP. Ever notice your car has more pickup on cooler days? Same reason intercoolers, cold air intakes are used.
 
  • Like
Reactions: 1 users
After surfing Google, I came up with this approximation:
A 10° change in engine inlet air temperature equals 1% change in horsepower.

Example:
300 HP at 77° temp
Increase HP for 30° temp drop: 30° x 3% = +9 HP, or 309 HP
Decrease HP for 30° temp rise: 30° x 3%= -9 HP or 291 HP.

The current SAE "Standard Day" is 77°F (25°C) day with 0% humidity and a barometric pressure of 29.234 in-Hg.
That's a set of figures that is seldom seen in the real world. For that reason auto manufacturers use the SAE correction factor standard J1349. It makes possible to compare engines from different manufacturers in different parts of the world on a equal basis

Older cars ( 60's -70's muscle car era) used the J607 standard which considers that the engine was run on a 60°F day with 0% humidity and a barometric pressure of 29.92 in-Hg. That has an effect of making it difficult to compare older engines with newer ones without some math to equalize the correction figures. The higher barometric pressure and lower inlet air temps give older engine HP ratings that are inflated when compared to the newer engines used in today's cars.

For those interested in how SAE correction works, see Corrected Power
 
Last edited:
  • Like
Reactions: 1 users
Perhaps the most interesting part of the experiment to me was that there's a happy medium. When the car was its coldest, it was 6rwhp under peak. This is obviously not due to the density of air, but to the adjustments that the computer made to timing and fuel. At the vertical line, the AFR was identical and we still see that the coldest (red) run made less power. I think that the computer must be pulling timing for whatever reason.
 
I also noticed a significant change in et. With my car(no AFR gauge) having the motor with about 130-150 degrees(on autometer gauge/after a quick burn out) it will run .1-.2 quicker than at full operating temps(180-190). When I also kept the nitrous bottle pressure correct(900#s) and the same engine temps I'd pick up even more over a hot engine. When I did the tnt "brackets"-these were just trophy racing and a small purse- I always ran the engine at full operating temps because it was dead consistent. That car would run 13.80-13.82 every pass(as long as I didn't do something stupid-like miss a gear). But for a "run a personal best mode" I always iced the intake & kept the engine cold,
 
I also wonder if how much the ACT sensor comes into play. I don't recall at what temp it is effected, but a heat soaked ACT sensor will tell the ECU to pull a little timing out of the curve. A lot of guys relocate the sensor out of the manifold and into the intake track closer to the MAF meter where the air temp is closer to ambient. There must be some merit to it, as all of the later Mustangs were designed just this way.

Will it help....can't say for sure, but it certainly couldn't hurt?
 
I swear I've forgotten more about these cars than I remember. What is ACT again? Is this ambient temperature?

No that is the BAP sensor. The ACT sensor is normally screwed into the #5 intake runner boss and meausures the temp of the air.. In the SN95 cars, they relocated it into the intake tube. It is there so the EEC can advance the timing when the engine is cold.