Discussion in 'Fox 5.0 Mustang Tech' started by jason89gt, Sep 6, 2007.
I see I see...guess i never looked hard enough
and very niiice
could be my fuel ethanol burns a little cooler
Wallzy is entirely correct, “the water going through the radiator too fast” theory has been proven multiple times to be incorrect, in fact Stant has even published an article regarding the origins of this.
The reality is that the more efficiently the heat is transferred to the radiator, the more efficiently the radiator can dissipate that heat. The reason behind that is fairly simple in that the faster the coolant flows through the engine, the lower the heat rise of the coolant as it passes through and the higher the average radiator temperature when compared to the engine outlet temperature. The higher the difference between the radiator temperature and the temperature of the airflow through it, the higher the heat dissipation will be.
As a real world example, the typical heat rise with a standard water pump at an idle is about 12 degrees F, so with an engine outlet temperature of 180 degrees, the engine inlet temperature will be about 168 degrees. The difference in temperature from the inlet to the outlet of the radiator will, of course, match that heat rise through the engine. This then results in an average radiator temperature of 174 degrees F. With an outside temperature of 74F, there will then be a 100 degree delta between the average radiator coolant temperature and ambient air temperature.
If the flow is then doubled, the engine inlet to outlet temperature rise as well as the temperature drop through the radiator will drop from twelve to six degrees, the heat generated, of course remains constant, as will the necessary difference between the the average radiator coolant temperature and the ambient air temperature since the heat dissipated from the radiator is directly proportional to the difference between the average coolant temperature and the ambient air temperature. Because the radiator inlet temperature is now only three degrees higher than the average radiator temperature of 174, so will the engine outlet temperature of 177F.
With regard to the e-pump, most motor designers design for a core temperature between 130 and 155 Celsius, which equates in this case to about a 60 degree C temperature rise over the engine temperature at full power. For every 10 degrees C, the life span of the motor doubles. Now, if you can run the motor at 25% most of the time, that heat rise drops by 45 degrees, so the life span of the motor, and for that matter the rest of the pump, will increase by 2^4.5 or about 22 times, and that’s one of the main advantages of varying the speed, the other advantage is that it doesn’t need to be cycled on and off when the thermostat is closed. With the Mezier rating of 3.000 hours at full speed, the life of the pump would be far longer than a mechanical pump, since the vast majority of the time it could be run at a lower power level.
That's because your water temp gauge is basically an idiot light. It won't differentiate a reading of 20-30 degrees at all. It pretty much just tells if the car is cold, somewhat ok or overheating. That means it will point to the middle pretty much all the time.
If you want to know what your true temp is, you need a aftermarket water temp gauge.
I would also put the t-stat back in. Running a modern EFI engine too cold can cause more problems than it's worth. They are designed to run hot, and are actually more efficient.
if you do it, i suggest you wire it up to be hot in run. so anytime the key is on, the pump is on. this way you cant forget to turn it on. ide also wire an over ride switch, that would power the pump with the ign off (ie in the staging lanes)