ECT/EOT/TFT while towing
Now that's funny.It's just a personal thing, I had a heavy background in physics and engineering and to hear/see "Delta" used in such a way is like nails on a chalkboard.
Josh
My current Mishimoto seems to be about the same as before.
Speaking of Mishimoto, I believe there are a couple examples in that thread where the temp difference lessened with the 200 t-stat
Josh
Not doubting it happened for some folks, just would like to know the mechanism because if a temperature differential is reduced, then it pretty much seems to me that the flow through the cooler had to change (increase) since the heat exchange area of the oil cooler is constant. Not sure how that would happen, but I can accept that it does for some situations.
The ECT is the hot water temp and the EOT is the cold oil temp. So, if you raise the overall water temp (and keep the coolant flow the same), then I just can't see anything happening other than the oil temp going up also - by about the same amount.
Q=U*A*delta T (for the engineering background). You simply can't remove more heat (actually enthalpy) from the oil (reduce it's outlet temp, or even keep it the same) with warmer inlet water unless the flow changes.
If there are flaws in that thinking, I certainly would not mind hearing what they are and how it would support a reduction in the ECT/EOT differential.
Thoughts?
Not doubting it happened for some folks, just would like to know the mechanism because if a temperature differential is reduced, then it pretty much seems to me that the flow through the cooler had to change (increase) since the heat exchange area of the oil cooler is constant. Not sure how that would happen, but I can accept that it does for some situations.
The ECT is the hot water temp and the EOT is the cold oil temp. So, if you raise the overall water temp (and keep the coolant flow the same), then I just can't see anything happening other than the oil temp going up also - by about the same amount.
Q=U*A*delta T (for the engineering background). You simply can't remove more heat (actually enthalpy) from the oil (reduce it's outlet temp, or even keep it the same) with warmer inlet water unless the flow changes.
If there are flaws in that thinking, I certainly would not mind hearing what they are and how it would support a reduction in the ECT/EOT differential.
Thoughts?
See a similar concept with the old Ford FE engines (360, 390, 427, 428 etc). Many think they can run 160 degree t-stats versus the stock 180 and the engine will run cooler.
Nope, what actually happens is the engine ends up running hotter since the t-stat is open further causing the coolant to flow faster thru the radiator. Not as much heat transfer and the end result is coolant temps topping 210 or more.
So, what does a person do? Install a 195 t-stat and it stays at 195. T-stat remains barely open, reducing flow and allowing the radiator to do it's job.
That is one example.
Not much different than the Sinister oil cooler with the bigger internal passages. Too much flow and not enough saturation time which equals hotter EOT.
Josh
Josh

I'm leaning towards revising my theory on how a hotter or at least a proper t-stat could close the temp gap on ECT/EOT.
Several PM exchanges with Bismic, followed with links to others having the exact discussion has led me to believe the hotter coolant provides more heat "capacity" allowing the EOT to cool off more. Extremely basic definition of my understanding.
I previously thought the hotter t-stat slowed the coolant down, but upon more reading wouldn't really work in an engine cooling system. Looks like, if anything, the slower coolant would increase the temp spread, not reduce it.
Links as follows:
https://www.physicsforums.com/thread...ansfer.198975/
I clued in on Post #19 in this link:
coolant flow question - The Garage Journal Board
Josh
Ford Trucks for Ford Truck Enthusiasts
2006 F350 4X4
2006 F350 4X4

I'm leaning towards revising my theory on how a hotter or at least a proper t-stat could close the temp gap on ECT/EOT.
Several PM exchanges with Bismic, followed with links to others having the exact discussion has led me to believe the hotter coolant provides more heat "capacity" allowing the EOT to cool off more. Extremely basic definition of my understanding.
I previously thought the hotter t-stat slowed the coolant down, but upon more reading wouldn't really work in an engine cooling system. Looks like, if anything, the slower coolant would increase the temp spread, not reduce it.
Links as follows:
https://www.physicsforums.com/thread...ansfer.198975/
I clued in on Post #19 in this link:
coolant flow question - The Garage Journal Board
Josh
Even if a hotter running T-stat was able to lower the temp differential between ECT and EOT it is not really solving the initial problem. This is only masking the fact that the cooler may be working less efficient than intended. In my mind raising the coolant temp to get a lower differential is only masking the issue at hand.
Since the Ford spec is a 15 degree difference, unofficially may be 7 or 8, based on a 192 degree T-stat then maybe we need a new set of numbers for a 200 degree T-stat. If the efficiency changes with the ECT then we can not use the differential for 192 at 200. If the effiency changes then the measurement of how efficient it is must change as well.
My logic may be completely off base but it seems logical to me, LOL.
Not doubting it happened for some folks, just would like to know the mechanism because if a temperature differential is reduced, then it pretty much seems to me that the flow through the cooler had to change (increase) since the heat exchange area of the oil cooler is constant. Not sure how that would happen, but I can accept that it does for some situations.
The ECT is the hot water temp and the EOT is the cold oil temp. So, if you raise the overall water temp (and keep the coolant flow the same), then I just can't see anything happening other than the oil temp going up also - by about the same amount.
Q=U*A*delta T (for the engineering background). You simply can't remove more heat (actually enthalpy) from the oil (reduce it's outlet temp, or even keep it the same) with warmer inlet water unless the flow changes.
If there are flaws in that thinking, I certainly would not mind hearing what they are and how it would support a reduction in the ECT/EOT differential.
Thoughts?
There is also a difference in energy transfer rates as temperature changes between the mediums (individually and maybe more significantly compared to each other) and the overall temp of the super system. I tried to research this a bit to try and understand the significance of the idea but got bogged down by Gibbs equation and some other concepts I don't know about and now my head hurts.

I would also add the coolant isn't removing more heat from the oil. The oil is returning to the cooler at the same temp (or maybe higher due to the increase in engine temp) and is exiting at a higher temp, the oil temp increase is less than the coolant temp increase which would mean it's somehow more effencient right?
If the 220° stat install was the only change, no fresh coolant or flush and no oil change, the only variables that are different are the temps and flow rate.













