#41  
Old 10-29-2004, 09:57 PM
Torque1st's Avatar
Torque1st
Torque1st is offline
Posting Legend
Join Date: Sep 2001
Posts: 30,255
Likes: 0
Received 26 Likes on 26 Posts
All right here it comes:

The SAE ran baseline engine and fuel consumption tests. They are published in SAE manuals that you can buy for $$$. I was lucky enuf to have access to them at a place I worked. There is one manual I would dearly like to have if anyone has a spare $400 that they want to give me... Anyway- The baseline for engine wear was picked to be 180°F for the SAE tests. For every 10°F under that temp engine wear doubled. So at 160°F the engine is wearing at 4 times the rate as it would a 180°F. The paper did not go into higher temp applications but it follows that engine wear is probably halved at 190°F. This wear rate is caused by a chemical reaction that occurs at the cylinder wall that erodes the cast iron surface. The resulting particles of cast iron that are released also abrade the cylinder walls. The rings, bearings, and other engine components also wear more rapidly due to these particles carried in the oil. An oil filter only removes a certain % of the particles on each pass (called a beta ratio by filtration Engineers). So the more particles produced the more circulate in the oil. Each particle abrades all of the surfaces it comes in contact with and produces more particles. This wear mechanism does not have anything to do with the oil film. The fuel consumption per brake HP also increased at temperatures under 180°F but I can not remember those figures. You can see the effects of engine coolant induced wear rates on many inline-6 engines where the #1 cylinder has a much higher wear rate than the rest of the cylinders. Check a few of those engines in the junkyard and feel the ridge on #1 and compare it to #4,5, or 6... Increased engine temps are just one of the reason we see higher engine life from newer engines. Even higher engine temps get into other problems with material breakdown and degradation.

RapidRuss, and Purely Ford are correct in that the thermostat controls the minimum operating temperature, and that an engine under load reaches an equilibrium temperature that can be higher than the thermostat opening temp. Unfortunately our engines do not always run under a load and they also run in cooler seasons where the thermostat does control operating temperature. The thermostat will also control how fast an engine reaches that "equilibrium state" when it is under load.

Computer modeling has helped engine designers eliminate hot spots and improve cooling in new engines. Un/fortunately, however you want to look at it, no computer was used in the design of the FE engine. The cooling system was designed by trial and error with drawings made on paper by a draftsman. The newer engines will operate at higher "average" temps than the old FE. Aluminum heads used on newer engines also help dissipate heat and allow the use of higher compression and more spark advance without detonation. Aluminum also helps on the old FE.

Personally I like to run a cold intake manifold with a valley pan to keep hot oil off the bottom of the manifold. I also block off the exhaust passages in the intake. This keeps the intake charge cooler. It is a real help on dry manifold engines like the 335 series. Engines with a wet manifold can not run as cool. I also run a thermostatically controlled heated air intake that is just a modified OEM type system. I use dual intake snorkels with cold air inlets and a modified thermal switch to control the heat dampers in the snorkels. This avoids carb icing and fuel puddling problems.

I hope this answers some of the questions here. Right now I have to get back to work.