I'm not an engineer or physicist, but from the little bit of thermodynamics that I learned, I was under the impression that a hotter running engine is generally more efficient. Of course wear is a very big reason to limit temperatures. And I think *** formation is the other big one (particularly in the US).
How does that improve volumetric efficiency? If anything it reduces VE since it is essentially equivalent to opening the throttle a little bit less than you otherwise would.
However since there apparently isn't' much EGR at full load, I guess I don't really care.
The thermal efficiency is better because if your combustion is cooler, less heat is lost through the cylinder walls into the water jacket. An engine will run more efficiently by either raising the coolant temp (that's the temp you are thinking of) or lowering the combustion temperatures or both. It's all about reducing the temperature gradient across the cylinder walls. However, there is a point at which too much EGR will reduce thermal efficiency. One paper I just read shows that 5% EGR is about the maximum you want to reduce *** and increase thermal efficiency, but another shows that up to 30% would be good. It's gonna take a while to sort through these papers. They are quite interesting though.
You are right about the VE going down. I don't know what I was thinking. I can't justify my prior reasoning. I'm finding a lot of conflicting information though, so I'm still reading about it.
There is actually a great deal of research going into this. I'm finding some conflicting info, but the one thing that is definitely true is that EGR SIGNIFICANTLY reduces *** formation and it does improve fuel economy (referred to as brake specific fuel consumption, BSFC, in most papers).