It is generally understood that lubrication must, by default, have an impact on equipment efficiency in terms of friction. These losses, which are lumped into friction and windage in efficiency standards, such as IEEE Standard 112 Method B[1], which is one of the five types of electric machine losses: core losses (15-25%) and friction and windage losses (5-15%), which are fixed losses that only change with speed; and, stator I2R (25-40%), rotor I2R (15-25%, and stray load losses (10-20%) which are variable with load[2]. When a motor is operated uncoupled at no load the combined losses are the total input versus output of the motor. The increase in bearing-related losses are observed, normally, as a slight increase in watts lost which appear as heat, sound and vibration until friction impacts machine speed. A dramatic increase in temperature and change in speed is generally found as the impact on efficiency approaches 2-3% of the overall losses. This would mean that in a 100 kW motor, the maximum a bearing will draw is 2-3 kilowatts when the bearing is failing, but generally well under 1 kilowatt when there is a degradation of lubrication or even a slightly over-lubricated condition. In general, losses measured as cage and/or ball or roller usually indicate bearing lubrication where the addition of inner or outer race losses will indicate surface defects and damage.
Howard