
Financial Terms Glossary
The most important financial terms - with simple and concise explanations.
Sharpe Ratio
The Sharpe Ratio is a metric commonly used to compare investments with different levels of risk. It is based on risk-adjusted returns and measures the average excess return (i.e., the difference between the investment return and the risk-free rate) achieved per unit of risk.
The Sharpe Ratio is calculated by dividing the excess return by the standard deviation of the investment return:
Sharpe Ratio=(r – rf) / σ
where
rr = investment return
rf = risk-free interest rate
σ = standard deviation of the investment
The Sharpe Ratio provides investors with a better understanding of whether a high historical return (for example, that of an actively managed equity fund) was achieved through an “intelligent” investment strategy or simply by taking on a very high level of risk.
A low Sharpe Ratio (for example, below 0.3) indicates that a low return was achieved relative to the risk taken. Negative values are harder to interpret. A highly negative value essentially means that losses were achieved with “high certainty.” However, historical performance comparisons should only be made over longer periods, as the Sharpe Ratio can fluctuate strongly in the short term.
The Sharpe Ratio has become the standard metric for comparing the performance of investments with different levels of risk. However, it has the disadvantage that it uses standard deviation (volatility) as a risk measure. Since the standard deviation averages both positive and negative fluctuations, this risk measure is only meaningful if investment returns are symmetrically distributed—as is assumed, for example, in a normal distribution. In reality, however, we often observe that returns behave asymmetrically, since extreme negative price jumps occur much more frequently than extreme positive ones.