According to current models, gamma-ray bursts (GRBs) are produced when the energy carried by a relativistic outflow is dissipated and converted into radiation. The efficiency of this process, εγ, is one of the critical factors in any GRB model. The X-ray afterglow light curves of Swift GRBs show an early stage of flattish decay. This has been interpreted as reflecting energy injection. When combined with previous estimates, which have concluded that the kinetic energy of the late (≳10 h) afterglow is comparable to the energy emitted in γ-rays, this interpretation implies very high values of εγ, corresponding to ≳90 per cent of the initial energy being converted into γ-rays. Such a high efficiency is hard to reconcile with most models, including in particular the popular internal-shocks model. We re-analyse the derivation of the kinetic energy from the afterglow X-ray flux and re-examine the resulting estimates of the efficiency. We confirm that, if the flattish decay arises from energy injection and the pre-Swift broad-band estimates of the kinetic energy are correct, then εγ 0.9. We discuss various issues related to this result, including an alternative interpretation of the light curve in terms of a two-component outflow model, which we apply to the X-ray observations of GRB 050315. We point out, however, that another interpretation of the flattish decay - a variable X-ray afterglow efficiency (e.g. due to a time dependence of afterglow shock microphysical parameters) - is possible. We also show that direct estimates of the kinetic energy from the late X-ray afterglow flux are sensitive to the assumed values of the shock microphysical parameters and suggest that broad-band afterglow fits might have underestimated the kinetic energy (e.g. by overestimating the fraction of electrons that are accelerated to relativistic energies). Either one of these possibilities implies a lower γ-ray efficiency, and their joint effect could conceivably reduce the estimate of the typical εγ to a value in the range ∼0.1-0.5.