# Angel "Java" Lopez en Blog

### Publicado el 30 de Noviembre, 2016, 15:01

 Este es un tema que amerita mayor consideración y detalle técnico. Por ahora, sigo citando y comentando brevemente a Dirac: Let us see what can be done with putting the present quantum electrodynamics on a logical footing. We must keep to the standard practice of neglecting only quantities which one can believe to be small, even though the grounds for this belief may be rather shaky. In order to handle infinities, we must refer to a process of cut-off. We must do this in mathematics whenever we have a series or an integral which is not absolutely convergent. When we have introduced a cut-off, we may proceed to make it more and more remote and go to a limit, which then depends on the method of cut-off. Alternatively, we may keep the cut-off finite. In the latter case, we must find quantities that are insensitive to the cutoff. The divergencies in quantum electrodynamics come from the high-energy terms in the energy of interaction between the particles and the field. The cut-off thus involves introducing an energy, g say, beyond which the interaction energy terms are omitted. It is found that we cannot make g tend to infinity without destroying the possibility of solving the equations logically. We have to keep a finite cutoff. Dirac prefiere perder la invariancia relativista que seguir en un problema de base: The relativistic invariance of the theory is then destroyed. This is a pity, but it is a lesser evil than a departure from logic would be. It results in a theory which cannot be valid for high-energy processes, processes involving energies comparable with g, but we may still hope that it will be a good  approximation for low-energy processes. On physical grounds we should expect to have to take g to be of the order of a few hundred Mev, as this is the region where quantum electrodynamics ceases to be a self-contained subject and the other particles of physics begin to play a role. This value for g is satisfactory for the theory. Nos leemos! Angel "Java" Lopezhttp://www.ajlopez.comhttp://twitter.com/ajlopez
Por ajlopez, en: Ciencia