Quantum first-order phase transitions
The scaling theory of critical phenomena has been successfully extended for classical first-order transitions even though the correlation length does not diverge in these transitions. In this paper, we apply the scaling ideas to quantum first-order transitions. The usefulness of this approach is illustrated treating the problems of a superconductor coupled to a gauge field and of a biquadratic Heisenberg chain, at zero temperature. In both cases there is a latent heat associated with their discontinuous quantum transitions. We discuss the effects of disorder and give a general criterion for it's relevance in these transitions.
Year of publication: |
2004
|
---|---|
Authors: | Continentino, Mucio A ; Ferreira, André S |
Published in: |
Physica A: Statistical Mechanics and its Applications. - Elsevier, ISSN 0378-4371. - Vol. 339.2004, 3, p. 461-468
|
Publisher: |
Elsevier |
Subject: | Quantum phase transitions | First order transitions | Superconductivity |
Saved in:
Online Resource
Saved in favorites
Similar items by subject
-
Quantum phase transitions of correlated electrons in two dimensions
Sachdev, Subir, (2002)
-
ENHANCEMENT OF SUPERCONDUCTIVITY OF Pb ULTRA-THIN FILMS BY THE INTERFACE EFFECT
LI, WEN-JUAN, (2010)
-
Some general remarks on superconductivity
Zee, A., (2000)
- More ...