6 Dec
2004
6 Dec
'04
1:36 p.m.
Hi, I've never quite well understood why do theoretical phases and amplitudes seem to be dependent on the interatomic distance between the absorber and scatterer atoms. Shouldn't it only depend on the nature of the atoms involved? Nevertheless when trying to build theoretical references I'm always careful in choosing an appropriate distance. Otherwise the difference in phase shifts must be compensated at the expense of large E0 values. Could someone please shead more light on this subject? Thanks, Hugo Carabineiro PhD Student