TURBOMOLE Users Forum
TURBOMOLE Modules => Treatment of Solvation Effects with COSMO => Topic started by: beschmid on January 31, 2012, 08:08:31 PM
-
Dear all,
I'm trying to do a geometry optimization in cosmo on dft level without ri, but the calculation always crashes in dscf step.
The cause seems to be the dscf program, because with ridft (on another calculation - I didn't switch on ri when I tried to use jobex without -ri) the malfunction doesn't occur.
The output (token from job.last) alway ends with the cosmo-results list.
==============================================================================
COSMO RESULTS
==============================================================================
PARAMETER:
nppa: 1082
nspa: 92
nsph: 32
nps: 2410
npspher: 890
disex: 10.0000
disex2: 4206.66
rsolv [A]: 1.3000
routf: 0.8500
phsran: 0.0
ampran: 0.10E-04
cavity: closed
epsilon: infinity
refind: 1.300
fepsi: 1.0000000
CAVITY VOLUME/AREA [a.u.]:
surface: V1.0, A matrix: V1.0
area: 1958.17
volume: 5044.27
SCREENING CHARGE:
cosmo : -1.052798
correction : 0.052570
total : -1.000228
ENERGIES [a.u.]:
Total energy = -2468.7533479366
Total energy + OC corr. = -2468.7494544279
Dielectric energy = -0.0825801950
Diel. energy + OC corr. = -0.0786866863
ELEMENT RADIUS [A]: ATOM LIST
n 1.83: 1
c 2.00: 2-6,8,10-12,14,30-41,43-54
o 1.72: 7,9,78,79
h 1.30: 13,15-17,19-29,55-76,80-83
p 2.11: 18,42
rh 2.22: 77
==============================================================================
dscf step ended abnormally
next step = dscf
Some general Fakts about the Calculation:
molecule has charge +1
basic set: def2-TZVP
closed shell
one atom with ecp
I used the default parameters for the cavity and the only things I changed in the control file was denconv 7 and a startvalue for damping of 1.
Does anyone else have the problem or a good idea.
Thanks
Bernhard
-
Hello,
was that a parallel run? There was a bug in 6.3 which has been fixed in 6.3.1, see :
http://www.turbo-forum.com/index.php?topic=583.0 (http://www.turbo-forum.com/index.php?topic=583.0)
The results are correct, but the dscf_mpi binary produces an I/O error at the end, so that the parallel job does not finish as expected.
Regards,
Uwe
-
Yes it was a parallel run!
I'll see if I can get 6.3.1 to try.
Thanks a lot!
Bernhard
-
Dear uwe,
I tried 6.3.1 with the same result as in 6.3.
Any further idea?
Regards,
Bernhard