AMBER Archive (2006)

Subject: AMBER: AMBER mdcrd file bigger than 2GB

From: snoze pa (snoze.pa_at_gmail.com)
Date: Wed May 31 2006 - 16:57:22 CDT


Hi,
 I was reading ross walker comments in net. My file size is also bigger then
2GB. Infact it is 2.9GB. Now problem is neither VMD is reading it nor ptrj
command of amber.
How to find the line in my mdvrd file that will tell me the the step size.
Is there any solution to this problem? or I need to re-run my simulation
again.
thanks in advance
snoze

 Try editing the config.h file produced with your configure script and add
> the following to the
> AMBERBUILDFLAGS line:
>
> -D_FILE_OFFSET_BITS=64 -D_LARGEFILE_SOURCE
>
> Then do a make clean followed by make.
>
> This 'may' depending on your system, compiler etc produce an executable
> that
> can write trajectory files larger than 2GB. That said writing such large
> trajectory files is NOT recommended. The reason for this is the larger the
>
> file becomes the more likely it is to be corrupted. If a single frame in
> the
> tajectory gets corrupted then you will have a great deal of trouble
> extracting information beyond the corrupted frame. If you break your
> simulation up into much smaller parts and one trajectory file gets
> corrupted
> then you can simply re-run this small portion of the trajectory from the
> restart file for that portion of the trajectory to recreate the mdcrd
> file.
> Improvements to the trajectory file format in future releases of AMBER may
>
> address this problem but for the moment there is no easy fix for corrupted
>
> trajectories.
>
> Splitting things up is much much much safer... When it comes to analysing
> things it also makes things easier. For starters you can analyse several
> trajectory files seamlessly in ptraj and VMD will allow you to load
> several
> mdcrd files into a single prmtop (molecule). Havings things split up also
> has the advantage that you can easily analyse just a small fraction of the
>
> trajectory without having to load the whole thing.
> Add this to the fact that if you copy files bigger than 2GB between 32
> bit
> and 64 bit machines over NFS shares or try to edit a file bigger than 2GB
> in
> an editor (say vi) that does not have support for large files you can end
> up
> truncating or corrupting the file fairly easily.

-----------------------------------------------------------------------
The AMBER Mail Reflector
To post, send mail to amber_at_scripps.edu
To unsubscribe, send "unsubscribe amber" to majordomo_at_scripps.edu