11-03-2021, 07:38 PM (This post was last modified: 11-03-2021, 08:10 PM by Mahmoud.)
Dear all
I am working on a big TIMES model and running parametric runs. After I got a "disk out of memory" error I tried to go through the possible solutions. I figured out that the folder veda2.0/pgsql/data gets bigger after every single run. I don't know if I miss installed veda2.0_v219 or if I am doing something wrong but it seems strange that the underlying folder's size is now ~200GB.
Does anyone have any insights about this situation?
11-03-2021, 09:15 PM (This post was last modified: 11-03-2021, 09:15 PM by Antti-L.)
This is a very interesting point, and I think an important one.
You may also consider the following more simple case: Assume that you have a model database, which you only re-import and run, re-import and run, over and over again (without importing any GDX files or results data).
In the old VEDA, the database sizes did not change at all, while doing that cycle: re-import + run, re-import + run, .... To see what happens under VEDA2, I decided to test a bit with this cycle under the new VEDA. I used the VEDA DEMO model, which is very small: The DD files including all the model data take only 0.2 MB disk space.
0) In the beginning (after install) the size of the data folder seems to be about 73 MB.
1) I first imported the small VEDA DEMO model, and run it once. The size of data folder was then 107 MB.
2) I then re-imported and ran this same model a few times. The size of data folder was then 234 MB.
3) I quitted VEDA2 for a while, to see if garbage collection would require that.
4) I restarted VEDA2 again, and the size of data folder was about 230 MB, so not much more compact.
5) I then again re-imported and ran this same model a few times. The size of data folder increased to 290 MB.
From step 1) versus step 0) you can see that the initial database size overhead caused by this model should be around 30 MB. However, after a number of re-imports and runs, the total database size was increased to 290 MB, and so the size overhead for this model was gradually increasing from 34 MB to 217 MB, for holding exactly the same data. In the old VEDA the total size of all the database files for the DEMO model was about 11 MB, and there was no increase in the size due to re-imports.
These small tests seem to indicate that VEDA2 (or the database service) is not doing its garbage collection task too well. It seems that the size of the data folder may increase a lot, merely by re-importing the model input templates, and the size overhead of storing the data for a single model may get many times larger during a cycle of re-imports and solves, although the model data remains exactly the same.
Moreover, if we also consider the space needed for results and data GDX files, I can understand that the waste in disk space may become overwhelming. For the model data only, in my test it seemed to require about 1000 times more space than the DD files (0.2 MB --> 217 MB), while in the old VEDA this factor was ~50. I hope that this aspect can be improved.
(11-03-2021, 07:38 PM)Mahmoud Wrote: Dear all
I am working on a big TIMES model and running parametric runs. After I got a "disk out of memory" error I tried to go through the possible solutions. I figured out that the folder veda2.0/pgsql/data gets bigger after every single run. I don't know if I miss installed veda2.0_v219 or if I am doing something wrong but it seems strange that the underlying folder's size is now ~200GB.
Does anyone have any insights about this situation?
Thanks in advance,
Mahmoud
Dear Mahmoud,
If you want to reduce the database size immediately, kindly follow the below steps:-
Go to Model menu and select Database option under Manage disc space
Check GdxSize in the Summary Tab of the Manage model window
Delete all GDX entries to release the database size under GDX Tab
You need to do the above steps again when you find your database size grow out of hand.
In the latest version(above 225) we have discontinuedautomatic import of data GDX files while solving the model. So, you do not need to perform the above steps manually.
Thank you Antti for the demonstration case. It helps illustrate the problem in a more precise way.
Thank you Ravinder for the tip. Didn't know about it. I re installed veda to gain some space on the disk. I will follow the steps mentioned above if the situation is to be repeated.
I am under v231 of veda. After about 40 runs using the JRC-Eu-TIMES model, the databases size increased (again) to about 45GB. In the same time, veda says that there are no gdx files to be deleted.
Hence, may I ask what could cause such a situation?
Best regards,
Mahmoud
Glad that you have been able to do so many runs with this monster model .
VD file of a single normal run is around 500Mb, and for runs without crossover, it is almost 1GB. The case GDX files (not the data only ones) should be large too, but Veda leaves them in the Wrk folder as user responsibility. The Summary tab of Manage disk space will show you where this space is occupied. Most probably it will be in cases.
To summarize, if you want to do so many runs with this model, especially without crossover, then you will need a larger hard drive.