Veda2.0 Released!


Problems to sync a small model with 8736 time-slices
#1
We are building a model where we wish to use flexible time-slice definition, where the used time-slices depend on need. We have built a small model that works fine with 48 time-slices (syncing and running), but we are not able to sync the same model with 8736 time-slices.

We get the following error message as shown below. What is causing this and how can it be solved? The time-slices are named 1 - 2 - ....- 8736.

I highly appreciate all suggestions and input!


Attached Files Thumbnail(s)
   
Reply
#2
I have also seen this problem several times in the past.  I have understood that this error can be caused by either the temporary JET file(s) exceeding 2GB, or the amount of file locks being too high. The temporary file size problem would be difficult to address, but the file lock limit can be increased.

You could try editing the MaxLocksPerFile setting under the following Registry key:

HKEY_LOCAL_MACHINE\SOFTWARE\Wow6432Node\Microsoft\Jet\4.0\Engines\Jet 4.0

Change the value for MaxLocksPerFile, setting the value to something greater than what it is (it should be 9500 by default). Years ago, I changed it myself to 20000, and since then my problems with this error have indeed been very rare, and so I think changing it made a difference. You can always change it back to what it was, if it does not help you.
Reply
#3
Thanks for this input Antti.

Pernille, is this error persistent? I have seen this a few times, but normally it goes away if I resume the SYNC operation after relaunching VFE.
Reply
#4
On a side note, could you also share a bit of your approach of using "flexible time-slice definition, where the used time-slices depend on need". It seems something that may be of interest to many other users. You mention that the time-slices are named 1 - 2 - ....- 8736, and so are they all SEASONs, or perhaps a DAYNITE cycle?
Reply
#5
Thank you for sharing your experience on this issue.

First, after changing the MaxLocksPerFile, the model actually synced. However, in this case some of the COM_FR to the end-use demand were not imported and I did not really understand why.

Nevertheless, today (with MaxLocksPerFile = 20 000), I get the same error twice in a row when syncing the model.

Do you think the naming of the scenarios will solve this problem? Now, I have only defined 1 – 2 – 3 ---…- 8736 daily time-slices (just to test), but will test if it works better with 52 weeks and 168 daily time-slices.
Reply
#6
We are in a starting phase of developing a model that enables the use of a flexible time-slice definition. This is because we wish to run the model with a coarse time resolution and run the final version of model analysis with a finer temporal resolution. We also wish to use the flexible time-slice structure to test what is the necessary time-slice resolution for different types of analyses.

The naming of the time-slices is not final yet, but the basis is the production and demand profiles on an hourly level. If we use 8736 hours of a year, these hours will further be grouped into weeks and the weeks can be grouped into seasons. To implement this, we plan to gather all the production and demand profiles into one excel-sheet.
Reply
#7
Sorry to hear that the problem persists.
Of course you could try still higher values for MaxLocksPerFile (the suggestions I have seen typically cite 200,000 as a value to try), but otherwise I think you could consider giving the test model to KanOrs to see if they can do anything about it, e.g. by checking that using a high number of timeslices does not lead to wasteful query executions.

Concerning the timeslice definition, to avoid any possible mix-up with the TIMES system sets using such integer labels, I would suggest not to use pure integer labels 1, 2,…, 8736 for the timeslices, but use some prefix like S1,…,S8736.

In addition, bear in mind that TIMES assumes by default that there are 365 days in a year, and that each DAYNITE cycle forms a representative day, i.e. 24 hours. Thus, if you have 24 equal length timeslices as a DAYNITE cycle under any parent timeslice, then TIMES assumes that they are all exactly one hour in length. Correspondingly, if you would have 8736 equal length timeslices as a DAYNITE cycle under ANNUAL, then TIMES assumes that they are all of 8760/(365*8736) hours = 9.9 seconds in length. So, if necessary, make sure that you override the default G_CYCLE(tslvl) appropriately.
Reply
#8
Pernille, can you send us a file where you get this error consisitently?
Reply
#9
Dear Pernille, Amit, Antti,

have you found out anything new on the topic, so what exactly causes the error and how to solve it?

Unfortunately, I experience the same issue, but with VEDA-BE in this case:
(The error message is in German, but telling the same thing as the English one posted by Pernille; the given long integer is also equal.)
   

This error occures when I attempt to export flow variables (both VAR_FIn and VAR_FOut) into a CSV file of model runs with more than, ~1.1 million records.
Since the model consists of 192 timeslices, 8 periods, 30 transformation processes and 29 regions, such a number of records can easily occur (192*8*30*29=1.34e+6), especially when two or more attributes and several scenarios are being exported...

As Antti suggested, I set the MaxLocksPerFile setting to 200,000 but the error persists. I tried it on two different machines which both have plenty of free working memory.

So far, I can only help myself to split the number of records into several different files and append those manually via Notepad++ which can be quite cumbersome...

Any help on this issue is highly appreciated Smile

Best,
Fabian
Reply
#10
Depending on what you are doing with these CSV files (and how), the Aggregation options of VBE might make your life easier. See page 44 of TIMES doc Part V.

Basically, you can split with ENUM1/2/3 and you can aggregate the dimensions you don't need with COLLAPSE.
Reply
#11
Thank you for this hint, Amit.
I tried this option for SCENARIO with ENUM1 and the tables get indeed generated for each scenario in the VEDA TABLES window.
However, when I click the "Export to CSV (as flat file)" button, the above mentioned error arises again  Undecided

Since I use python scripts for automated post-processing, generation of indicators and a broad variety of plotting exercises (e.g. results onto geographical maps, ternary plots  etc.), COLLAPSE is no option for me. I really need the data in an unmended form, just as the CSV export does.

Any further ideas?
Could you maybe provide an option to export the results directly from the underlying MS-Access databases without the need to process them through VBE?
Reply
#12
For single scenarios, you can easily produce CSV files with the GDX2VEDA utility.  Just make a new flows.vdd file containing the following:
* TIMES GDX2VEDA Set Directives

[DataBaseName]
TIMES

[Dimensions]
Attribute        attr
Commodity        c
Process          p
Period           t
Region           r
Vintage          v
TimeSlice        s
UserConstraint  uc_n

[Options]
not-0 var_fin var_fout

[DataEntries]
* VEDA Attr     GAMS             - indexes -
 VAR_FIn       f_in             r v t p c s
 VAR_FOut      f_out            r v t p c s

Then you can run run gdx2veda as follows:
gdx2veda <filename>.gdx flows.vdd <filename>.csv
Reply
#13
Great idea Antti, thanks!

In general, if you are working with large volumes of granular results then it is better to bypass VBE. You could also parse the VD file directly.
Reply
#14
@Antti
Ok, I tried the conversion with the GDX2VEDA utility, however it tells me that my file version was not supported:
   

Maybe my gdx2veda.exe is outdated? Its from 07/10/2005 as you can see on the screenshot.

@Amit:
Ok thats sounds like a considerable way of doing it, even though it would force me to re-write some code for it.

So, at first I'll wait if Antti can help me with the direct way...
Reply
#15
GDX2VEDA.exe is located in the GAMS system folder. There is the GDX2VEDA.exe you have been using for generating the .vd files for BE. Get rid of that old version from 2005, which you for some reason now took into use. As you can see, you have not been using it (it does not work with your version of GAMS).
Reply


Possibly Related Threads…
Thread Author Replies Views Last Post
  Persistent Import Error after Sync Ismail Kimuli 0 530 02-03-2024, 02:54 PM
Last Post: Ismail Kimuli
  Model Import Yan Yan 8 2,561 30-01-2024, 05:45 PM
Last Post: Yan Yan
  time variant emission coefficient dgchoi 5 12,854 26-06-2023, 04:09 PM
Last Post: Antti-L
  SHARE-I/O for all process and all time horizon ejin 7 4,062 08-03-2023, 10:55 AM
Last Post: AKanudia
  Failure to reproduce a previous run from built model under Veda1.4 iris 1 2,221 19-05-2021, 09:23 PM
Last Post: Antti-L
  problem with (re)SYNC of previous working models/DB after reinstall of VEDA Koen Smekens 9 8,543 29-12-2020, 08:01 PM
Last Post: Antti-L
  Time stepped algorithm -demand projection canismajoris 7 8,713 18-05-2020, 08:39 PM
Last Post: Antti-L
  How can I model discharge times for two DAYNITE processes? ach 3 5,357 26-04-2020, 02:52 AM
Last Post: Antti-L
  Infeasibility in Model Runs Ismail Kimuli 0 2,148 24-02-2020, 10:49 AM
Last Post: Ismail Kimuli
  Model infeasible regardless of CO2 limit- not sure why? ach 2 5,089 10-12-2019, 04:39 AM
Last Post: Antti-L

Forum Jump:


Users browsing this thread: 1 Guest(s)