Download latest Version of VEDA-FE 4.5.812.

Veda Application Installation guide


Reproducing Model Run Errors
#1
Hello,

I am using UKTM and am trying to reproduce model run results from July this year. The results do not match.

As far as I can tell:
- input (scenario) files are the same and their contents is identical
- CPLEX options are identical
- VEDA_FE options are the same (control panel)
- GAMS_SRCTIMES is the same in both cases

I did however update VEDA_FE, VEDA_BE, GAMS/Cplex between runs - I will be rolling this back to test if that is the source of the differences.

I do not have the .DD files to compare.
Looking through the run summary files, the only difference is the following
- [RunforIRE]: COM_BPRICE -> reproduction runs
- [RunforIRE]: cmbExistRuns -> reproduction runs
(note that I do not use any elastic demands in either of the runs)

Looking through the ".lst" files, I have noticed that the model statistics are different: single equation numbers and non zero elements are different - single variable numbers are the same. The objective functions are also - clearly - quite different.

Does anyone know:
1) what the [RunforIRE] section of the run summaries points to/means?
2) what I could look at to help solve this? (please let me know what files you might need to help answer this - happy to share them)

Thank you
Oliver
Reply
#2
(07-12-2018, 06:09 PM)O.Broad Wrote: Hello,

I am using UKTM and am trying to reproduce model run results from July this year. The results do not match.

As far as I can tell:
- input (scenario) files are the same and their contents is identical
- CPLEX options are identical
- VEDA_FE options are the same (control panel)
- GAMS_SRCTIMES is the same in both cases

I did however update VEDA_FE, VEDA_BE, GAMS/Cplex between runs - I will be rolling this back to test if that is the source of the differences.

I do not have the .DD files to compare.
Looking through the run summary files, the only difference is the following
- [RunforIRE]: COM_BPRICE -> reproduction runs
- [RunforIRE]: cmbExistRuns -> reproduction runs
(note that I do not use any elastic demands in either of the runs)

Looking through the ".lst" files, I have noticed that the model statistics are different: single equation numbers and non zero elements are different - single variable numbers are the same. The objective functions are also - clearly - quite different.

Does anyone know:
1) what the [RunforIRE] section of the run summaries points to/means?
2) what I could look at to help solve this? (please let me know what files you might need to help answer this - happy to share them)

Thank you
Oliver


Hi Oliver,
do you still have the GDX save files from the older runs? 
usually in Gamssave in your TIMES working folder.
You could use GDXDIFF GAMS utility to see the differences?

James
Reply
#3
(07-12-2018, 06:15 PM)JGlynn Wrote:
(07-12-2018, 06:09 PM)O.Broad Wrote: Hello,

I am using UKTM and am trying to reproduce model run results from July this year. The results do not match.

As far as I can tell:
- input (scenario) files are the same and their contents is identical
- CPLEX options are identical
- VEDA_FE options are the same (control panel)
- GAMS_SRCTIMES is the same in both cases

I did however update VEDA_FE, VEDA_BE, GAMS/Cplex between runs - I will be rolling this back to test if that is the source of the differences.

I do not have the .DD files to compare.
Looking through the run summary files, the only difference is the following
- [RunforIRE]: COM_BPRICE -> reproduction runs
- [RunforIRE]: cmbExistRuns -> reproduction runs
(note that I do not use any elastic demands in either of the runs)

Looking through the ".lst" files, I have noticed that the model statistics are different: single equation numbers and non zero elements are different - single variable numbers are the same. The objective functions are also - clearly - quite different.

Does anyone know:
1) what the [RunforIRE] section of the run summaries points to/means?
2) what I could look at to help solve this? (please let me know what files you might need to help answer this - happy to share them)

Thank you
Oliver


Hi Oliver,
do you still have the GDX save files from the older runs? 
usually in Gamssave in your TIMES working folder.
You could use GDXDIFF GAMS utility to see the differences?

James


Hi James,


Thanks for the fast reply!
Yes I do. I've had a look at that - albeit quickly - and will spend more time on it in parallel to rolling back the updates ... as far as I could tell there are no significant differences in the input data. Do you know if there is a way of filtering differences to see only significant ones? e.g. differences of more than x%?

Oliver
Reply
#4
(07-12-2018, 06:18 PM)O.Broad Wrote:
(07-12-2018, 06:15 PM)JGlynn Wrote:
(07-12-2018, 06:09 PM)O.Broad Wrote: Hello,

I am using UKTM and am trying to reproduce model run results from July this year. The results do not match.

As far as I can tell:
- input (scenario) files are the same and their contents is identical
- CPLEX options are identical
- VEDA_FE options are the same (control panel)
- GAMS_SRCTIMES is the same in both cases

I did however update VEDA_FE, VEDA_BE, GAMS/Cplex between runs - I will be rolling this back to test if that is the source of the differences.

I do not have the .DD files to compare.
Looking through the run summary files, the only difference is the following
- [RunforIRE]: COM_BPRICE -> reproduction runs
- [RunforIRE]: cmbExistRuns -> reproduction runs
(note that I do not use any elastic demands in either of the runs)

Looking through the ".lst" files, I have noticed that the model statistics are different: single equation numbers and non zero elements are different - single variable numbers are the same. The objective functions are also - clearly - quite different.

Does anyone know:
1) what the [RunforIRE] section of the run summaries points to/means?
2) what I could look at to help solve this? (please let me know what files you might need to help answer this - happy to share them)

Thank you
Oliver


Hi Oliver,
do you still have the GDX save files from the older runs? 
usually in Gamssave in your TIMES working folder.
You could use GDXDIFF GAMS utility to see the differences?

James


Hi James,


Thanks for the fast reply!
Yes I do. I've had a look at that - albeit quickly - and will spend more time on it in parallel to rolling back the updates ... as far as I could tell there are no significant differences in the input data. Do you know if there is a way of filtering differences to see only significant ones? e.g. differences of more than x%?

Oliver


Hi Oliver,
Sorry, I don't know if there is an automated way of filtering significant differences.
You'd need to define what you mean by numerically significant for each difference, and i don't know a simple way of doing that given the potential range of variable, unit and scale differences. maybe others do?

perhaps the relative difference option is what you need as opposed to the absolute difference default option.
https://www.gams.com/latest/docs/T_GDXDI...Difference

james
Reply
#5
(07-12-2018, 06:09 PM)O.Broad Wrote: input (scenario) files are the same and their contents is identical

If the input files are supposed to be identical, then you should not see any (relevant) differences, and the diff file produced by GDXDIFF should be nearly empty.  And so, if you see any differences, they should give you the explanation e.g. to the different number of equations, non-zeros, etc, assuming that the RUN file options are identical.  However, the RUN file switches/options might also be at play, if they could have been changed, but I guess that should be less likely.

[EDIT:] Please note that I am talking about the *Data_Only*.GDX files here, which you should compare by using GDXDIFF if you have those files. These files contain basically only the input data. Comparing the full GDX files produced at the end of the model runs would not be very fruitful in my opinion.
Reply
#6
(07-12-2018, 08:03 PM)Antti-L Wrote:
(07-12-2018, 06:09 PM)O.Broad Wrote: input (scenario) files are the same and their contents is identical

If the input files are supposed to be identical, then you should not see any (relevant) differences, and the diff file produced by GDXDIFF should be nearly empty.  And so, if you see any differences, they should give you the explanation e.g. to the different number of equations, non-zeros, etc, assuming that the RUN file options are identical.  However, the RUN file switches/options might also be at play, if they could have been changed, but I guess that should be less likely.


Hi Antti, 
Thanks for the answer.
I've just checked the run files again - the only difference is that one contains $ SET LPOINT NONE whereas the other contains none. But it does not seem to me that this would be causing the issue based on quick reading up of what this links to. Would it?
Reply
#7
No, $SET LPOINT NONE shouldn't be causing the differences.

Did you manage to compare the data only GDX files? Please note that I am talking about the *Data_Only*.GDX files here, which you should compare by using GDXDIFF if you have those files. These files contain basically only the input data. Comparing the full GDX files produced at the end of the model runs would not be very fruitful in my opinion.
Reply
#8
The only .GDX files that I have are the full file that are kept in the GamsSave folder.
How do you get the data only GDX files? I presume I'll not be able to get this for the runs that I am trying to replicate..?
Reply
#9
The data only GDX files are indeed saved into the GamsSave folder.

Check Tools → User options → General:  See if you have checked "Create Data-Only GDX". If you have not checked that, you obviously don't have these files, but otherwise you should have those files in GamsSave. It is a useful option if you want to investigate what has been changed between some model runs.
Reply
#10
(07-12-2018, 09:09 PM)Antti-L Wrote: The data only GDX files are indeed saved into the GamsSave folder.

Check Tools → User options → General:  See if you have checked "Create Data-Only GDX". If you have not checked that, you obviously don't have these files, but otherwise you should have those files in GamsSave. It is a useful option if you want to investigate what has been changed between some model runs.


Will keep it ticked from now on ...  Smile 
thx
Reply
#11
Thanks James and Antti, for your input. What Antti has suggested is the most efficient way to resolve such situations. It helps not only in reproducing runs, but also to identify or document the precise changes made between any two points in time.

[RunforIRE] section refers to the selection made in an option that (in the current version) doesn't even appear unless at least one region is unselected on the case manager. This is under the elastic demands button. This is where you select a GDX file and indicate whether you want to inherit prices or levels for endogenous trade links that have become exogenous due to the unselected regions in the current run. This section is not relevant in single region models.
Reply
#12
Amit,

Even though I agree with Antti that a full model compare is not ideal, if Oliver copies the GDXs he needs to compare and inserts ~DaTa_OnLy_# after the case name might he then be able to pull them up in VFE CaseManager/Select/GDXDIFF to get a DIFF scenario? He'd then have to focus in on only the input parameters, which isnt so easy, but may enable him to see if there has been a change in the data?

Gary
g2
Reply
#13
How about: doing a GDXDiff on command line // exporting the diff.GDX file to ACCESS or Excel // looking at Type = Par and Sets
Reply
#14
Dear all,

A quick note on this thread.
Having rolled back the versions of VEDA, TIMES, and GAMS on another machine - re-running the scenarios in question reproduced the results I was looking for.
Digging into reasons why this might be revealed that the issue is related to number formats previously not recognised by, and therefore read-in to, the older version of VEDA was being picked up by the latest available version.

Thank you for your support with this - all suggestions re. GDX data only and GDXdiff now taken onboard going forward.
Reply
#15
Great! thanks for the update.
Reply


Forum Jump:


Users browsing this thread: 1 Guest(s)