-
Notifications
You must be signed in to change notification settings - Fork 43
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
parse_file and compute_mc_pf fail on EPRI Ckt24 [BUG] #455
Comments
Hi! I'm not going to be able to help with finding the root cause in the short term, but may be able to point you to workarounds. The PMD parser for OpenDSS is limited in scope, i.e. doesn't support all of the OpenDSS spec. Furthermore, OpenDSS's data model is ambiguous which creates parsing challenges (you don't know if the neutral in impedance matrices has been Kron reduced or not). On p.18 of https://arxiv.org/pdf/2305.04405.pdf you can see which test cases we've been able to run with the native solver. What are you trying to achieve? |
Thanks @frederikgeth. I am simply trying to run a power flow analysis in PMD using a circuit that colleagues have previously used within OpenDSS. I'm trying to get a feel for the capabilities of PMD, and how practical it might be (now or in the future) in the industrial setting. Your reply raises a few more questions/concerns. First, comparing PMD and OpenDSS runtimes in Table 4 of the reference you cited, to what do you attribute the large differences in runtime for the two large circuits (61-fold and 165-fold differences)? It sounds as if the OpenDSS and PMD power flow algorithms are similar. Also, it appears that for the two large circuits, PMD has a nearly constant total runtime, even though one circuit is about four times larger than the other. In contrast, the OpenDSS timings show an approximate four-fold difference. Would you expect that for even larger circuits, say 10-fold larger, PMD runtimes would remain similar to those in the paper (~20 seconds), while OpenDSS runtimes would continue to scale linearly? That is, is there likely to be a circuit size where runtimes between OpenDSS and PMD are similar? Second, is there any guidance or rules to follow when constructing OpenDSS files for eventual parsing by PMD? If not, then the utility of first creating circuits in the OpenDSS format (which PMD recommends) might be limited. One might spend considerable time creating a large circuit, only to learn later that it can't be parsed by PMD. For this reason, perhaps one should create circuits from scratch in PMD itself, or use some backbone created in the OpenDSS format and then complete it in PMD. It sounds like there are fundamental incompatibilities if the OpenDSS format is ambiguous in certain respects. Is there a typical set of steps to go through when trying to alter DSS files for successful PMD parsing? Similarly, it would appear that the ability to use both tools together in the same project is limited. A circuit created in OpenDSS format might not parse into PMD, and a circuit created or altered in PMD cannot be exported into OpenDSS format (correct?). Yet, the literature suggests that PMD is not trying to replicate many of the capabilities of OpenDSS, so there might be cause to use both tools together. And there might be legacy code available in OpenDSS format for a given circuit. Can you offer any suggestions (for now or in the future) as to how one might approach the parsing issue and potential tool interoperability? Are there plans to allow circuit export into the OpenDSS format, and are there plans to develop guidance on constructing a (large) circuit in the OpenDSS format that would parse into PMD? If so, then perhaps many of the concerns I raise might be addressed. Or, is there a better way to be thinking about OpenDSS-PMD incompatibilities? |
For anyone looking at the parsing issues with Ckt24 in the future, I've made some slight progress in narrowing down problems. To run OPF, I am adapting the code in the "Introduction to PowerModelsDistribution" tutorial. As an aside, this same approach worked out of the box for IEEE123, using script 1 of "Run_IEEE123Bus.DSS". My code to run the problem (after various import statements) is:
As I mentioned in the previous post, parsing the file results in several thousand INFO messages. These appear to be due to loads on buses in the dss file where the bus is only connected to the load, not anything else. For example, for bus1 in
there is only this instance where bus g2101ac9800_n283938_sec_4 is mentioned in the dss files. I'm guessing that OpenDSS might just ignore these loads. But I could be wrong. Maybe it connects them to the rest of the circuit in some fashion. It's not clear to me if OpenDSS and PMD are supposed to ignore these loads. PMD does read in a large number of loads and buses without giving the INFO message. Here is an example of one that does have an INFO message:
Looking further at this example, the issue is from
Now, regarding the actual error, from running
Looking deeper at src/data_model/transformations/kron.jl:40, numerous circuit lines run without error. The first error is for:
The issue here is that the assert test I'm not sure what to try from here. I don't understand why the two sets of connections are different length. I also tried calling the function several ways, e.g., with If anyone has suggestions as to what to try next, or where the problem might lie, I would appreciate the help. |
Here's some quick high-level feedback
|
Thanks @frederikgeth. That helped. |
@John-Boik @frederikgeth I had a closer look at the Ckt24 data and here is where I am so far. The problem is when transforming engineering data to mathematical data, which is
where we get this error
I have found these two issues so far:
The linecodes with larger matrix size are not problematic, so we can ignore them, But the other one needs to be fixed
The second issue of a specific branch with from and to vector connections mismatch is also easily fixed:
With these fixes so far, the previous error is gone. Runnin
This error arises from most of the load being set to DELTA configuration, but are single phase
This error seems to be rooted in PMD dss2eng.jl where the load configuration is set up, and I can see that there is a TODO note for better generalization
This line seems to be in contradiction to the line that throws the error at
|
Hi, I stumbled with this issue through a GitHub search.
We recently added a new function OpenDSSDirect.Circuit.Save to allow tweaking the previous For a better approach, we're developing a JSON Schema to formalize alternative circuit I/O -- this is what the current
Probably not a major issue here, but note that that repo is a personal repo and has not been updated in several years. The official repo is hosted on SourceForge, and we (DSS-Extensions) keep a copy with some changes (e.g. adjust paths, add some tests) at https://github.com/dss-extensions/electricdss-tst A couple test circuits did get updates in this period, including that one: https://github.com/dss-extensions/electricdss-tst/tree/master/Version8/Distrib/IEEETestCases/123Bus |
Hi @PMeira. Sorry for the slow reply, but thanks much for mentioning the ToJSON functions. The JSON output does indeed look useful in parsing OpenDSS circuits to PowerModelsDistribution. I just want to check two things with you:
Also, thanks for the info on the IEEE123 circuit.
|
@John-Boik The FromJSON/ToJSON functions pass roundtrip tests for well-behaved data (including many of the test circuits). I imagine you saw the tickets on AltDSS-Schema related to the default values -- I'm finishing that, coupled to the next release of the engine. Besides that, the basics for the components are done and stable (as much as OpenDSS is stable). The JSON schema spec for the command system is ongoing (tied to a refactoring of the command system).
Part of it is in the DSS C-API repo and part in the AltDSS-Schema repo. I'll try to make that more clear next time I updated the docs. And we (e.g. the team at my university) typically have some private branches and tickets that sooner or later land on the public repos. |
I'm trying to parse the dss files for EPRI Ckt24 and run
compute_mc_pf
, to compare with output using OpenDSSDirect.jl. I can run OpenDSSDirect.jlsolve
successfully with this circuit. I can also runparse_file
successfully, although PMD spits out a long list of messages. Butcompute_mc_pf
fails on the parsed result. I am new to OpenDss and PMD, so I'm not sure how to address the problem, or even what the problem is. And I'm not sure if it is a PMD bug or not. Any assistance would be appreciated.In order to run
parse_file
successfully with PMD, I had to make several changes to the dss files downloaded from OpenDSS, including:model=4
tomodel=1
for almost all loadsAfter those changes, the result of
data_eng = PMD.parse_file("master_ckt24.dss", transformations=[PMD.transform_loops!])
is a few (I assume) harmless messages, like:Then PMD spits out 3,638 INFO statements, all similar to:
These messages occur whether or not I include
transformations=[PMD.transform_loops!]
in the call.The first example occurs in the Lines file as:
The second example occurs in the Loads file as:
I could understand if a few nodes were somehow wrongly structured in the dss files, but 3,638 nodes sounds more like a PMD bug, or at least some consistent characteristic of the dss files that is incompatible with PMD.
When I run
compute_mc_pf(data_eng)
, I get error messages:The same error occurs if I try different variations of
compute_mc_pf
, such ascompute_mc_pf(data_eng, explicit_neutral=true)
, or if I use the functions from the Native Power Flow Solver example:I'm guessing that the errors for
compute_mc_pf
are related to all the INFO message ofparse_file
.Any help would be appreciated.
The text was updated successfully, but these errors were encountered: