GithubHelp home page GithubHelp logo

modelica / fmi-cross-check Goto Github PK

View Code? Open in Web Editor NEW
39.0 39.0 82.0 1.47 GB

Results and FMUs for the FMI Cross-Check

License: Other

Shell 6.55% Batchfile 45.85% VBScript 43.48% Python 4.11%
fmi

fmi-cross-check's Introduction

Modelica Libraries LogoModelica Libraries Logo

Modelica Standard Library

Free library from the Modelica Association to model mechanical (1D/3D), electrical (analog, digital, machines), magnetic, thermal, fluid, control systems and hierarchical state machines. Also numerical functions and functions for strings, files and streams are included.

Library description

Package Modelica is a free library that is developed together with the Modelica language from the Modelica Association. It is also called Modelica Standard Library. It provides model components and standard component interfaces from many engineering domains. Each model comes with documentation included. The generous license conditions allow usage in commercial products.

Note, the usage of a Modelica library requires a Modelica simulation environment, see the tools page, and that such an environment usually already includes the Modelica standard library. It is possible that the demo version of the commercial tools will not allow to simulate non-trivial examples from the library.

ModelicaLibraries

Current release

Modelica Standard Library v4.0.0 (2020-06-04)

Older Releases

Browse the Releases page in order to get access to older releases of the Modelica Standard Library.

License

This Modelica package is free software and the use is completely at your own risk; it can be redistributed and/or modified under the terms of the 3-Clause BSD License.

Status

CI checks CLA assistant Modelica v4.1.0-dev regression test ModelicaTest v4.1.0-dev regression test

Development and contribution

The development is organised by the Modelica Association Project - Libraries (MAP-LIB). See also the contribution guide and the MAP-LIB Project Rules for more information.

You may report any issues by using the Issue Tracker.

fmi-cross-check's People

Contributors

andreas-junghanns avatar antonio-portaluri avatar bastianbin avatar bcamus avatar beutlich avatar chrbertsch avatar dieggar avatar f-verani avatar ghorwin avatar hela-guesmi avatar izacharias avatar jthoene avatar karlwernersson avatar klausschuch avatar labauke avatar lochel avatar maplesoft-fmigroup avatar markaren avatar masoud-najafi avatar matthieujude1 avatar mbrumbulli avatar nfelderer avatar paulfilip avatar ptaeuberds avatar robbr48 avatar stefanoltean avatar t-schenk avatar t-sommer avatar weiwuli-tmw avatar zeppelin-lan avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

fmi-cross-check's Issues

Green button missing for ETAS COSYM

To my understanding, ETAS COSYM fulfills the rules for the "green button" as an importing tool, but is displayed as "blue". I think it was green before but somehow changed.

There are results for 4 exorting tools

grafik

but only for two tools they are displayed

grafik

perhaps related to #102

Exclude 2.0 FMUs that use undeclared units

The following 2.0 FMUs have variables with units that are not defined in UnitDefinitions:

/fmus/2.0/cs/c-code/MapleSim/2015.1/CoupledClutches/CoupledClutches.fmu: The unit '' of variable 'inputs' is not defined.
/fmus/2.0/cs/c-code/MapleSim/2015.1/ControlledTemperature/ControlledTemperature.fmu: The unit '' of variable 'outputs[1]' is not defined.
/fmus/2.0/cs/win64/MapleSim/2015.1/CoupledClutches/CoupledClutches.fmu: The unit '' of variable 'inputs' is not defined.
/fmus/2.0/cs/win64/MapleSim/2015.1/ControlledTemperature/ControlledTemperature.fmu: The unit '' of variable 'outputs[1]' is not defined.
/fmus/2.0/cs/win64/Easy5/2017.1/PneumaticActuator/PneumaticActuator.fmu: The unit 'user-defined' of variable 'PistonPosition' is not defined.
/fmus/2.0/cs/win64/Easy5/2017.1/Linear_Pos/Linear_Pos.fmu: The unit 'user-defined' of variable 'PistonPosition' is not defined.
/fmus/2.0/cs/win64/Easy5/2017.1/VanDerPol/VanDerPol.fmu: The unit 'user-defined' of variable 'V' is not defined.
/fmus/2.0/cs/darwin64/MapleSim/2015.1/CoupledClutches/CoupledClutches.fmu: The unit '' of variable 'inputs' is not defined.
/fmus/2.0/cs/darwin64/MapleSim/2015.1/ControlledTemperature/ControlledTemperature.fmu: The unit '' of variable 'outputs[1]' is not defined.
/fmus/2.0/cs/linux32/MapleSim/2015.1/CoupledClutches/CoupledClutches.fmu: The unit '' of variable 'inputs' is not defined.
/fmus/2.0/cs/linux32/MapleSim/2015.1/ControlledTemperature/ControlledTemperature.fmu: The unit '' of variable 'outputs[1]' is not defined.
/fmus/2.0/cs/win32/MapleSim/2015.1/CoupledClutches/CoupledClutches.fmu: The unit '' of variable 'inputs' is not defined.
/fmus/2.0/cs/win32/MapleSim/2015.1/ControlledTemperature/ControlledTemperature.fmu: The unit '' of variable 'outputs[1]' is not defined.
/fmus/2.0/cs/linux64/MapleSim/2015.1/CoupledClutches/CoupledClutches.fmu: The unit '' of variable 'inputs' is not defined.
/fmus/2.0/cs/linux64/MapleSim/2015.1/ControlledTemperature/ControlledTemperature.fmu: The unit '' of variable 'outputs[1]' is not defined.
/fmus/2.0/me/win64/MapleSim/2015.1/CoupledClutches/CoupledClutches.fmu: The unit '' of variable 'inputs' is not defined.
/fmus/2.0/me/win64/MapleSim/2015.1/ControlledTemperature/ControlledTemperature.fmu: The unit '' of variable 'outputs[1]' is not defined.
/fmus/2.0/me/win64/Easy5/2017.1/PneumaticActuator/PneumaticActuator.fmu: The unit 'user-defined' of variable 'PistonPosition' is not defined.
/fmus/2.0/me/win64/Easy5/2017.1/Linear_Pos/Linear_Pos.fmu: The unit 'user-defined' of variable 'PistonPosition' is not defined.
/fmus/2.0/me/win64/Easy5/2017.1/VanDerPol/VanDerPol.fmu: The unit 'user-defined' of variable 'V' is not defined.
/fmus/2.0/me/win64/SystemModeler/5.0/CoupledClutches/CoupledClutches.fmu: The unit 'rad/s2' of variable 'J1.a' is not defined.
/fmus/2.0/me/win64/SystemModeler/5.0/ControlledTemperature/ControlledTemperature.fmu: The unit 'K' of variable 'heatCapacitor.T' is not defined.
/fmus/2.0/me/darwin64/MapleSim/2015.1/CoupledClutches/CoupledClutches.fmu: The unit '' of variable 'inputs' is not defined.
/fmus/2.0/me/darwin64/MapleSim/2015.1/ControlledTemperature/ControlledTemperature.fmu: The unit '' of variable 'outputs[1]' is not defined.
/fmus/2.0/me/darwin64/SystemModeler/5.0/CoupledClutches/CoupledClutches.fmu: The unit 'rad/s2' of variable 'J1.a' is not defined.
/fmus/2.0/me/darwin64/SystemModeler/5.0/ControlledTemperature/ControlledTemperature.fmu: The unit 'K' of variable 'heatCapacitor.T' is not defined.
/fmus/2.0/me/linux32/MapleSim/2015.1/CoupledClutches/CoupledClutches.fmu: The unit '' of variable 'inputs' is not defined.
/fmus/2.0/me/linux32/MapleSim/2015.1/ControlledTemperature/ControlledTemperature.fmu: The unit '' of variable 'outputs[1]' is not defined.
/fmus/2.0/me/win32/MapleSim/2015.1/CoupledClutches/CoupledClutches.fmu: The unit '' of variable 'inputs' is not defined.
/fmus/2.0/me/win32/MapleSim/2015.1/ControlledTemperature/ControlledTemperature.fmu: The unit '' of variable 'outputs[1]' is not defined.
/fmus/2.0/me/win32/SystemModeler/5.0/CoupledClutches/CoupledClutches.fmu: The unit 'rad/s2' of variable 'J1.a' is not defined.
/fmus/2.0/me/win32/SystemModeler/5.0/ControlledTemperature/ControlledTemperature.fmu: The unit 'K' of variable 'heatCapacitor.T' is not defined.
/fmus/2.0/me/linux64/MapleSim/2015.1/CoupledClutches/CoupledClutches.fmu: The unit '' of variable 'inputs' is not defined.
/fmus/2.0/me/linux64/MapleSim/2015.1/ControlledTemperature/ControlledTemperature.fmu: The unit '' of variable 'outputs[1]' is not defined.
/fmus/2.0/me/linux64/SystemModeler/5.0/CoupledClutches/CoupledClutches.fmu: The unit 'rad/s2' of variable 'J1.a' is not defined.

Calculate cross-check results based on latest tool versions

Currently the number of "passed" FMU imports is summed up over all versions of the tools / FMUs. This leads to an "advantage" for older tools (with more versions) and does not encourage importers to test against newly released versions / tools. For the users this makes the result tables (both tools page and cross-check) less meaningful.

In the discussion at the FMI Design Meeting in Roanne it turned out that this could be solved by introducing an additional column "latest_version" to tools.csv. The results could then be structured like this:

Tool A v1.1 Tool B v2.0
Model 1 Model 2 Model 1 Model 2
Tool A passed rejected passed failed
Tool C passed passed passed rejected

The advantage of this approach would be that it scales much better (less data and CPU time) and produces a table that fits on one page and gives a good overview of the current status of the interoperability of the tools.

Reference results of fmi-cross-check/fmus/2.0/me/win64/Test-FMUs/0.0.2/BouncingBall/

The reference results of the Co-Simulation FMU BouncingBall contains results with time values which are not monotonically increasing. This violates the FMI CrossCheck Rules (see Appendix B).

Could you please correct this? It is misleading for users.

I just noticed the same for the corresponding model exchange FMU and for the FMUs Stair and Feedthrough.

Add consistency check: there should not be two variables with same value reference but different types

See fmus_2.0_cs_win64_AMESim_15_ClassEAmplifier -> defines two variables with same value reference, yet different types:

    <ScalarVariable name="maxTimeStep" valueReference="536870912"  ... >
         <Real unit="s" ... />

     <ScalarVariable name="errCtrl" valueReference="536870912" ... >
          <Integer min="0" max="2" start="0"/>

This looks like an error. The standard does not seem to forbid this, though.

Still, it looks strange and could be flagged with a warning in the compliance checker (in case it is an error).

Missing libgfortran.so.3 for JModelica FMU on Linux

When simulating fmus/2.0/me/linux64/JModelica.org/1.15/ControlledTemperature/ControlledTemperature.fmu with fmuChecker a shared library for gfortran can't be found. See the following error log:

$ fmuCheck -k me ControlledTemperature.fmu
[INFO][FMUCHK] FMI compliance checker Test [FMILibrary: Test] build date: Dec 13 2018
[INFO][FMUCHK] Called with following options:
[INFO][FMUCHK] fmuCheck -k me ControlledTemperature.fmu
[INFO][FMUCHK] Will process FMU ControlledTemperature.fmu
[INFO][FMILIB] XML specifies FMI standard version 2.0
[INFO][FMUCHK] Model name: ControlledTemperature
[INFO][FMUCHK] Model GUID: 14ac906d8d18e8da4c3d119d0c9a8a4
[INFO][FMUCHK] Model version:
[INFO][FMUCHK] FMU kind: ModelExchange
[INFO][FMUCHK] The FMU contains:
7 constants
42 parameters
4 discrete variables
38 continuous variables
0 inputs
1 outputs
48 local variables
0 independent variables
11 calculated parameters
80 real variables
6 integer variables
0 enumeration variables
16 boolean variables
0 string variables

[INFO][FMUCHK] Printing output file header
"time","TRes"
[INFO][FMUCHK] Model identifier for ModelExchange: ControlledTemperature
[INFO][FMILIB] Loading 'linux64' binary with 'default' platform types
[FATAL][FMICAPI] Could not load the DLL: libgfortran.so.3: cannot open shared object file: No such file or directory
[FATAL][FMUCHK] Could not create the DLL loading mechanism(C-API) for ME.
FMU check summary:
FMU reported:
        0 warning(s) and error(s)
Checker reported:
        0 Warning(s)
        2 Error(s)
        2 Fatal error(s) occurred during processing
```
What am I missing here? Do I need some additional libraries.
Are the provided test fmu's run with CircleCI? I could not find it there.

GFortran is installed on my Linux (Ubuntu64) fmu. The problem also occures with OMSimulator (which also uses FMI-Library for FMU loading) but it's working on a travis job for my julia FMISimulator, which is not using FMI-Library.

Cross-check results on fmi-standard.org

I really hope that I am getting this one completely wrong:

I fetched a simple overview of the number of available me tests for win64 from the repository, and these are the numbers that I got:

Exporting tool # provided tests # not compliant tests
CATIA 11 2
DS_FMU_Export_from_Simulink 19 10
Dymola 34 2
FMIToolbox_MATLAB 4 0
FMUSDK 8 0
MapleSim 22 14
Test-FMUs 12 4
... ... ...

Please compare now those numbers to the numbers presented on https://fmi-standard.org/cross-check/fmi2-me-win64/. There are apparently in many cases more verified tests than tests in the repository. What did I get wrong here?

image

AVL Cruise-M FMUs

Why does AVL-Cruise have the "greeen button" for FMU-export (e.g. FMU 2.0 CS)? I cannot find any uploaded FMUs or reference results for any platform. (CC: @klausschuch) Thanks in advance for clarifying

Cross-check should verify derivative attribute

On page 54 of FMI2.0 spec it says:

The state derivatives of an FMU are listed under element . All ScalarVariables listed in this element must have attribute derivative (in order that the continuous-time states are uniquely defined).

However, there are FMUs in the repository that fail to provide this information. For example:
https://github.com/modelica/fmi-cross-check/blob/master/fmus/2.0/cs/win64/MWorks/2016/MixtureGases/MixtureGases.fmu

has four Derivatives declared in the ModelStructure but there are no ScalarVariables with the attribute. This makes the FMU invalid and thus it should not pass the cross checker.

No working CI / master failing

The CirclecCI was disabled with commit c06c9b4.

Will there be a new continuous integration to validate the master branch and PRs?

When running python -m fmpy.cross_check.validate_vendor_repo on the master branch I get a giant load of error messages. So it would be very important to have at least a rudimentary CI for the moment until a proper one can be set up to keep the master branch error free.

Clarification proposal to FMI-CROSS-CHECK-RULES.md rules

Before actually changing the FMI-CROSS-CHECK-RULES.md and creating a pull request I just wanted to ask your opinion about a few clarifications to the document:

Section 9.1.3:

  • "All FMUs submitted to the repository must run without license checks..."
    Should FMUs that fail to run (due to license restrictions) be removed or marked as "notCompliantWithLatestRules"?

Generally, should we add a sentence about what happens with FMUs that have been added during a previous version of the cross-check rules and only because of a change to the regulations are no longer complying? Shall these be kept for historical reasons, but excluded from the summary page generation?

Section 9.1.4:

"If the FMU cannot be provided (e.g. because it contains critical intellectual property), submit a file {Model_Name}.nofmu. The README file shall contain information about how to get access to that FMU directly from the exporting tool vendor."
-> could this be applied also for license-restricted FMUs? Asking the tool vendor for a license is similar to asking for a license-free "secret" FMU.

"Reference solution as computed by the exporting tool."
-> Extend to:
"Reference solution as computed by the exporting tool. Should be obtained by calculation with the exported FMU itself."

"The variables in this file must match the input variables defined in the modelDescription.xml."
-> Extend to:
"The variables in this file must match the input variables and their types defined in the modelDescription.xml."

(Background: prevent situations where a real variable from the csv with value "12.3" is fed into a boolean input variable.)

Just a few ideas from my side.
-Andreas

dSPACE FMUs with backslashes in path

Some TargetLink FMUs (e.g. Fmucontroller.fmu) have backslashes (\) in their file paths which causes problems when extracting the archive on Linux and macOS.

When extracting the FMU I get a flat list of files with backslashes instead of subfolders:

$ ls
binaries\win32\Fmucontroller.dll	sources\controller_fri.h
binaries\win64\Fmucontroller.dll	sources\controller_tlaf.c
documentation\_main.html		sources\controller_tlaf.h
modelDescription.xml			sources\fmuTemplate.h
sources\Fmucontroller.c			sources\tl_basetypes.h
sources\controller.c			sources\tl_defines_a.h
sources\controller.h			sources\tl_types.h
sources\controller_fri.c

List of non-conforming FMUs regarding 'overstepping' end time

According to recent discussion/clarification on standard interpretation, CS-FMUs must be able to handle "doStep()" calls gracefully, even if t+h > tEnd. This may happen due to rounding errors for both fixed and variable step FMUs. See also discussion in ticket modelica/fmi-standard#575

Currently, there are a few FMUs in the cross-check repo, that fail the last step when end time is exceeded. Importing tools cannot comply with those test cases, unless they ignore the "constant step size" property and adjust the last step manually. Therefore it may be meaningful to update these FMUs and in the mean time mark them 'notCompliantWithLatestRules'.

Here's a list of tools that export FMUs with strict end-time checking:

  • 20Sim
  • MapleSim
  • jModelica (fmus/2.0/cs/linux64/JModelica.org/1.15/CoupledClutches)

(not complete, yet)

FMI Standard Tool listing shows "passed" despite "notCompliantWithLatestTestRules" file

For example:

results/2.0/cs/linux64/SIMPACK/9.10.1/JModelica.org/1.15/ControlledTemperature/

Directory contains both passed and notCompliantWithLatestRules files. Fmi-Tools-Listing shows corresponding case still as validated (btw, results are way off, so 'failed' would have been the appropriate rating).

Currently, the webpage display does not allow to distinguish between "had been validated with previous set of cross-check-rules" and "does comply to current cross-check-rules".

Suggest: add special handling in validate_vendor_repo script.

Problem appears to be in lines 277-278:validate_vendor_repo.py where

        if new_problems and clean_up:
            not_compliant_file = os.path.join(subdir, 'notCompliantWithLatestRules')

notCompliant... file is only tested for, when running with cleanup and when problems were found.
Maybe the check for 'notComplient' could be moved ahead and the test case should be skipped alltogether if the file is present?

(Note to self: I guess I have to read up on the discussion on that issue in the meeting notes... :-) )

Wrong cross-check status shown for Model.CONNECT

Model.CONNECT has at least 3 FMUs from 3 different tools imported successfully per submitted platform (win32, win64) and type (FMI1.0/2.0-ModelExchange, FMI1/2-CoSim).
However, only FMI1.0-ModelExchange is green, FMI1/2-CoSim and FMI2.0-ModelExchange are blue.
According to my interpretation of the rules all should be green.
Am I missing something?

Locals in FMU export reference files

Reported by maplesoft on 30 Jan 2017 14:16 UTC
A number of exported FMUs contain data for local variables in the reference files. Since these local variables are internal to the FMU and are listed under 'protected' section during Import, the importing tools may not allow access to such variables including for the purposes of plotting. We propose that the local variables should not be included in the reference CSV files, only outputs and states.


Migrated-From: https://trac.fmi-standard.org/ticket/408

Allow only output variables as reference results

The current cross-check rules document does not specify which variables can be part of the reference results. Currently, all variables can be used as reference results.

If the exporting tool adds parameters (even fixed or constant ones) to the reference results and the importing tool cannot record parameters during simulation, such an exporting tool cannot publish valid cross-check results for such an FMU.

Example:
The fmusdk/bouncingBall FMU writes the parameters g and e as reference results (csv-header: "time","h","der(h)","v","der(v)","g","e").
All cross check results that contain only all other variables like (csv-header: "Time","h","der(h)","v","der(v)") will be dropped even though the actual simulation results are perfectly fine.

Include reference FMUs in XC

I would suggest to include the Reference in the XC.

Before we make reporting on them mandatory (as the rules say) I would propose to start with a transition phase and ask all tool vendors to report on results for them, so that we get feedback on them and can possibly fix bugs.

After some time passed we could release a version 1.0 of the reference FMUs and make reporting on them mandatory

Add version tags

Problem

When running tests with my FMI tool using the master branch of this repository and something fails I can't be 100% sure if the error is on my side or there was a new commit wich did break something.
E.g. there are problems like #69
But of cours I don't want do define a commit as "correct" and start to only test with that version.

Also having constantly chaing number of tests makes it hard to follow why my results are getting better or worse.

Suggestion

Add some version tags on some specific commits where (at least mostly) everything is correct.
To find a precise criteria for this is probably hard, but still better than trying to keep up with a ever chaning master.

Or is there a different approach I could be using for this one?
Maybee rejecting every commit that is failing your tests on circleci?
Not that it's matter that much since my tool has to many basic problems, but I hope I'm not the only one to use this repo.

What "system" libraries can be assumed for the Cross Check?

OMSimualtor is running the FMI Cross Check (see libraries.openmodelica.org) inside a Docker image with only a few libraries installed.
In this Docker images ( anheuermann/ompython on DockerHub: Ubuntu Bionic, Ubuntu Focal, Ubuntu Bionic with Wine ) some libraries used by a lot of FMUs are not installed.

For example FMUs that use

Which of these libraries are assumed every user of an FMU has locally installed? If I understand the FMI standard correctly every FMU should contain all libraries needed to run it, but that's not really realistic.
Adding the correct Fortran library for every linux64 system could be very hard.
But on the other hand it is not possible (or very hard) to install libgfortran.so.3 on Ubuntu Focal.

So my question:
Are the example FMUs not compliant with the latest rules of the FMI Cross Check or should I add the missing libraries to my Docker images?

uploads to fmi-standard.org failed

The last two ci uploads failed with:
`[master 35b9440] Adding cross-check results from CI (build 245)
2 files changed, 11 insertions(+), 9 deletions(-)
rewrite _data/cross-check/result.csv (75%)
To github.com:modelica/fmi-standard.org.git
! [rejected] master -> master (fetch first)
error: failed to push some refs to '[email protected]:modelica/fmi-standard.org.git'
hint: Updates were rejected because the remote contains work that you do
hint: not have locally. This is usually caused by another repository pushing
hint: to the same ref. You may want to first integrate the remote changes
hint: (e.g., 'git pull ...') before pushing again.
hint: See the 'Note about fast-forwards' in 'git push --help' for details.

Exited with code exit status 1`

It seems that this happened before too.

FMI Import cross-check rules - what are suitable "reference values"?

Hi all,

since we just had a discussion about "validating a model implementation with measurement data", the following analogy to FMI co-simulation testing came up.

When the purpose is to test the correct implementation (i.e. standard compliance) of a model, differences between simulated results and 'reference' results must be clearly attributable to the implementation aspect.

Consider measurement data vs. simulation data comparison:

real data -> measurement errors/uncertainties -> reference values 
model parameter + algorithm -> implementation -> simulation results

Abstract:

  • refValues = operatorM(real data, uncertainties, measurement errors)
  • simValues = operatorS(model input/parameters, algorithms, implementation errors)

Obviously, deviations between refValues and simValues may be caused by different influencing factors in the operators M and S.

Same applies to co-simulation (specifically import and running a co-simulation).

coSimResults = operatorCS(algorithm, numerical parameters, implementation errors)

assuming model input/parameters are given in a uniquely interpretable way (not yet specified, fully; is linear interpolation of tabulated input values actually required?).

With respect to the differences in results, one of the most influential numerical parameter is (limiting this discussion to CS 1) the constant communication step size.

Also assumed is the fact, that test FMU implementations behave 'well' in numerical stability context -> the results generated should converge with smaller communication step size. Alas, that appears to be not always the case.

However, suppose a testing FMU is well-behaving, the input data is fully given, the algorithm (e.g. fixed time stepping forward with constant given step size) is spezified -> each FMU importing tool should pretty much generate the same results, within rounding error issues (as for example discussed in modelica/fmi-standard#575).

So, when testing for correct implementation of a master scheme is also the purpose of FMI cross-checking (which I assume it is), I suggest providing reference results that have actually been generated with the FMU itself.

Currently it appears, that quite a few CS 1 test cases have reference results generated from an implicit (fully Modelica) simulation, which cannot be obtained even with a fully confirming FMU co-simulation master.

One example for such cases is the ControlBuild/Drill (see attached published "passed" results).

ControlBuild_linux64_passed_results

When running the case we every smaller steps (MasterSim uses 0,0001 s steps with 0,01 s output frequency in this example), the results "converge" to a step-like function. The provided reference solution cannot be obtained -> in the analogy above a different operator has been used to generate the results.

With respect to the test criteria I would suggest formulating the following rules for publishing cross-check reference results, that aim at testing correct FMI importing functionality:

  • The reference solution has to be generated with the provided FMU itself.
  • The algorithm selected for generating the reference results shall be a standard FMI co-simulation algorithm (i.e. "explicit Euler"-type constant step wise stepping) without custom value postprocessing (gliding averages, smoothing techniques etc.).
  • The step size shall be selected such, that a further refinement of communication step size does not significantly alter the results (-> time grid refinement study)
  • The output result frequency shall be selected such, that significant effects remain observable, when the final results are plotted with linear line segments between points.
  • The time interval shall be selected such, that re-sampling at 1000 equidistant points (-> current cross check validation algorithm) will preserve relevant information (steep gradients). Otherwise the simulation time range has to be shortened accordingly.

Sorry for the long ticket texts... this should become an article at some point ;-)
-Andreas

Where to find Reference FMUs?

Reported by Mihal Brumbulli on 14 Dec 2017 10:19 UTC
Greetings,

In document FMI_Cross_Check_Rules_v3.1_2015_07 page 7 (No 10 in the table) is stated:

"FMU importing tools must report on importing for all Reference FMUs available for the supported FMI Variant and supported platforms provided on the SVN server"

Searching on the public branch I couldn't find an entry marked as "Reference FMUs". Can someone point me to it?


Migrated-From: https://trac.fmi-standard.org/ticket/427

Create an FMI XC sandbox

At the Regensburg Design meeting it was proposed to create a (prviate) FMI 3.0 Cross-Check Sandbox for members of the FMI project.

Problems running Test-FMUs

Hello,

I've faced some issues running some Test-FMUs in QTronic Silver. If this is not the correct place to raise these issues please let me know whom I should get in contact with.

  1. 2.0\cs\win64\Test-FMUs\0.0.01\Feedthrough - seems like this one is expecting some input signals (bool_in, int_in, real_continous_in, etc.) but these are not available as a .csv file.

  2. 1.0\cs\win64\Test-FMUs\0.0.1\Resource - when running this FMU in SIlver, it produces the following error message. The 2.0 version of this FMU works as expected. Could there be an issue with the 1.0 FMU?
    Resource: Resource (logError): Failed to open resource file C:\work\projects\12_FMI_XC\repos\fmi-cross-check-SO-branch\fmus\1.0\cs\win64\Test-FMUs\0.0.1\Resource\9L7YK_cs0_10232.

  3. 1.0\me\win64\Test-FMUs\0.0.1\Stair - the output of this FMU is constant throughout the simulation, there are no error message thrown by the FMU. The 2.0 version of this FMU works as expected. Could you please check if this behaves as it should? Have other tools reported issues with it?

Regards,
Stefan
QTronic GmbH

circle ci fails - ssh passphrase missing

The ci of cross-check fails currently.
This is the error shown in the ci output:
Warning: Permanently added the RSA host key for IP address '140.82.112.3' to the list of known hosts. Enter passphrase for key '/home/circleci/.ssh/id_rsa': Too long with no output (exceeded 10m0s) CircleCI received exit code 1

Allow FMUs to require start values

Often simulations fail because start values are not being set correctly (e.g. in the wrong state). It would be good to optionally allow test FMUs to specify start values that have to be set in order to reproduce the reference results e.g. as an (optional) file StartValues.csv:

OutsideTempC,18.5
InitialPosition[1],4
InitialPosition[2],1
InitialPosition[3],0
StringParam,"FMI is awesome!"

Tunable parameters could be provided via the existing *_in.csv. These changes would not require any updates of the existing FMUs and results.

How shall we treat tools/libraries supporting FMU import but not beeing a simulator

E.g., FMIL is used to implement pyfmi or der FMU Compliance Checker.
Could FMIL then inherit the green XC status from one of these tools?
see also discussion in modelon-community/fmi-library#5 (comment)

Currently I would read the tables as the FMIL developers do not want to participate in the XC, which is not the case.

Or shold we introduce a new formatting for "importing" tools that are not intended for simulation of FMUs, but do other things with FMUs (Libraries like FMIL, ...)

no LICENSE.txt for this repository

This repository doesn't seem to have a license file. This makes it difficult and unclear for others to freely use, change, or distribute the project.

Would it be possible to add a LICENSE.TXT. Something permissive like MIT License would be great!

Add missing FMUSDK FMUs

The FMUSDK is extremely valuable for testing different aspects of the FMI.
It would be great to have a complete set of pre-compiled FMUs for all platforms (at least for the latest release).

  • missing FMUs: values.fmu (the only FMU to test String I/O)
  • missing platforms: darwin64, linux32, linux64

Retroactively flagging FMUs as non-compliant

Should this be allowed? If an FMU passed validation using an FMPy version available at the time of check-in, it should be listed as compliant. It seems not logical to expect past FMUs to comply with future releases of verification tools.
A MapleSim 2018 and 2019 FMU was flagged as not compliant recently. While the problem is valid (modelDescription
contains an error but the FMU still runs as expected), it was not detected by FMPy version available at the time of submission.

Test FMUs with external dependencies

The following examples depend on shared objects which are not available or cannot be loaded on a standard target machine:

win64 (Windows 10):

  • fmus/2.0/cs/win64/YAKINDU_Statechart_Tools/4.0.4/*
  • fmus/2.0/cs/win64/Silver/3.3/*

linux64 (Ubuntu 20.04 LTS)

  • fmus/2.0/cs/linux64/JModelica.org/1.15/*
  • fmus/2.0/me/linux64/JModelica.org/1.15/*

The fmi-cross-check repository should follow strictly the FMI standard and ensure that the provided FMUs can be simulated by any importing tool that supports the FMI standard without the burden of chasing export specific libraries.

The quality of the cross-check depends highly on the quality of the provided examples. I am very much concerned that it technically is not possible to import the provided examples. It would make the test results and therwith the entire cross-check effort useless.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.