Comments (12)
Based on the past discussion about how to show to the user an information about version compatibility of tools I would like to make the following proposal:
You start from the following currently available table (e.g. in https://fmi-standard.org/cross-check/fmi2-me-win32/):
importing tool | Tool 1 | Tool 2 | Tool 3 |
---|---|---|---|
Tool 1 | number of tests 1->1 | number of tests 1->2 | number of tests 1->3 |
Tool 2 | number of tests 2->1 | number of tests 2->2 | number of tests 2->3 |
Tool 3 | number of tests 3->1 | number of tests 3->2 | number of tests 3->3 |
If a user clicks on the information about the number of tests (e.g. number of tests 1->2) the following table is displayed
importing tool | Tool 2 version a | Tool 2 version b | Tool 2 version c |
---|---|---|---|
Tool 1 version 2017 | number of tests 2017->a | number of tests 2017->b | number of tests 2017->c |
Tool 1 version 2018 | number of tests 2018->a | number of tests 2018->b | number of tests 2018->c |
Tool 1 version 2019 | number of tests 2019->a | number of tests 2019->b | number of tests 2019->c |
The user can go back to the former table view via the browser's "go back" features (i.e. buttons).
from fmi-cross-check.
Many customers are stuck with old versions of tools for all kinds of reasons. Putting such a singular focus on the newest version looks like we start a race and punish those that try to stay stable.
It also looks like it only looks at versions of exporting tools - intentionally?
from fmi-cross-check.
The results would still be summed up over all versions of the importing tool. The intention is to give the users a clear picture where the numbers come from.
In the current tables Tool B gets a "5" for five imported FMUs of Tools A which could be the same model from 5 different versions of Tool A or imported by 5 five different versions of tool B.
This would IMHO even be a big improvement if we sum up over all versions because it is unlikely that newer versions can't simulate models that worked in older versions.
from fmi-cross-check.
For me the requirement for displaying the results are the following:
- in a top-level view it should be simple (e.g. not displaying all tool versions separately)
- users expect up-to-date results (from up-to-date versions). Tool vendors who provide such up-to-date information should be rewarded. On the other hand, for exported FMUs, it will take some time until the importers have reported new XC results. So I would propose to use only the two latest versions of tools both for exporters and importers to calculate the sum of successful FMUs of a tool combination
- it should be possible (in a second view)to to access more detailed information like results for older tool versions. Here users could look how good FMI is supported in a specific (older) tool version.
(e.g. displayed in a table where each tool version is treated like a separate tool) - in an even more detailed (third) view it should be possible to inspect XC results for individual FMUs.
This will be especially important once we will have reference FMUs as part of the XC.
In the above sketch I do not understand what "Model 1", "Model 2" means.
from fmi-cross-check.
"Model 1" and "Model 2" are placeholders for the models exported by the tool.
from fmi-cross-check.
Christian: I am happy with what we have and Torsten spent a lot of his time to get it to where we have it now. You requirements sound quite detailed: maybe you can implement them as a proposal?
from fmi-cross-check.
@andreas-junghanns, @chrbertsch would you be okay if we add the models as columns? It would be possible w/o changes to files / tools.csv and would give a more detailed image. The total number of models (currently ~50) is quite stable and would fit on one page.
from fmi-cross-check.
Could we have a switch to enable and disable the model level?
From my experience, the number of models will increase over time and we need a switch of level of detail anyway. Filtering will be needed in the platform tests at some point for sure - this could be a first step.
from fmi-cross-check.
The number of different models per platform is actually quite stable because vendors tend to export the same models for each version (which is good). Even with 50 models for the "worst case" (2.0/cs/win64) we would have some space to grow.
from fmi-cross-check.
aha - ok then let's leave more complicated filtering for later
from fmi-cross-check.
If we use a smart layout the table will look the same from 1m distance
from fmi-cross-check.
...but one could easily detect the tools that import only the simple models.
from fmi-cross-check.
Related Issues (20)
- Clarify the use of ComplianceChecker and FMPy HOT 3
- Exporting tools are missing in tables HOT 1
- Green button missing for ETAS COSYM HOT 1
- Retroactively flagging FMUs as non-compliant HOT 3
- No working CI / master failing HOT 4
- no LICENSE.txt for this repository HOT 8
- Update documentation of check & PR workflow
- Provide information for why an FMU is not compliant with latest rules HOT 6
- Test FMUs with external dependencies HOT 7
- solidThinking_Activate/2020/CVloop/ doesn't comply with the rules for the directory structure HOT 5
- Cross-check results on fmi-standard.org HOT 10
- Add version tags HOT 1
- Reference results of fmi-cross-check/fmus/2.0/me/win64/Test-FMUs/0.0.2/BouncingBall/ HOT 1
- Start FMI 3.0 sandbox HOT 2
- XC self-certification HOT 1
- Remove "hard fees" from XC rules and come up with new proposal for changes
- Include reference FMUs in XC
- How shall we treat tools/libraries supporting FMU import but not beeing a simulator HOT 1
- circle ci fails - ssh passphrase missing HOT 2
- What "system" libraries can be assumed for the Cross Check? HOT 2
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from fmi-cross-check.