GithubHelp home page GithubHelp logo

darcymason / egsnrc Goto Github PK

View Code? Open in Web Editor NEW
5.0 5.0 1.0 1.51 MB

Port of EGSnrc to Python

License: GNU Affero General Public License v3.0

Python 30.45% Makefile 1.11% Fortran 67.49% Shell 0.48% Jupyter Notebook 0.48%

egsnrc's People

Contributors

darcymason avatar dependabot[bot] avatar pre-commit-ci[bot] avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar

Forkers

lc52520

egsnrc's Issues

Initial electr and next steps

So, good news: the Python electr loop is now working for tutor4. It still calls out through f2py to many functions, e.g. annih, bhaba, brems, moller, etc., and photon histories are tracked entirely in fortran photon. However, the basic logic now seems to be correct, at least for the settings of tutor4. Those nasty gotos are gone, and unit tests are in place.

However, I'm seeing it ~50 times slower than pure Fortran. Perhaps that's actually not too bad as a starting point, with no attempt yet to optimize, and the inherent inefficiencies of global variables and f2py interfacing.

I ran a quick profile: about 60% of the time is in the electr loops themselves (not including Python functions called from there). The many ustep loops per history really add up. An additional 25% of the time is in 'calc functions' like compute_eloss_g, etc. that I added in Python. Since those are basically math, they can be jitted. In fact, I did a simple @numba.jit decorator on two of them and the speed-up was factor ~8-10.

I'm interested in thoughts on next steps. @SimonBiggs, any thoughts from the GPU perspective? Or on trying PyPy or other 'compiling'?

I'm still contemplating, but based on the initial profiling noted above, I'm thinking the way forward may be to move loops out of Python and into arrays as much as possible -i.e. replace current one-history-at-a-time with numpy arrays of many particles tracked at once. That also gets us closer to GPU-compatible code. But that requires some significant refactoring to get there. Seems to me, though, that it is mostly the kind required to convert to any language - clear inputs and outputs, and minimal globals (except perhaps interaction data).

Codon compiler?

There's a Python-compatible language compiler out called Codon which looks interesting. It can do static compiling of all or parts of Python code, and is capable of GPU use and general parallelism.

@ftessier, @SimonBiggs - was wondering if that might be usable for something like this code. I might try it for fun... I have a strange idea of fun, but I suspect many programmers share in that strangeness.

Simon, I'm curious if you've seen Codon and can comment on the GPU use compared with jax - it looks simpler to me at first glance.

Codon is in very early releases, and linux only at the moment, and also is Business Source License - but from the FAQ: "you can use Codon freely for personal, academic, or other non-commercial applications."

Has anything else happened in the EGS alternative programming language space since we last talked?

Single vs double precision

I made a comment somewhere earlier about seeing minor differences in the last few digits of Python results vs Fortran results, and that I would leave it for a while. Well, ... it kept niggling at me so I investigated, and it is solved in the next PR - I'm just trying to fix up some of the tests.

It turns out that Fortran float literals are assumed to be single precision, even when assigned to a double precision variable, unless marked as double with e.g. "D0" on the end. Python is all double-precision. So, e.g. in compute-eloss, the formula has a constant 0.333333, which gives a double-precision number more like 0.333332985... vs in Python, 0.33333299999999.... (Aside: is the 0.333333 really 1/3?). There are other constants in these formulas too, but they all happen to be representable in exact binary IEEE 754: e.g. 0.25, 0.5, 2.0, 4.0.

A similar thing happened with some of the global common constants - rm, prm, prm2. They are initialized with single-prec values. I copied the values as seen in Fortran, to make Python match.

But, I wonder if constants such as these should be updated in original EGSnrc? It seems now that the default is to use all double precision. These are small differences, to be sure, but nevertheless they can add up over many histories. It feels strange to put "wrong" values in Python but it is very useful to be able to prove exact results vs the original Fortran. I suppose these could be put in variables, and made "wrong" just for unit tests.

Getting github actions set up

Have github actions mostly working, but pegs4 data not being found.

@ftessier, I'm wondering if you have any ideas?
In running configure.expect , I did not compile any codes, to save time. Is that relevant? I thought the pegs4 data stood alone outside this. And as far as I can tell, HEN_HOUSE is set correctly, unless somehow it is not picked up inside the f2py-generated egsfortran shared lib.

Here is a link to the failing workflow:
https://github.com/darcymason/egsnrc/runs/1840777335?check_suite_focus=true#step:11:70.
Note that I've ls'd the directory and it shows the file there.

I feel this is somewhat import to get a workflow like this going, at least as a sort of template for how to get the Python egsnrc set up and running. I'd like to also add code coverage stats to ensure testing of all branches of code.

I suppose I could copy the tutor_data file to my user code directory and see if that works, but will likely run into other problems unless we can understand why this isn't working.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.