GithubHelp home page GithubHelp logo

fusion-energy / neutronics-workshop Goto Github PK

View Code? Open in Web Editor NEW
93.0 3.0 45.0 35.79 MB

A workshop covering a range of fusion relevant analysis and simulations with OpenMC, DAGMC, Paramak and other open source fusion neutronics tools

License: MIT License

Dockerfile 2.59% Shell 0.93% Python 16.73% Jupyter Notebook 79.76%
neutronics simulations openmc dagmc neutrons photons containerised training course learning

neutronics-workshop's Introduction

docker-publish-release

Fusion Neutronics workshop

A selection of resources for learning fusion neutronics simulations with a particular focus on OpenMC, DAGMC and Paramak

There is a slide deck that introduces the workshop and each task.

There is also a Gather Town space which is great for working through the workshop with colleagues.

These examples require specific versions of software packages and nuclear data to run correctly. Therefore I highly recommend making use of the containerized environment to run these example notebooks.

The repository has benefited greatly from user feedback. Please feel free to raise GitHub issues or reach out in the discussions section if you spot anything that needs fixing or think of an improvement. Pull requests are very welcome.

The video below gives a brief explainer of what to expect in the workshop and some motivation for learning neutronics.

Tasks Keywords Video(s) CI test status
Task 1 - Cross sections Nuclear data, cross-sections, MT numbers, Doppler link1 link2 link3 link4 Task 1
Task 2 - Materials Materials, Neutronics Material Maker, Mixed materials link Task 2
Task 3 - CSG geometry CSG geometry, Geometry visualisation link Task 3
Task 4 - Sources Neutron point sources, Gamma sources, Plasma sources, Neutron track visualization link Task 4
Task 5 - TBR Tritium Breeding Ratio, Cell tallies, Simulations link Task 5
Task 6 - DPA Displacements Per Atom, Cell tallies, Simulations, Volume calculations link Task 6
Task 7 - Neutron and photon spectra Neutron Spectra, Photon Spectra, Cell tallies, Energy group structures, Flux, Current link Task 7
Task 8 - Mesh tallies Mesh tallies, Structured meshes link Task 8
Task 9 - Instantaneous Dose Instantaneous Dose, Cell tallies, Dose coefficients Task 9
Task 10 - Activation transmutation depletion Isotope build up and tally variation as a function of time Task 10
Task 11 - CSG shut down Dose Shut down Dose, Cell tallies, Dose coefficients Task 11
Task 12 - Detector examples Time filter detector response time of flight Task 12
Task 13 - stochastic volume calculation stochastic volume material atoms in cell Task 13
Task 14 - Variance_reduction Variance reduction, weight windows Task 14
Task 15 - Making CAD geometry Parametric CAD geometry, Paramak, Geometry visualisation link Task 15
Task 16 - CAD Cell tallies CAD-based neutronics, Cell tallies, DAGMC, Heating Task 16
Task 17 - CAD Mesh tallies CAD-based neutronics, Mesh tallies, Paramak, DAGMC, Fast flux Task 17

Local Installation

There are video tutorials for this section which accompany the step by step instructions below.

  • Ubuntu installation video ๐Ÿ‘‰

  • Windows installation video ๐Ÿ‘‰

  • Mac installation video ๐Ÿ‘‰

  1. Install Docker CE for Ubuntu, Mac OS, or Windows, including the part where you enable docker use as a non-root user.

  2. Pull the docker image from the store by typing the following command in a terminal window, or Windows users might prefer PowerShell.

    docker pull ghcr.io/fusion-energy/neutronics-workshop

    Expand - Having permission denied errors?
    
         If you are running the command from Linux or Ubuntu terminal and getting permission denied messages back.
         Try running the same command with with elevated user permissions by adding sudo at the front.
         sudo docker pull ghcr.io/fusion-energy/neutronics-workshop
         Then enter your password when prompted.
         
  3. Now that you have the docker image you can enable graphics linking between your os and docker, and then run the docker container by typing the following commands in a terminal window.

    docker run -p 8888:8888 ghcr.io/fusion-energy/neutronics-workshop

    Expand - Having permission denied errors?
    
         If you are running the command from Linux or Ubuntu terminal and getting permission denied messages back.
         Try running the same command with elevated user permissions by adding sudo at the front.
         sudo docker run -p 8888:8888 ghcr.io/fusion-energy/neutronics-workshop
         Then enter your password when prompted.
         
  4. A URL should be displayed in the terminal and can now be opened in the internet browser of your choice. Select and open the URL at the end of the terminal printout (highlighted below)

To check the tasks run try opening the first task in the half day workshop folder and running the Jupyter Lab code (either click on the triangular run button or click on the first code cell and press shift and enter to execute that cell).

Optional Packages for Local Installation

This is not required for the half day workshop but some of the more advanced tasks do require Paraview and or FreeCAD.

  1. Some tasks require the use of Paraview to view the 3D meshes produced. Parview can be downloaded from here.

    Expand - Ubuntu terminal commands for Paraview install
    
         sudo apt update && sudo apt-get install paraview
         
  2. Some tasks require the use of CAD software to view the 3D geometry produced. FreeCAD is one option for this and can be downloaded here.

    Expand - Ubuntu terminal commands for FreeCAD install
    
             sudo apt update && sudo apt-get install freecad
             

Run in the cloud

The repository is also ready for deployment on GitHub Codespaces which allows users to launch the containerized environment on more powerful cloud computers without installing anything locally.

  • To get started sign up to codespaces ๐Ÿ‘‰ codespaces

  • Then follow ๐Ÿ‘‰ this link to config a compute instance ๐Ÿ‘‰

  • VS Code will then launch in the browser, once loaded you must select the conda python interpreter to enable the correct Python environment.

neutronics-workshop's People

Contributors

ai-pranto avatar billingsley-john avatar davidepettinari avatar johnnonweiler avatar katie-taylor avatar kingyue737 avatar oliverator avatar pshriwise avatar remdelaportemathurin avatar rlbarker avatar rworrall-ukaea avatar shimwell avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

neutronics-workshop's Issues

Finish task 12

Make complete parts 2 and 3 of task 12

also add some text to the start of part 1

fission vs fusion neutron source strength and energy

Adding a plot of fission vs fusion neutron source could be imformative

Screenshot from 2023-08-02 09-30-47

import openmc
import openmc_source_plotter  # extends openmc.Source with plotting functions
import pint

total_energy = pint.Quantity(1, 'GJ')
energy_per_dd = pint.Quantity(12.5, 'MeV/particle')
energy_per_dt = pint.Quantity(17.6, 'MeV/particle')
energy_per_fission = pint.Quantity(200., 'MeV/particle')

my_dt_source = openmc.Source()
my_dt_source.energy = openmc.stats.Muir(e0=14080000.0, m_rat=5.0, kt=20000.0)
my_dt_source.strength=(total_energy/energy_per_dt).to('particle').magnitude
print(my_dt_source.strength)


my_dd_source = openmc.Source()
my_dd_source.energy = openmc.stats.Muir(e0=2080000.0, m_rat=2.0, kt=20000.0)
my_dd_source.strength=(total_energy/energy_per_dd).to('particle').magnitude
print(my_dd_source.strength)

my_fission_source = openmc.Source()
my_fission_source.energy = openmc.stats.Watt(a=988000.0, b=2.249e-06)
my_fission_source.strength=(total_energy/energy_per_fission).to('particle').magnitude*2.5  # 2.5 neutrons per fission on average
print(my_fission_source.strength)

figure1 = my_dd_source.plot_source_energy(n_samples=1000000, name=f'DD {my_dd_source.strength:.2e} neutrons per second')
figure2 = my_dt_source.plot_source_energy(figure=figure1, n_samples=1000000, name=f'DT {my_dt_source.strength:.2e}  neutrons per second')
figure3 = my_fission_source.plot_source_energy(figure=figure2, n_samples=1000000, name=f'Fission {my_fission_source.strength:.2e}  neutrons per second')

figure3.layout.yaxis['title']['text'] = 'neutrons per second'
figure3.layout.title = 'Energy distribution of neutrons from a 1GW thermal energy fission reactor, DD fusion reactor or DT fusion'

figure3.write_html('neutrons_from_different_sources.html')
figure3.show()
# hand calc for number of neutrons from each source
((1e9 * 6.242e+18)/200e6)*2.5 # Fission
((1e9 * 6.242e+18)/17.6e6) # DT Fusion
((1e9 * 6.242e+18)/12.5e6) # DD Fusion

Missing videos for tasks 11-14

These tasks are a bit underdeveloped compared to others as the workflow is still maturing

However it would be helpful to make some video tutorials for these tasks and cover the use of paraview

Filepath in Task 1 Part 4

Hello! I ran into an issue in part 4 of tast 1, where I was getting errors saying that there was no such file or directory for the h5 file, but when I changed it to 'h5_file = f"/WMP_Library/074186.h5"', it worked.
Screen Shot 2023-05-30 at 4 33 15 PM

suggestion: task 8 2d mesh tally plotting

In the 2d mesh plotting example, the source is at 0 0 0 in co-ordinate space, but in the 2d plots it is located at 50, 50 as it is the mesh index. If the source is moved up in +Z direction it will move down in the plot, which may confuse some students as they start to use their own models etc.
Possibly would be useful to expand the example to demonstrate how to make the axis of the plot match the geometry co-ordinates.

task 12 part 2 fails

If run straight after part 1 then it has trouble over writing the summary.h5 and statepoint file

needs a !rm *.h5 command and perhaps part 1 and 3 should also have this

current bugs

with the latest docker image there are a few api changes in openmc to accommodate

root โžœ / $ pytest tests
======================================== test session starts ========================================
platform linux -- Python 3.11.2, pytest-7.4.0, pluggy-1.2.0
rootdir: /tests
plugins: anyio-3.7.1
collected 18 items                                                                                  

tests/test_task_1.py .                                                                        [  5%]
tests/test_task_10.py .                                                                       [ 11%]
tests/test_task_11.py .                                                                       [ 16%]
tests/test_task_12.py F                                                                       [ 22%]
tests/test_task_13.py F                                                                       [ 27%]
tests/test_task_14.py .                                                                       [ 33%]
tests/test_task_15.py F                                                                       [ 38%]
tests/test_task_16.py F                                                                       [ 44%]
tests/test_task_17.py .                                                                       [ 50%]
tests/test_task_18.py .                                                                       [ 55%]
tests/test_task_2.py .                                                                        [ 61%]
tests/test_task_3.py F                                                                        [ 66%]
tests/test_task_4.py .                                                                        [ 72%]
tests/test_task_5.py .                                                                        [ 77%]
tests/test_task_6.py .                                                                        [ 83%]
tests/test_task_7.py .                                                                        [ 88%]
tests/test_task_8.py F                                                                        [ 94%]
tests/test_task_9.py .                                                                        [100%]

============================================= FAILURES ==============================================
___________________________________________ test_task_12 ____________________________________________

    def test_task_12():
        for notebook in Path().rglob("tasks/task_16_*/*.ipynb"):
            print(notebook)
>           nb, errors = _notebook_run(notebook)

tests/test_task_12.py:45: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/test_task_12.py:37: in _notebook_run
    raise e
tests/test_task_12.py:31: in _notebook_run
    ep.preprocess(nb, {'metadata': {'path': this_file_directory}})
opt/venv/lib/python3.11/site-packages/nbconvert/preprocessors/execute.py:89: in preprocess
    self.preprocess_cell(cell, resources, index)
opt/venv/lib/python3.11/site-packages/nbconvert/preprocessors/execute.py:110: in preprocess_cell
    cell = self.execute_cell(cell, index, store_history=True)
opt/venv/lib/python3.11/site-packages/nbclient/util.py:84: in wrapped
    return just_run(coro(*args, **kwargs))
opt/venv/lib/python3.11/site-packages/nbclient/util.py:62: in just_run
    return loop.run_until_complete(coro)
usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete
    return future.result()
opt/venv/lib/python3.11/site-packages/nbclient/client.py:965: in async_execute_cell
    await self._check_raise_for_error(cell, cell_index, exec_reply)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f3c201a6fd0>
cell = {'cell_type': 'code', 'execution_count': 1, 'metadata': {'execution': {'iopub.status.busy': '2023-08-06T19:52:58.47606...ue, exist_ok=True)\n    with open(filename, mode="w", encoding="utf-8") as f:\n        json.dump(result, f, indent=4)'}
cell_index = 1
exec_reply = {'buffers': [], 'content': {'ename': 'ModuleNotFoundError', 'engine_info': {'engine_id': -1, 'engine_uuid': 'f7ab179a-...e, 'engine': 'f7ab179a-839c-4a6f-ac8c-fb8f5cbaee2d', 'started': '2023-08-06T19:52:58.476783Z', 'status': 'error'}, ...}

    async def _check_raise_for_error(
        self, cell: NotebookNode, cell_index: int, exec_reply: t.Optional[t.Dict]
    ) -> None:
    
        if exec_reply is None:
            return None
    
        exec_reply_content = exec_reply['content']
        if exec_reply_content['status'] != 'error':
            return None
    
        cell_allows_errors = (not self.force_raise_errors) and (
            self.allow_errors
            or exec_reply_content.get('ename') in self.allow_error_names
            or "raises-exception" in cell.metadata.get("tags", [])
        )
        await run_hook(self.on_cell_error, cell=cell, cell_index=cell_index)
        if not cell_allows_errors:
>           raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)
E           nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
E           ------------------
E           # Install dependencies
E           
E           import json
E           import numpy as np
E           import pandas as pd
E           import adaptive
E           import holoviews
E           import ipywidgets
E           import nest_asyncio
E           import plotly.graph_objects as go
E           
E           from tqdm import tqdm
E           from pathlib import Path
E           from skopt import gp_minimize
E           from skopt.utils import dump, load
E           from scipy.interpolate import griddata
E           
E           from openmc_model import objective
E           
E           adaptive.notebook_extension()
E           nest_asyncio.apply()
E           
E           # method for saving results in json file
E           def output_result(filepath, result):
E               filename = filepath
E               Path(filename).parent.mkdir(parents=True, exist_ok=True)
E               with open(filename, mode="w", encoding="utf-8") as f:
E                   json.dump(result, f, indent=4)
E           ------------------
E           
E           ---------------------------------------------------------------------------
E           ModuleNotFoundError                       Traceback (most recent call last)
E           Cell In[1], line 18
E                15 from skopt.utils import dump, load
E                16 from scipy.interpolate import griddata
E           ---> 18 from openmc_model import objective
E                20 adaptive.notebook_extension()
E                21 nest_asyncio.apply()
E           
E           ModuleNotFoundError: No module named 'openmc_model'
E           ModuleNotFoundError: No module named 'openmc_model'

opt/venv/lib/python3.11/site-packages/nbclient/client.py:862: CellExecutionError
--------------------------------------- Captured stdout call ----------------------------------------
tasks/task_16_parameter_study_optimisation/parameter_study_optimisation.ipynb
--------------------------------------- Captured stderr call ----------------------------------------
0.00s - Debugger warning: It seems that frozen modules are being used, which may
0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off
0.00s - to python to disable frozen modules.
0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation.
___________________________________________ test_task_13 ____________________________________________

    def test_task_13():
        for notebook in Path().rglob("tasks/task_13_*/*.ipynb"):
            print(f'tests {notebook}')
>           nb, errors = _notebook_run(notebook)

tests/test_task_13.py:41: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/test_task_13.py:34: in _notebook_run
    raise e
tests/test_task_13.py:29: in _notebook_run
    ep.preprocess(nb, {'metadata': {'path': this_file_directory}})
opt/venv/lib/python3.11/site-packages/nbconvert/preprocessors/execute.py:89: in preprocess
    self.preprocess_cell(cell, resources, index)
opt/venv/lib/python3.11/site-packages/nbconvert/preprocessors/execute.py:110: in preprocess_cell
    cell = self.execute_cell(cell, index, store_history=True)
opt/venv/lib/python3.11/site-packages/nbclient/util.py:84: in wrapped
    return just_run(coro(*args, **kwargs))
opt/venv/lib/python3.11/site-packages/nbclient/util.py:62: in just_run
    return loop.run_until_complete(coro)
usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete
    return future.result()
opt/venv/lib/python3.11/site-packages/nbclient/client.py:965: in async_execute_cell
    await self._check_raise_for_error(cell, cell_index, exec_reply)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f3c1af20190>
cell = {'cell_type': 'code', 'execution_count': 6, 'id': 'a51cd616-b0ff-4322-82db-4608f3082611', 'metadata': {'execution': {'... [surface_filter, energy_filter]\nouter_surface_spectra_tally.id = 12\nmy_tallies.append(outer_surface_spectra_tally)'}
cell_index = 11
exec_reply = {'buffers': [], 'content': {'ename': 'TypeError', 'engine_info': {'engine_id': -1, 'engine_uuid': 'dc1961d4-35e6-4d26-...e, 'engine': 'dc1961d4-35e6-4d26-9a15-e48b55596a68', 'started': '2023-08-06T19:53:16.734360Z', 'status': 'error'}, ...}

    async def _check_raise_for_error(
        self, cell: NotebookNode, cell_index: int, exec_reply: t.Optional[t.Dict]
    ) -> None:
    
        if exec_reply is None:
            return None
    
        exec_reply_content = exec_reply['content']
        if exec_reply_content['status'] != 'error':
            return None
    
        cell_allows_errors = (not self.force_raise_errors) and (
            self.allow_errors
            or exec_reply_content.get('ename') in self.allow_error_names
            or "raises-exception" in cell.metadata.get("tags", [])
        )
        await run_hook(self.on_cell_error, cell=cell, cell_index=cell_index)
        if not cell_allows_errors:
>           raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)
E           nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
E           ------------------
E           my_tallies = openmc.Tallies()
E           
E           # This spherical mesh tally is used for generating the weight windows.
E           mesh = openmc.SphericalMesh()
E           mesh.r_grid = np.linspace(0, outer_surface.r, 5000)
E           mesh_filter = openmc.MeshFilter(mesh)
E           flux_tally_for_ww = openmc.Tally(name="flux tally")
E           flux_tally_for_ww.filters = [mesh_filter]
E           flux_tally_for_ww.scores = ["flux"]
E           flux_tally_for_ww.id = 42
E           my_tallies.append(flux_tally_for_ww)
E           
E           # This spectrum tally is on the outer shell and shows then energy distribution
E           # of neutrons present in the cell.
E           energy_filter = openmc.EnergyFilter.from_group_structure('CCFE-709')
E           surface_filter = openmc.CellFilter(cell_2)
E           outer_surface_spectra_tally = openmc.Tally(name='outer_surface_spectra_tally')
E           outer_surface_spectra_tally.scores = ['current']
E           outer_surface_spectra_tally.filters = [surface_filter, energy_filter]
E           outer_surface_spectra_tally.id = 12
E           my_tallies.append(outer_surface_spectra_tally)
E           ------------------
E           
E           ---------------------------------------------------------------------------
E           TypeError                                 Traceback (most recent call last)
E           Cell In[6], line 4
E                 1 my_tallies = openmc.Tallies()
E                 3 # This spherical mesh tally is used for generating the weight windows.
E           ----> 4 mesh = openmc.SphericalMesh()
E                 5 mesh.r_grid = np.linspace(0, outer_surface.r, 5000)
E                 6 mesh_filter = openmc.MeshFilter(mesh)
E           
E           TypeError: SphericalMesh.__init__() missing 1 required positional argument: 'r_grid'
E           TypeError: SphericalMesh.__init__() missing 1 required positional argument: 'r_grid'

opt/venv/lib/python3.11/site-packages/nbclient/client.py:862: CellExecutionError
--------------------------------------- Captured stdout call ----------------------------------------
tests tasks/task_13_variance_reduction/1_shielded_room_survival_biasing.ipynb
tests tasks/task_13_variance_reduction/4_sphere_iterative_per_batch_ww.ipynb
--------------------------------------- Captured stderr call ----------------------------------------
0.00s - Debugger warning: It seems that frozen modules are being used, which may
0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off
0.00s - to python to disable frozen modules.
0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation.
0.00s - Debugger warning: It seems that frozen modules are being used, which may
0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off
0.00s - to python to disable frozen modules.
0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation.
___________________________________________ test_task_14 ____________________________________________

    def test_task_14():
        for notebook in Path().rglob("tasks/task_15_*/*.ipynb"):
            print(notebook)
>           nb, errors = _notebook_run(notebook)

tests/test_task_15.py:45: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/test_task_15.py:37: in _notebook_run
    raise e
tests/test_task_15.py:31: in _notebook_run
    ep.preprocess(nb, {'metadata': {'path': this_file_directory}})
opt/venv/lib/python3.11/site-packages/nbconvert/preprocessors/execute.py:89: in preprocess
    self.preprocess_cell(cell, resources, index)
opt/venv/lib/python3.11/site-packages/nbconvert/preprocessors/execute.py:110: in preprocess_cell
    cell = self.execute_cell(cell, index, store_history=True)
opt/venv/lib/python3.11/site-packages/nbclient/util.py:84: in wrapped
    return just_run(coro(*args, **kwargs))
opt/venv/lib/python3.11/site-packages/nbclient/util.py:62: in just_run
    return loop.run_until_complete(coro)
usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete
    return future.result()
opt/venv/lib/python3.11/site-packages/nbclient/client.py:965: in async_execute_cell
    await self._check_raise_for_error(cell, cell_index, exec_reply)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f3c21ad0ad0>
cell = {'cell_type': 'code', 'execution_count': 1, 'metadata': {'execution': {'iopub.status.busy': '2023-08-06T19:58:03.01214... plot_simulation_results(filtered_results_df)\n    fig = go.Figure()\n    fig.add_trace(sample_trace)\n    return fig'}
cell_index = 1
exec_reply = {'buffers': [], 'content': {'ename': 'ModuleNotFoundError', 'engine_info': {'engine_id': -1, 'engine_uuid': '9cbe83ce-...e, 'engine': '9cbe83ce-6661-448f-ae90-24c6d1b76239', 'started': '2023-08-06T19:58:03.012816Z', 'status': 'error'}, ...}

    async def _check_raise_for_error(
        self, cell: NotebookNode, cell_index: int, exec_reply: t.Optional[t.Dict]
    ) -> None:
    
        if exec_reply is None:
            return None
    
        exec_reply_content = exec_reply['content']
        if exec_reply_content['status'] != 'error':
            return None
    
        cell_allows_errors = (not self.force_raise_errors) and (
            self.allow_errors
            or exec_reply_content.get('ename') in self.allow_error_names
            or "raises-exception" in cell.metadata.get("tags", [])
        )
        await run_hook(self.on_cell_error, cell=cell, cell_index=cell_index)
        if not cell_allows_errors:
>           raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)
E           nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
E           ------------------
E           # import dependencies for sampling and plotting
E           
E           import uuid
E           from pathlib import Path
E           import json
E           import pandas as pd
E           import adaptive
E           import numpy as np
E           import plotly.graph_objects as go
E           from tqdm import tqdm
E           from plotly.subplots import make_subplots
E           from skopt.sampler import Halton
E           from skopt.space import Space
E           from skopt.sampler import Grid
E           
E           import nest_asyncio
E           adaptive.notebook_extension()
E           nest_asyncio.apply()
E           
E           # imports pre-defined neutronics model and plotting tools
E           from openmc_model import * # this is where find_tbr_hcpb is
E           from plotting_tools import read_in_data, plot_simulation_results, plot_interpolated_results
E           
E           # method for saving results in json file
E           def output_result(result):
E               filename = "outputs/" + str(uuid.uuid4()) + ".json"
E               Path(filename).parent.mkdir(parents=True, exist_ok=True)
E               with open(filename, mode="w", encoding="utf-8") as f:
E                   json.dump(result, f, indent=4)
E                   
E           # method for showing results
E           def show_results(filtered_results_df):
E               sample_trace = plot_simulation_results(filtered_results_df)
E               fig = go.Figure()
E               fig.add_trace(sample_trace)
E               return fig
E           ------------------
E           
E           ---------------------------------------------------------------------------
E           ModuleNotFoundError                       Traceback (most recent call last)
E           Cell In[1], line 21
E                18 nest_asyncio.apply()
E                20 # imports pre-defined neutronics model and plotting tools
E           ---> 21 from openmc_model import * # this is where find_tbr_hcpb is
E                22 from plotting_tools import read_in_data, plot_simulation_results, plot_interpolated_results
E                24 # method for saving results in json file
E           
E           ModuleNotFoundError: No module named 'openmc_model'
E           ModuleNotFoundError: No module named 'openmc_model'

opt/venv/lib/python3.11/site-packages/nbclient/client.py:862: CellExecutionError
--------------------------------------- Captured stdout call ----------------------------------------
tasks/task_15_parameter_study_sampling/1_techniques_for_sampling_design_space.ipynb
--------------------------------------- Captured stderr call ----------------------------------------
0.00s - Debugger warning: It seems that frozen modules are being used, which may
0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off
0.00s - to python to disable frozen modules.
0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation.
___________________________________________ test_task_14 ____________________________________________

    def test_task_14():
        for notebook in Path().rglob("tasks/task_16_*/*.ipynb"):
            print(notebook)
>           nb, errors = _notebook_run(notebook)

tests/test_task_16.py:45: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/test_task_16.py:37: in _notebook_run
    raise e
tests/test_task_16.py:31: in _notebook_run
    ep.preprocess(nb, {'metadata': {'path': this_file_directory}})
opt/venv/lib/python3.11/site-packages/nbconvert/preprocessors/execute.py:89: in preprocess
    self.preprocess_cell(cell, resources, index)
opt/venv/lib/python3.11/site-packages/nbconvert/preprocessors/execute.py:110: in preprocess_cell
    cell = self.execute_cell(cell, index, store_history=True)
opt/venv/lib/python3.11/site-packages/nbclient/util.py:84: in wrapped
    return just_run(coro(*args, **kwargs))
opt/venv/lib/python3.11/site-packages/nbclient/util.py:62: in just_run
    return loop.run_until_complete(coro)
usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete
    return future.result()
opt/venv/lib/python3.11/site-packages/nbclient/client.py:965: in async_execute_cell
    await self._check_raise_for_error(cell, cell_index, exec_reply)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f3c21b023d0>
cell = {'cell_type': 'code', 'execution_count': 1, 'metadata': {'execution': {'iopub.status.busy': '2023-08-06T19:58:08.08778...ue, exist_ok=True)\n    with open(filename, mode="w", encoding="utf-8") as f:\n        json.dump(result, f, indent=4)'}
cell_index = 1
exec_reply = {'buffers': [], 'content': {'ename': 'ModuleNotFoundError', 'engine_info': {'engine_id': -1, 'engine_uuid': '1114a438-...e, 'engine': '1114a438-7d1d-4ec4-a425-50a06418d737', 'started': '2023-08-06T19:58:08.088569Z', 'status': 'error'}, ...}

    async def _check_raise_for_error(
        self, cell: NotebookNode, cell_index: int, exec_reply: t.Optional[t.Dict]
    ) -> None:
    
        if exec_reply is None:
            return None
    
        exec_reply_content = exec_reply['content']
        if exec_reply_content['status'] != 'error':
            return None
    
        cell_allows_errors = (not self.force_raise_errors) and (
            self.allow_errors
            or exec_reply_content.get('ename') in self.allow_error_names
            or "raises-exception" in cell.metadata.get("tags", [])
        )
        await run_hook(self.on_cell_error, cell=cell, cell_index=cell_index)
        if not cell_allows_errors:
>           raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)
E           nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
E           ------------------
E           # Install dependencies
E           
E           import json
E           import numpy as np
E           import pandas as pd
E           import adaptive
E           import holoviews
E           import ipywidgets
E           import nest_asyncio
E           import plotly.graph_objects as go
E           
E           from tqdm import tqdm
E           from pathlib import Path
E           from skopt import gp_minimize
E           from skopt.utils import dump, load
E           from scipy.interpolate import griddata
E           
E           from openmc_model import objective
E           
E           adaptive.notebook_extension()
E           nest_asyncio.apply()
E           
E           # method for saving results in json file
E           def output_result(filepath, result):
E               filename = filepath
E               Path(filename).parent.mkdir(parents=True, exist_ok=True)
E               with open(filename, mode="w", encoding="utf-8") as f:
E                   json.dump(result, f, indent=4)
E           ------------------
E           
E           ---------------------------------------------------------------------------
E           ModuleNotFoundError                       Traceback (most recent call last)
E           Cell In[1], line 18
E                15 from skopt.utils import dump, load
E                16 from scipy.interpolate import griddata
E           ---> 18 from openmc_model import objective
E                20 adaptive.notebook_extension()
E                21 nest_asyncio.apply()
E           
E           ModuleNotFoundError: No module named 'openmc_model'
E           ModuleNotFoundError: No module named 'openmc_model'

opt/venv/lib/python3.11/site-packages/nbclient/client.py:862: CellExecutionError
--------------------------------------- Captured stdout call ----------------------------------------
tasks/task_16_parameter_study_optimisation/parameter_study_optimisation.ipynb
--------------------------------------- Captured stderr call ----------------------------------------
0.00s - Debugger warning: It seems that frozen modules are being used, which may
0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off
0.00s - to python to disable frozen modules.
0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation.
____________________________________________ test_task_3 ____________________________________________

    def test_task_3():
        for notebook in Path().rglob("tasks/task_03_*/*.ipynb"):
            print(notebook)
>           nb, errors = _notebook_run(notebook)

tests/test_task_3.py:46: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/test_task_3.py:38: in _notebook_run
    raise e
tests/test_task_3.py:32: in _notebook_run
    ep.preprocess(nb, {'metadata': {'path': this_file_directory}})
opt/venv/lib/python3.11/site-packages/nbconvert/preprocessors/execute.py:89: in preprocess
    self.preprocess_cell(cell, resources, index)
opt/venv/lib/python3.11/site-packages/nbconvert/preprocessors/execute.py:110: in preprocess_cell
    cell = self.execute_cell(cell, index, store_history=True)
opt/venv/lib/python3.11/site-packages/nbclient/util.py:84: in wrapped
    return just_run(coro(*args, **kwargs))
opt/venv/lib/python3.11/site-packages/nbclient/util.py:62: in just_run
    return loop.run_until_complete(coro)
usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete
    return future.result()
opt/venv/lib/python3.11/site-packages/nbclient/client.py:965: in async_execute_cell
    await self._check_raise_for_error(cell, cell_index, exec_reply)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f3c1af13110>
cell = {'cell_type': 'code', 'execution_count': 2, 'metadata': {'execution': {'iopub.status.busy': '2023-08-06T19:58:38.88470...and the available openmc colors\n# vox_plot.colors = {copper: \'blue\'}\n\nvox_plot.to_vtk(output=\'voxel_plot.vti\')'}
cell_index = 4
exec_reply = {'buffers': [], 'content': {'ename': 'ImportError', 'engine_info': {'engine_id': -1, 'engine_uuid': '9161dac6-f154-405...e, 'engine': '9161dac6-f154-405b-9d6f-956f48cba0a3', 'started': '2023-08-06T19:58:38.885004Z', 'status': 'error'}, ...}

    async def _check_raise_for_error(
        self, cell: NotebookNode, cell_index: int, exec_reply: t.Optional[t.Dict]
    ) -> None:
    
        if exec_reply is None:
            return None
    
        exec_reply_content = exec_reply['content']
        if exec_reply_content['status'] != 'error':
            return None
    
        cell_allows_errors = (not self.force_raise_errors) and (
            self.allow_errors
            or exec_reply_content.get('ename') in self.allow_error_names
            or "raises-exception" in cell.metadata.get("tags", [])
        )
        await run_hook(self.on_cell_error, cell=cell, cell_index=cell_index)
        if not cell_allows_errors:
>           raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)
E           nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
E           ------------------
E           # makes the 3d "cube" style geometry
E           vox_plot = openmc.Plot()
E           vox_plot.type = 'voxel'
E           
E           # makes sure the bounds of the plot include the whole geometry
E           vox_plot.width = my_geometry.bounding_box.width
E           
E           # makes sure the voxel plot is centered at the center of the geometry
E           vox_plot.origin = my_geometry.bounding_box.center
E           
E           # sets the pixels in each direction to be proportional to the size of the geometry in that direction
E           # Your computer RAM will limit the number of pixels you can set in each direction.
E           # The * 0.1 part of this line reduces the number of pixels in each direction to a reasonable amount but this could be increased if you want more resolution.
E           vox_plot.pixels = [int(w* 0.1) for w in my_geometry.bounding_box.width]
E           
E           vox_plot.color_by = 'material'
E           # materials can be coloured using this command and the available openmc colors
E           # vox_plot.colors = {copper: 'blue'}
E           
E           vox_plot.to_vtk(output='voxel_plot.vti')
E           ------------------
E           
E           ---------------------------------------------------------------------------
E           ImportError                               Traceback (most recent call last)
E           Cell In[2], line 20
E                16 vox_plot.color_by = 'material'
E                17 # materials can be coloured using this command and the available openmc colors
E                18 # vox_plot.colors = {copper: 'blue'}
E           ---> 20 vox_plot.to_vtk(output='voxel_plot.vti')
E           
E           File /opt/venv/lib/python3.11/site-packages/openmc/plots.py:977, in Plot.to_vtk(self, output, openmc_exec, cwd)
E               974 if output is None:
E               975     output = h5_voxel_file.with_suffix('.vti')
E           --> 977 return voxel_to_vtk(h5_voxel_file, output)
E           
E           File /opt/venv/lib/python3.11/site-packages/openmc/plots.py:203, in voxel_to_vtk(voxel_file, output)
E               185 """Converts a voxel HDF5 file to a VTK file
E               186 
E               187 .. versionadded:: 0.13.4
E              (...)
E               199     Path of the .vti file produced
E               200 """
E               202 # imported vtk only if used as vtk is an option dependency
E           --> 203 import vtk
E               205 _min_version = (2, 0)
E               207 # Read data from voxel file
E           
E           File /opt/venv/lib/python3.11/site-packages/vtk.py:5
E                 3 # this module has the same contents as vtkmodules.all
E                 4 from vtkmodules.vtkCommonCore import *
E           ----> 5 from vtkmodules.vtkWebCore import *
E                 6 from vtkmodules.vtkCommonMath import *
E                 7 from vtkmodules.vtkCommonTransforms import *
E           
E           ImportError: Initialization failed for vtkWebCore, not compatible with vtkmodules.vtkCommonCore
E           ImportError: Initialization failed for vtkWebCore, not compatible with vtkmodules.vtkCommonCore

opt/venv/lib/python3.11/site-packages/nbclient/client.py:862: CellExecutionError
--------------------------------------- Captured stdout call ----------------------------------------
tasks/task_03_making_CSG_geometry/3_viewing_the_geometry_as_vtk.ipynb
--------------------------------------- Captured stderr call ----------------------------------------
0.00s - Debugger warning: It seems that frozen modules are being used, which may
0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off
0.00s - to python to disable frozen modules.
0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation.
____________________________________________ test_task_8 ____________________________________________

    def test_task_8():
        for notebook in Path().rglob("tasks/task_08_*/*.ipynb"):
            print(notebook)
>           nb, errors = _notebook_run(notebook)

tests/test_task_8.py:45: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
tests/test_task_8.py:37: in _notebook_run
    raise e
tests/test_task_8.py:31: in _notebook_run
    ep.preprocess(nb, {'metadata': {'path': this_file_directory}})
opt/venv/lib/python3.11/site-packages/nbconvert/preprocessors/execute.py:89: in preprocess
    self.preprocess_cell(cell, resources, index)
opt/venv/lib/python3.11/site-packages/nbconvert/preprocessors/execute.py:110: in preprocess_cell
    cell = self.execute_cell(cell, index, store_history=True)
opt/venv/lib/python3.11/site-packages/nbclient/util.py:84: in wrapped
    return just_run(coro(*args, **kwargs))
opt/venv/lib/python3.11/site-packages/nbclient/util.py:62: in just_run
    return loop.run_until_complete(coro)
usr/lib/python3.11/asyncio/base_events.py:653: in run_until_complete
    return future.result()
opt/venv/lib/python3.11/site-packages/nbclient/client.py:965: in async_execute_cell
    await self._check_raise_for_error(cell, cell_index, exec_reply)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <nbconvert.preprocessors.execute.ExecutePreprocessor object at 0x7f3c201abad0>
cell = {'cell_type': 'code', 'execution_count': 7, 'metadata': {'execution': {'iopub.status.busy': '2023-08-06T20:02:08.01030...y=my_geometry,\n    outline_by='material',\n    outline_kwargs={'colour':['black']}\n)\nplot.figure.savefig('yz.png')"}
cell_index = 14
exec_reply = {'buffers': [], 'content': {'ename': 'AttributeError', 'engine_info': {'engine_id': -1, 'engine_uuid': '1c43a002-b080-...e, 'engine': '1c43a002-b080-4099-b16a-6cfd4351f37b', 'started': '2023-08-06T20:02:08.010804Z', 'status': 'error'}, ...}

    async def _check_raise_for_error(
        self, cell: NotebookNode, cell_index: int, exec_reply: t.Optional[t.Dict]
    ) -> None:
    
        if exec_reply is None:
            return None
    
        exec_reply_content = exec_reply['content']
        if exec_reply_content['status'] != 'error':
            return None
    
        cell_allows_errors = (not self.force_raise_errors) and (
            self.allow_errors
            or exec_reply_content.get('ename') in self.allow_error_names
            or "raises-exception" in cell.metadata.get("tags", [])
        )
        await run_hook(self.on_cell_error, cell=cell, cell_index=cell_index)
        if not cell_allows_errors:
>           raise CellExecutionError.from_cell_and_msg(cell, exec_reply_content)
E           nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
E           ------------------
E           plot = openmc.plot_mesh_tally(
E               tally=my_tbr_tally,
E               basis='xz',
E               axis_units='m',
E               # slice_index=9,  # max value of slice selected
E               value= 'std_dev',
E               outline=True,
E               geometry=my_geometry,
E               outline_by='material',
E               outline_kwargs={'colour':['black']}
E           )
E           plot.figure.savefig('yz.png')
E           ------------------
E           
E           ---------------------------------------------------------------------------
E           AttributeError                            Traceback (most recent call last)
E           Cell In[7], line 1
E           ----> 1 plot = openmc.plot_mesh_tally(
E                 2     tally=my_tbr_tally,
E                 3     basis='xz',
E                 4     axis_units='m',
E                 5     # slice_index=9,  # max value of slice selected
E                 6     value= 'std_dev',
E                 7     outline=True,
E                 8     geometry=my_geometry,
E                 9     outline_by='material',
E                10     outline_kwargs={'colour':['black']}
E                11 )
E                12 plot.figure.savefig('yz.png')
E           
E           AttributeError: module 'openmc' has no attribute 'plot_mesh_tally'
E           AttributeError: module 'openmc' has no attribute 'plot_mesh_tally'

opt/venv/lib/python3.11/site-packages/nbclient/client.py:862: CellExecutionError
--------------------------------------- Captured stdout call ----------------------------------------
tasks/task_08_CSG_mesh_tally/1_example_2d_mesh_tallies.ipynb
tasks/task_08_CSG_mesh_tally/2_example_3d_mesh_tallies.ipynb
--------------------------------------- Captured stderr call ----------------------------------------
0.00s - Debugger warning: It seems that frozen modules are being used, which may
0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off
0.00s - to python to disable frozen modules.
0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation.
0.00s - Debugger warning: It seems that frozen modules are being used, which may
0.00s - make the debugger miss breakpoints. Please pass -Xfrozen_modules=off
0.00s - to python to disable frozen modules.
0.00s - Note: Debugging will proceed. Set PYDEVD_DISABLE_FILE_VALIDATION=1 to disable this validation.
========================================= warnings summary ==========================================
opt/venv/lib/python3.11/site-packages/jupyter_client/connect.py:28
  /opt/venv/lib/python3.11/site-packages/jupyter_client/connect.py:28: DeprecationWarning: Jupyter is migrating its paths to use standard platformdirs
  given by the platformdirs library.  To remove this warning and
  see the appropriate new directories, set the environment variable
  `JUPYTER_PLATFORM_DIRS=1` and then run `jupyter --paths`.
  The use of platformdirs will be the default in `jupyter_core` v6
    from jupyter_core.paths import jupyter_data_dir, jupyter_runtime_dir, secure_write

-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html
====================================== short test summary info ======================================
FAILED tests/test_task_12.py::test_task_12 - nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
FAILED tests/test_task_13.py::test_task_13 - nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
FAILED tests/test_task_15.py::test_task_14 - nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
FAILED tests/test_task_16.py::test_task_14 - nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
FAILED tests/test_task_3.py::test_task_3 - nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
FAILED tests/test_task_8.py::test_task_8 - nbclient.exceptions.CellExecutionError: An error occurred while executing the following cell:
======================== 6 failed, 12 passed, 1 warning in 833.32s (0:13:53) ========================
root โžœ / $ 

polygon example

import openmc
import matplotlib.pyplot as plt

height = 20
width = 30
thickness = 2
center_x=100
center_z=0

surface_1 = openmc.model.Polygon(
    points= [
        (10,10),
        (7,13),
        (5,10),
        (7,5),
    ],
    basis='rz'
)

# offsetting is very useful when the geometry is not on the central axis
surface_2 = surface_1.offset(1)

region_1 = -surface_1
region_2 = +surface_1 & -surface_2

cell_1 = openmc.Cell(region=region_1)
cell_2 = openmc.Cell(region=region_2)

universe = openmc.Universe(cells=[cell_1, cell_2])
print(universe.bounding_box)
# set width and origin as bounding box can't be automatically found for polyline
universe.plot(width=(25,20),basis='xz', origin=(0,0,10))

plt.show()

Scaling in task 8 Part 2

# converts the tally result into a VTK file
# this time stadnard deviation error on the tally is added to the vtk file as another data set
# the tally is also scaled from eV per source particle to Joules per source particle 1eV = 1.60218e-19J)
# Try adding another scaling term to multiplying by the number of neutrons emitted per second would which would convert the units to Watts
mesh.write_data_to_vtk(
    filename="heating_tally_on_mesh.vtk",
    datasets={"mean": my_heating_tally.mean * 1.60218e-19, "std_dev": my_heating_tally.std_dev}
)

In task 8 part 2, shouldn't the standard deviation be scaled too?

porting slides to rep

We have some slides which are not under version control and it would be great to move them into the repo

After reading this blog post I think there is a nice option for markdown based slides

So I'm going to port the slides over when i get some time

task 10 part 2 is slow

The construction of ToroidalFieldCoilPrincetonD is quite slow, perhaps another shape should be chosen for this example

plotting neutron angle energy

we could add an energy angle plot for the nuclear data section

# This example plots the probability of scattering angle for different incident neutron energies.

import openmc
import matplotlib.pyplot as plt
from pprint import pprint
import math

nuc_path = "/home/jshimwell/ENDF-B-VIII.0-NNDC/h5_files/neutron/C12.h5"
# /nuclear_data/ENDFB-8.0-NNDC_Au197.h5

gold = openmc.data.IncidentNeutron.from_hdf5(nuc_path)

print(list(gold.reactions.values()))

gold_elastic = gold[2]
print(gold_elastic.products)

neutron_out = gold_elastic.products[0]

dist = neutron_out.distribution[0]

pprint(list(gold.reactions.values()))

fig, ax = plt.subplots()

neutron_energy = dist.angle.energy
# This loops through the incident neutron energies and the angular distributions
# This loop also uses some python list slicing syntax
# As the nuclear data goes to high energies (150MeV) we skip the last 15 entries of the list with the [:-15] part
# As there are a lot of data points for lots of energies we are plotting every 50th neutron energy with [:::50] part
for energy, angle in zip(dist.angle.energy[:-15][::50], dist.angle.mu[:-15][::50]):
    ax.plot(angle.x, angle.p, label=f"neutron energy={energy/1e6}MeV")
import numpy as np
def forward(x):
    vals = []
    for v in x:
        vals.append(math.cos(v))
    # return vals
    return np.array(vals)
def inverse(x):
    vals = []
    for v in x:
        vals.append(math.acos(v))
    return np.array(vals)

# (lambda x: math.degrees(math.acos(x)), lambda x: math.cos(math.radians(x))
secax = ax.secondary_xaxis('top', functions=(forward, inverse))
secax.set_xlabel('angle [degrees]')

# plt.xlabel("scattering cosines")
# plt.ylabel("probability")
# plt.title("Neutron energy angle distribution of C12")
# plt.legend(bbox_to_anchor=(1.01, 0.9))
# plt.tight_layout()
plt.show()#savefig("angle_energy_c12.png", bbox_inches="tight", dpi=400)

tidy up

task 10 has some download links that are not needed at the end
task 11 makes use of otuc which might not be needed
task 12 part 3 fails to find all the cad volumes with brep-part-finder
task 13 rename scripts to start with 1 and 2

Adding axis scale to the plots

For some plots in task 1, the user has to switch from the default lin lin to log log.

Perhaps this (log log) argument could be added to the plotting utils so the graph can be plotted in log log directly

Then it is easier for the users

task 12 is failing

Reported on this discussion

"Task 12 of the neutronics workshop wont run through this method throwing a Value error : "ValueError: 37 volumes found in Brep file is not equal to the number of material_tags 17 provided." This uses the BallReactor Class though other reactors will not convert also for different reasons."

task 13 fails

with error message cannot import name 'MultiMaterial' from 'neutronics_material_maker'

simpler energyFilter

A recent PR simplified the EnergyFilter from energy groups.

Currently the workshop (in particular task 7) does the following

energy_bins = openmc.mgxs.GROUP_STRUCTURES['CCFE-709']
energy_filter = openmc.EnergyFilter(energy_bins)

However this could be reduced to a single line

energy_filter = openmc.EnergyFilter.from_group_structure('CCFE-709')

adding a more minimal conda install

could make a conda install for the whole package now that cadquery is on pip and the environment can be resolved.

curl -L -O https://github.com/conda-forge/miniforge/releases/latest/download/Miniforge-pypy3-Linux-x86_64.sh
bash Miniforge-pypy3-Linux-x86_64.sh

conda create --name openmc_0.13.3 python=3.10
conda activate openmc_0.13.3

conda install -c conda-forge openmc=0.13.3

pip install openmc_data
download_nndc -r b8.0
download_nndc_chain -r b8.0

echo 'export OPENMC_CHAIN_FILE=/home/jshimwell/chain-nndc-b8.0.xml' >> ~/.bashrc 
echo 'export OPENMC_CROSS_SECTIONS=/home/jshimwell/nndc-b8.0-hdf5/endfb-viii.0-hdf5/cross_sections.xml' >> ~/.bashrc 

Problem with task 10

I ran into this issue with task 10, I tried running the command that the error message mentions inside the docker container but it seemed to be installed already. I also tried to install it in the openmc_0_11_0 environment within the container but this didn't work either.
image

Problem with the ring source plotting

Hello,
I am completing the workshop and I've run into a problem with the 2nd part of the 4th task.
When I try to plot the source defined in this part, I have the following error :
"ERROR: Invalid spatial distribution for external source: cylindrical"
I have not changed the default settings.
Do you know where it might come from ?
Thank you

Screenshot 2022-04-06 at 09 59 23

feedback from 2023 york workshop

we don't yet have any survey monkey responses yet ๐Ÿ˜ข

But I do remember a few trick parts that could be fixed if time permits

  • mesh tally visualization in paraview didn't work for some users
  • spectra simulation should show thermal neutron bump without requiring lots of particles as there is not time for large particle simulations
  • DPA task should have volume calculation don on a more complex cell

suggestion: increase clarity on cross sections in task 1

In task 1 it would perhaps be useful to be clear in the text when plotting microscopic cross section or macroscopic cross section, Depending on the target audience it might be worth adding the definitions & relationship between them to the text.

The doppler broadening plotting function looks to be doing things differently i.e. reading a data file directly and not calling openmc.calculate_cexs() and unlike the other functions will not take a list of isotopes or elements. calculate_cexs() does quite a few checks and processing steps - does this mean it is not sensible to compare standard temperature data from plot isotope function with the same temperature in the doppler broadening function.

task 04 part 2 and part 3 are failing

this is due to the version of openmc that writes initial source does not work on version 0.12 or 0.13 of openmc and the ring source used in the task is not available in version 0.11

This will be fixed when openmc.lib has the ability to query the source

should we have an example for photonuclear neutron production from photons

the code is ease enough if this is useful we can add an example

# coupled photon transport can be enabled for a neutron transport
# with the settings.photon_transport = True
# that is fairly standard and we need to enable photon production and transport
# for tallies like heat, dose and it is important to transport the photon
# as it will not deposit its energy locally.
# Charged particles will deposit their energy locally so they don't need to be transported

# However this example is somewhat different
# This examples shows that a 4MeV photon source creates neutrons
# We do this by creating a photon source and tallying neutron flux

# You could try changing the material to a lower Z atom like Be or Li and the
# flux should go up partly because these shield photons to a lesser extent.

import openmc

# MATERIALS
material = openmc.Material()
material.add_element('Fe', 1)
material.set_density('g/cm3', 7)
my_materials = openmc.Materials([material])

# GEOMETRY
# surfaces
vessel_inner = openmc.Sphere(r=500, boundary_type='vacuum')
# cells
inner_vessel_region = -vessel_inner
inner_vessel_cell = openmc.Cell(region=inner_vessel_region)
inner_vessel_cell.fill = material
my_geometry = openmc.Geometry([inner_vessel_cell])

# SIMULATION SETTINGS
# Instantiate a Settings object
my_settings = openmc.Settings()
my_settings.batches = 10
my_settings.inactive = 0
my_settings.particles = 500
my_settings.run_mode = 'fixed source'
# Create a photon point source
my_source = openmc.Source()
my_source.space = openmc.stats.Point((0, 0, 0))
my_source.angle = openmc.stats.Isotropic()
my_source.particle = 'photon'  # default would otherwise be neutron
my_source.energy = openmc.stats.Discrete([4e6], [1])
my_settings.source = my_source
# added a cell tally for neutron flux production
cell_filter = openmc.CellFilter(inner_vessel_cell)
flux_tally = openmc.Tally(name='flux')
flux_tally.filters = [cell_filter]
flux_tally.scores = ['flux']
my_tallies = openmc.Tallies([flux_tally])

# Run OpenMC!
model = openmc.model.Model(my_geometry, my_materials, my_settings, my_tallies)
sp_filename = model.run()

# open the results file
sp = openmc.StatePoint(sp_filename)

# access the tally using pandas dataframes
flux_tally = sp.get_tally(name='flux')

# printing the tally shows that we get a neutron flux as the tally is not 0
print(flux_tally.mean, flux_tally.std_dev)
# printed results are ...
# >>> [[[14.42136056]]] [[[0.14143424]]]

Docker not starting on Windows

A student reported that docker was not starting and solved it by typing the following command in the command prompt (cmd)

netsh winsock reset

[Feature suggestion] Inclusion of a simple introduction to Monte Carlo methods

Just wondering if it would be beneficial to add a task_0 or something similar, which would be an introduction to Monte Carlo methods more generally. I have often found this useful when teaching undergraduate students. The calculation of pi using random numbers is always a good start. There are additional extensions of this as well if the concept is of interest.

I'd be happy to implement

Removing download FileLinks

We currently have tasks where we download the file via a Filelink

from IPython.display import FileLink

However these are no longer needed now that we have jupyter labs working.

jupyter labs provides easy access to the files in the file explorer to the left hand side

tasks to refactor include
task 3 part 3
task 4 part 4
task 8 part 2
task 10 part 1
task 10 part 2
task 10 part 3
task 12 part 1
task 12 part 2
task 12 part 3

nuclear data threshold reaction energy plot

to help explain activation from DT neutrons at 14MeV vs activation from DD neutrons at 2.5MeV I could add a plotting script like this

import matplotlib.pyplot as plt
import openmc
from matplotlib.lines import Line2D


def plot_thresholds(nuclide="Fe56"):
    plt.clf()
    threshold_below_14 = 0
    threshold_below_2 = 0
    x = {}
    fe = openmc.data.IncidentNeutron.from_hdf5(
        # this dir would need changing to read the local nuclear data
        f"/home/jshimwell/ENDF-B-VIII.0-NNDC/h5_files/neutron/{nuclide}.h5"
    )
    for key, value in fe.reactions.items():
        reaction = fe[key]
        xs = reaction.xs["294K"]
        threshold = xs.x[0]
        name = reaction.__str__().split()[-1][:-1]
        if name not in ["heating", "damage-energy", "heating-local"]:
            # print(threshold/1e6, name)
            x[name] = threshold
            if threshold < 14.1e6:
                threshold_below_14 = threshold_below_14 + 1
                color = "blue"
                plt.plot(xs.x / 1e6, xs.y, label=name, color=color, linewidth=0.5)
            if threshold < 2.5e6:
                threshold_below_2 = threshold_below_2 + 1
                color = "red"
                plt.plot(xs.x / 1e6, xs.y, label=name, color=color, linewidth=0.5)
            print(key, name, threshold / 1e6)

    custom_lines = [
        Line2D([0], [0], color="red", lw=4),
        Line2D([0], [0], color="blue", lw=4),
    ]
    plt.legend(
        custom_lines,
        ["Reaction threshold < 2.5MeV", "Reaction threshold > 2.5 MeV and < 14 MeV"],
    )
    plt.title(
        f"Neutron reaction cross sections for {nuclide} colored by threshold energy.\n {threshold_below_2} reactions below 2.5MeV {threshold_below_14} reactions below 14.1MeV"
    )
    plt.yscale("log")
    plt.xlabel("Energy [MeV]")
    plt.ylabel("Cross sections [barns]")

    plt.xlim((0, 15))
    plt.ylim((1e-16, 1e10))
    plt.savefig(f"threshold_{nuclide}.png", dpi=200)

    # all the reactiosn sorted by energy
    all = dict(sorted(x.items(), key=lambda item: item[1]))


plot_thresholds(nuclide="Al27")
plot_thresholds(nuclide="Fe56")

threshold_Al27
threshold_Fe56

remove unnecessary particles setting

I noticed this code in some examples and it is not necessary, openmc.settings() doesn't actualy have a .particles attribute so this code doesn't do anything.

sett.particle = "neutron"

It could go into the openmc.Source() which has a .particle attribute

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.