GithubHelp home page GithubHelp logo

kc-ml2 / meent Goto Github PK

View Code? Open in Web Editor NEW
41.0 1.0 5.0 3.92 MB

Electromagnetic simulation (RCWA) & optimization package in Python

License: MIT License

Python 100.00%
optimization jax pytorch optics rcwa fmm fourier-modal-method rigorous-coupled-wave-analysis electromagnetics simulation

meent's Introduction

Meent

Meent is an Electromagnetic(EM) simulation package with Python, composed of three main parts:

  • Modeling
  • EM simulation
  • Optimization

Backends

Meent provides three libraries as a backend:
alt text

  • NumPy
    • The fundamental package for scientific computing with Python
    • Easy and lean to use
  • JAX
    • Autograd and XLA, brought together for high-performance machine learning research.
  • PyTorch
    • A Python package that provides two high-level features: Tensor computation with strong GPU acceleration and Deep neural networks built on a tape-based autograd system

When to use

Numpy JAX PyTorch Description
64bit support O O O Default for scientific computing
32bit support O O O 32bit (float32 and complex64) data type operation*
GPU support X O O except Eigendecomposition**
TPU support* X X X Currently there is no workaround to do 32 bit eigendecomposition on TPU
AD support X O O Automatic Differentiation (Back Propagation)
Parallelization X O X JAX pmap function

*In 32bit operation, operations on numbers of 8>= digit difference fail without warning or error. Use only when you do understand what you are doing.
**As of now(2023.03.19), GPU-native Eigendecomposition is not implemented in JAX and PyTorch. It's enforced to run on CPUs and send back to GPUs.

Numpy is simple and light to use. Suggested as a baseline with small ~ medium scale optics problem.
JAX and PyTorch is recommended for cases having large scale or optimization part.
If you want parallelized computing with multiple devices(e.g., GPUs), JAX is ready for that.
But since JAX does jit compilation, it takes much time at the first run.

How to install

pip install meent

JAX and PyTorch is needed for advanced utilization.

How to use

import meent

# backend 0 = Numpy
# backend 1 = JAX
# backend 2 = PyTorch

backend = 1
mee = meent.call_mee(backend=backend, ...)

Tutorials

Jupyter notebooks are prepared in tutorials to give a brief introduction.

Citation

To cite this repository:

@software{Kim_Meent_Electromagnetic_simulation,
  author = {Kim, Yongha and Kim, Sanmun and Lee, Jinmyoung and Jung, Anthony W. and Kim, Seolho},
  license = {MIT},
  title = {{Meent:Electromagnetic simulation & optimization package in Python}},
  url = {https://github.com/kc-ml2/meent}
}

Reference

Will be updated.

[1] https://opg.optica.org/josa/abstract.cfm?uri=josa-71-7-811, Rigorous coupled-wave analysis of planar-grating diffraction
[2] https://opg.optica.org/josaa/abstract.cfm?uri=josaa-12-5-1068
[3] https://opg.optica.org/josaa/abstract.cfm?uri=josaa-12-5-1077
[4] https://opg.optica.org/josaa/abstract.cfm?uri=josaa-13-5-1019
[5] https://opg.optica.org/josaa/abstract.cfm?uri=josaa-13-4-779
[6] https://opg.optica.org/josaa/abstract.cfm?uri=josaa-13-9-1870
[7] https://empossible.net/emp5337/
[8] https://github.com/zhaonat/Rigorous-Coupled-Wave-Analysis (see also our fork: https://github.com/yonghakim/zhaonat-rcwa)
[9] https://arxiv.org/abs/2101.00901

meent's People

Contributors

anthony0727 avatar chocopi2718 avatar seolhokim avatar yonghakim avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar

meent's Issues

optimizer structure

class Optimizer:
def __init__(self, optimizer, solver, target, lr = 0.001):
self.solver = solver
self.target = target
if not isinstance(target, torch.Tensor):
self.target = torch.Tensor(self.target)
self.target.requires_grad = True
if isinstance(optimizer,str):
optimizer = getattr(torch.optim, optimizer)
self.optimizer = optimizer([self.target], lr = lr)
def optimize(self, iterations = 1000):
for iteration in range(iterations):
E_conv_all = to_conv_mat(self.solver.ucell, fourier_order)
o_E_conv_all = to_conv_mat(1 / self.solver.ucell, fourier_order)
de_ri, de_ti, _, _, _ = self.solver.solve(self.solver.wavelength, E_conv_all, o_E_conv_all)
self.optimizer.zero_grad()
loss = self.loss(de_ti)
loss.backward()
self.optimizer.step()
print(loss)
def loss(self, de_ti):
return -de_ti[3, 2]

@yonghakim

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.