GithubHelp home page GithubHelp logo

outs1der / concord Goto Github PK

View Code? Open in Web Editor NEW
6.0 3.0 0.0 11.95 MB

Tools for analysing thermonuclear bursts, and comparing with numerical models

License: GNU General Public License v3.0

Jupyter Notebook 27.56% Python 72.44%

concord's Introduction

README

This repository contains code intended to simplify the analysis of astronomical X-ray data of thermonuclear (type-I) bursts.

Full documentation can be found at https://burst.sci.monash.edu/concord/

This code is under active development, but the v1.0.0 release is associated with a companion paper accepted by Astrophysical Journal Supplements (see Galloway et al. 2022, also available at arXiv:2210.03598). A preprint of the paper is available in the doc subdirectory of this repository

To get started, look at the Inferring burster properties jupyter notebook

What is this repository for?

  • Analysis of thermonuclear X-ray bursts
  • Reading in and plotting thermonuclear burst data and models
  • Performing model-observation comparisons

How do I get set up?

Use the included environment.yml file to set up a conda environment with the required dependencies:

conda env create -f environment.yml

This will create an environment called concord, which you can activate with:

conda activate concord

Then add concord to the local environment with:

python3 -m pip install .

You can then import the repository and use the functions. Here's a very simple example, to find the peak luminosity of a burst from 4U 0513+40 measured by RXTE, as part of the MINBAR sample. The first part calculates the isotropic luminosity, neglecting the uncertainty in both the peak flux and the distance:

>>> import concord as cd
>>> import astropy.units as u
>>> F_pk, e_F_pk = 21.72, 0.6 # 1E-9 erg/cm^2/s bolometric; MINBAR #3443
>>> d = (10.32, 0.24, 0.20) # asymmetric errors from Watkins et al. 2015
>>> l_iso = cd.luminosity( F_pk, dist=d[0], isotropic=True )
WARNING:homogenize_params:no bolometric correction applied
>>> print (l_iso)
2.767771097997098e+38 erg / s

The second part takes into account both the uncertainties in the peak flux and distance (including the asymmetric errors), and also includes the model-predicted effect of the high system inclination (>80 degrees):

>>> l_asym = cd.luminosity( (F_pk, e_F_pk), dist=d, burst=True, imin=80, imax=90, fulldist=True)
WARNING:homogenize_params:no bolometric correction applied
>>> lc = l_asym['lum'].pdf_percentiles([50, 50 - cd.CONF / 2, 50 + cd.CONF / 2]) 
>>> l_unit = 1e38*u.erg/u.s
>>> print ('''\nIsotropic luminosity is {:.2f}e38 erg/s
...   Taking into account anisotropy, ({:.2f}-{:.2f}+{:.2f})e38 erg/s'''.format(l_iso/l_unit, 
...                                                                             lc[0]/l_unit, 
...                                                 (lc[0]-lc[1])/l_unit, (lc[2]-lc[0])/l_unit))
Isotropic luminosity is 2.77e38 erg/s
  Taking into account anisotropy, (4.85-0.43+0.51)e38 erg/s

With the fulldist=True option, the function returns a dictionary with the Monte-Carlo generated distribution of the result (key lum) and all the intermediate quantities. Check the Inferring burster properties notebook for additional demonstrations of usage

There are a number of sources for bursts to analyse:

Or, you can define routines to read in your own model predictions, for example based on the KeplerBurst class

Who do I talk to?

Why concord?

Because we're trying to achieve a "concordance" fit to a wide range of burst data. Plus, concord was cool

concord's People

Contributors

outs1der avatar zacjohnston avatar hworpel avatar

Stargazers

Zhaosheng LI avatar Tuğba Boztepe avatar  avatar ChongChong He avatar  avatar

Watchers

 avatar  avatar Joe Hellmers avatar

concord's Issues

No module named 'linfit'

On attempting import concord on a fresh install, get the error:

ModuleNotFoundError: No module named 'linfit'

I found a comment in burstclass.py mentioning it's from https://github.com/djpine/linfit.git, but this should be stated as a dependency in the README or setup instructions.

Although, it appears to simply be a least-squares linear regression. Surely this could be found in numpy or some other common package?

Other missing dependencies

In addition to linfit, other dependencies not mentioned in README or notebook:

  • astroquery
  • chainconsumer

Another possibility is to provide an environment.yml file which conda can read to automatically set up an environment with the necessary dependencies (e.g. see my package flashbang). Although I'm not yet sure how this works with packages only found in pip (astroquery).

cannot save results easily using pickle

Trying to save the distributions doesn't work:

PicklingError: Can't pickle <class 'astropy.uncertainty.core.QuantityDistribution'>: attribute lookup QuantityDistribution on astropy.uncertainty.core failed

Need to find a workaround/alternative

Check Vizier table query for Lampe et al. 2016 bursts

The KeplerBurst method in burstclass.py offers a way to define bursts based on the Lampe et al. 2016 tables. However I recall there was an issue with the opacity or something with these models; are they still valid to use? And are the parameters correct?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.