GithubHelp home page GithubHelp logo

fritz-marshal / fritz Goto Github PK

View Code? Open in Web Editor NEW
24.0 24.0 33.0 2.63 MB

Astronomical data platform for the Zwicky Transient Facility.

Home Page: https://fritz.science

License: Other

JavaScript 56.32% Python 43.68%
astronomy broker marshal time-domain transient-astronomy transients variable-stars ztf ztf-ii

fritz's People

Contributors

acrellin avatar bfhealy avatar dannygoldstein avatar dmitryduev avatar guynir42 avatar hombit avatar jadalilleboe avatar kmshin1397 avatar mcoughlin avatar nabeelre avatar profjsb avatar scizen9 avatar stefanv avatar tahumada avatar theodlz avatar thomasculino avatar virajkaram avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar

fritz's Issues

Group sources page sorting

We need the ability to sort by e.g., saved_date on the group sources page, e.g., https://fritz.science/group_sources/41
Some further columns (that are optional in the menu) are also needed for display at the top level, most critical being peak magnitude, current magnitude and when these mags happened, info about if a spectrum exists or not and TNS name. (The basic info needed to determine what should be assigned for spectroscopy, classified, or sent to TNS)

Properly handle exceptions in ZTFAlertHandler.post

From a conversation with @stefanv:

ZTFAlertHandler.post instantiates external handlers and uses them. Need to look at self.error or set a flag self._error_occurred.

Also see here.

The better option is to factor out certain parts of the logic into utility functions outside of the handler class, and to then import those. (They can still live in the handler class .py file)

Continuous integration

Get continuous integration up and running for fritz, using github actions.

Workflow would be:

  1. Build images
  2. Launch containers
  3. Run unit tests

Fritz doesn't show up on localhost:9000

Hi,

I installed Fritz following all instructions. The unit test worked successfully without any error. Launching Fritz also works, but I can't see the website on localhost:9000. Is there a different port of MacOS?

Cheers,

Steve

`make`-based build system?

Wanted to start a discussion about transitioning from a docker-based build system to make as default, following SP. The current build system takes quite a while to start the repo from fresh (~20 mins) and I think it can be done in ~10x less time than that. @dmitryduev what are the blockers for this?

cc @stefanv @profjsb @acrellin

Stage-by-stage view of alert filter output

This would be very useful for the Kowalski filter debugging purposes.
The user would provide filter_id and candid, F would then take care of permissioning, construct the aggregation pipeline, and execute it stage-by-stage displaying the intermediate output, similar to what MongoDB Compass does.

./fritz run --init fails

on 770f9d6

(skyportal) Dannys-MacBook-Pro:fritz danny$ ./fritz run --init

Welcome to SkyPortal v0.9.dev0+git20200605.129f1b3 (https://skyportal.io)

cd baselayer && git submodule update --init --remote
docker build -t skyportal/web .
Sending build context to Docker daemon  2.462MB
Step 1/12 : FROM ubuntu:18.04
 ---> ea4c82dcd15a
Step 2/12 : RUN apt-get update &&     apt-get install -y curl build-essential software-properties-common &&     curl -sL https://deb.nodesource.com/setup_12.x | bash - &&     apt-get update &&     apt-get -y upgrade &&     apt-get install -y python3 python3-venv python3-dev                        libpq-dev supervisor                        git nginx nodejs postgresql-client &&     apt-get clean &&     rm -rf /var/lib/apt/lists/* /tmp/* /var/tmp/* &&     useradd --create-home --shell /bin/bash skyportal
 ---> Using cache
 ---> 81b6381c419f
Step 3/12 : RUN python3.6 -m venv /skyportal_env &&         bash -c "source /skyportal_env/bin/activate &&     pip install --upgrade pip"
 ---> Using cache
 ---> fb6f9d31ad57
Step 4/12 : ENV LC_ALL=C.UTF-8
 ---> Using cache
 ---> 633edc1c4820
Step 5/12 : ENV LANG=C.UTF-8
 ---> Using cache
 ---> 2f27ae8c9eeb
Step 6/12 : ADD . /skyportal
 ---> Using cache
 ---> c472611f7a8a
Step 7/12 : WORKDIR /skyportal
 ---> Using cache
 ---> cc2c88329178
Step 8/12 : RUN bash -c "    source /skyportal_env/bin/activate &&         make -C baselayer paths &&     (make -f baselayer/Makefile baselayer dependencies || make -C baselayer dependencies) &&     (make -f baselayer/Makefile baselayer fill_conf_values || make -C baselayer fill_conf_values)"
 ---> Using cache
 ---> 3c45aa466836
Step 9/12 : RUN bash -c "        (make -f baselayer/Makefile bundle || make -c baselayer bundle) &&     rm -rf node_modules &&         chown -R skyportal.skyportal /skyportal_env &&     chown -R skyportal.skyportal /skyportal &&         cp docker.yaml config.yaml"
 ---> Using cache
 ---> 4328d9ce10e7
Step 10/12 : USER skyportal
 ---> Using cache
 ---> 166296661610
Step 11/12 : EXPOSE 5000
 ---> Using cache
 ---> 8968075a56d8
Step 12/12 : CMD bash -c "source /skyportal_env/bin/activate &&              (make log &) &&              make run_production"
 ---> Using cache
 ---> 8e2f64731309
Successfully built 8e2f64731309
Successfully tagged skyportal/web:latest
Creating network "skyportal_default" with the default driver
Creating skyportal_web_1 ... done
Creating skyportal_db_1  ... done

Welcome to SkyPortal v0.9.dev0 (https://skyportal.io)

Checking system dependencies:
[✓] nginx >= 1.7                  [  1.14.0]
[✓] psql >= 9.6                   [   10.12]
[✓] node (npm) >= 5.8.0           [  6.14.4]

[✓] Baselayer installed inside of app
--------------------
[✓] Verifying Python package dependencies
[✓] Silently executing: baselayer/tools/check_js_deps.sh
  Config files: baselayer/config.yaml.defaults
                config.yaml.defaults
                config.yaml
[✓] Connecting to database skyportal
[·] Dropping all tablesDropping tables on database skyportal
[✓] Dropping all tables
[·] Creating tablesCreating tables on database skyportal
Refreshed tables:
 - acls
 - roles
 - role_acls
 - users
 - tokens
 - token_acls
 - user_roles
 - usersocialauths
 - nonces
 - associations
 - codes
 - partials
 - groups
 - group_users
 - streams
 - stream_groups
 - objs
 - filters
 - candidates
 - sources
 - sourceviews
 - telescopes
 - group_telescopes
 - instruments
 - comments
 - photometry
 - spectra
 - thumbnails
 - followuprequests
[✓] Creating tables
    - acls
    - roles
    - role_acls
    - users
    - tokens
    - token_acls
    - user_roles
    - usersocialauths
    - nonces
    - associations
    - codes
    - partials
    - groups
    - group_users
    - streams
    - stream_groups
    - objs
    - filters
    - candidates
    - sources
    - sourceviews
    - telescopes
    - group_telescopes
    - instruments
    - comments
    - photometry
    - spectra
    - thumbnails
    - followuprequests
[✓] Creating permissions
[✓] Creating dummy users
[✓] Creating token
[·] Launching web app & executing API calls
Waiting for server to appear at http://localhost:5000...
App running - continuing with API calls
[✓] Creating dummy groups & filter & adding users
[✓] Creating dummy telescopes & instruments
[✓] Creating dummy candidates & sources
[✓] Launching web app & executing API calls
Traceback (most recent call last):
  File "./fritz", line 318, in <module>
    getattr(sys.modules[__name__], args.command)(args)
  File "./fritz", line 150, in run
    token = generate_skyportal_token()
  File "./fritz", line 64, in generate_skyportal_token
    universal_newlines=True,
  File "/anaconda2/envs/skyportal/lib/python3.6/subprocess.py", line 403, in run
    with Popen(*popenargs, **kwargs) as process:
TypeError: __init__() got an unexpected keyword argument 'capture_output'```

PS1 on scanning page

I know there have been some now closed issues on this, but we still need PS1 cutouts or links to PS1 cutouts on the scanning page specifically.

Additional column options for group sources

Per @cfremling:
Some further columns (that are optional in the menu) are also needed for display at the top level, most critical being peak magnitude, current magnitude and when these mags happened, info about if a spectrum exists or not and TNS name. (The basic info needed to determine what should be assigned for spectroscopy, classified, or sent to TNS)

Pulled out from #100 as that one was mainly about sorting on the group sources page

Favorites page

To keep track of which sources will need e.g. spectroscopic followup in the future, an essential functionality on the Growth marshal was the ability to add and remove sources to a private favorites page for the user. This was of great help when for example a scanner was not sure if a source is a CV or not. In this case one could add the source to favorites, then go back in a few days to check how the lightcurve evolves before triggering SEDM. If it was indeed a CV, with one click it could be removed from favorites.
This page could have the same basic look as the group_sources page

API Documentation

Currently the fritz-specific endpoints are not documented. We should auto-generate API docs for these as we do with the normal skyportal endpoints. Users will make heavy use of many of these endpoints, especially /api/alerts.

This is related to the issue of overriding (rather than introducing totally new) skyportal features that is described in #63.

Increase axis label font size for ZTF Vega Plot

I think the y-axis label specifically (the mag) is a little hard to read with the small, bold font. It looks squished to where 'm' and 'a' are almost overlapping (at least on my screen).

Screenshot 2020-08-29 113150 (2)

Should just be a matter of adding a "titleFontSize" to the axis specs. Docs here

enter specific times for candidates when scanning

@acrellin At the moment it's only possible to enter "integer" date ranges when scanning for candidates.

  1. It is not clear whether those times correspond to observation or ingestion (also are they UT, PDT, PST, etc)
  2. More importantly - it is important to add more granularity so that it is possible to scan candidates in ~hour long chunks, so specifying hh:mm:ss in addition to the date would be useful.

Need module jwt for ./fritz run --init

I am working on Mac, in order to work with Fritz, we had to set an conda environment with python 3.7, and create an ssh key, and download docker. We had a problem with YAML where we had to pip install pyYAML and not YAML instead. Now, we are running into the following error (after pip install jwt which was missing) after running the following command ./fritz run --init :

(fritz) Saturn:fritz r.saturn$ ./fritz run --init
Checking system dependencies:
[✓] python >= 3.7 [ 3.7.9]

Traceback (most recent call last):
File "./fritz", line 410, in
getattr(sys.modules[name], args.command)(args)
File "./fritz", line 141, in run
check_config(cfg='fritz.defaults.yaml', yes=args.yes)
File "./fritz", line 84, in check_config
jwt_secret=config['kowalski']['server']['JWT_SECRET_KEY'],
File "./fritz", line 33, in generate_kowalski_token
jwt_token = jwt.encode(payload, jwt_config['JWT_SECRET'], jwt_config['JWT_ALGORITHM'])
AttributeError: module 'jwt' has no attribute 'encode'

Access to Spectra information: bug or (bad) feature.

There is a data access issue with the spectra, that most likely is related to data right access but maybe not fully.

There are 3 cases:

  1. There are sources where I see the source, see the spectrum on the source page, but cannot access the data.
    example: https://fritz.science/source/ZTF20achvlbs but https://fritz.science/api/sources/ZTF20achvlbs/spectra is empty (success but no data)
  2. There are sources where I know there is an SEDm spectra, I can see the source, but cannot see the spectra and cannot access the data:
    example: https://fritz.science/source/ZTF20acpevli
  3. A user (Jesper) trigger an SEDm spectra, He can see the source, He can see the spectra but cannot access the data (https://fritz.science/api/sources/ZTF20acpevli/spectra ->status:"success", data=[]". He cannot share the spectra to me even though he requested SEDm and he is the SEDM czar

Remark that there are cases where I can see and access all (e.g. ZTF20acmzoxo)

This is a pretty frustrating problem that many ZTF members came back to me about (because I'm SEDm pipeline lead and because of ztfquery.fritz).

Is this:

  • a fritz issue ?
  • a data access policy issue ?

All parnership SEDm data should be visible to all ZTF partners.

docs for fritz-specific api endpoints

  • Generate docs for fritz-specific endpoints in CI (requires new fritz-extension infrastructure)
  • Automatically deploy docs to docs.fritz.science (requires fritz CI)

git clone only clones part of the repo

Hi,

I am trying to install Fritz. I pulled the repo with git clone https://github.com/fritz-marshal/fritz.git && cd fritz. When I do pip install -r requirements.txt, I get the following error:

ERROR: Could not open requirements file: [Errno 2] No such file or directory: 'kowalski/requirements.txt'

It turns out that the cloning did not download everything. How can I pull the entire package?

Cheers,

Steve

Headers of spectra in Fritz database

We need the .fits headers included as comments when downloading spectra of Fritz
This info is critical for various analyses (e.g., plots as a function of exp. time), getting exact JD of middle of exposure, exact instrument setups (gratings, grisms), etc.
While it could be nice to have some of this info properly ingested, it is just as important to have the ability to leave a .fits or .ascii header alone and retain it throughout uploading and later downloading. If I download 1000 spectra next year I want the untouched output files from the SEDM, P200, or whatever pipeline, and not in any way modified from their original state.

add link to ZTF alerts on Source and Candidate pages

@dmitryduev

I was about to open a pull request that overrode the source and candidate pages as Fritz extensions with a button that links to the ZTF alerts page. However, I realized that this would cause the Source and Candidates pages in fritz to fall out of sync with what is in SkyPortal. Any ideas on how to override componets that are already in Skyportal with Fritz specific features, but not have the two go out of sync?

Cc @stefanv

websockets unresponsive on deployed fritz?

Sometimes on fritz I lose all connection with the websocket, but don't see any warning notification on the page. Examples (reproducible):

  1. logged in as provisioned_admin on private.caltech.edu, on https://private.caltech.edu/share_data/ZTF20abonvte, when I click on points 2862, 2864, 2866, 2880, 2881, and share with Redshift Completeness Factor and Infant SNe, I immediately see a notification that the sharing was successful, but have to reload the page to see the groups updated on the MUI datatable (rather than seeing it immediately update via WS)

  2. logged in as provisioned_admin on https://private.caltech.edu/source/ZTF20abonvte, when I classify as DM annihilation under sitewide taxonomy 0.1.1 with probabilty 1.0, I immediately see a notification that sharing was successful, but I have to reload the page to see the classification update on the page (rather than seeing it immediately update via WS)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.