GithubHelp home page GithubHelp logo

alercebroker / alerce_client Goto Github PK

View Code? Open in Web Editor NEW
7.0 11.0 1.0 274 KB

🐍 ALeRCE Python Client

Home Page: https://alerce.readthedocs.io/en/latest/index.html

License: MIT License

Python 100.00%
alerce astronomy python ztf lsst

alerce_client's Introduction

imageimageimage

Welcome to ALeRCE Python Client.

ALeRCE client is a Python library to interact with ALeRCE services and databases.

For full documentation please visit the official Documentation:

Installing ALeRCE Client

pip install alerce

Or clone the repository and install from there

git clone https://github.com/alercebroker/alerce_client.git
cd alerce_client
python setup.py install

Usage

from alerce.core import Alerce
alerce = Alerce()

dataframe = alerce.query_objects(
    classifier="lc_classifier", 
    class_name="LPV", 
    format="pandas"
)

detections = alerce.query_detections("ZTF20aaelulu", format="pandas", sort="mjd")

magstats = alerce.query_magstats("ZTF20aaelulu")

query='''
SELECT
    oid, sgmag1, srmag1, simag1, szmag1, sgscore1
FROM
    ps1_ztf
WHERE
    oid = 'ZTF20aaelulu'
'''
detections_direct = alerce.send_query(query, format="pandas")

Configuration

By default the Alerce object should be ready to use without any external configuration, but in case you need to adjust any parameters then you can configure the Alerce object in different ways.

At the client object initialization

You can pass parameters to the Alerce class constructor to set the parameters for API connection.

For example using the ZTF API on localhost:5000 and the DB API on localhost:5050

alerce = Alerce(ZTF_API_URL="<http://localhost:5000>", ZTF_DB_API_URL="<http://localhost:5050>")

From a dictionary object

You can pass parameters to the Alerce class from a dictionary object.

my_config = {
    "ZTF_API_URL": "http://localhost:5000"
    "ZTF_DB_API_URL": "http://localhost:5050"
}
alerce = Alerce()
alerce.load_config_from_object(my_config)

Contribuiting

Each pull request must have at least one commit following the angular commit guidelines for the semantic versioning to work.

alerce_client's People

Contributors

ashuenchuleo avatar cvalenzu avatar dirodriguezm avatar fforster avatar ignacioreyes avatar javierarredondo avatar pcastelln avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Forkers

kdesoto-astro

alerce_client's Issues

update library dependencies

I cannot install alerce at present because it requires old libraries that conflict with various other programs I have. This absolutely needs to be fixed before public release. Ideally, this should be more flexible to versions. Or alternately provide complete docker-like environment.

Update client to use API with mongo functionalities

The client must be able to use the new mongo support on the API. An interface should be drafted and proposed to the team.

Interface proposals

1

from alerce.core import Alerce
alerce = Alerce()

query = Alerce.mongoquery(collection='detections')
query =  query.find({'oid' : 'ZTF1'})
query = query.sort('mjd')
query = query.limit(10)

detections = query.execute()

for detection in detections:
    print(detection)

[Bug]I can't import Alerce

Describe the bug
I can't import the alerce client

To Reproduce
from alerce.core import Alerce

Expected behavior

Screenshots
Captura de pantalla de 2021-04-20 14-46-02

Environment (please complete the following information):
-Python version 3.8.5

Additional context
pip show alerce displays the following:

Name: alerce
Version: 1.0.1
Summary: ALeRCE Client
Home-page: UNKNOWN
Author: ALeRCE Team
Author-email: [email protected]
License: UNKNOWN
Location: /home/javier/anaconda3/lib/python3.8/site-packages
Requires: requests, pandas, astropy
Required-by:

Install Problem with P4J

Screenshot 2024-04-30 at 12 49 00 PM **Describe the bug** Attempting to run python notebook ALeRCE_Other_Periodograms.ipynb. The pip install and import of P4J fails with this error "note: This error originates from a subprocess, and is likely not a problem with pip. error: metadata-generation-failed"

To Reproduce
Steps to reproduce the behavior:

  1. Open the notebook in Anaconda
  2. Run the first 5 cells in sequence
  3. See error

Expected behavior
A clear and concise description of what you expected to happen.
The install and import of P4J should have happened normally.

Screenshots
If applicable, add screenshots to help explain your problem.

Environment (please complete the following information):

  • OS: MacOS Sonoma 14.4.1
  • Browser chrome
  • Version Version 124.0.6367.93 (Official Build) (arm64)
  • Python version 3.11
  • If possible the output of pip freeze to check the python environment.

Additional context
I have reviewed this error on google and github and there is no solution. There is a similar error report on your github but the error as not been assigned since it was created on Feb 22.

[Feature] Change the client version moving to production

Is your feature request related to a problem? Please describe.
The current version in setup.py is 0.0.1, so we need to add them to pypi with a good version

Describe the solution you'd like
Change it to 1.0.x and maybe having a 1.0.x-dev

[Bug] query_objects returning fields class and classifier

Describe the bug
query_object returns two fields that shouldn't exist

To Reproduce
alerce.query_objects(oid="ZTF19abahvdh", format='pandas')

Expected behavior
These fields shouldn't be there, they confuse users.

Screenshots
If applicable, add screenshots to help explain your problem.
image

Environment (please complete the following information):

  • OS: Ubuntu

*programming* How do I get the Object ID giving RA and Dec (for example, with catshtm_conesearch(ra, dec, radius)? )

I'm trying to use the API to input the RA and Dec of an object and return the OID.

For example, when using,

alerce.catshtm_conesearch(ra, dec, radius)

it returns a lot of information that includes the flux, distance, catalog name, etc, but I can't find a way for it to return the OID. Is there a way to do that with this methodI? Is there another method I can use to get this?

Thanks!

Changes for production

[Bug] alerce.query_detections running very slow

Describe the bug
I am using the API to download light curves for a sample of sources. Each source light curve is taking several minutes to download. I am downloading in json format. I've tried the query_non_detections, query_detections, and query_lightcurve, with json and pandas format. The problem is the same. Please could you help?

To Reproduce
Steps to reproduce the behavior:

Using vscode with the latest alerce client installed.

from alerce.core import Alerce
alerce = Alerce()
try:
with open(f'{folderpath}/{oid}.json', 'w') as outfile:
json.dump(alerce.query_detections(oid, format="json"), outfile)
except Exception as e:
os.remove(f'{folderpath}/{oid}.json')
print(e)

Expected behavior
Each light curve should download within a second or two.

Environment (please complete the following information):

  • OS: macOS Ventura
  • Browser: N/A
  • Version: 13.2.1
  • Python version: 3.9.16
  • If possible the output of pip freeze to check the python environment.
    alerce==1.2.0
    anyio @ file:///opt/concourse/worker/volumes/live/485b0f52-1188-482a-6285-65a36c8fa8a6/volume/anyio_1644481714856/work/dist
    appnope @ file:///home/conda/feedstock_root/build_artifacts/appnope_1649077682618/work
    argon2-cffi @ file:///opt/conda/conda-bld/argon2-cffi_1645000214183/work
    argon2-cffi-bindings @ file:///opt/concourse/worker/volumes/live/42cf1b28-e71f-45ed-47b2-50f828088636/volume/argon2-cffi-bindings_1644569709119/work
    astropy @ file:///Users/runner/miniforge3/conda-bld/astropy_1673376509203/work
    astroquery==0.4.6
    asttokens @ file:///home/conda/feedstock_root/build_artifacts/asttokens_1670263926556/work
    attrs @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_33k1uces4n/croot/attrs_1668696162258/work
    backcall @ file:///home/conda/feedstock_root/build_artifacts/backcall_1592338393461/work
    backports.functools-lru-cache @ file:///home/conda/feedstock_root/build_artifacts/backports.functools_lru_cache_1618230623929/work
    beautifulsoup4 @ file:///home/conda/feedstock_root/build_artifacts/beautifulsoup4_1675252249248/work
    bleach @ file:///opt/conda/conda-bld/bleach_1641577558959/work
    Bottleneck @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_29949159-f86f-474b-bc1f-aaa1e0e222b4ofusifik/croots/recipe/bottleneck_1657175564045/work
    brotlipy @ file:///Users/runner/miniforge3/conda-bld/brotlipy_1666764741656/work
    certifi @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_477u68wvzm/croot/certifi_1671487773341/work/certifi
    cffi @ file:///Users/runner/miniforge3/conda-bld/cffi_1671179477998/work
    charset-normalizer @ file:///home/conda/feedstock_root/build_artifacts/charset-normalizer_1661170624537/work
    comm @ file:///home/conda/feedstock_root/build_artifacts/comm_1670575068857/work
    cryptography @ file:///Users/runner/miniforge3/conda-bld/cryptography_1637687153321/work
    cycler @ file:///tmp/build/80754af9/cycler_1637851556182/work
    debugpy @ file:///Users/runner/miniforge3/conda-bld/debugpy_1674522483834/work
    decorator @ file:///home/conda/feedstock_root/build_artifacts/decorator_1641555617451/work
    defusedxml @ file:///tmp/build/80754af9/defusedxml_1615228127516/work
    entrypoints @ file:///home/conda/feedstock_root/build_artifacts/entrypoints_1643888246732/work
    exceptiongroup==1.1.0
    executing @ file:///home/conda/feedstock_root/build_artifacts/executing_1667317341051/work
    fastjsonschema @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_b5c1gee32t/croots/recipe/python-fastjsonschema_1661368622875/work
    feets==0.4
    fonttools==4.25.0
    html5lib @ file:///home/conda/feedstock_root/build_artifacts/html5lib_1592930327044/work
    idna @ file:///home/conda/feedstock_root/build_artifacts/idna_1663625384323/work
    imbalanced-learn @ file:///home/conda/feedstock_root/build_artifacts/imbalanced-learn_1672235740107/work
    importlib-metadata @ file:///home/conda/feedstock_root/build_artifacts/importlib-metadata_1672612343532/work
    iniconfig==2.0.0
    ipykernel @ file:///Users/runner/miniforge3/conda-bld/ipykernel_1673894854828/work
    ipython @ file:///Users/runner/miniforge3/conda-bld/ipython_1672758654845/work
    ipython-genutils @ file:///tmp/build/80754af9/ipython_genutils_1606773439826/work
    jaraco.classes @ file:///home/conda/feedstock_root/build_artifacts/jaraco.classes_1667024629799/work
    jedi @ file:///home/conda/feedstock_root/build_artifacts/jedi_1669134318875/work
    Jinja2 @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_6adj7x0ejx/croot/jinja2_1666908137966/work
    joblib @ file:///home/conda/feedstock_root/build_artifacts/joblib_1663332044897/work
    jsonschema @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_d832da7jx3/croots/recipe/jsonschema_1663375475386/work
    jupyter-server @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_031akrjssy/croot/jupyter_server_1671707631142/work
    jupyter_client @ file:///home/conda/feedstock_root/build_artifacts/jupyter_client_1673615989977/work
    jupyter_core @ file:///Users/runner/miniforge3/conda-bld/jupyter_core_1674530039210/work
    jupyterlab-pygments @ file:///tmp/build/80754af9/jupyterlab_pygments_1601490720602/work
    kaleido==0.2.1
    keyring @ file:///Users/runner/miniforge3/conda-bld/keyring_1671728304453/work
    kiwisolver @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_ca2945cc-8f2f-407b-8e24-f05d29ee2f4fvto5sdlk/croots/recipe/kiwisolver_1653292053344/work
    lasair==0.0.5
    lxml @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_1902c961-4bd2-4871-a3c5-70b7317a6521kpj7nz2o/croots/recipe/lxml_1657545138937/work
    MarkupSafe @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_d4a9444f-bd4c-4043-b47d-cede33979b0fve7bm42r/croots/recipe/markupsafe_1654597878200/work
    matplotlib @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_croot-hkd3cnd2/matplotlib-suite_1647506470899/work
    matplotlib-inline @ file:///home/conda/feedstock_root/build_artifacts/matplotlib-inline_1660814786464/work
    mistune @ file:///opt/concourse/worker/volumes/live/4217afd5-dad1-438d-6f79-e4992ccda0e5/volume/mistune_1607364880245/work
    mkl-fft==1.3.1
    mkl-random @ file:///opt/concourse/worker/volumes/live/0cda23d8-7460-44b2-7e5d-3c76a8a0ca7e/volume/mkl_random_1626186083266/work
    mkl-service==2.4.0
    mock==5.0.1
    more-itertools @ file:///home/conda/feedstock_root/build_artifacts/more-itertools_1677514956219/work
    munkres==1.1.4
    nbclassic @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_7c3czojxw1/croot/nbclassic_1676902906096/work
    nbclient @ file:///opt/concourse/worker/volumes/live/fcea0efc-2a08-48fd-5c55-85ef78e0ea28/volume/nbclient_1650308406463/work
    nbconvert @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_8fyzuglni_/croot/nbconvert_1668450649428/work
    nbformat @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_2daun1fill/croot/nbformat_1670352339504/work
    nest-asyncio @ file:///home/conda/feedstock_root/build_artifacts/nest-asyncio_1664684991461/work
    notebook @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_0cdyriuhi_/croot/notebook_1668179888986/work
    notebook_shim @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_e9s6zsmlb7/croot/notebook-shim_1668160584892/work
    numexpr @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_cef3ah6r8w/croot/numexpr_1668713880672/work
    numpy @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_e3f85f54-0572-4c0e-b190-4bc4766fc3fenewxarhq/croots/recipe/numpy_and_numpy_base_1652801682879/work
    packaging @ file:///home/conda/feedstock_root/build_artifacts/packaging_1673482170163/work
    pandas==1.5.2
    pandocfilters @ file:///opt/conda/conda-bld/pandocfilters_1643405455980/work
    parso @ file:///home/conda/feedstock_root/build_artifacts/parso_1638334955874/work
    patsy @ file:///home/conda/feedstock_root/build_artifacts/patsy_1665356157073/work
    pexpect @ file:///home/conda/feedstock_root/build_artifacts/pexpect_1667297516076/work
    pickleshare @ file:///home/conda/feedstock_root/build_artifacts/pickleshare_1602536217715/work
    Pillow==9.0.1
    platformdirs @ file:///home/conda/feedstock_root/build_artifacts/platformdirs_1672264874562/work
    plotly @ file:///opt/conda/envs/env/conda-bld/plotly_1674482700049/work/packages/python/plotly/dist/plotly-5.13.0.tar.gz
    pluggy==1.0.0
    prometheus-client @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_19kjbndib7/croots/recipe/prometheus_client_1659455105394/work
    prompt-toolkit @ file:///home/conda/feedstock_root/build_artifacts/prompt-toolkit_1670414775770/work
    psutil @ file:///Users/runner/miniforge3/conda-bld/psutil_1667886143121/work
    ptyprocess @ file:///home/conda/feedstock_root/build_artifacts/ptyprocess_1609419310487/work/dist/ptyprocess-0.7.0-py2.py3-none-any.whl
    pure-eval @ file:///home/conda/feedstock_root/build_artifacts/pure_eval_1642875951954/work
    pycparser @ file:///home/conda/feedstock_root/build_artifacts/pycparser_1636257122734/work
    pyerfa @ file:///Users/runner/miniforge3/conda-bld/pyerfa_1666820916637/work
    Pygments @ file:///home/conda/feedstock_root/build_artifacts/pygments_1672682006896/work
    pyOpenSSL @ file:///home/conda/feedstock_root/build_artifacts/pyopenssl_1608055815057/work
    pyparsing @ file:///tmp/build/80754af9/pyparsing_1635766073266/work
    pyrsistent @ file:///opt/concourse/worker/volumes/live/76cffa60-bd33-4155-4e83-ea03c38b1294/volume/pyrsistent_1636111020441/work
    PySocks @ file:///home/conda/feedstock_root/build_artifacts/pysocks_1661604839144/work
    pytest==7.2.2
    python-dateutil @ file:///home/conda/feedstock_root/build_artifacts/python-dateutil_1626286286081/work
    pytz @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_ddzpsmm2_f/croot/pytz_1671697430473/work
    pyvo @ file:///home/conda/feedstock_root/build_artifacts/pyvo_1664228312964/work
    PyYAML @ file:///Users/runner/miniforge3/conda-bld/pyyaml_1666772473931/work
    pyzmq @ file:///Users/runner/miniforge3/conda-bld/pyzmq_1673612734897/work
    requests @ file:///home/conda/feedstock_root/build_artifacts/requests_1673863902341/work
    scikit-learn @ file:///Users/runner/miniforge3/conda-bld/scikit-learn_1670523693646/work
    scipy @ file:///opt/concourse/worker/volumes/live/9284487f-601d-4bc3-5556-535c4949d341/volume/scipy_1641557769615/work
    seaborn @ file:///tmp/build/80754af9/seaborn_1629307859561/work
    Send2Trash @ file:///tmp/build/80754af9/send2trash_1632406701022/work
    six @ file:///home/conda/feedstock_root/build_artifacts/six_1620240208055/work
    sniffio @ file:///opt/concourse/worker/volumes/live/38ca9e9e-09d1-4d43-5a0f-b546422e7807/volume/sniffio_1614030472707/work
    soupsieve @ file:///home/conda/feedstock_root/build_artifacts/soupsieve_1658207591808/work
    stack-data @ file:///home/conda/feedstock_root/build_artifacts/stack_data_1669632077133/work
    statsmodels @ file:///Users/runner/miniforge3/conda-bld/statsmodels_1667586084161/work
    tenacity @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_15d3bbab-3059-4048-adcb-986fb2669dd5nujdx5qd/croots/recipe/tenacity_1657899116644/work
    terminado @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_18_p3gbeio/croot/terminado_1671751835656/work
    threadpoolctl @ file:///home/conda/feedstock_root/build_artifacts/threadpoolctl_1643647933166/work
    tinycss2 @ file:///private/var/folders/sy/f16zz6x50xz3113nwtb9bvq00000gp/T/abs_56dshjmms6/croot/tinycss2_1668168824483/work
    tomli==2.0.1
    tornado @ file:///Users/runner/miniforge3/conda-bld/tornado_1666788801701/work
    traitlets @ file:///home/conda/feedstock_root/build_artifacts/traitlets_1673359992537/work
    typing_extensions @ file:///home/conda/feedstock_root/build_artifacts/typing_extensions_1665144421445/work
    urllib3 @ file:///home/conda/feedstock_root/build_artifacts/urllib3_1673452138552/work
    wcwidth @ file:///home/conda/feedstock_root/build_artifacts/wcwidth_1673864653149/work
    webencodings==0.5.1
    websocket-client @ file:///opt/concourse/worker/volumes/live/5baed9cd-40fb-4fbe-6721-9568cdd0f2d7/volume/websocket-client_1614804245073/work
    xgboost==1.7.1
    zipp @ file:///home/conda/feedstock_root/build_artifacts/zipp_1677313463193/work

Additional context
Add any other context about the problem here.

[Bug] Repeated object in the object table

Describe the bug
alerce.query_objects(oid="ZTF18aahvndq", format='pandas') returns two rows, which differ by class and probabillty

Expected behavior
Only one row should be returned

Additional context
Either fix the problem or remove the columns class, classifier and probability from object.

[Bug] plot_stamp not working

Describe the bug
A clear and concise description of what the bug is.
When I run plot_stamp it crashes

To Reproduce
Steps to reproduce the behavior:
after loading the client, run
alerce.plot_stamp("ZTF21aaqftuq")

Expected behavior
A clear and concise description of what you expected to happen.
we should see the first available stamps

Screenshots
If applicable, add screenshots to help explain your problem.

image

[Bug] Unknown API error

Hi,

I have accessed the alerce_client with no problems for several months and have had no issues. Today I have received the following error:

APIError: {'Error code': 504, 'Message': 'Unknown API error.', 'Data': 'Unknown API error.'}

I've installed the alerce client into my environment as described, then ran the following code:

from alerce.core import Alerce alerce = Alerce()

that part is fine, then I run this example code:

oids = [ "ZTF18accqogs", "ZTF19aakyhxi", "ZTF19abyylzv", "ZTF19acyfpno", ] objects = alerce.query_objects(oid=oids, format="pandas")

This is where I get the error. I've tries several other functions and the same issue occurs. I've also tried this in several other environments and the same issue occurs.

Environment (please complete the following information):

  • OS: [MacOS 12.3.1]
  • Python version [3.8.8 and 3.10.4]

Is there a current issue with the client or is this just on my end? Please could you help.

Thanks

Dharmesh

query_lightcurve in pandas format giving weird result

The result of the following

data = alerce.query_lightcurve("ZTF18abbuksn", format="json")

Is a dataframe with one row and two columns (detections and non_detections)

Out[42]: 
                                          detections                                     non_detections
0  [{'mjd': 58286.42961810017, 'candid': '5324296...  [{'mjd': 58288.435289400164, 'fid': 1, 'diffma...

Maybe it should be a dictionary with one dataframe for each result or a dataframe with dataframes.

[Feature] Query for more classes

Is your feature request related to a problem? Please describe.
The ability to query more than one class

Describe the solution you'd like
The ability to do this query:
dataframe = alerce.query_objects(
ra=220,
dec=0,
lastmjd=[nt.mjd-365./2,nt.mjd],
radius=search_radius*3600,
classifier='stamp_classifier',
class_name=['SN','AGN'],
page_size=200
)

This was requested by Alex Kim, from DESC.

[Enhancement] Check requirements.txt versions

Is your feature request related to a problem? Please describe.
When installing the package some time there are compatibility issues with some requirement versions.

Describe the solution you'd like
Define the min and max (if there is) version for each package used.

Client/Increase timeout for db gateways

In the client, the requests library must be able to wait until an expected timeout.
timeout=None da 5 minutos
Desactivando keep alive igual de 5 minutos
No logro encontrar de donde sale ese valor. Con curl dura la media hora.

programming question

We are developing an automatic system to get all alerts from ALeRCE configuring params like these:

"classifier": "lc_classifier_stochastic",
"class_name": CV/Nova,
"probability": 0.1,
"page_size": 1000,
"order_by": "firstmjd",
"ranking": 1,
"count" : "true",
"page": page, ----> we add a new page in every iteration
"order_mode": "DESC",
"format": "pandas"

objects = alerce.query_objects(**params)

In every iteration we check the number of objects if it is == 0 we stop the loop. We have found that the number of register of our file .csv where we save all objects always has 515891 registers and never exceed this number. We have check several queries with different sizes of page and criteria but the results are the same.

Do you have any idea about this issue? Is there any limitation in the number of alerts to query?

Thank you very much.

[programming] Downloading multiple light curves API

Hi,

I'm doing some major data mining and I was wondering what is the best way to download many (100s to 1000s) of light curves from Alerce using the API. I have thought about looping the alerce.query_detections function, but I would rather ask before taxing your server.

Cheers

Dharmesh Mistry

I am attempting to gather SNIbc trasnients from the Alerce ZTF API that occurred between certain dates, eg. Jan 01 2017 - May 01 2017. Is this possible to achieve? I am using the alerce.query_objects method and using following arguments, (classifier= "lc_classifier", class_name= "SNIbc", firstmjd = start_t.mjd, lastmjd = end_t.mjd, format="pandas") where start_t.mjd is the MJD for Jan 01 2017. It's returning results from 2017-2022 but I want a more refined search.

Is this a question for astronomers or the infraestructure team
Please use the astronomy or programming label to help the team sort easily this issues.

Describe your general question, the more detailed the better

Add screenshots [if necessary]

[Bug]

When exec a .py script in Pycharm always throw the same error:

ModuleNotFoundError: No module named 'alerce.core'; 'alerce' is not a package

I installed the package from Pycharm in my environment but I have not any good
result. I show the code:

Screenshot from 2023-07-13 11-43-06

I use PyCharm 2023.1.3 (Community Edition) with python3.8 as python interpreter in
my Ubuntu 20.4 system.
Due to this problem I tried to exec the same code in windows 10, CentOS with Pycharm
and directly in command line --> "$python script.py" with the same result. So I think this package has a problem.

[Bug] Issue with AGN use case notebook

Describe the bug
I was following the AGN use case notebook located here:

https://github.com/alercebroker/usecases/blob/master/notebooks/ALeRCE_ZTF_AGNUseCase%2BAPI-xmatch.ipynb

and I could not run the crossmatch using the URL:

https://xmatch-api.alerce.online/

which returns a "502: Bad Gateway" error message. Is this URL still valid?

To Reproduce
Steps to reproduce the behavior:

  1. Go to '...'
  2. Click on '....'
  3. Scroll down to '....'
  4. See error

Expected behavior
A clear and concise description of what you expected to happen.

Screenshots
If applicable, add screenshots to help explain your problem.

Environment (please complete the following information):

  • OS: [e.g. iOS]
  • Browser [e.g. chrome, safari]
  • Version [e.g. 22]
  • Python version [e.g. 3.7]
  • If possible the output of pip freeze to check the python environment.

Additional context
Add any other context about the problem here.

[Question] Which should I use, corr or corr_ext for magnitude and error columns?

HI,

I'm plotting the light curves (with error bars) for ZTF sources. I am confused as to which column within the object light curve table I should use for plotting magnitude (magpsf_corr or magpsf_corr_ext) and which I should use for the errors (sigmapsf_corr or sigmapsf_corr_ext).

I think the corr_ext values are corrected for contamination from a nearby source, therefore I would use magpsf_corr with sigmapsf_corr, and magpsf_corr_ext with sigmapsf_corr_ext. However, some error values are unavailable (given a value of 100), ZTF17aaaenex is a good example.

Please could you provide some guidance on this?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.