GithubHelp home page GithubHelp logo

imc's Introduction

Imaging mass cytometry

A package for processing and analysis of imaging mass cytometry (IMC) data.

It implements image- and channel-wise quality control, quantification of cell intenstity and morphology, cell type discovery through clustering, automated cell type labeling, community and super-community finding and differential comparisons between sample groups, in addition to many handy visualization tools. Above all, it is a tool for the use of IMC data at scale.

Development is still underway, so use at your own risk.

Requirements and installation

Requires Python >= 3.8. imc uses a pyproject.toml configuration only, so you'll need a up-to-date version of pip before installing. Base packages as gcc and g++ will also need to be installed on system using the command sudo apt install g++ or likewise. We also highly recommend installing the package on a conda environment to avoid dependency issues.

To install the most updated version of the program:

git clone https://github.com/ElementoLab/imc.git
cd imc
make install

Install from PyPI with pip or with poetry:

pip install imc
# or
poetry install imc

Quick start

Install the package from PyPI with extra packages required for all steps:

pip install imc[extra]
# or
poetry install imc[extra]

Use case 1 (pipeline processing)

Example: Lung sample processing from MCD to single-cell h5ad

One-line IMC data processing:

# Run pipeline in one step with remote MCD file
MCD_URL=https://zenodo.org/record/4110560/files/data/20200612_FLU_1923/20200612_FLU_1923.mcd
imc process $MCD_URL

imc also supports TXT or TIFF files as input, local or remote files:

# Run pipeline in one step with remote TXT file
TXT_URL=https://zenodo.org/record/5018260/files/COVID19_brain_Patient03_ROI3_COVID19_olfactorybulb.txt?download=1
imc process $TXT_URL

Input can be MCD, TIFF, or TXT files. Several files can be given to imc process at once. See more with the --help option.

imc is nonetheless very modular and allows the user to run any of the step seperately as well.

The above is also equivalent to the following:

MCD_URL=https://zenodo.org/record/4110560/files/data/20200612_FLU_1923/20200612_FLU_1923.mcd
SAMPLE=20200612_FLU_1923

wget -O data/${SAMPLE}.mcd $MCD_URL

## output description of acquired data
imc inspect data/${SAMPLE}.mcd

## convert MCD to TIFFs and auxiliary files
imc prepare \
  --ilastik \
  --n-crops 0 \
  --ilastik-compartment nuclear \
  data/${SAMPLE}.mcd

## For each TIFF file, output prediction of mask probabilities and segment them 
TIFFS=processed/${SAMPLE}/tiffs/${SAMPLE}*_full.tiff

## Output pixel probabilities of nucleus, membrane and background using ilastik
imc predict $TIFFS

## Segment cell instances with DeepCell
imc segment \
  --from-probabilities \
  --model deepcell \
  --compartment both $TIFFS

## Quantify channel intensity and morphology for each single cell in every image
imc quantify $TIFFS

Once all MCD files have been processed for the project, create a concatenated AnnData object containing all cells within a project.

from glob import glob
import os
import anndata
pattern = glob('processed/*.h5ad')
adatas = [anndata.read(f) for f in pattern if os.path.exists(f)]
adata = anndata.concat(adatas)
adata.write('results/quant.h5ad')

To perform batch correction and cell clustering:

## Phenotype cells into clusters
imc phenotype processed/quant.h5ad

There are many customization options for each step. Do imc --help or imc <subcommand> --help to see all.

imc also includes a lightweight interactive image viewer:

imc view $TIFFS

There is also an interface to the more full fledged napari image viwer:

imc view --napari data/${SAMPLE}.mcd  # view MCD file
napari $TIFFS  # view TIFF files directly with napari. Requires napari

A quick example of further analysis steps of single cell data downstream in IPython/Jupyter notebook:

import scanpy as sc
a = sc.read('processed/quantification.h5ad')
sc.pp.log1p(a)
sc.pp.pca(a)
sc.pp.neighbors(a)
sc.tl.umap(a)
sc.pl.umap(a, color=a.var.index)

Use case 2 (API usage)

Demo data (synthetic)

>>> from imc.demo import generate_project
>>> prj = generate_project(n_samples=2, n_rois_per_sample=3, shape=(8, 8))
>>> prj
Project 'project' with 2 samples and 6 ROIs in total.

>>> prj.samples  # type: List[IMCSample]
[Sample 'test_sample_01' with 3 ROIs,
 Sample 'test_sample_02' with 3 ROIs]

>>> prj.rois  # type: List[ROI]
[Region 1 of sample 'test_sample_01',
 Region 2 of sample 'test_sample_01',
 Region 3 of sample 'test_sample_01',
 Region 1 of sample 'test_sample_02',
 Region 2 of sample 'test_sample_02',
 Region 3 of sample 'test_sample_02']

>>> prj.samples[0].rois  # type: List[ROI]
[Region 1 of sample 'test_sample_01',
 Region 2 of sample 'test_sample_01',
 Region 3 of sample 'test_sample_01']

>>> roi = prj.rois[0]  # Let's assign one ROI to explore it
>>> roi.channel_labels  # type: pandas.Series; `channel_names`, `channel_metals` also available
0    Ch01(Ch01)
1    Ch02(Ch02)
2    Ch03(Ch03)
Name: channel, dtype: object

>>> roi.mask  # type: numpy.ndarray
array([[0, 0, 0, 0, 0, 0, 0, 0],
       [0, 0, 0, 0, 0, 0, 0, 0],
       [0, 0, 0, 0, 0, 0, 1, 0],
       [0, 0, 0, 0, 0, 0, 0, 0],
       [0, 2, 0, 0, 0, 3, 0, 0],
       [0, 0, 0, 0, 0, 0, 0, 0],
       [0, 0, 4, 0, 0, 0, 0, 0],
       [0, 0, 0, 0, 0, 0, 0, 0]], dtype=int32)

>>> roi.stack.shape  # roi.stack -> type: numpy.ndarray
(3, 8, 8)

>>> # QC
>>> prj.channel_correlation()
>>> prj.channel_summary()

>>> # Cell type discovery
>>> prj.cluster_cells()
>>> prj.find_communities()

Demo data (real)

>>> import imc.demo
>>> imc.demo.datasets
['jackson_2019_short', 'jackson_2019_short_joint']

>>> prj = imc.demo.get_dataset('jackson_2019_short')
>>> prj  # type: Project
Project 'jackson_2019_short' with 4 samples and 4 ROIs in total.

>>> prj.samples  # type: List[IMCSample]
[Sample 'BaselTMA_SP41_15.475kx12.665ky_10000x8500_5_20170905_90_88_X11Y5_242_a0' with 1 ROI,
 Sample 'BaselTMA_SP41_25.475kx12.665ky_8000x8500_3_20170905_90_88_X11Y5_235_a0' with 1 ROI,
 Sample 'BaselTMA_SP41_33.475kx12.66ky_8500x8500_2_20170905_24_61_X3Y4_207_a0' with 1 ROI,
 Sample 'BaselTMA_SP41_33.475kx12.66ky_8500x8500_2_20170905_33_61_X4Y4_215_a0' with 1 ROI]

>>> prj.samples[0].channel_labels  # type: pandas.Series
chanel
0                                  Ar80(Ar80)
1                                  Ru96(Ru96)
2                                  Ru98(Ru98)
3                                  Ru99(Ru99)
4                                Ru100(Ru100)
5                                Ru101(Ru101)
6                                Ru102(Ru102)
7                                Ru104(Ru104)
8                            HistoneH3(In113)
9                                EMPTY(Xe126)
10                                EMPTY(I127)
11                           HistoneH3(La139)
...
42                            vWF-CD31(Yb172)
43                                mTOR(Yb173)
44                        Cytokeratin7(Yb174)
45    PanCytokeratin-KeratinEpithelial(Lu175)
46         CleavedPARP-CleavedCaspase3(Yb176)
47                                DNA1(Ir191)
48                                DNA2(Ir193)
49                               EMPTY(Pb206)
50                               EMPTY(Pb207)
51                               EMPTY(Pb208)
Name: BaselTMA_SP41_15.475kx12.665ky_10000x8500_5_20170905_90_88_X11Y5_242_a0, dtype: object
>>> prj.plot_channels(['DNA2', 'Ki67', "Cytokeratin7"])
<Figure size 400x1200 with 12 Axes>

Your own data

The best way is to have a CSV file with one row per sample, or one row per ROI. That will ensure additional sample/ROI metadata is passed to the objects and used later in analysis. Pass the path to the CSV file to the Project object constructor:

from imc import Project

prj = Project()  # will search current directory for Samples/ROIs

prj = Project(processed_dir="processed")  # will search `processed` for Samples/ROIs

prj = Project("path/to/sample/annotation.csv", processed_dir="processed")
# ^^ will use metadata from CSV and use the files in `processed`.

However, if one is not given, Project will search the current directory or the argument of processed_dir for IMCSamples and ROIs.

The processed_dir directory can be structured in two ways:

  1. One directory per sample.
  • Inside there is a directory "tiffs" which contains the stack "*_full.tiff", channel labels "*_full.csv" and optionally a segmentation "*_full_mask.tiff".
  1. All samples in the same directory processed_dir.
  • Inside the one directory there are stack "*_full.tiff", channel label "*_full.csv" and optionally segmentation "*_full_mask.tiff" files.

The default is option one. If you choose 2, simply pass subfolder_per_sample:

prj = Project(subfolder_per_sample=True)

The expected files are produced by common preprocessing pipelines such as imcpipeline or imcyto.

Documentation

Documentation is for now mostly a skeleton but will be expanded soon:

make docs

Testing

Tests are still very limited, but you can run tests this way:

pip install pytest  # install testing package
python -m pytest --pyargs imc

For data processing, running the example lung data should make sure eveything is running smoothly.

imc's People

Contributors

afrendeiro avatar jkim810 avatar

Stargazers

Xiyu avatar Hang Yin avatar  avatar Yihao Zhang avatar SungWook avatar LugliLab avatar  avatar Merrick Strotton avatar hao dong avatar Fan avatar

Watchers

James Cloos avatar  avatar  avatar Alexandros Sigaras avatar  avatar Alexandros Sigaras avatar Kostas Georgiou avatar  avatar

imc's Issues

Incorporate non-intensity based features in analysis

Cell shape, size, eccentricity, etc:

  • quantify
  • assess which features to include (avoid redundancy);
  • assess if worth including in cell type/state definition

Incorporate other tissue-level structures ElementoLab/hyperion-cytof#4

cannot find files

Dear
I am not sure this is correct or not, but when I set
TIFFS=processed/${SAMPLE}/tiffs/${SAMPLE}_full.tiff
the following step cannot locate the full.tiff files. Can you help me with it?
[Errno 2] No such file or directory: '/processed/20200612_FLU_1923/tiffs/
_full.tiff'

Documentation

Add Sphinx docs:

  • API
  • Installation and testing
  • Configuration
  • Usage example

Add to readthedocs.io

DeepCell install

Dear developer,
This is a great package, well, I have trouble installing DeepCell when I execute pip install deepcell.

ERROR: Cannot install deepcell because these package versions have conflicting dependencies.

The conflict is caused by:
deepcell-tracking 0.4.5 depends on pathlib
deepcell-tracking 0.4.4 depends on pathlib
deepcell-tracking 0.4.3 depends on pathlib
deepcell-tracking 0.4.2 depends on pathlib
deepcell-tracking 0.4.1 depends on pathlib
deepcell-tracking 0.4.0 depends on pathlib
deepcell-tracking 0.3.1 depends on pathlib
deepcell-tracking 0.3.0 depends on pathlib
deepcell-tracking 0.2.7 depends on pathlib
deepcell-tracking 0.2.6 depends on pathlib
deepcell-tracking 0.2.5 depends on pathlib
deepcell-tracking 0.2.4 depends on pathlib

can you help me with it? I am using M1 macOS, and thanks a lot,
Fan

Error Message trying tutorial

Hi,

I try to reproduce the results of toy dataset and i got the following error

TypeError: imwrite() got an unexpected keyword argument 'compress'

both on mac-os and also in Linux.

My env configuration is the following:

Name Version Build Channel

absl-py 1.3.0 pypi_0 pypi
alabaster 0.7.12 pypi_0 pypi
anndata 0.8.0 pypi_0 pypi
annoy 1.17.1 pypi_0 pypi
anyio 3.6.2 pypi_0 pypi
appdirs 1.4.4 pypi_0 pypi
appnope 0.1.3 pypi_0 pypi
argon2-cffi 21.3.0 pypi_0 pypi
argon2-cffi-bindings 21.2.0 pypi_0 pypi
argparse 1.4.0 pypi_0 pypi
asciitree 0.3.3 pypi_0 pypi
astir 0.1.4 pypi_0 pypi
asttokens 2.1.0 pypi_0 pypi
astunparse 1.6.3 pypi_0 pypi
attrs 22.1.0 pypi_0 pypi
autodocsumm 0.2.9 pypi_0 pypi
babel 2.11.0 pypi_0 pypi
backcall 0.2.0 pypi_0 pypi
bbknn 1.5.1 pypi_0 pypi
beautifulsoup4 4.11.1 pypi_0 pypi
bleach 5.0.1 pypi_0 pypi
blessings 1.7 pypi_0 pypi
build 0.9.0 pypi_0 pypi
ca-certificates 2022.9.24 h033912b_0 conda-forge
cachetools 5.2.0 pypi_0 pypi
cachey 0.2.1 pypi_0 pypi
cellpose 0.8.0 pypi_0 pypi
certifi 2022.9.24 pypi_0 pypi
cffi 1.15.1 pypi_0 pypi
charset-normalizer 2.1.1 pypi_0 pypi
click 8.1.3 pypi_0 pypi
cloudpickle 2.2.0 pypi_0 pypi
codecov 2.1.12 pypi_0 pypi
coloredlogs 15.0.1 pypi_0 pypi
colour-runner 0.1.1 pypi_0 pypi
commonmark 0.9.1 pypi_0 pypi
contourpy 1.0.6 pypi_0 pypi
coverage 6.5.0 pypi_0 pypi
cycler 0.11.0 pypi_0 pypi
cython 0.29.32 pypi_0 pypi
dask 2022.10.2 pypi_0 pypi
debugpy 1.6.3 pypi_0 pypi
decorator 5.1.1 pypi_0 pypi
deepcell 0.12.3 pypi_0 pypi
deepcell-toolbox 0.11.2 pypi_0 pypi
deepcell-tracking 0.6.3 pypi_0 pypi
deepdiff 6.2.1 pypi_0 pypi
defusedxml 0.7.1 pypi_0 pypi
distlib 0.3.6 pypi_0 pypi
docstring-parser 0.15 pypi_0 pypi
docutils 0.17.1 pypi_0 pypi
entrypoints 0.4 pypi_0 pypi
executing 1.2.0 pypi_0 pypi
fastcluster 1.2.6 pypi_0 pypi
fasteners 0.18 pypi_0 pypi
fastjsonschema 2.16.2 pypi_0 pypi
fastremap 1.13.3 pypi_0 pypi
fcsparser 0.2.4 pypi_0 pypi
filelock 3.8.0 pypi_0 pypi
flatbuffers 22.10.26 pypi_0 pypi
flowcytometrytools 0.5.1 pypi_0 pypi
fonttools 4.38.0 pypi_0 pypi
freetype-py 2.3.0 pypi_0 pypi
fsspec 2022.10.0 pypi_0 pypi
gast 0.5.3 pypi_0 pypi
google-auth 2.14.1 pypi_0 pypi
google-auth-oauthlib 0.4.6 pypi_0 pypi
google-pasta 0.2.0 pypi_0 pypi
grpcio 1.50.0 pypi_0 pypi
h5py 3.7.0 pypi_0 pypi
heapdict 1.0.1 pypi_0 pypi
hsluv 5.0.3 pypi_0 pypi
humanfriendly 10.0 pypi_0 pypi
idna 3.4 pypi_0 pypi
igraph 0.10.2 pypi_0 pypi
imagecodecs 2022.9.26 pypi_0 pypi
imageio 2.22.4 pypi_0 pypi
imagesize 1.4.1 pypi_0 pypi
imc 0.1.3 pypi_0 pypi
imctools 2.1.8 pypi_0 pypi
importlib-metadata 5.0.0 pypi_0 pypi
ipykernel 6.17.0 pypi_0 pypi
ipython 8.6.0 pypi_0 pypi
ipython-genutils 0.2.0 pypi_0 pypi
ipywidgets 8.0.2 pypi_0 pypi
jedi 0.18.1 pypi_0 pypi
jinja2 3.1.2 pypi_0 pypi
joblib 1.2.0 pypi_0 pypi
jsonschema 4.17.0 pypi_0 pypi
jupyter 1.0.0 pypi_0 pypi
jupyter-client 7.4.4 pypi_0 pypi
jupyter-console 6.4.4 pypi_0 pypi
jupyter-core 4.11.2 pypi_0 pypi
jupyter-server 1.23.0 pypi_0 pypi
jupyterlab-pygments 0.2.2 pypi_0 pypi
jupyterlab-widgets 3.0.3 pypi_0 pypi
keras 2.8.0 pypi_0 pypi
keras-preprocessing 1.1.2 pypi_0 pypi
kiwisolver 1.4.4 pypi_0 pypi
lazy-loader 0.1rc2 pypi_0 pypi
leidenalg 0.9.0 pypi_0 pypi
libclang 14.0.6 pypi_0 pypi
libcxx 14.0.6 hccf4f1f_0 conda-forge
libffi 3.3 h046ec9c_2 conda-forge
libsqlite 3.39.4 ha978bb4_0 conda-forge
libzlib 1.2.13 hfd90126_4 conda-forge
littleutils 0.2.2 pypi_0 pypi
llvmlite 0.39.1 pypi_0 pypi
locket 1.0.0 pypi_0 pypi
loompy 3.0.7 pypi_0 pypi
lxml 4.9.1 pypi_0 pypi
magicgui 0.6.0 pypi_0 pypi
markdown 3.4.1 pypi_0 pypi
markupsafe 2.1.1 pypi_0 pypi
matplotlib 3.6.2 pypi_0 pypi
matplotlib-inline 0.1.6 pypi_0 pypi
mistune 2.0.4 pypi_0 pypi
mypy 0.990 pypi_0 pypi
mypy-extensions 0.4.3 pypi_0 pypi
napari 0.4.16 pypi_0 pypi
napari-console 0.0.6 pypi_0 pypi
napari-imc 0.6.5 pypi_0 pypi
napari-plugin-engine 0.2.0 pypi_0 pypi
napari-svg 0.1.6 pypi_0 pypi
natsort 8.2.0 pypi_0 pypi
nbclassic 0.4.8 pypi_0 pypi
nbclient 0.7.0 pypi_0 pypi
nbconvert 7.2.3 pypi_0 pypi
nbformat 5.7.0 pypi_0 pypi
nbsphinx 0.8.9 pypi_0 pypi
nbsphinx-link 1.3.0 pypi_0 pypi
ncurses 6.3 h96cf925_1 conda-forge
nest-asyncio 1.5.6 pypi_0 pypi
networkx 2.8.8 pypi_0 pypi
notebook 6.5.2 pypi_0 pypi
notebook-shim 0.2.2 pypi_0 pypi
npe2 0.6.1 pypi_0 pypi
numba 0.56.4 pypi_0 pypi
numcodecs 0.10.2 pypi_0 pypi
numexpr 2.8.4 pypi_0 pypi
numpy 1.23.4 pypi_0 pypi
numpy-groupies 0.9.20 pypi_0 pypi
numpydoc 1.5.0 pypi_0 pypi
oauthlib 3.2.2 pypi_0 pypi
opencv-python-headless 4.6.0.66 pypi_0 pypi
openssl 1.1.1s hfd90126_0 conda-forge
opt-einsum 3.3.0 pypi_0 pypi
ordered-set 4.1.0 pypi_0 pypi
outdated 0.2.2 pypi_0 pypi
packaging 21.3 pypi_0 pypi
pandas 1.5.1 pypi_0 pypi
pandas-flavor 0.3.0 pypi_0 pypi
pandocfilters 1.5.0 pypi_0 pypi
parmap 1.6.0 pypi_0 pypi
parso 0.8.3 pypi_0 pypi
partd 1.3.0 pypi_0 pypi
patsy 0.5.3 pypi_0 pypi
pep517 0.13.0 pypi_0 pypi
pexpect 4.8.0 pypi_0 pypi
pickleshare 0.7.5 pypi_0 pypi
pillow 9.3.0 pypi_0 pypi
pingouin 0.5.2 pypi_0 pypi
pint 0.20.1 pypi_0 pypi
pip 22.3.1 pyhd8ed1ab_0 conda-forge
platformdirs 2.5.3 pypi_0 pypi
pluggy 1.0.0 pypi_0 pypi
prometheus-client 0.15.0 pypi_0 pypi
prompt-toolkit 3.0.32 pypi_0 pypi
protobuf 3.19.6 pypi_0 pypi
psutil 5.9.4 pypi_0 pypi
psygnal 0.6.0 pypi_0 pypi
ptyprocess 0.7.0 pypi_0 pypi
pure-eval 0.2.2 pypi_0 pypi
py 1.11.0 pypi_0 pypi
pyasn1 0.4.8 pypi_0 pypi
pyasn1-modules 0.2.8 pypi_0 pypi
pycparser 2.21 pypi_0 pypi
pydantic 1.10.2 pypi_0 pypi
pydot 1.4.2 pypi_0 pypi
pygments 2.13.0 pypi_0 pypi
pynndescent 0.5.8 pypi_0 pypi
pyopengl 3.1.6 pypi_0 pypi
pyparsing 3.0.9 pypi_0 pypi
pyrsistent 0.19.2 pypi_0 pypi
python 3.9.0 h4f09611_5_cpython conda-forge
python-dateutil 2.8.2 pypi_0 pypi
python-louvain 0.16 pypi_0 pypi
pytomlpp 1.0.11 pypi_0 pypi
pytz 2022.6 pypi_0 pypi
pywavelets 1.4.1 pypi_0 pypi
pyyaml 6.0 pypi_0 pypi
pyzmq 24.0.1 pypi_0 pypi
qtconsole 5.4.0 pypi_0 pypi
qtpy 2.3.0 pypi_0 pypi
readimc 0.6.1 pypi_0 pypi
readline 8.1.2 h3899abd_0 conda-forge
requests 2.28.1 pypi_0 pypi
requests-oauthlib 1.3.1 pypi_0 pypi
rich 12.6.0 pypi_0 pypi
rootpath 0.1.1 pypi_0 pypi
rsa 4.9 pypi_0 pypi
scanpy 1.9.1 pypi_0 pypi
scikit-image 0.19.3 pypi_0 pypi
scikit-learn 1.0.2 pypi_0 pypi
scipy 1.9.3 pypi_0 pypi
seaborn 0.12.1 pypi_0 pypi
seaborn-extensions 0.0.20 pypi_0 pypi
send2trash 1.8.0 pypi_0 pypi
session-info 1.0.0 pypi_0 pypi
setuptools 65.5.1 pyhd8ed1ab_0 conda-forge
setuptools-scm 7.0.5 pypi_0 pypi
six 1.16.0 pypi_0 pypi
sklearn 0.0.post1 pypi_0 pypi
sniffio 1.3.0 pypi_0 pypi
snowballstemmer 2.2.0 pypi_0 pypi
soupsieve 2.3.2.post1 pypi_0 pypi
spektral 1.0.6 pypi_0 pypi
sphinx 5.3.0 pypi_0 pypi
sphinx-autodoc-typehints 1.19.5 pypi_0 pypi
sphinx-rtd-theme 1.1.1 pypi_0 pypi
sphinxcontrib-applehelp 1.0.2 pypi_0 pypi
sphinxcontrib-devhelp 1.0.2 pypi_0 pypi
sphinxcontrib-htmlhelp 2.0.0 pypi_0 pypi
sphinxcontrib-jsmath 1.0.1 pypi_0 pypi
sphinxcontrib-qthelp 1.0.3 pypi_0 pypi
sphinxcontrib-serializinghtml 1.1.5 pypi_0 pypi
sqlite 3.39.4 h9ae0607_0 conda-forge
stack-data 0.6.0 pypi_0 pypi
statsmodels 0.13.5 pypi_0 pypi
stdlib-list 0.8.0 pypi_0 pypi
superqt 0.3.8 pypi_0 pypi
tables 3.7.0 pypi_0 pypi
tabulate 0.9.0 pypi_0 pypi
tensorboard 2.8.0 pypi_0 pypi
tensorboard-data-server 0.6.1 pypi_0 pypi
tensorboard-plugin-wit 1.8.1 pypi_0 pypi
tensorflow 2.8.3 pypi_0 pypi
tensorflow-addons 0.16.1 pypi_0 pypi
tensorflow-estimator 2.8.0 pypi_0 pypi
tensorflow-io-gcs-filesystem 0.27.0 pypi_0 pypi
termcolor 2.1.0 pypi_0 pypi
terminado 0.17.0 pypi_0 pypi
texttable 1.6.4 pypi_0 pypi
threadpoolctl 3.1.0 pypi_0 pypi
tifffile 2022.10.10 pypi_0 pypi
tinycss2 1.2.1 pypi_0 pypi
tk 8.6.12 h5dbffcc_0 conda-forge
tomli 2.0.1 pypi_0 pypi
toolz 0.12.0 pypi_0 pypi
torch 1.13.0 pypi_0 pypi
tornado 6.2 pypi_0 pypi
tox 3.27.0 pypi_0 pypi
tqdm 4.64.1 pypi_0 pypi
traitlets 5.5.0 pypi_0 pypi
typeguard 2.13.3 pypi_0 pypi
typer 0.7.0 pypi_0 pypi
typing-extensions 4.4.0 pypi_0 pypi
tzdata 2022f h191b570_0 conda-forge
umap-learn 0.5.3 pypi_0 pypi
urllib3 1.26.12 pypi_0 pypi
urlpath 1.2.0 pypi_0 pypi
virtualenv 20.16.6 pypi_0 pypi
vispy 0.10.0 pypi_0 pypi
wcwidth 0.2.5 pypi_0 pypi
webencodings 0.5.1 pypi_0 pypi
websocket-client 1.4.2 pypi_0 pypi
werkzeug 2.2.2 pypi_0 pypi
wheel 0.38.2 pyhd8ed1ab_0 conda-forge
widgetsnbextension 4.0.3 pypi_0 pypi
wrapt 1.14.1 pypi_0 pypi
xarray 2022.11.0 pypi_0 pypi
xmltodict 0.13.0 pypi_0 pypi
xtiff 0.7.8 pypi_0 pypi
xz 5.2.6 h775f41a_0 conda-forge
zarr 2.13.3 pypi_0 pypi
zipp 3.10.0 pypi_0 pypi
zlib 1.2.13 hfd90126_4 conda-forge

Testing

  • Boilerplate + reqs
  • Figure out test data -> create demo module
  • Write pytest tests (>= 60%)
  • Travis CI + coverage

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.