GithubHelp home page GithubHelp logo

few-shot-histology's Introduction

FHIST: A benchmark for Few-shot Classification of Histological Images

This Repo contains the code and few-shot tasks of our paper FHIST. FHIST introduces a highly diversified public benchmark, gathered from various public datasets, for few-shot histology data classification. We build few-shot tasks in three different scenarios with various tissue types, different levels of domain shifts stemming from various cancer sites, and different class-granularity levels. We evaluate the performances of state-of-the-art few-shot learning methods, initially designed for natural images, on our benchmark.

Getting Started

Installation

This code is tested with python 3.8. To install required packages:

    pip install -r requirements

Data

You need to download the datasets (crc-tp, nct, lc25000, breakhis), and put all folders in a single folder. Then you can run the conversion script scripts/convert_datasets.sh (change the data_root and record_root path):

    bash scripts/convert_datasets.sh

This scripts will put all the datasets in the same *.tfrecords format.

Make Index files

Run the following to create index files for each tfrecord:(Specify the data_path in the bash script file)

    bash scripts/make_index_files.sh

Usage

Training

You can find implementations of some SOTA few-shot learning methods under src/methods. In order to train a model, run the following command specifying the few-shot method, train, validation and test sources:

    bash scripts/train.sh <method> <train> <valid> <test>

Here's an example:

    bash scripts/train.sh tim crc-tp nct lc25000

The above script will use standard supervised cross-entropy for training, and TIM method for evaluation on valid and test sets. Replace TIM with a meta-learning method(e.g. protonet), to start episodic training.

In order to change the backbone network, simply change the "arch" value in the base.yaml file in the config path or add --arch in the opts list. base.yaml file includes all the parameters used in the code. You can also change hyperparameters of each method from their corresponding yaml file in the config path.

Inference

To run few-shot inference on the trained models:

    bash scripts/test.sh <method> <shot> <train> <test> 

Results will be saved as csv files under the specified res_path for each method.

Acknowledgments

I thank the authors of the open-source TFRecord reader for open sourcing an awesome Pytorch-compatible TFRecordReader !

few-shot-histology's People

Contributors

mboudiaf avatar fereshteshakeri avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.