Comments (6)
It took us a while to get familiar with readthedocs, but we do have the API docs there at last!
https://trax-ml.readthedocs.io/en/latest/
We're now working on getting the colabs (quick intro and layer intro) there too.
It was hard to ask the community for help as we're making it work both on readthedocs and doing some internal work (e.g., so the colabs are unit-tested at each commit). But it is coming to an end, and from here we should be able to easily accept PRs and we'll definitely ask for help!
Some first examples where help would be welcome:
- API docs for convolution and pooling layers
- A colab with image models (say WIdeResNet on CIFAR or even MLP on MNIST or both) showing the results
- A Transformer colab in trax/models (explaining the model first, later we can add translation training?)
A meta-question: where would be a good place to put such open TODOs? Where would you expect them?
from trax.
Maybe this is something outside contributors can help with. To me, this also brings to mind the discussions recently on Twitter. While some of these discussions are on API design and library scope, It has been making me think of ways the community can support with the docs for Trax.
One of my favorite ways to think of this is how "What nobody tells you about documentation" breaks it down into:
- Tutorials
- How-To Guides
- Explanation
- References
I suspect that at this point in time, references are the key required artifact. @lukaszkaiser indicated the team is currently working on it. If the team is inclined, it could be worth their time to tag a few issues "good first issue" or "contributions welcome" for newcomers to start helping out.
Current Trax Docs Inventory
Besides README.md and the code API, the other documentation artifacts are currently:
Trax Quick Intro Colab
Short, sweet, and to the point. I really like how to-the-point this colab was.
Reformer Colabs:
These are How-to guides for three tasks currently:
Edit:*
layers/intro notebook.
What Docs Should be Produced Next?
Curious to hear from the contributors and community what they feel is needed next.
Personally, it's important to know which aim of the library takes precedence:
A - The examples/models -- e.g. people coming in to use the Reformer, or continue where Transformer2Transformer left off, or for a transformer implementation that works on CPUs/GPUs/TPUs. Or;
B - to learn a new deep learning library from the ground up.
from trax.
I created #897 for starters, please take it if you wish :). We also updated the main page README and the trax intro colab -- it has a direct Transformer decoding part with a pre-trained En-De model. So there is one trained model, I'm happy to add a LM too if you think having a decoder-only model would be helpful.
Will create more issues (and tags/labels) soon, thanks for the suggestions!!
from trax.
Same question. I was kind of expecting an API Reference given the statement.
from trax.
@jalammar Just the other day I was trying to visualize the model layers in a better way (like keras model summary ) and I found some moderate code help in intro.ipynb in layers which is incomplete in the models part. Tried starting with this small cleaner function, may need more recursive way to pretty print/create a graphical tree like layout of the model which would help in debugging models as well.
from trax.
That's wonderful, @lukaszkaiser! Congrats for shipping!
Since reading your comment, I have already started working on a Transformer colab for trax/model/transformer. I'll be sharing a draft in the next couple of days. I think I have a gentle narrative nailed down for it. If there's any additional guidance on what you and the team want to see in the colab, please let me know. I'm considering for the flow to start with a trained TransformerLM, build some confidence in the reader by establishing the model, the blocks, and the Trainer, then move to the Transformer model. I've always found it beneficial to readers if they go over a trained model first, then start learning about the training process once they're comfortable with the model.
For the other two (API docs for Conv/pooling and image models), maybe create an issue for each of the two (for visibility) and tag with "documentation", "good first issue" and "contributions welcome" (this repo has a "help wanted" tag, Tensorflow and JAX have "contributions welcome", either can work)?
from trax.
Related Issues (20)
- The colab button on Knowledge_Tracing_Transformer.ipynb is not open
- TypeError: float() argument must be a string or a number, not 'jaxlib.tpu_client_extension.PyTpuBuffer'
- Machine Translation Refromer model.pkl for trax 1.4.1?
- ImportError: cannot import name 'MergeHeads' from 'trax.layers.attention'
- how to use trax to translate other languages
- Limit the dataset from TFDS
- TypeError: unsupported operand type(s) for ==: 'Array' and 'tuple' HOT 2
- SelfAttention - problem with tensorflow 2.11.0
- AttributeError: module 'jax.ops' has no attribute 'index_add' HOT 1
- Unable to import trax HOT 1
- Cannot import Trax HOT 8
- Could not normally run trax using GPU in local computer
- Issue when running training_loop.run(2000) - message StopIteration in next_batch(self)
- Is possible Linformer algorithm ?
- Can I do simple tokenization?
- Can't run `bert_vocab_from_dataset` without `TypeError: Tensor is unhashable` when import `trax` with `tensorflow`
- Are any easy ways to use something like `train_test_split` from `sklearn`?
- AttributeError: 'function' object has no attribute 'n_steps_per_checkpoint' for NLP Machine translation model HOT 1
- Error loading loop from a checkpoint HOT 1
- Inconsistency in function's doc-string HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ❤️ Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from trax.