GithubHelp home page GithubHelp logo

brentyi / tyro Goto Github PK

View Code? Open in Web Editor NEW
456.0 7.0 23.0 35.02 MB

Zero-effort CLI interfaces & config objects, from types

Home Page: https://brentyi.github.io/tyro

License: MIT License

Python 100.00%
argparse argument-parsing dataclasses

tyro's Introduction


tyro logo

Documentation   •   pip install tyro

build mypy pyright ruff codecov codecov


tyro.cli() generates command-line interfaces via Python type introspection. We can define configurable scripts using functions:

"""A command-line interface defined using a function signature.

Usage: python script_name.py --foo INT [--bar STR]
"""

import tyro

def main(
    foo: int,
    bar: str = "default",
) -> None:
    ...  # Main body of a script.

if __name__ == "__main__":
    # Generate a CLI and call `main` with its two arguments: `foo` and `bar`.
    tyro.cli(main)

Or instantiate configuration objects defined using tools like dataclasses, pydantic, and attrs:

"""A command-line interface defined using a class signature.

Usage: python script_name.py --foo INT [--bar STR]
"""

from dataclasses import dataclass
import tyro

@dataclass
class Config:
    foo: int
    bar: str = "default"

if __name__ == "__main__":
    # Generate a CLI and instantiate `Config` with its two arguments: `foo` and `bar`.
    config = tyro.cli(Config)

    # Rest of script.
    assert isinstance(config, Config)  # Should pass.

Other features include helptext generation, nested structures, shell completion, and subcommands. For examples and the API reference, see our documentation.

In the wild

tyro is designed to be lightweight enough for throwaway scripts, while facilitating type safety and modularity for larger projects. Examples:

nerfstudio-project/nerfstudio
GitHub star count
Open-source tools for neural radiance fields.
Sea-Snell/JAXSeq
GitHub star count
Train very large language models in Jax.
kevinzakka/obj2mjcf
GitHub star count
Interface for processing OBJ files for Mujoco.
blurgyy/jaxngp
GitHub star count
CUDA-accelerated implementation of instant-ngp, in JAX.
NVIDIAGameWorks/kaolin-wisp
GitHub star count
PyTorch library for neural fields.
autonomousvision/sdfstudio
GitHub star count
Unified framework for surface reconstruction.
openrlbenchmark/openrlbenchmark
GitHub star count
Collection of tracked experiments for reinforcement learning.
vwxyzjn/cleanrl
GitHub star count
Single-file implementation of deep RL algorithms.

Alternatives

tyro is an opinionated library. If any design decisions don't make sense, feel free to file an issue!

You might also consider one of many alternative libraries. Some that we particularly like:

  • simple-parsing and jsonargparse, which provide deeper integration with configuration file formats like YAML and JSON.
  • clipstick, which focuses on generating CLIs from Pydantic models.
  • datargs, which provides a minimal API for dataclasses.
  • fire and clize, which support arguments without type annotations.

We also have some notes on tyro's design goals and other alternatives in the docs here.

tyro's People

Contributors

blurgyy avatar brentyi avatar eltociear avatar jessefarebro avatar kevin-thankyou-lin avatar kevinddchen avatar kevinzakka avatar kianmeng avatar pwais avatar pyetras avatar shenhanqian avatar supern1ck avatar vwxyzjn avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar

tyro's Issues

Wondering if this works

Hello tyro team,

Thank you for creating this beautiful tool.

I am creating a dummy demo to see it tyro can work in this way. Idea is very straightforward.

import typing as t
from dataclasses import dataclass
from pathlib import Path


@dataclass
class DatasetJitterConfig:
    image_folder: Path


@dataclass
class SwapDatasetConfig:
    image_folder: Path


def func1(config: SwapDatasetConfig):
    print(config)


def func2(config: DatasetJitterConfig):
    print(config)


if __name__ == "__main__":
    import tyro

    tyro.cli(t.Union[func1, func2])

and when I run this program, it gives me:

python filename.py func1 --config.image-folder 1 
Traceback (most recent call last):
  File "/home/jizong/workspace/nerfstudio-github/scripts/dataset_jetter.py", line 114, in <module>
    tyro.cli(t.Union[func1, func2])
  File "/opt/miniconda3/envs/nerfstudio/lib/python3.8/site-packages/tyro/_cli.py", line 171, in cli
    output = _cli_impl(
  File "/opt/miniconda3/envs/nerfstudio/lib/python3.8/site-packages/tyro/_cli.py", line 422, in _cli_impl
    out, consumed_keywords = _calling.call_from_args(
  File "/opt/miniconda3/envs/nerfstudio/lib/python3.8/site-packages/tyro/_calling.py", line 155, in call_from_args
    value, consumed_keywords_child = call_from_args(
  File "/opt/miniconda3/envs/nerfstudio/lib/python3.8/site-packages/tyro/_calling.py", line 34, in call_from_args
    f, type_from_typevar, field_list = _fields.field_list_from_callable(
  File "/opt/miniconda3/envs/nerfstudio/lib/python3.8/site-packages/tyro/_fields.py", line 201, in field_list_from_callable
    raise _instantiators.UnsupportedTypeAnnotationError(field_list.message)
tyro._instantiators.UnsupportedTypeAnnotationError: `default_instance` is supported only for select types: dataclasses, lists, NamedTuple, TypedDict, etc.

Could you please give me some suggestion on this demo?

Best

Pydantic validators raise a pydantic.Validation error instead of raising a tyro.Instantiation error

Pydantic has strong property validation capabilities that go beyond the capabilities of type Annotations.
Consider the following pydantic model which has a custom validator included:

class Model(BaseModel):
    a: int

    @field_validator("a")
    def below_ten(cls, v):
        if v > 10:
            raise ValueError("Must be below 10")

Tyro doesn't take them into account during model validation which is -I think- taking place somewhere here:

value = arg.lowered.instantiator(value)

Pydantic does the above validation at model instantiation, which is happening here:

return unwrapped_f(*positional_args, **kwargs), consumed_keywords # type: ignore

It would be great to leverage the validation rules inside tyro.

Cannot use `tuple` and `list` in python 3.9

Hi all,

I have a problem when using the new tuple and list type annotation with tyro. It gives me the following error:

AttributeError: type object 'tuple' has no attribute 'copy_with'

The code runs fine with Tuple and List.

Multiple class choices & initialization functions

Hi tyro team,

Thanks for a great library, tyro is a blast and really helps simplify config code so far!

There is a convoluted scenario I'm still trying to figure out how to apply tyro to:
Foo and Bar are interchangeable classes (i.e. think 2 different dataset implementations), and each can be constructed with different constructors.

class Foo(Baz):
   def __init__(self, a, b, c):
           super().__init__(a, b)
           self.c = c

   @classmethod
   def from_X(cls, arg1, arg2):
        return Foo(arg1, arg2, "X")

   @classmethod
   def from_Y(cls, arg3, arg4):
        return Foo(arg3, arg4, "Y")

class Bar(Baz):
   def __init__(self, a, b, d):
           super().__init__(a, b)
           self.d = d

   @classmethod
   def from_X(cls, arg1, arg2):
        return Foo(arg1, arg2, "another X")

   @classmethod
   def from_Z(cls, arg5, arg6):
        return Foo(arg5, arg6, "Z")

We have a function which generates a dataclass automatically from functions / class__init__.
We want our config system to be able to:

  1. Specify whether a Foo or Bar config should be created, something similar to the Subcommands except we don't have explicit dataclasses (as they're constructed dynamically).
# Goal is to replace those with dynamically created configs, based on the user passing my_baz_impl:foo or my_baz_impl: bar
def main_func(my_baz_impl: Union[Foo, Bar]) -> None:
  1. Accomodate the different from_.. construction methods within the same config, but still allow cli to explicitly show these are separate group args (i.e. Foo.fromX has arg1, arg2 and Foo.fromY has arg3 and arg4).

What's the best practice to approach this with tyro?

Thanks in advance!

Support variadic positional and keyword arguments

This shouldn't be too hard. Just a matter of changing the type when parsing the signature and having another internal marker to unpack the arguments before the call.

e.g.,

def main(*args: str, **kwargs: str) -> Tuple[Tuple[str], Dict[str, str]]:
    return args, kwargs

tyro.cli(main)

YAML serialization helpers

This API hasn't been touched in a while, which has caused some issues like #7.

Things that probably won't work:

  • Annotating a field with a base class, then assigning a instance of a subclass. This is fixable by iterating over cls.__subclasses__().
  • Annotating a field with a protocol, then assigning a value that correctly implements the protocol.

Some questions to consider:

  • Is the serialization API actually solving an issue people have; is it useful enough to keep around? PyYAML works pretty well.
  • Is this actually in scope for dcargs? We're only scratching the surface on potential features; readable + robust serialization could be its own project.
  • dcargs.cli() previously only supported dataclasses, but scope has expanded since then. Does it still make sense for the serialization helpers to be hyper-targeted on dataclasses?

Two options for next steps:

  1. Fix or document all the bugs and caveats. Not sure we have the energy for this 🥲
  2. Deprecate the API.

Assertion won't be formatted correctly in _arguments

Sorry for the issue spamming- but here's a small bug report that this assertion needs to be replaced by an Exception to properly display which fields have issues in the defaults. I would submit a PR but I'll probably get the type of Exception to use wrong :D

tyro/tyro/_arguments.py

Lines 198 to 201 in e88e690

assert False, (
"Expected a boolean as a default for {arg.field.name}, but got"
" {lowered.default}."
)

Support for variable referencing

I've been looking for an alternative to Hydra for config management, specifically one that allows for defining configs in Python, and I stumbled across Tyro which seems like a great library for my use case after some experimentation.

However, one thing that doesn't appear to be possible is referencing a single variable from multiple places in a nested config. As for why this might be needed, it is very common in an ML codebase to require the same parameter in many different places. For example, the number of classification classes might be used in the model construction, visualization, etc.

We might want this value to be dependent on a config group such as the dataset (i.e. each dataset might have a different number of classes). Instead of manually defining each combination of model + dataset, it would be a lot easier to have the model parameters simply reference the dataset parameter, or have them both reference some top-level variable. With Hydra, there is value interpolation that does this.

Since we can define Tyro configs directly in Python, it seems like this could be made much more powerful with support for arbitrary expressions allowing small pieces of logic to be defined in a configuration (e.g., for a specific top-level config we can have a model parameter be 4 * num_classes). Clearly, we could simply make the 4 into a new parameter but there are good reasons we might want it in the config instead.

From what I can tell, this type of variable referencing, even without any expressions, is not currently possible with Tyro.

Request to add jaxngp to tyro's README.md

Hey brentyi!

I hope you're doing great! I wanted to take a moment to say how awesome the tyro library is. I've been using tyro in my project jaxngp where I replicated instant-ngp in JAX, and it's works really nice. I was wondering if you would be open to the idea of including a link to my project in tyro's README.md, I think it could be a nice addition.

Thanks a lot for taking the time to consider it!

Best regards,
blurgyy

Faster + lazy helptext generation

nerfstudio's ns-train function currently has ~500 arguments, which results in a nearly 0.4 (!!) second dcargs overhead. That's huge!

It's currently still a small part of overall startup time, but some profiling shows that most of it is spent on helptext formatting; about 0.1 seconds for rich operations and 0.2 seconds for docstring parsing.

Most of the time, the helptext isn't even used; we should find ways to run less logic and faster logic. More intelligent caching and lazy strings would likely speed things up by ~an order of magnitude.

Fallthrough args for subcommands

One thing that's really nice about CLI11 is fallthrough args.

This isn't supported by argparse natively, which means that instead of writing something like:

python x.py subcommand1 subcommand2 {--options for the root parser of x.py, subcommand1, and subcommand2}

we're forced to write:

python x.py {--options for the root parser of x.py} subcommand1 {--options for subcommand1} subcommand2 {--options for subcommand2}

Which requires much more cognitive energy, because we need to be careful about where arguments are placed.

We should be able to partially solve this: it won't be as elegant as CLI11, but when a subcommand tree is built adding a flag that distributes arguments applied to intermediate nodes to the leaves of the tree would enable the syntax in the first example.

We basically have two approaches for this:

(1) Refactor ParserSpecification.apply() to support this. This would require big changes to the way "sibling" subcommands are handled.
(2) Keep the current ParserSpecification / argparse.ArgumentParser construction logic, but as a post-processing step move all argparse groups for intermediate subcommand nodes to leaves below them. This feels hackier but might be simpler.

More flexibility to the core parsing API

Hi! Just wanted to start off by saying that the library looks great 🎉. I'm trying to assess the feasibility of Tyro for my use-case, and there are a couple suggestions that maybe you'd consider:

  1. It would be great to have more flexibility to get the parsed arguments without calling the function. I was thinking potentially having a function that returns something like inspect.BoundArguments after parsing is complete. This way we can have more flexibility on how we call the function. Another possibility might be an API where you don't provide a function, but just a dataclass that'll get hydrated from the CLI. EDIT: I see now that you can parse a dataclass using tyro.cli.

  2. It would be great if there was a flag like strict that controls whether argparse attempts to parse all arguments or only known arguments. For example, you could parse known arguments with Tyro then default back to absl to parse Jax config flags. To accomplish this, you'd need some way of knowing what Tyro wasn't able to parse. This is somewhat possible already by doing:

    1. _, unknown_args = tyro.extras.get_parser(f).parse_known_args()
    2. Filter out unknown_args from sys.argv and parse these separately.
    3. tyro.cli(f, args=...) with the filtered args.

    It would be nice if there was an easier way to perform this. Perhaps this could be specifically added to (1) where if you specify strict=False it'll return a tuple BoundArguments, List[str] where the second element is the unknown arguments.

  3. A crazy feature that might be useful to others is if there was a way to go from a dataclass instance (or annotation of a dataclass) to the (minimal?) CLI arguments needed to generate that dataclass. e.g., something like tyro.extras.to_cli_args. The use-case here is to keep all your configs / sweeps in Python, this would be specifically useful for sweeps. You could have a generator over (annotated?) dataclasses and then convert those to CLI arguments when launching a job.

Visibility of parameter options across subcommands with defaults

Hi Brent,

First off, I'm really liking this framework! I have a use case that kind of combines "base configs as subcommands" with "sequenced subcommands".

Say I have a module that has two submodules, A and B. Furthermore, say each submodule has several possible "typical" configurations, e.g. A1, A2..., B1, B2,...

What I would like to do is simultaneously:

  1. Set up base configs for all combinations of the typical configs for both A and B, without having to enumerate all combinations e.g. A1B1, A1B2, etc...
  2. View, from -h, the possible options for both submodules A and B.

Is there currently a way of doing this? I've attached 2 examples. The first one sets up all base configs for both, but doesn't list all options with -h (it only lists options for the most recent subcommand). The second one will display all the possible parameter options for both A and B with -h (after one of the subcommands is specified).

I'm not even sure if what I'm trying to do is possible in a "subcommand" sense? I've also tried the AvoidSubcommands type but I can't really make that work either.

Thanks,
Mark

a.py:

from dataclasses import dataclass
from typing import Annotated, Union

import tyro
from tyro.conf import subcommand

@dataclass(frozen=True)
class SubModuleAConfig:
    param1: float
submoda_defaults = {
    'basic': SubModuleAConfig(param1=1.),
    'fancy': SubModuleAConfig(param1=2.2),
}
submoda_descriptions = {
    'basic': 'Basic config',
    'fancy': 'Fancy config'
}
SubModuleADefaultsType = tyro.extras.subcommand_type_from_defaults(
    submoda_defaults, submoda_descriptions
)

@dataclass(frozen=True)
class SubModuleBConfig:
    param2: int
submodb_defaults = {
    'basic': SubModuleBConfig(param2=0),
    'fancy': SubModuleBConfig(param2=-5),
}
submodb_descriptions = {
    'basic': 'Basic config',
    'fancy': 'Fancy config'
}
SubModuleBDefaultsType = tyro.extras.subcommand_type_from_defaults(
    submodb_defaults, submodb_descriptions
)

@dataclass
class FullModuleConfig:
    suba: SubModuleADefaultsType
    subb: SubModuleBDefaultsType

if __name__ == '__main__':
    full_module_config = tyro.cli(FullModuleConfig)
    print(full_module_config)

Output:

$ python a.py suba:basic subb:basic -h
usage: a.py suba:basic subb:basic [-h] [--subb.param2 INT]

Basic config

╭─ arguments ─────────────────────────────────────────────╮
│ -h, --help              show this help message and exit │
╰─────────────────────────────────────────────────────────╯
╭─ subb arguments ────────────────────────────────────────╮
│ --subb.param2 INT       (default: 0)                    │
╰─────────────────────────────────────────────────────────╯

b.py

from dataclasses import dataclass
from itertools import product
from typing import Annotated, Union

import tyro
from tyro.conf import subcommand

@dataclass(frozen=True)
class SubModuleAConfig:
    param1: float
submoda_defaults = {
    'basic': SubModuleAConfig(param1=1.),
    'fancy': SubModuleAConfig(param1=2.2),
}
submoda_descriptions = {
    'basic': 'Basic config',
    'fancy': 'Fancy config'
}

@dataclass(frozen=True)
class SubModuleBConfig:
    param2: int
submodb_defaults = {
    'basic': SubModuleBConfig(param2=0),
    'fancy': SubModuleBConfig(param2=-5),
}
submodb_descriptions = {
    'basic': 'Basic config',
    'fancy': 'Fancy config'
}

@dataclass
class FullModuleConfig:
    suba: SubModuleAConfig
    subb: SubModuleBConfig

all_defaults = {}
all_descriptions = {}
combos = product(submoda_defaults.items(), submodb_defaults.items())
for (suba_name, suba_config), (subb_name, subb_config) in combos:
    name = f'A{suba_name}_B{subb_name}'
    all_defaults[name] = FullModuleConfig(
        suba=suba_config,
        subb=subb_config,
    )
    all_descriptions[name] = f'A: {submoda_descriptions[suba_name]}, ' \
        + f'B: {submodb_descriptions[subb_name]}'

if __name__ == '__main__':
    full_module_config = tyro.cli(
        tyro.extras.subcommand_type_from_defaults(
            all_defaults,
            all_descriptions,
        )
    )
    print(full_module_config)

Output:

$ python b.py Abasic_Bbasic -h
usage: b.py Abasic_Bbasic [-h] [--suba.param1 FLOAT] [--subb.param2 INT]

A: Basic config, B: Basic config

╭─ arguments ─────────────────────────────────────────────╮
│ -h, --help              show this help message and exit │
╰─────────────────────────────────────────────────────────╯
╭─ suba arguments ────────────────────────────────────────╮
│ --suba.param1 FLOAT     (default: 1.0)                  │
╰─────────────────────────────────────────────────────────╯
╭─ subb arguments ────────────────────────────────────────╮
│ --subb.param2 INT       (default: 0)                    │
╰─────────────────────────────────────────────────────────╯

Custom datamanager + tyro error

Hi there!

I asked a question in discord channel, I want to just follow it up here, my question was following:

I want to implement new datamodules but don't want to include them in the directory of nerfstudio, but rather want to keep them externally. I copied the train.py code, and add my own configs to method_configs dictionary. I call tyro.cli with modified AnnotatedBaseConfigUnion, but tyro gives AssertionError. I realized I have to modify datamanagers.py and my new dataparserconfig but I want to avoid modifying any code in nerfstudio. What is the correct approach to achieve this?

Here is my code:

import pathlib
import sys

current_path = pathlib.Path(__file__).parent.resolve()
sys.path.append(str(current_path.parent / "dependencies/nerfstudio/scripts"))
from scripts.train import main

from dataclasses import dataclass, field
from pathlib import Path
from typing import Type

import tyro
# from nerfstudio.cameras.camera_optimizers import CameraOptimizerConfig
from nerfstudio.configs.base_config import Config
from nerfstudio.data.datamanagers import VanillaDataManagerConfig
from nerfstudio.data.dataparsers.blender_dataparser import BlenderDataParserConfig, Blender
# from nerfstudio.data.dataparsers.friends_dataparser import FriendsDataParserConfig
# from nerfstudio.data.dataparsers.nerfstudio_dataparser import NerfstudioDataParserConfig
from nerfstudio.engine.optimizers import AdamOptimizerConfig, RAdamOptimizerConfig
from nerfstudio.models.base_model import VanillaModelConfig
from nerfstudio.models.vanilla_nerf import NeRFModel
from nerfstudio.pipelines.base_pipeline import VanillaPipelineConfig
# from nerfstudio.pipelines.dynamic_batch import DynamicBatchPipelineConfig
from nerfstudio.configs.config_utils import convert_markup_to_ansi
from nerfstudio.configs.method_configs import method_configs, descriptions
from nerfstudio.data.dataparsers.base_dataparser import DataParserConfig

@dataclass
class TempDataParserConfig(DataParserConfig):
    _target: Type = field(default_factory=lambda: Blender)
    """target class to instantiate"""
    data: Path = Path("data/blender/lego")
    """Directory specifying location of data."""
    scale_factor: float = 1.0
    """How much to scale the camera origins by."""
    alpha_color: str = "white"
    """alpha color of background"""

def entrypoint():
    """Entrypoint for use with pyproject scripts."""
    # Choose a base configuration and override values.
    tyro.extras.set_accent_color("bright_yellow")

    # Add hyperlight models to the method configs and descriptions
    descriptions["temp-model"] = "Temp-model"
    method_configs["temp-model"] = Config(
        method_name="vanilla-nerf",
        pipeline=VanillaPipelineConfig(
            datamanager=VanillaDataManagerConfig(
                dataparser=TempDataParserConfig(),
                # dataparser=BlenderDataParserConfig(),
                train_num_images_to_sample_from = 8,
                train_num_times_to_repeat_images = 4,
                eval_num_images_to_sample_from = 2,
                eval_num_times_to_repeat_images = 1,
            ),
            model=VanillaModelConfig(_target=NeRFModel),
        ),
        optimizers={
            "fields": {
                "optimizer": RAdamOptimizerConfig(lr=5e-4, eps=1e-08),
                "scheduler": None,
            }
        },
    )

    AnnotatedBaseConfigUnion = tyro.conf.SuppressFixed[  # Don't show unparseable (fixed) arguments in helptext.
    tyro.extras.subcommand_type_from_defaults(defaults=method_configs, descriptions=descriptions)
    ]
    main(
        tyro.cli(
            AnnotatedBaseConfigUnion,
            description=convert_markup_to_ansi(__doc__),
        )
    )


if __name__ == "__main__":
    entrypoint()

Here is the error it produces:

Traceback (most recent call last):
  File "/host/scripts/train_temp.py", line 83, in <module>
    entrypoint()
  File "/host/scripts/train_temp.py", line 75, in entrypoint
    tyro.cli(
  File "/opt/conda/lib/python3.9/site-packages/tyro/_cli.py", line 125, in cli
    _cli_impl(
  File "/opt/conda/lib/python3.9/site-packages/tyro/_cli.py", line 275, in _cli_impl
    parser_definition = _parsers.ParserSpecification.from_callable_or_type(
  File "/opt/conda/lib/python3.9/site-packages/tyro/_parsers.py", line 99, in from_callable_or_type
    subparsers_attempt = SubparsersSpecification.from_field(
  File "/opt/conda/lib/python3.9/site-packages/tyro/_parsers.py", line 383, in from_field
    subparser = ParserSpecification.from_callable_or_type(
  File "/opt/conda/lib/python3.9/site-packages/tyro/_parsers.py", line 129, in from_callable_or_type
    nested_parser = ParserSpecification.from_callable_or_type(
  File "/opt/conda/lib/python3.9/site-packages/tyro/_parsers.py", line 129, in from_callable_or_type
    nested_parser = ParserSpecification.from_callable_or_type(
  File "/opt/conda/lib/python3.9/site-packages/tyro/_parsers.py", line 99, in from_callable_or_type
    subparsers_attempt = SubparsersSpecification.from_field(
  File "/opt/conda/lib/python3.9/site-packages/tyro/_parsers.py", line 357, in from_field
    assert default_name is not None
AssertionError

Would appreciate your help to achieve such functionality

How to specify an alias for a parameter

If a parameter name is too complex and I want to specify its abbreviation, how should I write it.
Similar to this:
parser.add_argument("--target_img", "--ti", required=False, help="Path of reference image.")
--target_img and --ti

Subcommands for choosing between base configs / defaults

One super common pattern is:

  1. Define a set of base configurations, for example:

    base_configs = {
        "small": ExperimentConfig(
            dataset="mnist",
            optimizer=SgdOptimizer(),
            batch_size=2048,
            num_layers=4,
            units=64,
            train_steps=30_000,
            # The dcargs.MISSING sentinel allows us to specify that the seed should have no
            # default, and needs to be populated from the CLI.
            seed=dcargs.MISSING,
            activation=nn.ReLU,
        ),
        "big": ExperimentConfig(
            dataset="imagenet-50",
            optimizer=AdamOptimizer(),
            batch_size=32,
            num_layers=8,
            units=256,
            train_steps=100_000,
            seed=dcargs.MISSING,
            activation=nn.GELU,
        ),
    }
  2. Choose one of these base configurations, and then use the CLI to override fields in it.

This behavior can currently be achieved by either reading from an environment variable or manually inspecting sys.argv, as we do in the base config example.

However, both of these methods have drawbacks: they feel hacky, aren't compatible with tools like shtab or argcomplete, and can make it hard to introduce CLI flags that don't need to be part of the config object.

The ultimate behavior also looks identical to adding subcommands, which argparse fully supports. Note that we could already create subcommands if we just created a subclass for each base config and passed in Union[SmallExperimentConfig, BigExperimentConfig], but this seems clunky.


What would be nice: being able to define a set of subcommands that correspond to a single nested type, where each subcommand differs only in the defaults that are used.

Some API options:

  1. import dcargs
    from typing import Annotated, Optional
    import pathlib
    
    base_configs = {...}
    
    def train(
        config: Annotated[ExperimentConfig, dcargs.extras.magically_produce_subcommands(base_configs)],
        restore_checkpoint_path: Optional[pathlib.Path] = None,
    ) -> None:
        ...

    This is reasonably succinct, but can't be type-checked.

  2. import dcargs
    from typing import Optional
    import pathlib
    
    base_configs = {...}
    
    def train(
        config: ExperimentConfig = dcargs.extras.magically_produce_subcommands(base_configs),
        restore_checkpoint_path: Optional[pathlib.Path] = None,
    ) -> None:
        ...

    This is slightly more succinct, and in contrast to using Annotated[] with some hacking could be statically type-checked (similar to dataclasses.field()). But it becomes unclear how this field should be handled when train() is called outside of dcargs.cli().

  3. import pathlib
    from typing import Annotated, Optional, Type, TypeVar, Union
    
    import dcargs
    
    
    base_configs = {...}
    
    
    SmallExperimentConfig = Annotated[
        ExperimentConfig,
        dcargs.extras.SubcommandMetadata(name="small", default_instance=base_configs["small"]),
    ]
    BigExperimentConfig = Annotated[
        ExperimentConfig,
        dcargs.extras.SubcommandMetadata(name="big", default_instance=base_configs["big"]),
    ]
    
    
    def train(
        config: Union[SmallExperimentConfig, BigExperimentConfig],
        restore_checkpoint_path: Optional[pathlib.Path] = None,
    ) -> None:
        ...

    This is... verbose, and there's no way to type-check the default value, but it's also much more general and follows the existing Union[] idiom for creating subparsers,

    To reduce verbosity, we could write a helper that dynamically returns the necessary Union[Annotated[...], Annotated[...]] type from a dictionary. Pyright understands this, but unfortunately mypy doesn't:

    T = TypeVar("T")
    
    def make_subcommands_type(instance_from_subcommand: Mapping[str, T]) -> Type[T]:
        return Union.__getitem__(  # type: ignore
            tuple(
                Annotated.__class_getitem__((type(v), SubcommandMetadata(k, v)))  # type: ignore
                for k, v in instance_from_subcommand.items()
            )
        )
    
    
    base_configs = {...}
    SelectableExperimentConfig = make_subcommands_type(base_configs)
    
    
    def train(
        config: SelectableExperimentConfig,
        restore_checkpoint_path: Optional[pathlib.Path] = None,
    ) -> None:
        reveal_type(config)  # correctly inferred from the type of `base_configs` in Pyright!
        ...

    We could make mypy happy by writing:

    if TYPE_CHECKING:
        SelectableExperimentConfig = ExperimentConfig
    else:
        SelectableExperimentConfig = make_subcommands_type(base_configs)

    Or, if we want to get really crazy:

    SelectableExperimentConfig = ExperimentConfig
    SelectableExperimentConfig = make_subcommands_type(base_configs)  # type: ignore

Some questions to think about:

  1. The syntax options above all seem pretty gross. Is there a cleaner approach, particularly one that's type-safe and/or doesn't require custom metadata? Avoiding this kind of logic is/was a core design goal.
  2. If custom metadata is necessary, is it worth it?

Why move away from poetry?

Just genuinely curious. I've never used poetry before, but I've been curious and always wanted to try out.

Were there any pain points in particular?

Subparsing issue with Union types

Hi Brent,

We hit another issue with some nested configs we're using -- basically something is going awry (I think) due to a type Union.

Here's a MWE:

import dataclasses
import dcargs

from typing import Tuple, Union

@dataclasses.dataclass(frozen=True)
class Subtype:
    data: int = 1
    
@dataclasses.dataclass(frozen=True)
class TypeA:
    subtype: Subtype = Subtype(1)

@dataclasses.dataclass(frozen=True)
class TypeB:
    subtype: Subtype = Subtype(2)
    
@dataclasses.dataclass(frozen=True)
class Wrapper:
    supertype: Union[TypeA, TypeB] = TypeA()
    
if __name__ == "__main__":
    wrapper = dcargs.cli(Wrapper) # errors when running with supertype:type-a
    print(wrapper)

If you put this in a module subparsers.py and run $ python subparsers.py, everything works; if you run $ python subparsers.py supertype:type-a, it throws the following error:

File "/opt/conda/lib/python3.7/site-packages/dcargs/_cli.py", line 272, in _cli_impl
    avoid_subparsers=avoid_subparsers,
  File "/opt/conda/lib/python3.7/site-packages/dcargs/_calling.py", line 169, in call_from_args
    avoid_subparsers=avoid_subparsers,
  File "/opt/conda/lib/python3.7/site-packages/dcargs/_calling.py", line 117, in call_from_args
    assert len(parser_definition.subparsers_from_name) > 0
AssertionError

Thanks again for the great package + sorry to raise obscure issues! No problem if this isn't high-priority.

Positional arguments by default

First, thanks for this library. I was looking for an alternative to Typer and ran into this. Just great !

Currently I am playing around with it a bit and have a question:

When using the dataclass approach, is it possible to consider each property which doesn't have a default value as a positional value?

See also the below code:

from dataclasses import dataclass
from typing import Annotated

import tyro
from tyro import conf


@dataclass
class Car:
    """A car class"""

    name: Annotated[str, conf.Positional]  # <-- this seems to be only way to indicate some property is positional.
    """The name of the car."""

    color: str
    """The color of the car."""


if __name__ == "__main__":
    result = tyro.cli(Car)

Handling unions with `os.PathLike`

I'm running into an issue currently trying to parse a type like: Union[str, os.PathLike]. An instantiator can't be resolved for PathLike as it's an abstract base class. Can an exception be made specifically for os.PathLike as it's so commonly used. e.g., https://github.com/google/etils/blob/main/etils/epath/typing.py#L24

As of Python 3.10 PathLike is a generic type and can be parameterized, e.g., PathLike[str] so it might be possible to fold in this type on Python 3.10+. I'm not sure if there's an easy way to accomplish this in the short term besides adding another exception here: https://github.com/brentyi/tyro/blob/main/tyro/_instantiators.py#L388-L393.

Thoughts?

How to overide a non-empty tuple with an empty tuple in the command line?

For an example shown below, we have a config item x as a non-empty tuple.

# main.py
@dataclass()
class BaseConfig:
    x: tuple[str, ...] = ("a", "b", "c")

if __name__ == "__main__":
    config = tyro.cli(BaseConfig)

We can overwrite the values in the tuple with the following command:

python main.py --x d e f

But how can we overwrite it with an empty tuple? The command below does not work.

python main.py --x

KeyError with prefix "_" for nested config

Hi there, I got error when trying to build the config with private configs indicated by a prefix underscore.

To reproduce the error:

from dataclasses import dataclass, field
import tyro
@dataclass
class PrivateConfig:
    pass
@dataclass
class BaseConfig:
    _private: PrivateConfig = field(default_factory=lambda: PrivateConfig())
@dataclass
class Level1(BaseConfig):
    pass
@dataclass
class Level2(BaseConfig):
    child: Level1 = field(default_factory=lambda: Level1())
@dataclass
class Level3(BaseConfig):
    child: Level2 = field(default_factory=lambda: Level2())
if __name__ == "__main__":
    tyro.cli(Level3)

The output:

Traceback (most recent call last):
  File "/home/hli/Documents/sdfli/tmp.py", line 32, in <module>
    tyro.cli(Level3)
  File "/home/hli/miniconda3/envs/sdfstudio/lib/python3.8/site-packages/tyro/_cli.py", line 187, in cli
    output = _cli_impl(
  File "/home/hli/miniconda3/envs/sdfstudio/lib/python3.8/site-packages/tyro/_cli.py", line 449, in _cli_impl
    out, consumed_keywords = _calling.call_from_args(
  File "/home/hli/miniconda3/envs/sdfstudio/lib/python3.8/site-packages/tyro/_calling.py", line 122, in call_from_args
    value, consumed_keywords_child = call_from_args(
  File "/home/hli/miniconda3/envs/sdfstudio/lib/python3.8/site-packages/tyro/_calling.py", line 122, in call_from_args
    value, consumed_keywords_child = call_from_args(
  File "/home/hli/miniconda3/envs/sdfstudio/lib/python3.8/site-packages/tyro/_calling.py", line 132, in call_from_args
    subparser_def = parser_definition.subparsers_from_prefix[
KeyError: 'child.child._private'

The error is resolved by replacing _private to private, and it only happens when level >= 3. I want to use an underscore at the beginning to indicate the private variables. Is there a solution for that?

Thank you!

Support unions that mix structure types and leaf types

Some unions that currently work:

  • DataclassA | DataclassB
  • DataclassA | DataclassB | Tuple[DataclassA, DataclassB]
  • int | str
  • int | List[str]
  • etc

What doesn't (will fail somewhat gracefully, but no reason why it can't just work):

  • DataclassA | int

either `param_1` or `param_2` as required on the same dataclass for different executes script

Hallo, I have encounted a curcumastance like this:

@dataclasses
class ModelConfig:
    source_path: Path
    """Source path where data included. if specified, model_path shall be assigned to a random path."""""
    model_path: Path
    """Path for the model, required for prediction."""

Then I use the following cmd:

python train.py --model.source-path <xxx> for training, and only --model.source-path is required for training.py, while model_path is not required.

python inference.py --model.model-path <xxx> for training, and only --model.model-path is required for inference.py, source_path can be left empty.

I am wondering How can set source_path required only to training.py, while model_path required only to inference.py

Support for `nargs="+", action="append"`

Hello, this is a really cool project. I have wanted to use it to replace argparse in https://github.com/openrlbenchmark/openrlbenchmark.

There is a small issue though, currently we have snippet that looks like

import argparse
parser = argparse.ArgumentParser()
parser.add_argument("--filters", nargs="+", action="append", default=[])
args = parser.parse_args()
print(args.filters)
python tyro_test.py \
    --filters '?we=openrlbenchmark&wpn=baselines&ceik=env&cen=exp_name&metric=charts/episodic_return' 'baselines-ppo2-cnn' \
    --filters '?we=openrlbenchmark&wpn=envpool-atari&ceik=env_id&cen=exp_name&metric=charts/avg_episodic_return' 'ppo_atari_envpool_xla_jax_truncation' \
    --filters '?we=openrlbenchmark&wpn=cleanrl&ceik=env_id&cen=exp_name&metric=charts/avg_episodic_return'  'sebulba_ppo_envpool_impala_atari_wrapper?tag=v1.0.0-118-g52e2638' 'cleanba_ppo_envpool_impala_atari_wrapper?tag=v1.0.0-127-g42a800b'
[['?we=openrlbenchmark&wpn=baselines&ceik=env&cen=exp_name&metric=charts/episodic_return', 'baselines-ppo2-cnn'], ['?we=openrlbenchmark&wpn=envpool-atari&ceik=env_id&cen=exp_name&metric=charts/avg_episodic_return', 'ppo_atari_envpool_xla_jax_truncation'], ['?we=openrlbenchmark&wpn=cleanrl&ceik=env_id&cen=exp_name&metric=charts/avg_episodic_return', 'sebulba_ppo_envpool_impala_atari_wrapper?tag=v1.0.0-118-g52e2638', 'cleanba_ppo_envpool_impala_atari_wrapper?tag=v1.0.0-127-g42a800b']]

However, after multiple attempts, I wasn't sure how to do this nargs="+", action="append" with tyro. Is it even possible?

Thanks.

subcommand parsing with pydantic v1 models does not work.

When using pydantic v1 and creating a model with subcommands, the first model is always used.
See the test below. Using pydantic v1 this fails.

def test_pydantic_model_with_sub_commands_and_positionals__success():
    class SubCommandOne(BaseModel):
        name: str

    class SubCommandTwo(BaseModel):
        name: str

    class ModelWithSubcommand(BaseModel):
        sub_command: SubCommandOne | SubCommandTwo

    model = cli(
        ModelWithSubcommand,
        args=["sub-command:sub-command-two", "--sub-command.name", "myname"],
    )
    assert isinstance(model.sub_command, SubCommandTwo)
    assert model.sub_command.name == "myname"

Allow for both `FlagConversion` and direct override

Consider the following snippet.

from dataclasses import dataclass

import tyro

@dataclass
class Args:
    flag: bool = False


if __name__ == "__main__":
    tyro.cli(Args)

Would it be possible to do the following? tyro.conf.FlagConversionOff[bool] provides the ability to manual override but also disabled the --flag behavior.

python main.py --flag # flag=True
python main.py --flag False # flag=False

Currently it is possible to achieve this behavior with argparse which is the default behavior in CleanRL (https://github.com/vwxyzjn/cleanrl/blob/master/cleanrl/c51.py#L34).

import argparse
from distutils.util import strtobool
parser = argparse.ArgumentParser()
parser.add_argument("--flag", type=lambda x: bool(strtobool(x)), default=False, nargs="?", const=True)
args= parser.parse_args()
print(args.flag)
(openrlbenchmark-py3.9) ➜  openrlbenchmark git:(hnsplots) ✗ python test.py --flag
True
(openrlbenchmark-py3.9) ➜  openrlbenchmark git:(hnsplots) ✗ python test.py --flag False
False
(openrlbenchmark-py3.9) ➜  openrlbenchmark git:(hnsplots) ✗ python test.py --flag True 
True
(openrlbenchmark-py3.9) ➜  openrlbenchmark git:(hnsplots) ✗ python test.py --flag 0   
False
(openrlbenchmark-py3.9) ➜  openrlbenchmark git:(hnsplots) ✗ python test.py --flag 1
True

How to get access to attribute doc string

Hi @brentyi,

Thanks for this amazing repo. We are starting to use it at https://github.com/huggingface/trl. However, we ran into some auto docstring issues.

I have the following snippet.

@dataclass
class RewardConfig:
    """
    RewardConfig collects all training arguments related to the [`RewardTrainer`] class.

    Parameters:
        max_length (`int`, *optional*, defaults to `None`):
            The maximum length of the sequences in the batch. This argument is required if you want to use the default data collator.
        gradient_checkpointing (`bool`, *optional*, defaults to `True`):
                If True, use gradient checkpointing to save memory at the expense of slower backward pass.
    """

    max_length: Optional[int] = None
    """The maximum length of the sequences in the batch. This argument is required if you want to use the default data collator."""
    gradient_checkpointing: Optional[bool] = True
    """If True, use gradient checkpointing to save memory at the expense of slower backward pass."""

Do you know how I can programmatically generate the docs string for

    Parameters:
        max_length (`int`, *optional*, defaults to `None`):
            The maximum length of the sequences in the batch. This argument is required if you want to use the default data collator.
        gradient_checkpointing (`bool`, *optional*, defaults to `True`):
                If True, use gradient checkpointing to save memory at the expense of slower backward pass.

using

    max_length: Optional[int] = None
    """The maximum length of the sequences in the batch. This argument is required if you want to use the default data collator."""
    gradient_checkpointing: Optional[bool] = True
    """If True, use gradient checkpointing to save memory at the expense of slower backward pass."""

We use the docstring generate something like https://huggingface.co/docs/trl/v0.7.1/en/trainer#trl.PPOConfig

Setting docstrings for dynamic dataclasses

Another weird question:

Is it possible to set docstrings programatically for dynamic dataclasses?

A quick look through the source seems to indicate that this isn't possible, since for dataclasses it relies on the source existing, and seems to default to returning None in the case it detects that the object is a dynamic dataclass.

It could be super useful if you could add optional support for docs with dynamic dataclasses, for purposes like dynamically generating schemas from class definitions / function definitions. It's not super elegant but this could be exposed through the metadata field of dataclasses.field, which according to the docs is meant for 3rd party extensions for dynamic dataclasses.

Support for union + nested hierarchies?

I have an experiment with two types of models that can run, each with its own configs. In my ExperimentConfig, I have a model_class parameter which is of type Union[ModelAConfig, ModelBConfig], and defaults to ModelAConfig. When I call dcargs.cli(ExperimentConfig), how do I set which config to use, as well as the respective parameters in it?

In general, how can Union over two types of configs be used if each config has its own set of parameters?

Edit: Closing this, I think this is addressed with using multiple subparsers

How to update type definition dynamically

Thanks for your fantastic project!

I am currently using NeRFStudio as an external dependency in my own project. However, I am facing an issue with the AnnotatedDataParserUnion type definition (located at here), which contains some pre-defined data parsers. My goal is to add a custom dataparser by updating this type definition dynamically. While it is easy to achieve this by modifying the NeRFStudio codebase directly, it is difficult to do so when using NeRFStudio as an external dependency.

I have noticed that some other vision codebases utilize a registry class that allows the registration of new classes on the fly. I was wondering if it would be possible for tyro to support a similar feature in the future. Additionally, I would like to suggest exploring the possibility of using more standard data structures such as list, dict, or class for subcommands config rather than a Union type. I am curious to hear your thoughts on this matter.

Thanks again!

Multiple nested optional settings

I have this pretty simple example with two nested optional settings objects. But when I run and only specify the second object optimizer-settings:optimizer-settings it doesn't work.

# test.py
import tyro
import dataclasses
from typing import Optional

@dataclasses.dataclass(frozen=True)
class OutputHeadSettings:
    number_of_outputs: int = 1

@dataclasses.dataclass(frozen=True)
class OptimizerSettings:
    name: str = "adam"

@dataclasses.dataclass(frozen=True)
class ModelSettings:
    output_head_settings: Optional[OutputHeadSettings] = None

    optimizer_settings: Optional[OptimizerSettings] = None


settings = tyro.cli(ModelSettings)
print(settings)
python optimizer-settings:optimizer-settings

usage: test.py [-h] [{output-head-settings:None,output-head-settings:output-head-settings}]

test.py: error: argument [{output-head-settings:None,output-head-settings:output-head-settings}]: invalid choice: 'optimizer-settings:optimizer-settings' (choose from 'output-head-settings:None', 'output-head-settings:output-head-settings')

I see a similar issue if I run

python test.py output-head-settings:None optimizer-settings:optimizer-settings

usage: test.py [-h] [{output-head-settings:None,output-head-settings:output-head-settings}]
 
test.py: error: unrecognized arguments: optimizer-settings:optimizer-settings

hydra-zen + tyro ❤️

Hello! I just came across tyro and it looks great!

I wanted to put hydra-zen on your radar. It is a library designed make Hydra-based projects more Pythonic and lighter on boilerplate code. It mainly does this by providing users with functions like builds and just, which dynamically generate dataclasses that describe how to build, via instantiate, various objects. There are plenty of bells and whistles that I could go into (e.g. nice support for partial'd targets), but I'll keep it brief-ish.

That being said, hydra-zen's main features are quite independent of Hydra, and are more focused on generating dataclasses that can configure/build various Python interfaces. It seems like this might be the sort of thing that could be helpful for tyro users who want to generate nested, typed interfaces based on objects in their library or from third party libraries.

This is just a rough idea at this point, but I figured that there might be some potential synergy here! I'd love to get your impressions if you think there might be any value here.

Getting YAMLs without populating defaults

Hi tyro team,

First of all thanks for this super cool configuration library!! It looks awesome.

While reading the docs and playing around with the configurator I had a small question: is it possible to output a yaml file for a hierarchical config without first populating the argument defaults via the command line?

The usecase is as follows:

After defining a (hierarchical) dataclass schema, I want to populate a yaml with all null entries so that I can then populate that yaml to use as the default arguments. Ideally I can follow a flow like: 1. the code looks for a default config 2. if none exists, will populate an empty config 3. users can then populate the defaults inside the yaml, and then override the yaml with CLI arguments.

multiple independent subcommands

hi, first off thanks for making this great library! I've run into the following issue: I'd like to have a setting structure with two subcommands, each with separate settings and defaults:

from dataclasses import dataclass
from typing import Union
import tyro

@dataclass
class A1:
    a1: int = 1

@dataclass
class A2:
    a2: int = 1

@dataclass
class B1:
    b1: int = 1

@dataclass
class B2:
    b2: int = 1

@dataclass
class Config:
    a: Union[A1, A2] = A1()
    b: Union[B1, B2] = B1()


if __name__ == '__main__':
    print(tyro.cli(Config))

I'd like to be able to set the value for config.b while leaving config.a as its default. However if I run python tyro_test.py b:b2 I get the following error:

tyro_test.py: error: argument [{a:a1,a:a2}]: invalid choice: 'b:b2' (choose from 'a:a1', 'a:a2')

and the only way(?) to set config.b is to set config.a first with python tyro_test.py a:a1 b:b2. Is there a way to set config.b via the command line without having to set config.a?

Overriding with YAML defaults on a dataclass config

Hi again,

Today I was trying to override a config defined by a dataclass using a YAML file. The docs (https://brentyi.github.io/tyro/examples/03_config_systems/02_overriding_yaml/) seem to show that the use of a simple dictionary does work to override- but for a dataclass based config, it yields a bunch of warnings. A look into the source looks like it's looking for attributes, hence failing on a dictionary.

Is this the intended behaviour? (maybe it makes sense to assume attribute based accessors considering the config itself is a dataclass- but I found this discrepancy with what's indicated in the docs a bit confusing, unless I missed something that specifies this behaviour)

For completeness here's a small example to repro. Replacing the dict with an attrdict does work.

import yaml

import tyro
import dataclasses
import attrdict

@dataclasses.dataclass
class Config:
    exp_name : str
    batch_size : int

# YAML configuration. Note that this could also be loaded from a file! Environment
# variables are an easy way to select between different YAML files.
default_yaml = r"""
exp_name: test
batch_size: 10
""".strip()

if __name__ == "__main__":
    # Convert our YAML config into a nested dictionary.
    default_config = dict(yaml.safe_load(default_yaml))
    
    # Using attrdict here instead will work
    #default_config = attrdict.AttrDict(default_config)

    # Override fields in the dictionary.
    overridden_config = tyro.cli(Config, default=default_config)

Generic dataclass detection fails for unions

Hi Brent,

I have a very low-level bug to flag for you -- when saving/loading nested dataclasses to yaml (using extras.to_yaml(), extras.from_yaml(), if a dataclass has a Union of two custom types, they don't get detected as custom types for the yaml.Loader to construct.

I wrote a MWE to replicate the issue:

import dataclasses
import dcargs

from typing import Union

@dataclasses.dataclass
class TypeA:
    data: int

@dataclasses.dataclass
class TypeB:
    data: int
    
@dataclasses.dataclass
class Wrapper:
    subclass: Union[TypeA, TypeB] = TypeA(1)
    
if __name__ == "__main__":
    wrapper1 = Wrapper() # Create Wrapper object.
    wrapper2 = dcargs.extras.from_yaml(Wrapper, dcargs.extras.to_yaml(wrapper1)) # Errors, no constructor for TypeA

No worries if this is too low-level to deal with right now -- I think we can work around it by just pickling the configs, but wanted to flag something is going awry in the custom type detection.

Print dataclass as arguments

Hey, thanks for your work on this project -- it seems really elegant and is nice to use so far!

I think it would be pretty cool if there was a function that would print a dataclass as a list of commandline arguments.

For example:

python 01_basics/02_dataclasses.py --field1 hello produces Args(field1='hello', field2=3).

Can we implement a tyro.to_arguments(container) -> str such that tyro.to_arguments(Args(field1='hello', field2=3)) produces --field1 hello --field2 3? Would be really nice to programmatically generate containers and their respective "launch commands" in this way.

Is this already possible? Please let me know what you think! Thanks!

Pass in path to yaml config to specify params

Hi! First of all, thank you for the wonderful cli tool!
Currently, I am playing around with it and wondering if there is any way to specify a path to the .yaml config file which contains (possibly, not with every params specified) to populate/edit the default config determined via dataclass? For example, in pyrallis there is --config_path

How to override configs for `Optiona[Dict[xx, xx]]` in cmd?

Case 1: Dict

The following test code works:

def test_normal_dict() -> None:
    def main(x: Dict[str, int] = {"a": 5, "b": 5}):
        return x

    assert tyro.cli(main, args=[]) == {"a": 5, "b": 5}
    assert tyro.cli(main, args="--x.a 3 --x.b 7".split(" ")) == {
        "a": 3,
        "b": 7,
    }

For example, we can override arguments with:

--x.a 3 --x.b 7

Case 2: Optional[Dict]

Now, if the given config is Optional[Dict[str, int]], how shall we override its parameters?

def test_optional_dict() -> None:
    def main(x: Optional[Dict[str, int]] = {"a": 5, "b": 5}):
        return x

    assert tyro.cli(main, args=[]) == {"a": 5, "b": 5}
    # The following assert will fail.
    assert tyro.cli(main, args="--x.a 3 --x.b 7".split(" ")) == {
        "a": 3,
        "b": 7,
    }

For example, this is used in

https://github.com/nerfstudio-project/nerfstudio/blob/081bf82bf594b6eab52410b46e6cbd510ce82eea/nerfstudio/models/base_model.py#L47-L48

Help text output underscore instead of dashes

Would it be possible to output underscore instead of dashes in the help text? E.g., --ddpo_config.exp_name instead of --ddpo-config.exp-name. I understand both options can be parsed but was just wondering if we could configure the help text to display underscore instead :)

image

Subcommands are broken

Given this file,

# tyro_test.py

from dataclasses import dataclass
from typing import Union

import tyro


@dataclass
class DataparserA:
    pass


@dataclass
class DataparserB:
    pass


Dataparser = Union[DataparserA, DataparserB]


@dataclass
class ModelA:
    pass

@dataclass
class ModelB:
    pass


Model = Union[ModelA, ModelB]


@dataclass
class Pipeline:
    dataparser: Dataparser
    model: Model


@dataclass
class Trainer:
    pipeline: Pipeline


tyro.cli(Trainer)

When I run with python tyro_test.py pipeline.dataparser:dataparser-a pipeline.model:model-a, I get this following error with tyro==0.3.35 and ==0.3.36:

Traceback (most recent call last):
  File "/home/kchen/mttr_nerfstudio/tyro_test.py", line 43, in <module>
    tyro.cli(Trainer)
  File "/home/kchen/miniconda3/envs/nerfstudio/lib/python3.10/site-packages/tyro/_cli.py", line 125, in cli
    _cli_impl(
  File "/home/kchen/miniconda3/envs/nerfstudio/lib/python3.10/site-packages/tyro/_cli.py", line 326, in _cli_impl
    out, consumed_keywords = _calling.call_from_args(
  File "/home/kchen/miniconda3/envs/nerfstudio/lib/python3.10/site-packages/tyro/_calling.py", line 100, in call_from_args
    value, consumed_keywords_child = call_from_args(
  File "/home/kchen/miniconda3/envs/nerfstudio/lib/python3.10/site-packages/tyro/_calling.py", line 110, in call_from_args
    subparser_def = parser_definition.subparsers_from_prefix[
KeyError: 'pipeline.model'
  • I do not get an error if I use tyro==0.3.33
  • I do not get an error if I use tyro==0.3.35 but change the last line to tyro.cli(Pipeline) and then run python tyro_test.py dataparser:dataparser-a model:model-a

Maybe this is related to the new tyro.conf.ConsolidateSubcommandArgs functionality? But I am not sure. Sorry, for just dumping the error and not looking into the source--I don't have much time right now.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.