GithubHelp home page GithubHelp logo

Comments (6)

mgrange1998 avatar mgrange1998 commented on May 2, 2024

Hi, thank you for the question. As you've encountered, Ax is not well suited for this problem due to the large input space.

For the approach you've outlined, we'll triage to @saitcakmak who has more context on historical research cases that may be similar to this. Thanks!

from ax.

saitcakmak avatar saitcakmak commented on May 2, 2024

Hi @CCranney. Just so we're on the same page,

  • Your objective is to optimize some performance metric on this regression model
  • You have ~60k parameters. You are interested in doing a form of sensitivity analysis and picking a smaller subset of these features for use in your model.
  • The regression model produces ~7k separate outputs. The overall performance metric is an aggregate of these.

In Ax, we typically use Bayesian optimization with Gaussian process (GP) surrogates. We do have some sensitivity measures associated with these surrogates, which can help understand how much each input matters for each output (based on the surrogate predictions). The standard GP models would not scale to ~60k inputs & ~7k outputs, so you'd need a more specialized approach here.

There's some recent research from our team on dealing with high-dimensional intermediate outputs, though I am not sure how applicable it is for this use case & whether they can deal with ~60k parameters: https://arxiv.org/abs/2311.02213
cc @ItsMrLin, @nataliemaus

from ax.

ItsMrLin avatar ItsMrLin commented on May 2, 2024

Hi @CCranney, interesting problem!

If we are interested in performing feature selection or identifying sparse solutions to the optimization problem, Sparsity Exploration Bayesian Optimization (SEBO) might be an option for you and it is already available in Ax: https://ax.dev/tutorials/sebo.html. However, handling 7k output might not be feasible with the (noisy) expected hypervolume improvement acquisition function, if there's no easy way to explicitly reducing it down to a few metrics to optimize.

Additionally, as @saitcakmak pointed out, our recent work should enable the use of large number of intermediate outcomes to improve the optimization performance, but it'd still require some scalar value quantifying how good the outcomes are. If it is hard to define such quantity explicitly, it can be estimated through preference learning as well (https://botorch.org/tutorials/bope). Mixing & matching these method might be an interesting approach to this problem

That said, above modification might take some work to enable in Ax, and it'd be easier to implement it in BoTorch directly if we are interested in mix & match these methods. If we are just looking for something already available in Ax, I'd recommend to either:

  1. Summarize the 7k outcomes into a few metrics that quantifies the quality of the outcomes, then identify sparse solutions with SEBO.
    And/or
  2. Perform sensitivity analysis as @saitcakmak mentioned, but that's typically done on each individual metric. We could try to do it for all 7k outcomes, which wouldn't be that much different from what you are already doing using SHAP. And the alternative is also to explicitly reduce that down to a few summary metric or to use preference learning to do that.

from ax.

CCranney avatar CCranney commented on May 2, 2024

Thank you all for contributing!

I should clarify one point - while the original problem has ~60k inputs and ~7k outputs, for this process of eliminating unnecessary inputs I am focusing on just a single output. While I would also be interested in eliminating inputs that have little to no bearing on the all of the outputs generally (one day), my request was for doing so while narrowing the focus to a specific output of interest. So things like SEBO and BOPE might be exactly what I was looking for. I'm going to research your responses in greater depth and get back to you, thank you!

from ax.

Balandat avatar Balandat commented on May 2, 2024

Handling the 60k inputs directly will be challenging. That said, I know that @dme65 has had some success with feature selection approaches where the optimization is performed over a lower-dimensional space that describes the parameters of other more heuristic feature selection methods. It's kind of an ensembling approach if you will, where you optimize over parameters x1, x2, ..., where x1 may means sth like "include the top x1 features from feature ordering method A using hyperparameters x2 and x3", etc.

from ax.

CCranney avatar CCranney commented on May 2, 2024

I think my questions have been largely answered, I my re-open this if more ideas/questions come up in the future, but I'll close this for now. Thank you again!

from ax.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.