GithubHelp home page GithubHelp logo

Solving a system of linear equations `LinearSolve.solve` uses up more memory when `A` is sparse then when `A` is dense about linearsolve.jl HOT 8 CLOSED

lampretl avatar lampretl commented on July 18, 2024
Solving a system of linear equations `LinearSolve.solve` uses up more memory when `A` is sparse then when `A` is dense

from linearsolve.jl.

Comments (8)

DrTimothyAldenDavis avatar DrTimothyAldenDavis commented on July 18, 2024 2

For use in direct factorization methods, matrices from "sprand" are not truly sparse, in the sense of Wilkinson. They provably take O(n^3) time and O(n^2) space, under modest assumptions. See https://www.ams.org/journals/mcom/1974-28-125/S0025-5718-1974-0331756-7/ .

Essentially no matrices arising in practice have this characteristic, even those that do not have a small bandwidth. There could be exceptions to this rule, like the matrices you're seeing from the chessboard simplicial complex.

So "sprand" is not a good way to test any sparse direct method.

Many of the matrices (perhaps most) in my collection come from industry, or from a practical problem in academia. Very few of them are fabricated (exceptions include this set: https://sparse.tamu.edu/Gset but it's marked as such).

from linearsolve.jl.

j-fu avatar j-fu commented on July 18, 2024 1

Well, yes, sprand generates sparse matrices which are not typical for any kind of problems I know about.

For matrices from elliptic/parabolic PDE discretizations (this is my field), things very much depend on the space dimension and the resulting
different levels of fill-in:

  • 1D: no alternative to direct solvers
  • 2D: YMMV, but at the end, direct solvers work more or less as a blackbox
  • 3D: For any larger problem, fill-in will kill all performance of direct solvers, try to use preconditioned Krylov methods
    With e.g. saddle point problems, things are more complicated though.

I am working on some examples, guess I'll make a blog post out of this.

from linearsolve.jl.

rayegun avatar rayegun commented on July 18, 2024

One thing to note in general is that sprand is pretty terrible for benchmarking all sparse linalg codes. Do you observe similar results with matrices from the real world / SuiteSparse collection?

I suspect, but haven't verified, that for LU sprand is generating massive fill in. Something you'll likely see less of in real matrices.

from linearsolve.jl.

rayegun avatar rayegun commented on July 18, 2024

For instance I just factored the torso3 matrix from SuiteSparse collection. nnz(F.L) / length(F.L) == 0.005. While for a random matrix I have a ratio of something like 0.2 or even 0.5 in one run. Fill in at that level is much too high in my experience.

If your matrices do indeed exhibit these random patterns (or they are very large), you should perhaps check out something like an iterative method where fill-in is a non-issue.

from linearsolve.jl.

j-fu avatar j-fu commented on July 18, 2024

See https://j-fu.github.io/marginalia/julia/scalingtest/

from linearsolve.jl.

lampretl avatar lampretl commented on July 18, 2024

@j-fu Thank you for your blog and the comparison between efficiency of different solvers! I'm a bit skeptical about the notion of real matrices. In science, sparse matrices can be close to diagonal or upper/lower diagonal, and of course, solving such systems will cause much less fill-in. But not all of them are like that.

In my research in homological algebra and algebraic topology, when generating chain complexes of simplicial complexes for computing (co)homology, matrices were always very sparse and often avoided fill-in. But for some particular cases, e.g. the chessboard simplicial complex, when I tried to compute the rank of a matrix, there was so much fill-in the computation didn't finish after 1 month.

I think what matters most are the type of matrices that the industry actually needs. Solving such problems would offer more funding to Julia :). @Wimmerer Does SuiteSparse matrix collection contain sparse matrices directly from the industry? I can't tell which ones originated outside of academia.

from linearsolve.jl.

rayegun avatar rayegun commented on July 18, 2024

@DrTimothyAldenDavis can give more info about the matrix collection. But there are often descriptions on each matrix.

from linearsolve.jl.

lampretl avatar lampretl commented on July 18, 2024

Thank you, appreciate your insight! @DrTimothyAldenDavis

from linearsolve.jl.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.