Comments (17)
We understand the importance of sparse matrices, but there is a significant coding burden in creating them. For one, there is a large diversity in sparse kinds. We are close to stabilizing mat64 and so are maybe in a position tothink about adding them, but there are no explicit plans as far as I know.
from gonum.
I have spent quite some time thinking about a plan to introduce sparse matrix handling - it is not trivial, both in the implementation of the matrix types and also in the threading it into the mat64 operation model we use. I do want to do it though.
What operations are you particularly interested in?
from gonum.
I need basic operations:
- Multiplying matrices or elements of matrices
- Growing matrices
- Normalizing vectors
from gonum.
- sparse x sparse or dense x sparse
- growing in one or two dimensions. if one, which?
from gonum.
I am going to use only sparse matrices (sparse x sparse) and growing by two dimensions.
from gonum.
from gonum.
I will interfere a bit. I would also like to have sparse matrices in gonum, I think that eventually we will have to have them but I also agree that it is not trivial. I started to implement my own tiny sparse package but due to the lack of free time I did not get far.
To give a sample of different (much more basic) sparse operations: at first I would be happy with just being able to build a fixed sparse matrix and compute (transposed) sparse matrix times dense vector product. Then I could write simple CG and GMRES solvers with which I could then solve my toy PDE models. As a next step, I would like to be able to modify values in the sparse matrix without changing its sparsity structure. That would keep me happy for a long time.
from gonum.
The trick though is that there are so many kinds of sparse matrix. At least in my world of PDEs, it's almost always block-structured rather than random access. GMRES is then coded to operate knowing the block structure.
from gonum.
To give a sample of different (much more basic) sparse operations: at first I would be happy with just being able to build a fixed sparse matrix and compute (transposed) sparse matrix times dense vector product.
This is trivial. You can see how this works in the sparse PageRank function in graph/network.
Then I could write simple CG and GMRES solvers with which I could then solve my toy PDE models. As a next step, I would like to be able to modify values in the sparse matrix without changing its sparsity structure.
This is also relatively straightforward. Depending on your use model either an iteration over a row/column (depending on your storage) to find the correct index, or binary search on a sorted rox/column.
For reference, there is CSparse and Sparse BLAS as competitors for models to use. Sparse BLAS is really where I would like to go. Neither is trivial.
from gonum.
Yes, both are straightforward and I have them together with a CG solver. I was trying to think of a nice interface that fits with mat64 but it was not so clear to me. That's what I mean by non-trivial. I read the Sparse BLAS document and reference code in detail and tried to use it for inspiration but its usage of matrix handles as integers would have to replaced with something more Go-like.
from gonum.
from gonum.
any progress here, guys?
from gonum.
Iām a beginner gopher, but I would like to avoid adding a layer of python to he existing Go backend just to have sparse matrices. If this is still something you would like to see implemented, and are willing to give me a few pointers I might give it a go (!).
from gonum.
This repository if deprecated and frozen, so I'll move this to #367, but the overall plan would to:
- choose a sparse matrix implementation model - I think Sparse BLAS is probably the right choice, so reading up its documentation would be a good place to start (The CSparse book would also be worth reading as it is intended as a teaching model).
- implement the sparse model
- design the interaction between the sparse model and mat types - this is reasonably constrained, but should involve a fair bit of discussion with/between us.
- implement the interactions.
from gonum.
I transferred this issue from the old repository here in case there are some useful comments or ideas, although there is already at least one issue about sparse matrices.
from gonum.
I'd like to offer my grain of salt on the matter:
I've used matlab for sparse matrix operations and it seems to work on a 3 vector data structure (triplet, as they call it):
- Rows
[]int
- Columns
[]int
- Values
[]float64
wherevalue[i]
corresponds to rowrow[i]
and columncolumn[i]
of the sparse matrix
The fastest way to "assign" to a sparse matrix is to compute these vectors first and then use sparse
to create the sparse matrix with these three vectors. If you try to start out with a empty m
by n
sparse matrix and assign rows/columns or single values to it your program becomes painfully slow for matrices of large dimensions (spalloc
and row/column assignment can be the difference between a 50 second operation and a 6 hour operation, true story).
I know matlab uses lapack behind the scenes for dense matrices but I'm not sure what it uses for sparse representation.
from gonum.
What would be needed for a first PR on this? I've given Sparse BLAS a shot below
sparse blas level 1 routines
package blas
// A sparse vector representation.
// Non zero entries should be len(Val) or len(Idx)
// at any given time.
type SpVector struct {
// Length, or capacity of sparse vector.
N int
// Val[i] corresponds to the Stride*Idx[i]th value of the vector.
Val []float64
Idx []int
Stride int
}
type Implementation struct{}
// Dusdot Sparse Blas Level 1 routine. Returns dot product of x and y.
// w = x . y
func (Implementation) Dusdot(nz int, x []float64, indx []int, y []float64, incy int) (w float64) {
for i := 0; i < nz; i++ {
w += x[i] * y[indx[i]*incy]
}
return w
}
// Dusaxpy Sparse Blas Level 1 routine. Adds scaled x values to y as in
// y = alpha * x
func (Implementation) Dusaxpy(nz int, alpha float64, x []float64, indx []int, y []float64, incy int) {
for i := 0; i < nz; i++ {
y[indx[i]*incy] += alpha * x[i]
}
}
// Dusga Sparse Blas Level 1 routine. Copies x values to y.
// x = y
func (Implementation) Dusga(nz int, y []float64, incy int, x []float64, indx []int) {
for i := 0; i < nz; i++ {
x[i] = y[indx[i]*incy]
}
}
// Dusgz Sparse Blas Level 1 routine. Gather y values into x and zero y values.
// x = y
// y = 0
func (Implementation) Dusgz(nz int, y []float64, incy int, x []float64, indx []int) {
for i := 0; i < nz; i++ {
x[i] = y[indx[i]*incy]
y[indx[i]*incy] = 0
}
}
// Dussc Sparse Blas Level 1 routine. Copies x values to y.
// y = x
func (Implementation) Dussc(nz int, x []float64, y []float64, incy int, indx []int) {
for i := 0; i < nz; i++ {
y[indx[i]*incy] = x[i]
}
}
from gonum.
Related Issues (20)
- mat: add PivotedQR type
- mat: add type for permutation matrices HOT 1
- all: replace min and max helpers with min/max builtins when go1.21 is lowest supported version
- all: consider using math/rand/v2 when available HOT 1
- LP simplex bug HOT 8
- gonum method like numpy
- graph/path: YenKShortestPaths returns duplicate paths
- fatal error in internal/asm/f32.AxpyInc HOT 1
- proposal: generate unrolled implementations in `internal/asm` HOT 3
- Feature Request: Add DTRSYL3 to gonum or lapack64
- gonum/mat: Dense Matrix Multiplication Bug HOT 1
- a little error in the documentation, where can I fix and PR? HOT 1
- gonum/mat: NewTridiag has arguments flipped HOT 2
- add support for `Lines()` method on Multigraph interface HOT 1
- spatial/r3: `Example_slerp` failure on arm64 HOT 1
- mat/prodcut.go invalid link in the comment HOT 2
- mat: calling qr.Factorize leads to OOM for matrixes with many rows HOT 5
- Dijkstra weight function - return false HOT 1
- all: fix ST1000, ST102[012] errors
- feature request: transitive reduction of a graph HOT 1
Recommend Projects
-
React
A declarative, efficient, and flexible JavaScript library for building user interfaces.
-
Vue.js
š Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
-
Typescript
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
-
TensorFlow
An Open Source Machine Learning Framework for Everyone
-
Django
The Web framework for perfectionists with deadlines.
-
Laravel
A PHP framework for web artisans
-
D3
Bring data to life with SVG, Canvas and HTML. ššš
-
Recommend Topics
-
javascript
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
-
web
Some thing interesting about web. New door for the world.
-
server
A server is a program made to process requests and deliver data to clients.
-
Machine learning
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
-
Visualization
Some thing interesting about visualization, use data art
-
Game
Some thing interesting about game, make everyone happy.
Recommend Org
-
Facebook
We are working to build community through open source technology. NB: members must have two-factor auth.
-
Microsoft
Open source projects and samples from Microsoft.
-
Google
Google ā¤ļø Open Source for everyone.
-
Alibaba
Alibaba Open Source for everyone
-
D3
Data-Driven Documents codes.
-
Tencent
China tencent open source team.
from gonum.