Lazy evaluation #78
ingmarschuster
started this conversation in
Ideas
Replies: 1 comment 4 replies
-
Hi @ingmarschuster For example, constructing a product of low rank factors: U = cola.ops.Dense(torch.randn(10000,5))
V = cola.ops.Dense(torch.randn(10000,5))
A = U@V.T The matrices are never explicitly multiplied. All of the iterative algorithms in CoLA are defined through matrix vector multiplies, so in this way the matrices ( Is this what you had in mind or are you speaking towards something else? |
Beta Was this translation helpful? Give feedback.
4 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
What are your thoughts on lazy evaluation? This can make working with extremely large matrices possible. One example of an algorithm that approximates rather large PSD matrices, their cholesky factors and inverses with only few iterations is
https://paperswithcode.com/paper/randomly-pivoted-cholesky-practical
This would probably necessitate to introduce an API that allows interfacing with user-defined compute backends. Or thats at least how the RPCholesky code is currently implemented.
Beta Was this translation helpful? Give feedback.
All reactions