7/8/2023 0 Comments Diagonal matrix![]() The Cholesky decomposition is an L U decomposition in which the lower-triangular L and upper-triangular U are simply taken to have the same diagonal entries, so PSD testing should have the same complexity as L U decomposition. From what I've seen, PSD testing can be done by trying to make a Cholesky decomposition (writing the matrix as L L ∗ with L lower-triangular) and seeing if it fails. I could move the comment to mathoverflow if you think it is better. Side-note: The drawback of having posted this question in multiple places at the same time is that the discussion is fragmented. ![]() Would it be useful if I try to spell this out more precisely? Of course, this would not be enough to reach O ( m n ) in the small m case. This should be doable using a Picard or Newton iteration, with a number of steps that depends on the desired precision. The function f can be evaluated in time O ( n ω ). Consider the map V → M n, B ↦ B − 1 and its projection f : V → V where we zero out all of the other entries. The maximum-determinant completion is the (only?) one whose inverse is in V. Let M n be the space of n × n matrices and let V be the ( 2 m − n )-dimensional vector space of matrices with zeros in all non-specified entries of the problem. To submit a solution, send an email to think it can be done in O ( n ω ), where I recall for non-expert's convenience that ω is the exponent of matrix multiplication / inverse / PSD testing / etc. Feel free to ask for other clarifications on our question on Math Overflow, on Facebook, or by email. ~ O hides polylogarithmic factors in n, ε, and the max matrix entry. We expect a correct solution to be about as clear and easy to verify as a paper published at STOC.įor both problems, it’s OK if we incorrectly treat a matrix as PSD as long as all of its eigenvalues are at least − ε for a small constant ε. If we can’t understand a solution quickly, we may ask you to provide more details, and if we still can’t understand it we may reject it. We don’t expect to receive many incorrect proposals, but if we receive more than 5 we may start applying a higher standard in order to save our time. These two questions are very closely related to one of the simplest cases where we haven't yet found any reasonable linear time heuristic estimator. ARC is trying to find efficient heuristic estimators as a formalization of defeasible reasoning about quantities like the variance of a neural network's output. To understand the motivation for these questions, you can read our paper on Formalizing the presumption of independence and in particular Appendix D.7.2. We’ll also consider smaller prizes for partial progress, or anything that we find helpful for either solving the problem or realizing we should give up on it. Otherwise, if the result ends up being a significant part of a paper, we’ll invite them to be a coauthor. ![]() Winners are welcome to publish solutions independently. The offer is open for each problem for 3 months or until the problem gets solved (whichever happens first). We'll pay $5k for a solution to either problem. Question 2 (fast “approximate squaring”): given A ∈ R n × n and a set of m = Ω ( n ) entries of A A T, can I find some PSD matrix that agrees with A A T in those m entries in time ~ O ( n m )? Question 1 (existence of PSD completions): given m entries of an n × n matrix, including the diagonal, can we tell in time ~ O ( n m ) whether it has any (real, symmetric) positive semidefinite completion? Proving that this task is at least as hard as dense matrix multiplication or PSD testing would count as a resolution. We're offering a bounty of $5k for a solution to either of them-either an algorithm, or a lower bound under any hardness assumption that has appeared in the literature. Here are two self-contained algorithmic questions that have come up in our research.
0 Comments
Leave a Reply. |