De fapt, acesta este un vot foarte puternic pentru Grok. Am verificat și se pare că da, a îmbunătățit limita inferioară într-un studiu serios de probabilitate din 2025. Multi-agent cu căutare și execuție de cod, dar de ce să te dezavantajezi dacă poți folosi efectiv unelte? DS (doar web) eșuează/cedează.
Paata Ivanisvili
Paata Ivanisvili18 feb. 2026
Grok 4.20 (Beta) improves the lower bound by 9.1% on the Gaussian perimeter of convex sets in two minutes. This is something that was pointed out to me by Xinyuan Xie. Back in 1993, Keith Ball showed that the Gaussian perimeter of a convex body in n-dimensional Euclidean space is bounded from above by 4n^{1/4}. As for the lower bound, Ball showed that for a cube (of appropriate size) the perimeter can grow as \sqrt{\log(n)}. So there was a gap for a while as to which bound is sharp, until 2003, when, in a beautiful paper, Fedor Nazarov showed that on the example of a random polyhedron (the intersection of many random half-spaces) the lower bound can grow as C n^{1/4}, with C=\exp(-5/4)=0.286…. Besides, Nazarov also improved the constant 4 in the upper bound (replacing it with 0.64) when n is large. These bounds stayed unbeaten until recently, when in 2019 Martin Raic managed to improve the upper-bound constant factor from 0.64 to 0.59. Grok 4.20 (Beta), by more carefully optimizing Nazarov’s construction, managed to improve the lower-bound constant from 0.286 to 0.3126. I find this surprising even if it is just playing within the techniques of Nazarov’s paper, because very recently Nadimpalli--Pascale (2025) posted a preprint where, with a different approach, they recovered Nazarov’s lower bound with the same constant factor 0.286…. Grok was very generous in its response: it said that the improvement it provided follows the same argument of Nazarov ``line-by-line,'' whereas when I asked other models (other than Grok) to verify Grok’s claim, they agreed on everything except this part; they said the improvement is not really ``line-by-line'' :D. Finally, I would not say that Nazarov missed this improvement. Knowing him for a long time, I am pretty confident it is common for him to sacrifice optimal constants for algebraic elegance. Why is all this interesting? Having control of the Gaussian perimeter allows one to control Fourier tails of characteristic functions of these sets, which leads to controlling the time complexity of PAC learning and agnostic learning algorithms for this family (see Klivans--O’Donnell--Servedio). References: Chat link with Grok 4.20 (Beta). Keith Ball. The Reverse Isoperimetric Problem for Gaussian Measure. Discrete and Computational Geometry, 10:411–420, 1993. Adam Klivans, Ryan O’Donnell, and Rocco A Servedio. Learning geometric concepts via Gaussian surface area. In Proc. 49th IEEE Symposium on Foundations of Computer Science (FOCS), pages 541–550, 2008. Shivam Nadimpalli, Caleb Pascale. On the Maximal Gaussian Perimeter of Convex Sets, Revisited. Preprint (2025) Fedor Nazarov. On the maximal perimeter of a convex set in R^n with respect to a Gaussian measure. In Geometric Aspects of Functional Analysis (2001-2002) pages 169–187. Lecture Notes in Mathematics, Volume 1807, Springer, 2003 Martin Raicz. A multivariate Berry–Esseen theorem with explicit constants. Bernoulli 25(4A), 2019, 2824–2853
Ca să fie clar, dacă îi spun DS să NU renunțe, gândește mult mai intens, 12 minute aici, și oferă o idee despre cum poate fi îmbunătățit constantul. Dar codul generat eșuează. Privind înapoi, renunță. De fapt, calitativ pare să fie "corect", dar obține 0,3116, <Grok
Dacă codul DeepSeek este fixat (chiar și de DeepSeek), acesta produce un rezultat care converge către valoarea lui Grok. Deci, presupun că cu un REPL destul de trivial ar fi "reușit" la fel de mult. Oricum, utilitate mai mare pentru Grok aici.
104