Get the latest tech news
You need much less memory than time
Just as I was complaining that we haven't seen many surprising breakthroughs in complexity recently, we get an earthquake of a result to st...
Using the fact that it takes \(\sqrt{t(n)}\) time to cross an entire segment, Williams with some clever tricks models acceptance of the Turing machines as a circuit of bounded degree and depth\(\sqrt{t(n)}\), where the wires carry the contents of the size \(\sqrt{t(n)}\) segments at various times in the computation. In 1986 my advisor Mike Sipser gave the first hardness vs randomness result, showing roughly that if there were problems that took time \(2^n\) but could not be solved in space \(2^{.99n}\) on multi-tape Turing machines then RP = P. Williams' theorem kills this assumption though we've developed weaker assumptions since. Maybe try to use the Cook-Mertz techniques directly in the Turing machine simulation instead of going through computation trees.
Or read this on Hacker News