Skip to main content Skip to navigation

Theory and Foundations News

Archive news content can be found here.

Select tags to filter on

6 papers accepted to FOCS 2024

Six papers from the Theory and Foundations Research Group and the Centre for Discrete Mathematics and Its Applications (DIMAP) have been accepted to the 65th IEEE Symposium on Foundations of Computer Science (FOCS 2024), the flagship conference in theoretical computer science that will be held on October 27 - 30, 2024 in Chicago, USA:

  • "Optimal Coding Theorems for Randomized Kolmogorov Complexity and Its Applications" by Shuichi Hirahara, Zhenjian Lu and Mikito Nanashima.
  • "On the Complexity of Avoiding Heavy Elements" by Zhenjian Lu, Igor C. Oliveira, Hanlin Ren and Rahul Santhanam.
Fri 28 Jun 2024, 20:39 | Tags: Research Theory and Foundations

Breakthrough result on the power of memory in computation

A recent paperLink opens in a new window published by Dr. Ian MertzLink opens in a new window, a postdoctoral researcher in the Theory and Foundations (FoCS)Link opens in a new window research group and the Centre for Discrete Mathematics and its Applications (DIMAP)Link opens in a new window, has disproved a longstanding conjecture on the limitations of space-bounded computation.

For many years it had been believed that a function, known as Tree Evaluation, would be the key to separating two fundamental classes of problems: those computable quickly (P), and those computable in low space (L). Mertz, along with James CookLink opens in a new window of Toronto, builds on their earlier work to show a low-space algorithm for Tree Evaluation, thus refuting this belief. In particular, their technique has attracted attention for shedding new light on the power of space-bounded computation, suggesting novel approaches to age-old questions in complexity theory. They show that space can be used in surprising ways, with the same memory serving many simultaneous purposes.

The paper, which Mertz will present at the 56th Annual ACM Symposium on the Theory of Computing (STOC 2024)Link opens in a new window, has been invited to the special issue of SIAM Journal on Computing (SICOMP)Link opens in a new window for the conference. STOC is the main conference of the Association of Computing Machinery (ACM) and one of the two premier venues for theoretical computer science, with only the top results being invited for publication in the special issue.

Mertz has also presented this work at many venues, including the Institute for Advanced Study (IAS), Columbia University, Oxford University, Warwick (Online Complexity Seminar)Link opens in a new window, McGill University, and others.

Sun 23 Jun 2024, 22:27 | Tags: People Highlight Research Theory and Foundations

Seven papers accepted to ICML 2024

Seven papers authored by Computer Science researchers from Warwick have been accepted for publication at the 41st International Conference on Machine Learning, one of the top three global venues for machine learning research, which will be held on 21-27 July 2024 in Vienna, Austria:

  • Agent-Specific Effects: A Causal Effect Propagation Analysis in Multi-Agent MDPs, by Stelios Triantafyllou, Aleksa Sukovic, Debmalya Mandal, and Goran Radanovic
  • Dynamic Facility Location in High Dimensional Euclidean Spaces, by Sayan Bhattacharya, Gramoz Goranci, Shaofeng Jiang, Yi Qian, and Yubo Zhang (Accepted as a spotlight, among the top 13 percent of all accepted papers)
  • High-Dimensional Kernel Methods under Covariate Shift: Data-Dependent Implicit Regularization, by Yihang Chen, Fanghui Liu, Taiji Suzuki, and Volkan Cevher
  • Revisiting character-level adversarial attacks, by Elias Abad Rocamora, Yongtao Wu, Fanghui Liu, Grigorios Chrysos, and Volkan Cevher
  • Reward Model Learning vs. Direct Policy Optimization: A Comparative Analysis of Learning from Human Preferences, by Andi Nika, Debmalya Mandal, Parameswaran Kamalaruban, Georgios Tzannetos, Goran Radanovic, and Adish Singla
  • To Each (Textual Sequence) Its Own: Improving Memorized-Data Unlearning in Large Language Models, by George-Octavian Bărbulescu and Peter Triantafillou
  • Towards Neural Architecture Search through Hierarchical Generative Modeling, by Lichuan Xiang, Łukasz Dudziak, Mohamed Abdelfattah, Abhinav Mehrotra, Nicholas Lane, and Hongkai Wen

Older news