Viðburðir eftir árum


Optimal resizable arrays

Uri Zwick from Tel Aviv University

  • 13.6.2022, 13:00 - 14:00

We invite you to join us on Monday, June 13, at 13:00, in room M103, to hear about resizable arrays from speaker Uri Zwick from Tel Aviv University. He will give a talk at the seminar series for ICE-TCS in Computer Science department.

A resizable array is an array that can grow or shrink by adding or removing items from both its ends while still enabling constant time access to each item stored in the array given its index. As the size of an array, i.e., the number of items in it varies over time, space-efficient maintenance of a resizable array necessitates dynamic memory management. A standard doubling technique allows the maintenance of an array of size $N$ using only $O(N)$ space, with $O(1)$ amortized time, or even $O(1)$ worst-case time, per operation. Sitarski and Brodnik et al.\ describe a much better solution that maintains a resizable array of size~$N$ using only $N+O(\sqrt{N})$ space, still with $O(1)$ time per operation. 

Brodnik et al.\ give a simple proof that shows that this is the best possible. We distinguish between the space needed for storing a resizable array and retrieving items from it, and the temporary space needed while growing or shrinking the array. For every $r\ge 2$, we show that $n+O(N^{1/r})$ space is sufficient for storing and accessing an array of size~$N$ if $N+O(N^{1-1/r})$ space can be used briefly during grow and shrink operations. Accessing an item by index takes $O(1)$ worst-case time while grow and shrink operations take $O(r)$ amortized time. Using an exact analysis of an \emph{insertion game}, we show that for any data structure that uses only $N+O(N^{1/r})$ space to store the array the amortized cost of growing is $\Omega(r)$, even if only grow and access operations are allowed.

Joint work with Robert E. Tarjan

--

Uri Zwick is a professor of computer science at Tel Aviv University. He is best known for his work on graph algorithms, in particular on distances in graphs and on the color-coding technique for subgraph isomorphism. Je is also the namesake of the Karloff–Zwick algorithm for approximating the MAX-3SAT problem of Boolean satisfiability. Together with Mike Paterson, Yuval Peres, Mikkel Thorup and Peter Winkler, he received the David P. Robbins Prize in 2011 for their work on the block-stacking problem presented in the papers:

- “Overhang,” American Mathematical Monthly 116, January 2009; https://www.researchgate.net/publication/1769859_Overhang

- “Maximum Overhang,” American Mathematical Monthly 116, December 2009. https://www.maa.org/sites/default/files/pdf/upload_library/22/Robbins/Patterson2.pdf

The two papers together provide an ingenious solution to solve the classic problem of stacking blocks on a table so that they reach out the furthest horizontal distance from the edge of the table without falling.

In collaboration with Mike Paterson published in the article at https://doi.org/10.1016/0304-3975(93)90355-W, Uri Zwick determined the optimal strategy for playing the popular Memory Game, known also as Concentration.



Vinsamlegast athugið að á viðburðum Háskólans í Reykjavík (HR) eru teknar ljósmyndir og myndbönd sem notuð eru í markaðsstarfi HR. Hægt er að nálgast frekari upplýsingar á ru.is eða með því að senda tölvupóst á netfangið: personuvernd@ru.is
//
Please note that at events hosted at Reykjavik University (RU), photographs and videos are taken which might be used for RU marketing purposes. Read more about this on our ru.is or send an e-mail: personuvernd@ru.is