论文标题

单纯范围报告的最佳下限

An Optimal Lower Bound for Simplex Range Reporting

论文作者

Afshani, Peyman, Cheng, Pingan

论文摘要

我们为单纯形范围报告问题提供了简化和改进的下限。我们表明,给定$ \ mathbb {r}^d $的$ n $点的$ p $,任何使用$ s(n)$空间回答此类查询的数据结构必须具有$ q(n)=ω((((n^2/s(n)(n))^{(d-1)/d}/d}/d}/d}+k Query时间,$ k $ k $是输出大小。对于近线性空间数据结构,即$ s(n)= o(n \ log^{o(1)} n)$,这改善了Chazelle和Rosenberg [CR96]和Afshani [A12]的先前下限,但更重要的是,它是第一个用于任何简单范围$ $ $ dimens的较低范围的较低限制。 我们通过简单地与事件几何形状中的良好问题建立联系,从而获得下限,这使我们能够在该区域使用已知的结构。我们观察到,对已经存在的简单结构进行了少量修改会导致我们的下限。我们认为,至少与以前基于Chazelle和Rosenberg [CR96]和Afshani [A12]的衡量参数的复杂概率证明相比,我们的证据可以得到更多的访问权。 缺乏紧密或几乎密密麻麻的(直至多毛体系数)近线性空间数据结构的下限是在诸如证明多层次数据结构的下限的问题上取得进展的主要瓶颈。我们希望,基于发病率几何形状的这一新的攻击线可以在这一领域进一步进步。

We give a simplified and improved lower bound for the simplex range reporting problem. We show that given a set $P$ of $n$ points in $\mathbb{R}^d$, any data structure that uses $S(n)$ space to answer such queries must have $Q(n)=Ω((n^2/S(n))^{(d-1)/d}+k)$ query time, where $k$ is the output size. For near-linear space data structures, i.e., $S(n)=O(n\log^{O(1)}n)$, this improves the previous lower bounds by Chazelle and Rosenberg [CR96] and Afshani [A12] but perhaps more importantly, it is the first ever tight lower bound for any variant of simplex range searching for $d\ge 3$ dimensions. We obtain our lower bound by making a simple connection to well-studied problems in incident geometry which allows us to use known constructions in the area. We observe that a small modification of a simple already existing construction can lead to our lower bound. We believe that our proof is accessible to a much wider audience, at least compared to the previous intricate probabilistic proofs based on measure arguments by Chazelle and Rosenberg [CR96] and Afshani [A12]. The lack of tight or almost-tight (up to polylogarithmic factor) lower bounds for near-linear space data structures is a major bottleneck in making progress on problems such as proving lower bounds for multilevel data structures. It is our hope that this new line of attack based on incidence geometry can lead to further progress in this area.

扫码加入交流群

加入微信交流群

微信交流群二维码

扫码加入学术交流群,获取更多资源