Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
499 views
in Technique[技术] by (71.8m points)

python - Efficient way to find overlapping of N rectangles

I am trying to find an efficient solution for finding overlapping of n rectangles where rectangles are stored in two separate lists. We are looking for all rectangles in listA that overlap with rectangles in listB (and vice versa). Comparing one element from the first list to second list could take immensely large amount of time. I am looking for an efficient solution.

I have two list of rectangles

rect = Rectangle(10, 12, 56, 15)
rect2 = Rectangle(0, 0,1, 15)
rect3 = Rectangle (10,  12, 56, 15)

listA = [rect, rect2]
listB = [rect3]

which is created from the class:

import numpy as np
import itertools as it

class  Rectangle(object):
    def __init__(self, left, right, bottom, top):
        self.left = left
        self.bottom = right
        self.right = bottom
        self.top = top

    def overlap(r1, r2):
        hoverlaps = True
        voverlaps = True
        if (r1.left > r2.right) or (r1.right < r2.left):
            hoverlaps = False
        if (r1.top < r2.bottom) or (r1.bottom > r2.top):
            voverlaps = False
        return hoverlaps and voverlaps

I need to compare rectangle in listA to listB the code goes like this which is highly inefficient - comparing one by one

for a in it.combinations(listB):
    for b in it.combinations(listA):
        if a.overlap(b):

Any better efficient method to deal with the problem?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

First off: As with many a problem from computational geometry, specifying the parameters for order-of-growth analysis needs care: calling the lengths of the lists m and n, the worst case in just those parameters is Ω(m×n), as all areas might overlap (in this regard, the algorithm from the question is asymptotically optimal). It is usual to include the size of the output: t?=?f(m,?n,?o) (Output-sensitive algorithm).
Trivially, f ∈ Ω(m+n+o) for the problem presented.


Line Sweep is a paradigm to reduce geometrical problems by one dimension - in its original form, from 2D to 1D, plane to line.

Imagine all the rectangles in the plane, different colours for the lists.
?Now sweep a line across this plane - left to right, conventionally, and infinitesimally further to the right "for low y-coordinates" (handle coordinates in increasing x-order, increasing y-order for equal?x).
?For all of this sweep (or scan), per colour keep one set representing the "y-intervals" of all rectangles at the current x-coordinate, starting empty. (In a data structure supporting insertion, deletion, and enumerating all intervals that overlap a query interval: see below.)
?Meeting the left side of a rectangle, add the segment to the data structure for its colour. Report overlapping intervals/rectangles in any other colour.
?At a right side, remove the segment.
?Depending on the definition of "overlapping", handle left sides before right sides - or the other way round.


There are many data structures supporting insertion and deletion of intervals, and finding all intervals that overlap a query interval. Currently, I think Augmented Search-Trees may be easiest to understand, implement, test, analyse…
Using this, enumerating all o intersecting pairs of axis-aligned rectangles (a, b) from listA and listB should be possible in O((m+n)log(m+n)+o) time and O(m+n) space. For sizeable problem instances, avoid data structures needing more than linear space ((original) Segment Trees, for one example pertaining to interval overlap).


Another paradigm in algorithm design is Divide&Conquer: with a computational geometry problem, choose one dimension in which the problem can be divided into independent parts, and a coordinate such that the sub-problems for "coordinates below" and "coordinates above" are close in expected run-time. Quite possibly, another (and different) sub-problem "including the coordinate" needs to be solved. This tends to be beneficial when a) the run-time for solving sub-problems is "super-log-linear", and b) there is a cheap (linear) way to construct the overall solution from the solutions for the sub-problems.
This lends itself to concurrent problem solving, and can be used with any other approach for sub-problems, including line sweep.


There will be many ways to tweak each approach, starting with disregarding input items that can't possibly contribute to the output. To "fairly" compare implementations of algorithms of like order of growth, don't aim for a fair "level of tweakedness": try to invest fair amounts of time for tweaking.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...