patents.google.com

CN113112600A - Indoor scene three-dimensional modeling method based on structure - Google Patents

  • ️Tue Jul 13 2021

CN113112600A - Indoor scene three-dimensional modeling method based on structure - Google Patents

Indoor scene three-dimensional modeling method based on structure Download PDF

Info

Publication number
CN113112600A
CN113112600A CN202110361587.9A CN202110361587A CN113112600A CN 113112600 A CN113112600 A CN 113112600A CN 202110361587 A CN202110361587 A CN 202110361587A CN 113112600 A CN113112600 A CN 113112600A Authority
CN
China
Prior art keywords
vertex
polygon
boundary
matching
point
Prior art date
2021-04-02
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202110361587.9A
Other languages
Chinese (zh)
Other versions
CN113112600B (en
Inventor
许威威
徐瀚彤
鲍虎军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
2021-04-02
Filing date
2021-04-02
Publication date
2021-07-13
2021-04-02 Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
2021-04-02 Priority to CN202110361587.9A priority Critical patent/CN113112600B/en
2021-07-13 Publication of CN113112600A publication Critical patent/CN113112600A/en
2023-03-03 Application granted granted Critical
2023-03-03 Publication of CN113112600B publication Critical patent/CN113112600B/en
Status Active legal-status Critical Current
2041-04-02 Anticipated expiration legal-status Critical

Links

  • 238000000034 method Methods 0.000 title claims abstract description 68
  • 230000011218 segmentation Effects 0.000 claims abstract description 15
  • 230000002159 abnormal effect Effects 0.000 claims abstract description 8
  • 238000013138 pruning Methods 0.000 claims description 27
  • 239000013598 vector Substances 0.000 claims description 21
  • 238000005457 optimization Methods 0.000 claims description 15
  • 230000015556 catabolic process Effects 0.000 claims description 13
  • 238000006731 degradation reaction Methods 0.000 claims description 13
  • 230000033001 locomotion Effects 0.000 claims description 11
  • 239000000463 material Substances 0.000 claims description 10
  • 230000002452 interceptive effect Effects 0.000 claims description 9
  • 230000009466 transformation Effects 0.000 claims description 6
  • 230000003044 adaptive effect Effects 0.000 claims description 5
  • 238000001514 detection method Methods 0.000 claims description 5
  • 238000006073 displacement reaction Methods 0.000 claims description 4
  • 238000009499 grossing Methods 0.000 claims description 4
  • 239000008709 Curare Substances 0.000 claims description 3
  • 238000012217 deletion Methods 0.000 claims description 3
  • 230000037430 deletion Effects 0.000 claims description 3
  • 238000012545 processing Methods 0.000 claims description 3
  • 238000013519 translation Methods 0.000 claims description 3
  • 230000001627 detrimental effect Effects 0.000 claims description 2
  • 238000005192 partition Methods 0.000 claims description 2
  • 238000010200 validation analysis Methods 0.000 claims description 2
  • 230000006870 function Effects 0.000 description 9
  • 230000008569 process Effects 0.000 description 6
  • 238000010586 diagram Methods 0.000 description 3
  • 230000003993 interaction Effects 0.000 description 3
  • 230000008439 repair process Effects 0.000 description 3
  • 230000007547 defect Effects 0.000 description 2
  • 238000000605 extraction Methods 0.000 description 2
  • 230000000007 visual effect Effects 0.000 description 2
  • PXFBZOLANLWPMH-UHFFFAOYSA-N 16-Epiaffinine Natural products C1C(C2=CC=CC=C2N2)=C2C(=O)CC2C(=CC)CN(C)C1C2CO PXFBZOLANLWPMH-UHFFFAOYSA-N 0.000 description 1
  • 230000009286 beneficial effect Effects 0.000 description 1
  • 238000013480 data collection Methods 0.000 description 1
  • 238000013461 design Methods 0.000 description 1
  • 238000005516 engineering process Methods 0.000 description 1
  • 238000002474 experimental method Methods 0.000 description 1
  • 238000012423 maintenance Methods 0.000 description 1
  • 238000007726 management method Methods 0.000 description 1
  • 238000013507 mapping Methods 0.000 description 1
  • 238000005259 measurement Methods 0.000 description 1
  • 238000012986 modification Methods 0.000 description 1
  • 230000004048 modification Effects 0.000 description 1
  • 230000003287 optical effect Effects 0.000 description 1
  • 230000008447 perception Effects 0.000 description 1
  • 238000003672 processing method Methods 0.000 description 1
  • 238000009418 renovation Methods 0.000 description 1
  • 238000005070 sampling Methods 0.000 description 1
  • 238000006467 substitution reaction Methods 0.000 description 1

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/10Constructive solid geometry [CSG] using solid primitives, e.g. cylinders, cubes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • G06F18/2135Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods based on approximation criteria, e.g. principal component analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2413Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on distances to training or reference patterns
    • G06F18/24147Distances to closest patterns, e.g. nearest neighbour classification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T17/00Three dimensional [3D] modelling, e.g. data description of 3D objects
    • G06T17/005Tree description, e.g. octree, quadtree
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20112Image segmentation details
    • G06T2207/20164Salient point detection; Corner detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/04Architectural design, interior design
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2210/00Indexing scheme for image generation or computer graphics
    • G06T2210/61Scene description

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Data Mining & Analysis (AREA)
  • Geometry (AREA)
  • Software Systems (AREA)
  • Computer Graphics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Artificial Intelligence (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Processing Or Creating Images (AREA)

Abstract

The invention discloses a structure-based indoor scene three-dimensional modeling method, which solves the problems of low modeling precision and the like of noise, cavities and small objects in indoor scene point cloud data by utilizing a planar structure and a polygonal structure. The method comprises the following steps: performing plane segmentation on the point cloud, and eliminating abnormal points and small objects with low modeling precision in the point cloud data according to plane information; and then carrying out polygon fitting on the plane boundary, automatically extracting the adjacency relation of the polygon set, and eliminating the gaps caused by the cavities and the plane segmentation in the data by applying the optimized polygon automatic bonding. The method provided by the invention solves the problems of noise, cavities and the like, ensures the consistency of the output result and the original point cloud, and can obtain a high-precision indoor scene model.

Description

Indoor scene three-dimensional modeling method based on structure

Technical Field

The invention relates to the technical field of three-dimensional modeling, in particular to a structure-based indoor scene three-dimensional modeling method.

Background

Three-dimensional modeling of indoor scenes is the basis for many applications, such as positioning and navigation, building maintenance and renovation planning, emergency management, and generating game scenes in virtual reality. A person without experience in interior design may provide a three-dimensional digital representation of the housing to an expert or expert system for better furniture placement recommendations. The indoor model with rich semantics and high geometric precision provides key information (such as the position of a door for an exit, the opening direction of the door, the position and the topological relation of an indoor space, and the semantic attribute of the indoor space) for indoor navigation service. Three-dimensional modeling of indoor environments still faces particular challenges due to the complex layout of indoor structures, complex interactions between objects, clutter and occlusions. (1) During the data collection process, it is difficult to obtain data about walls, floors, and other structures of interest due to lack of view coverage, resulting in undesirable reconstruction results. (2) Restoring internal structures and the topological relationships between them (e.g., connectivity, containment, or adjacency) is difficult. (3) Weakly textured areas (such as featureless walls or floors) are often present in indoor environments, which leads to photo consistency measurement errors. (4) Sensor noise and outliers further complicate the modeling process. (5) The main appearance differences between different scenes, lighting and view variations also make the automatic and robust generation of an indoor model very challenging. The above points result in a number of defects in the point cloud model of the indoor scene generated by the data collected by the sensor: and a large amount of noise and abnormal points, cavities caused by shielding, materials and the like, low modeling precision of small objects and the like.

Disclosure of Invention

The invention aims to provide a structure-based indoor scene three-dimensional modeling method aiming at the defects of the prior art.

In order to achieve the purpose, the invention adopts the following technical scheme: a structure-based indoor scene three-dimensional modeling method comprises the following steps:

s1: removing abnormal points and small objects in the point cloud data of the indoor scene through plane segmentation;

s2: performing polygon fitting on a plane, including extracting plane boundaries, performing neighborhood estimation on each boundary point, calculating the normal direction of each boundary point, smoothing boundaries by moving the positions of the boundary points in the normal direction, and extracting polygons by using an angular point detection algorithm;

s3: searching the adjacency relation among points, lines and surfaces of the polygon set according to the self-adaptive search radius; establishing a graph structure according to the adjacency relation, and carrying out local pruning and global pruning;

s4: constructing an objective function according to the adjacency relation, the planarity and the orthogonality of the polygon set and the fitting of the input point cloud, solving the objective function to perform coordinate transformation on the polygon, and automatically bonding the polygon set;

s5: an indoor scene model is generated based on the polygons.

Further, the S1 includes the following sub-steps:

s11: constructing a KDTree, performing neighborhood search on each point in the point cloud data by using a nearest neighbor algorithm, and calculating the normal direction and curvature of each point by using a PCA algorithm;

s12: performing plane segmentation by adopting region growth based on distance and normal vector judgment;

s13: the plane equation is fitted to each plane using the PCA algorithm, removing outliers and small objects from the plane information (number of points, plane normal, etc.).

Further, in S2, projecting all points in the plane point set onto a plane according to a plane equation, converting the three-dimensional coordinates into two-dimensional coordinates, and extracting a two-dimensional point cloud boundary using an α -shape algorithm; a forward search method is applied to classify the locally smooth regions around each boundary point.

Further, in S2, calculating and optimizing the normal direction of each boundary point specifically includes: initializing a normal direction by using a PCA algorithm in a neighborhood, and optimizing the normal direction of the boundary point by using a least square method, wherein an optimization equation is as follows:

Figure BDA0003005824420000021

Figure BDA0003005824420000022

the first term is an energy term which is used for minimizing the normal difference in the neighborhood and transmitting the normal difference to the set N of all adjacent boundary points; the second term is a constraint term used for preventing the normal of the boundary point from deviating from the initial value too much; weighting coefficient wp,qPenalizing the difference in the normal direction, given by the Gaussian filter, where θp,qRepresents a normal vector npAnd nqσ represents the variance;

Figure BDA0003005824420000023

represents a normal vector npAn initial value of (1); λ represents the constraint term coefficient.

Further, in S2, the boundary is smoothed by moving the boundary point position in the normal direction, which can be expressed by the following equation:

p′=p+tpnp

wherein t ispThe distance of the boundary point moving in the normal direction is shown, and p' is the coordinate of the moved boundary point;

the new boundary point position is obtained by minimizing the energy function:

Figure BDA0003005824420000024

the first term smoothes the point cloud boundary by minimizing the dot product of the connecting line of q, p and the normal direction of q, p; the second term is used for preventing the point cloud boundary points from deviating from the initial positions thereof too much, and mu is a constraint term coefficient.

Further, in S3, searching for the adjacency relation between the points, lines, and surfaces of the polygon set according to the adaptive search radius includes:

s31: matching points with points;

the intrinsic stability requirement of a polygon limits the search radius to half the minimum distance from the vertex p to the polygon boundary, i.e., half

Figure BDA0003005824420000031

Wherein

Figure BDA0003005824420000032

Representing the polygon boundary after removing vertex p and its connected edges, d (p, e) representing the distance from vertex p to polygon boundary e,

Figure BDA0003005824420000033

represents the minimum distance of the vertex p to the polygon boundary;

will be provided with

Figure BDA0003005824420000034

Defined as the adaptive search radius of the vertex p, rmaxThe maximum distance between matching elements customized for the user; matching candidate set of vertex p contains

Figure BDA0003005824420000035

And searching for all vertices within a distance r (p),

Figure BDA0003005824420000036

representing a set of polygons, P representing a polygon to which the vertex P belongs; if the candidate set is empty, add to the candidate set

Figure BDA0003005824420000037

The vertex closest to the vertex p and satisfying the distance less than rmax

Will r ise(p)=max(r(p),min(rmax,dmin) Defined as the extended search radius of the vertex p, dminRepresents p and

Figure BDA0003005824420000038

the distance of the middle closest vertex; two vertices p and q are considered matched if they are contained within their respective extended search radius from each other, and require that the distance of a pair of matching points to the intersection line l of the planes in which they lie satisfies: r is not more than d (l, p)e(p),d(l,q)≤re(q)。

S32: matching points with edges;

for a side e ═ p0,p1) Its search radius is defined as the minimum search radius of its two end points, r (e) min (r (p)0),r(p1) ); if the orthogonal projection of vertex p on e is inside e, and the following formula is satisfied, then p matches e:

d(p,e)≤min(r(p),r(e))

d(l,p)≤r(p)and d(l,e)≤r(e)

s33: for vertex p and face f, if the orthogonal projection of p on f is inside the polygon and the projection distance is less than r (p), then p and f match; two edges match if there are two vertex-vertex matches, or one vertex-vertex and one vertex-edge match, or two vertex-edge matches, for the endpoints of the two edges.

Further, in S3, a graph structure is established according to the adjacency relation, and local pruning is performed, specifically:

establishing graph structure G ═ V, EM) To represent the matching relationship of the polygon set, all the vertexes and edges of the polygon set P form V, EMContains all vertex-vertex/edge matches;

vertex-to-vertex matching typically produces stable results, correcting mismatching due to adding the closest vertex to the candidate set in the following pruning step: considering two or more vertices Q of a polygon QiMatching with the vertex P belonging to P, only keeping the matching pair with the nearest distance, and representing all single ring neighborhoods of the search G in the matching graph G;

intrinsic stability requires pruning vertex-edge matches where the vertices correspond to multiple non-adjacent edges of the polygon, which occurs due to overlap of search radii around the edges, which is compressed to the nearest vertex-edge match using graph G.

Further, in S3, constructing an expanded matching graph, and performing global pruning, specifically:

constructing an extended matching graph Ge=(V,Ee) V contains all polygon elements, Ee=EM∪ECContains all the matching relationships EMAnd a constraint set EC;ECConnecting all element pairs in the V of the same polygon except the polygon vertex and the connecting edge thereof; according to the expanded matching graph, judging that the matching m of any vertex-vertex/edge is equal to (p, q) epsilon EMWhether part of a detrimental cycle; find m cycles c (m), e (m, e) directly connected to another match on both sides0,...,en),e0,...,en∈EMIf such a cycle is found, the corresponding match is directly deleted;

for match m, search the actual match graph G for a matching sequence, check if all indirectly induced matches miAre also all at EMFor being not in EMPerforming geometric validation on each match to determine whether polygon degradation would result; and (3) projecting the matched vertex and/or edge endpoint to a common intersection line l of the plane where the polygon is located, and pruning m if the normal vector direction of the polygon corrected by vertex/edge projection is reversed or has self-intersection.

Further, the S4 specifically includes:

in order to maintain planarity, a cartesian coordinate system is introduced; for each plane

Figure BDA0003005824420000041

Establishing the point of origin, v, is o1And v2A coordinate system that is a basis vector; for any vertex P ∈ P, use the coordinate (P)x,py) Is represented by p ═ o + pxv1+pyv2(ii) a The displacement of each point is expressed by adopting a velocity vector field of instantaneous motion, so as to linearize the space motion of each coordinate system,

Figure BDA0003005824420000042

Figure BDA0003005824420000043

where x represents the coordinates of a point or points,

Figure BDA0003005824420000044

translation of the points is described, c rotation of the points is described, v (x) represents the coordinates of the points after movement; one vertex P epsilon P in the optimization processiThe location of (d) can be written as:

Figure BDA0003005824420000045

with adjacency relationships, data items are defined as the distance between matching elements:

Figure BDA0003005824420000046

to maintain a collection of polygons

Figure BDA0003005824420000047

Fitting to the input point cloud, using the constraint term:

Figure BDA0003005824420000048

wherein P isi(l)Representing a plane PlAll of the vertices in (a) are,

Figure BDA0003005824420000049

represents PlThe initial state of (a);

to satisfy the ubiquitous orthogonality of indoor scenes, the following two orthogonal terms are used:

Figure BDA00030058244200000410

Figure BDA00030058244200000411

Eorth1acting on any pair of polygons, if the absolute value of the difference between the included angle of two polygons and 90 degrees is less than the angleDegree threshold, then orthogonal term decision coefficient wij1, otherwise 0; eorth2Optimizing the orthogonality of adjacent edges within the polygon, where ei(l)

Figure BDA00030058244200000413

Representing adjacent edges;

by minimizing the initial point piTo current vertex position p'iThe sum of the squared distances to overcome the degradation problem that may lead to edges, the distance term is as follows:

Figure BDA00030058244200000412

the objective function is expressed as:

E=λdataEdataconsEconsorth(Eorth1+Eorth2)+λcurEcur

wherein λdata、λcons、λorth、λcurAre all weight coefficients.

Further, on the basis of automatically creating and bonding polygons, an interactive editing tool is adopted for processing polygon errors or deletions caused by scene occlusion and material problems; automatically selecting a proper two-dimensional modeling space based on a partition plane in the point cloud, and simplifying all interactive operations into approximate two-dimensional operations; the interactive operation comprises polygon editing, polygon drawing, polygon boundary alignment to image boundary, polygon material giving and the like.

The invention has the beneficial effects that: the method processes the scene point cloud model based on the planar structure and the polygonal structure, successfully processes the problems of noise, cavities and the like in the original point cloud data, ensures the consistency of the output result and the original point cloud, and can obtain a high-precision indoor scene model. The method can be applied to visual positioning, virtual roaming and other applications.

Drawings

FIG. 1 is a flow chart of a method for three-dimensional modeling of an indoor scene based on a structure according to an embodiment of the present invention;

FIG. 2 shows an original point cloud (left) and a plane segmentation result (right) provided by an embodiment of the present invention;

FIG. 3 is a flow chart of polygon fitting provided by an embodiment of the present invention;

FIG. 4 is a schematic diagram of an error match provided by an embodiment of the present invention;

FIG. 5 shows a set V (left) and a constraint set E according to an embodiment of the present inventionC(right);

FIG. 6 is a diagram of a loop provided by an embodiment of the present invention that includes multiple constrained edges that may or may not result in polygon degradation (left);

FIG. 7 is a comparison of polygons before and after automatic bonding as provided by an embodiment of the present invention;

FIG. 8 is a schematic illustration of outliers and small object removal provided by an embodiment of the present invention;

FIG. 9 is a schematic diagram of hole repair provided by an embodiment of the present invention.

Detailed Description

The present invention will be described in further detail with reference to the following drawings and specific embodiments, it being understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.

The core content of the three-dimensional modeling is to realize the digital expression of a three-dimensional physical world by comprehensively utilizing a sensor and a computing technology, and capture the three-dimensional shape and appearance with high reality sense of an object and a scene so as to simulate three-dimensional interaction and perception in a digital space.

Fig. 1 is a flowchart of a structure-based three-dimensional modeling method for an indoor scene provided in an embodiment of the present invention, where an implementation flow of the method is specifically as follows:

firstly, removing abnormal points and small objects in the point cloud data of the indoor scene through plane segmentation.

The first step is as follows: and performing neighborhood search and normal and curvature estimation on each point in the point cloud data. The method specifically comprises the following steps:

randomly selecting a point from the input point cloud, constructing a KDTree, searching a neighborhood for each point by using a nearest neighbor (KNN) algorithm, and calculating the normal direction and the curvature of each point by using a PCA algorithm.

The second step is that: and performing plane segmentation by adopting region growing based on distance and normal vector judgment. The method specifically comprises the following steps:

(1) selecting a point with the minimum curvature and no clustering, and adding the point to a current plane point list C; if all points are clustered, the traversal is stopped.

(2) Sequentially selecting points which are not accessed in C as seed points pseed

(3) Calculating pseedEvery point p in the neighborhoodiTo pseedEuclidean distance ED ofiAnd storing the data in an array EDs, and recording the median of the EDs as mean _ ED.

(4) Calculating pseedEvery point p in the neighborhoodiTo pseedIs orthogonal distance ODiAnd storing the data in an array ODs, and recording the median of the ODs as mean _ OD. The median absolute difference MAD is calculated by:

Figure BDA0003005824420000061

where b is a constant and takes the value of 1.4826.

(5) For pseedEvery point p in the neighborhoodiR is calculated by the following formulazi,RziRepresents piAnd pseedAnd (5) clustering weight scores.

Figure BDA0003005824420000062

If p isiNot clustered and simultaneously satisfy RziLess than the weight threshold (usually set to 2), EDi<median_ED,piNormal vector of (a) and pseedIs smaller than an angle threshold (typically set to 10-20 degrees), it is added to the current plane point list C.

(6) If the capacity of C increases, repeating steps (2) - (5). Otherwise, the current plane extraction is finished, C is stored in the segmentation result point list R, and the next plane extraction is carried out by turning to the step (1).

The third step: after plane segmentation is finished, fitting a plane equation to each plane by applying a PCA method, and removing abnormal points and small objects according to plane information;

because the point cloud data of the indoor scene is generally obtained through laser scanning, and the modeling precision of small objects in the point cloud data is low, and the small objects have the characteristics of easy displacement, the method removes the small objects from the scene. Small objects and abnormal points caused by walking of pedestrians are removed through the conditions of the number of vertexes of each plane, the normal direction and the like. As shown in fig. 2, the plane segmentation method adopted by the present invention can effectively segment the original point cloud. In the figure, the problem of the point cloud after segmentation can be visually seen: (1) holes and noise due to occlusion, scanning accuracy; (2) gaps appear between the divided planes and are not communicated. Therefore, the invention adopts an optimization-based polygon bonding algorithm to obtain a closed complete polygon model without holes and noise.

And secondly, performing polygon fitting on the plane, including extracting plane boundaries, performing neighborhood estimation on each boundary point, calculating the normal direction of each boundary point, smoothing the boundary by moving the positions of the boundary points in the normal direction, and extracting polygons by using an angular point detection algorithm. The method comprises the following concrete steps:

(1) and projecting all the points in the plane point set onto a plane according to a plane equation, converting the three-dimensional coordinates into two-dimensional coordinates, and extracting the boundary of the two-dimensional point cloud by using an alpha-shape algorithm of the CGAL library. As shown in fig. 3(a), the four-pointed star indicates the boundary.

(2) The local smooth region around each boundary point is classified using a forward search method that preserves local features and is robust to noise and outliers (as shown in fig. 3 (b)).

(3) The normal direction of each boundary point is calculated and optimized. The normal direction is initialized by using the PCA algorithm within the neighborhood, as in fig. 3 (c). In order to make the normal directions of the points on the same side consistent, as shown in fig. 3(d), the method optimizes the normal direction of the boundary point by using the least square method, and the optimization equation is as follows:

Figure BDA0003005824420000071

Figure BDA0003005824420000072

the first term is the energy term, which is used to minimize the normal difference in the neighborhood and is passed on to the set N of all neighboring boundary points. The second term is a constraint term used to prevent the boundary point normal from deviating too much from its original value. Weighting coefficient wp,qPenalizing the difference in the normal direction, given by the Gaussian filter, where θp,qRepresents a normal vector npAnd nqThe included angle of (a).

Figure BDA0003005824420000073

Represents a normal vector npAn initial value of (1); the constraint term coefficient λ is set to 0.1 and the variance σ is set to 20 degrees in the present embodiment.

(4) Smoothing the boundary by moving the boundary point position in the normal direction can be represented by:

p′=p+tpnp

wherein t ispThe distance of the boundary point moving in the normal direction is shown, and p' is the coordinate of the moved boundary point;

the new boundary point position is obtained by minimizing the energy function:

Figure BDA0003005824420000074

the first term smoothes the point cloud boundary by minimizing the dot product of the connecting line of q, p and the normal to q, p. The second term is used to prevent the point cloud boundary points from deviating too much from their original positions, thereby avoiding the shrinkage of the point cloud boundary. The constraint term coefficient μ is set to 0.1 in the present embodiment. The resulting smoothed point cloud boundary is shown in fig. 3 (e).

(5) The polygons are extracted using a corner detection algorithm, as in fig. 3 (f). If the normal vector of a certain boundary point is almost parallel to the normal vectors of the succeeding boundary point and the preceding boundary point, respectively, it is classified as a point on a certain side of the polygon. Then, a straight line equation of the point set on each edge is fitted using a least square method. The intersection point of two continuous edges is the vertex of the polygon.

Thirdly, searching the adjacency relation among points, lines and surfaces of the polygon set according to the self-adaptive search radius; and establishing a graph structure according to the adjacency relation, and carrying out local pruning and global pruning. Specifically, the invention provides an automatic robust adjacency detection method based on stable vertex-vertex/edge/face and edge-edge matching, which comprises the following steps:

(1) matching of points to points.

The intrinsic stability requirement of a polygon limits the search radius to half the minimum distance from the vertex p to the polygon boundary, i.e., half

Figure BDA0003005824420000081

Wherein

Figure BDA0003005824420000082

Representing the polygon boundary after removing vertex p and its connected edges, d (p, e) representing the distance from vertex p to polygon boundary e,

Figure BDA0003005824420000083

representing the minimum distance of the vertex p to the polygon boundary. Search radius less than

Figure BDA0003005824420000084

Half of which prevents self-intersection, flipping, edge and diagonal collapse of the polygon.

The method comprises the following steps

Figure BDA0003005824420000085

Defined as the adaptive search radius of the vertex p, rmaxThe maximum distance between matching elements that is customized for the user. Matching candidate set package for vertex pComprises

Figure BDA0003005824420000086

All vertices within a distance r (p) are searched,

Figure BDA0003005824420000087

representing a set of polygons, P representing the polygon to which the vertex P belongs. If the candidate set is empty, add to the candidate set

Figure BDA0003005824420000088

The vertex closest to the vertex p and satisfying the distance less than rmax. Doing so may destroy the inherent stability of the polygon, which is solved by the pruning algorithm.

The method is toe(p)=max(r(p),min(rmax,dmin) Defined as the extended search radius of the vertex p, dminRepresents p and

Figure BDA0003005824420000089

the distance from the nearest vertex. Two vertices p and q are considered to match if they are contained within their respective expanded search radius from each other:

||p-q||≤min(re(p),re(q))

finally, in a later optimization phase the two matching vertices collapse to a point on the intersection line l of the planes they lie in. Therefore, the method further requires that the distance from a pair of matching points to l is:

d(l,p)≤re(p)and d(l,q)≤re(q)

(2) matching points with edges.

For a side e ═ p0,p1) Its search radius is defined as the minimum search radius of its two end points, r (e) min (r (p)0),r(p1)). If the orthogonal projection of vertex p on e is inside e and satisfies the following two equations, p matches e:

d(p,e)≤min(r(p),r(e))

d(l,p)≤r(p)and d(l,e)≤r(e)

(3) and (4) other matching.

For vertex p and face f, if the orthogonal projection of p on f is inside the polygon and the projection distance is less than r (p), then p and f match. Based on vertex-vertex and vertex-edge matching, an edge-edge match can be further derived: two edges are said to match if there are two vertex-vertex matches, or one vertex-vertex and one vertex-edge match, or two vertex-edge matches, for the endpoints of the two edges. Edge-to-edge matching is only used in the global pruning phase and not for optimization, since their contribution to optimization is implicitly included by vertex-vertex/edge matching.

(4) And (5) local pruning.

Establishing graph structure G ═ V, EM) To express the matching relation of the polygon set, all pruning steps are realized on the matching graph, and the polygon set

Figure BDA00030058244200000810

All vertices and edges of (A) form (V, E)MContaining all vertex-vertex/edge matches.

Vertex-to-vertex matching typically produces stable results, correcting mismatching due to adding the closest vertex to the candidate set in the following pruning step: considering two or more vertices Q of a polygon QiMatching with vertex P ∈ P. This obviously violates the intrinsic stability requirement of polygon Q, leaving only the nearest matching pairs. All single ring neighborhoods denoted as search G in the matching graph G.

Similar to vertex-vertex matching, intrinsic stability requires pruning those vertex-edge matches for which the vertices correspond to multiple non-adjacent edges of the polygon. This naturally occurs due to the overlap of search radii around the edges. This matching case is compressed to the nearest vertex-edge match using graph G.

(5) And (6) global pruning. As shown in fig. 4, the distances of the 3 polygons are close, and the matching of p and q causes the side e of the central polygon to collapse, which violates the intrinsic stability requirements of the polygons. This degradation occurs when certain polygon elements (vertices/edges) are connected by matching, forming a loop.

The method introduces an extended matching graph Ge=(V,Ee). Like G, the set of points V contains all the polygon elements (vertices and edges). Edge set Ee=EM∪ECContains all the matching relationships EMAnd a constraint set EC. As shown in FIG. 5, ECAll element pairs in V that connect the same polygon (except for the polygon vertex and its connected edge). According to the expanded matching graph, the matching m of any vertex-vertex/edge can be judged to be (p, q) epsilon EMWhether part of a harmful cycle. Since the degradation of the bounding edges of the polygons for p and q has already been dealt with in the local pruning stage, it is only necessary to find the loop c (m), m ═ and (m, e), which connects directly to the other match on both sides of m0,..,en),e0,...,en∈EM. If such a loop is found, the corresponding match can be directly deleted.

By pruning the matching loop with only one constraint edge, most of the degradation in the polygon model can be avoided. However, there are also cases involving multiple constraining edges, see FIG. 6, which may or may not result in polygon degradation. In order to solve the problem, the method provides a pruning method based on a search matching sequence. Matching sequences will cause further matches between the elements they connect. In a sense, these elements will also be connected during the optimization phase. Therefore, it is necessary to verify whether the indirectly induced matching will cause polygon degradation. For match m, the matching sequence is searched in the actual matching graph G. Then, check if all indirectly induced matches miAre also all at EMIn (1). For being out of EMNeeds to be geometrically validated to determine if polygon degradation would result. To geometrically verify whether an indirectly induced match causes any polygon degradation, the matched vertices and/or edge endpoints are projected onto a common intersection line/of the planes in which the polygons lie. If there is a flip or self-intersection in the normal vector direction of the polygon corrected by the vertex/edge projectionIn case (2), pruning is performed on m.

Fourthly, according to the adjacency relation, the planarity and the orthogonality of the polygon set and the fitting of the input point cloud, constructing an objective function, solving the objective function to perform coordinate transformation on the polygon, and automatically bonding the polygon set; the method specifically comprises the following steps:

to maintain planarity, the method introduces a cartesian coordinate system. For each plane

Figure BDA0003005824420000091

Establishing the point of origin, v, is o1And v2Is a coordinate system of basis vectors. For any vertex P ∈ P, use the coordinate (P)x,py) Is represented by p ═ o + pxv1+pyv2. In the optimization process, coordinates (p) are used to reduce the spatial gap between adjacent polygonsx,py) The cartesian coordinate system is also subject to spatial motion. The method uses the velocity vector field of instantaneous motion to represent the displacement of each point, thereby linearizing the space motion of each coordinate system,

Figure BDA0003005824420000101

where x represents the coordinates of a point or points,

Figure BDA0003005824420000102

translation of the points is described, c rotation of the points is described, and v (x) represents the coordinates of the points after movement. Therefore, one vertex P ∈ P in the optimization processiThe location of (d) can be written as:

Figure BDA0003005824420000103

using the adjacency found in the previous step, the method defines the data item as the distance between the matching elements:

Figure BDA0003005824420000104

wherein d is(pi,pj) Representing a vertex piTo the vertex pjD (p) ofi,ek) Representing a vertex piTo edge ekD (p) ofi,Pl) Representing a vertex piTo plane PlThe distance of (c).

To maintain a collection of polygons

Figure BDA00030058244200001012

For the fitting of the input point cloud, the method uses a constraint term:

Figure BDA0003005824420000105

wherein P isi(l)Represents a plane plAll of the vertices in (a) are,

Figure BDA0003005824420000106

represents PlThe initial state of (a);

in order to meet the ubiquitous orthogonality of indoor scenes, the method provides the following two orthogonal terms:

Figure BDA0003005824420000107

Figure BDA0003005824420000108

these terms measure the orthogonality between adjacent polygons, and the orthogonality of adjacent sides within a polygon, respectively. Eorth1Acting on any pair of polygons, if the absolute value of the difference between the angle of two polygons and 90 degrees is less than the angle threshold (the angle threshold may be set to be

Figure BDA0003005824420000109

) Then the orthogonal term determines the coefficient wijOtherwise, it is 0. Eorth2Optimizing orthogonality (w) of adjacent edges within a polygoni(l)Is determined byMeaning and wijSame), but this may lead to degradation of the edge; wherein ei(l)

Figure BDA00030058244200001013

Representing adjacent edges. This problem is particularly pronounced in the absence of geometry, which can be achieved by minimizing the initial point piTo current vertex position p'iIs overcome by the sum of the squared distances of (a) and the distance term is as follows:

Figure BDA00030058244200001010

in summary, the objective function is expressed as:

E=λdataEdataconsEconsorth(Eorth1+Eorth2)+λcurEcur

wherein λdata、λcons、λorth、λcurAre all weight coefficients. In this embodiment, the following settings are set: lambda [ alpha ]data=1,λcons=0.5,λorth0.01 and λcur=0.1。

This is a non-linear optimization problem, minimizing the objective function by the L-BFGS algorithm. According to

Figure BDA00030058244200001011

To transform plane PiThe cartesian coordinate system of (a) is not a rigid body transformation but an affine transformation. The method applies a helical motion to ensure rigid body transformation. Fig. 7 shows a comparison before and after optimization.

Generating an indoor scene model based on the polygon, wherein the specific application comprises the following steps: giving materials to each polygon, and generating an indoor scene model for virtual reality; a point cloud model is obtained by sampling a polygon model, and then an indoor scene model with high reality is generated through meshing and texture mapping, so that the method can be applied to visual positioning, virtual roaming and other applications.

In addition, on the basis of automatically creating and bonding polygons, the method provides an interactive editing tool for processing polygon errors or deletions caused by scene occlusion, material and other problems. All interactive operations are simplified to approximate two-dimensional operations by automatically selecting a suitable two-dimensional modeling space based on the segmentation plane in the point cloud. One dimension is implicitly removed, which greatly reduces the complexity of interaction, thereby reducing the difficulty of the overall modeling work. The interactive operation comprises polygon editing, polygon drawing, polygon boundary alignment to image boundary, polygon material giving and the like.

The invention provides a point cloud processing method based on a plane structure and a polygon structure, aiming at the problems of noise, cavities and the like in a point cloud model acquired by a sensor. Firstly, a point cloud model is segmented, so that abnormal points and small objects in the point cloud model are eliminated. Next, polygonal bonding based on numerical optimization is performed to repair voids and gaps due to plane division caused by occlusion, material, and the like. As shown in fig. 8, due to a moving person or object, there are some outliers (represented by circles) in the original point cloud data, and the laser scanning has low acquisition accuracy for small objects, and the method adopts a plane segmentation method to remove them, as shown in fig. 8 (right). FIG. 9 shows that the method of the present invention can effectively repair the cavity in the point cloud data caused by the factors of angle, shielding, material, etc. Experiments show that the method can successfully process the problems of noise, cavities and the like, ensure the consistency of the output result and the original point cloud, and can obtain a high-precision indoor scene model.

In one embodiment, a computer device is provided, which includes a memory and a processor, the memory stores computer readable instructions, and the computer readable instructions, when executed by the processor, cause the processor to execute the steps of the structure-based indoor scene three-dimensional modeling method in the above embodiments.

In one embodiment, a storage medium is provided, in which computer readable instructions are stored, and when executed by one or more processors, the one or more processors perform the steps of the structure-based indoor scene three-dimensional modeling method in the embodiments. The storage medium may be a nonvolatile storage medium.

Those skilled in the art will appreciate that all or part of the steps in the methods of the above embodiments may be implemented by associated hardware instructed by a program, which may be stored in a computer-readable storage medium, and the storage medium may include: read Only Memory (ROM), Random Access Memory (RAM), magnetic or optical disks, and the like.

The above description is only for the purpose of illustrating the preferred embodiments of the one or more embodiments of the present disclosure, and is not intended to limit the scope of the one or more embodiments of the present disclosure, and any modifications, equivalent substitutions, improvements, etc. made within the spirit and principle of the one or more embodiments of the present disclosure should be included in the scope of the one or more embodiments of the present disclosure.

Claims (10)

1. A structure-based indoor scene three-dimensional modeling method is characterized by comprising the following steps:

s1: removing abnormal points and small objects in the point cloud data of the indoor scene through plane segmentation;

s2: performing polygon fitting on a plane, including extracting plane boundaries, performing neighborhood estimation on each boundary point, calculating the normal direction of each boundary point, smoothing boundaries by moving the positions of the boundary points in the normal direction, and extracting polygons by using an angular point detection algorithm;

s3: searching the adjacency relation among points, lines and surfaces of the polygon set according to the self-adaptive search radius; establishing a graph structure according to the adjacency relation, and carrying out local pruning and global pruning;

s4: constructing an objective function according to the adjacency relation, the planarity and the orthogonality of the polygon set and the fitting of the input point cloud, solving the objective function to perform coordinate transformation on the polygon, and automatically bonding the polygon set;

s5: an indoor scene model is generated based on the polygons.

2. A method as claimed in claim 1, wherein the S1 comprises the following sub-steps:

s11: constructing a KDTree, performing neighborhood search on each point in the point cloud data by using a nearest neighbor algorithm, and calculating the normal direction and curvature of each point by using a PCA algorithm;

s12: performing plane segmentation by adopting region growth based on distance and normal vector judgment;

s13: the plane equation is fitted to each plane using the PCA algorithm, removing outliers and small objects from the plane information.

3. The method of claim 1, wherein in S2, all points in a plane point set are projected onto a plane according to a plane equation, the three-dimensional coordinates are converted into two-dimensional coordinates, and an α -shape algorithm is used to extract a two-dimensional point cloud boundary; a forward search method is applied to classify the locally smooth regions around each boundary point.

4. The structure-based three-dimensional modeling method for indoor scenes of claim 1, wherein in S2, the normal direction of each boundary point is calculated and optimized, specifically: initializing a normal direction by using a PCA algorithm in a neighborhood, and optimizing the normal direction of the boundary point by using a least square method, wherein an optimization equation is as follows:

Figure FDA0003005824410000011

Figure FDA0003005824410000012

the first term is an energy term which is used for minimizing the normal difference in the neighborhood and transmitting the normal difference to the set N of all adjacent boundary points; the second term is a constraint term used for preventing the normal of the boundary point from deviating from the initial value too much; weighting coefficient wp,qPenalizing the difference in the normal direction, given by the Gaussian filter, where θp,qRepresents a normal vector npAnd nqσ represents the variance;

Figure FDA0003005824410000021

represents a normal vector npAn initial value of (1); λ represents the constraint term coefficient.

5. The method of claim 1, wherein in step S2, the boundary is smoothed by moving the boundary point position in the normal direction, which can be expressed as follows:

p′=p+tpnp

wherein t ispThe distance of the boundary point moving in the normal direction is shown, and p' is the coordinate of the moved boundary point;

the new boundary point position is obtained by minimizing the energy function:

Figure FDA0003005824410000022

the first term smoothes the point cloud boundary by minimizing the dot product of the connecting line of q, p and the normal direction of q, p; the second term is used for preventing the point cloud boundary points from deviating from the initial positions thereof too much, and mu is a constraint term coefficient.

6. The method of claim 1, wherein the step of searching for the adjacency relationship between the points, lines and faces of the polygon set according to the adaptive search radius in the step S3 comprises:

s31: matching points with points;

the intrinsic stability requirement of a polygon limits the search radius to half the minimum distance from the vertex p to the polygon boundary, i.e., half

Figure FDA0003005824410000023

Wherein

Figure FDA0003005824410000024

Representing the polygon boundary after removing vertex p and its connected edges, d (p, e) representing the distance from vertex p to polygon boundary e,

Figure FDA0003005824410000025

represents the minimum distance of the vertex p to the polygon boundary;

will be provided with

Figure FDA00030058244100000210

Defined as the adaptive search radius of the vertex p, rmaxThe maximum distance between matching elements customized for the user; matching candidate set of vertex p contains

Figure FDA0003005824410000026

All vertices within a distance r (p) are searched,

Figure FDA0003005824410000027

representing a set of polygons, P representing a polygon to which the vertex P belongs; if the candidate set is empty, add to the candidate set

Figure FDA0003005824410000028

The vertex closest to the vertex p and satisfying the distance less than rmax

Will r ise(p)=max(r(p),min(rmax,dmin) Defined as the extended search radius of the vertex p, dminRepresents p and

Figure FDA0003005824410000029

the distance of the middle closest vertex; two vertices p and q are considered matched if they are contained within their respective extended search radius from each other, and require that the distance of a pair of matching points to the intersection line l of the planes in which they lie satisfies: r is not more than d (l, p)e(p),d(l,q)≤re(q)。

S32: matching points with edges;

for a side e ═ p0,p1) Its search radius is defined as the minimum search radius of its two end points, r (e) min (r (p)0),r(p1) ); if the orthogonal projection of vertex p on e is inside e, and the following formula is satisfied, then p matches e:

d(p,e)≤min(r(p),(e))

d(l,p)≤r(p)and d(l,e)≤r(e)

s33: for vertex p and face f, if the orthogonal projection of p on f is inside the polygon and the projection distance is less than r (p), then p and f match; two edges match if there are two vertex-vertex matches, or one vertex-vertex and one vertex-edge match, or two vertex-edge matches, for the endpoints of the two edges.

7. The structure-based indoor scene three-dimensional modeling method according to claim 1, wherein in S3, a graph structure is established according to an adjacency relation, and local pruning is performed, specifically:

establishing graph structure G ═ V, EM) To represent matching relationships of a set of polygons

Figure FDA0003005824410000036

All vertices and edges of (A) form (V, E)MContains all vertex-vertex/edge matches;

vertex-to-vertex matching typically produces stable results, correcting mismatching due to adding the closest vertex to the candidate set in the following pruning step: considering two or more vertices Q of a polygon QiMatching with the vertex P belonging to P, only keeping the matching pair with the nearest distance, and representing all single ring neighborhoods of the search G in the matching graph G;

intrinsic stability requires pruning vertex-edge matches where the vertices correspond to multiple non-adjacent edges of the polygon, which occurs due to overlap of search radii around the edges, which is compressed to the nearest vertex-edge match using graph G.

8. The structure-based indoor scene three-dimensional modeling method of claim 7, wherein in S3, an expanded matching graph is constructed for global pruning, specifically:

constructing an extended matching graph Ge=(V,Ee) V contains all polygon elements, Ee=EM∪ECContains all the matching relationships EMAnd a constraint set EC;ECConnecting all element pairs in the V of the same polygon except the polygon vertex and the connecting edge thereof; according to the expanded matching graph, judging that the matching m of any vertex-vertex/edge is equal to (p, q) epsilon EMWhether part of a detrimental cycle; find m cycles c (m), e (m, e) directly connected to another match on both sides0,…,en),e0,…,en∈EMIf such a cycle is found, the corresponding match is directly deleted;

for match m, search the actual match graph G for a matching sequence, check if all indirectly induced matches miAre also all at EMFor being not in EMPerforming geometric validation on each match to determine whether polygon degradation would result; and (3) projecting the matched vertex and/or edge endpoint to a common intersection line l of the plane where the polygon is located, and pruning m if the normal vector direction of the polygon corrected by vertex/edge projection is reversed or has self-intersection.

9. The structure-based indoor scene three-dimensional modeling method according to claim 1, wherein the S4 specifically is:

in order to maintain planarity, a cartesian coordinate system is introduced; for each plane

Figure FDA0003005824410000031

Establishing the point of origin, v, is o1And v2A coordinate system that is a basis vector; for any vertex P ∈ P, use the coordinate (P)x,py) Is represented by p ═ o + pxv1+pyv2(ii) a The displacement of each point is expressed by adopting a velocity vector field of instantaneous motion, so as to linearize the space motion of each coordinate system,

Figure FDA0003005824410000032

Figure FDA0003005824410000033

where x represents the coordinates of a point or points,

Figure FDA0003005824410000034

translation of the points is described, c rotation of the points is described, v (x) represents the coordinates of the points after movement; one vertex P epsilon P in the optimization processiThe location of (d) can be written as:

Figure FDA0003005824410000035

with adjacency relationships, data items are defined as the distance between matching elements:

Figure FDA0003005824410000041

to maintain a collection of polygons

Figure FDA0003005824410000042

Fitting to the input point cloud, using the constraint term:

Figure FDA0003005824410000043

wherein p isi(l)Representing a plane PlAll of the vertices in (a) are,

Figure FDA0003005824410000044

represents PlThe initial state of (a);

to satisfy the ubiquitous orthogonality of indoor scenes, the following two orthogonal terms are used:

Figure FDA0003005824410000045

Figure FDA0003005824410000046

Eorth1acting on any pair of polygons, and if the absolute value of the difference between the included angle of the two polygons and 90 degrees is less than the angle threshold, determining the coefficient w by the orthogonal termij1, otherwise 0; eorth2Optimizing the orthogonality of adjacent edges within the polygon, where ei(l)

Figure FDA0003005824410000047

Representing adjacent edges;

by minimizing the initial point piTo current vertex position p'iThe sum of the squared distances to overcome the degradation problem that may lead to edges, the distance term is as follows:

Figure FDA0003005824410000048

the objective function is expressed as:

E=λdataEdataconsEconsorth(Eorth1+Eorth2)+λcurEcur

wherein λdata、λcons、λorth、λcurAre all weight coefficients.

10. The structure-based indoor scene three-dimensional modeling method according to claim 1, characterized in that, on the basis of automatically creating and bonding polygons, interactive editing tools are used for processing polygon errors or deletions caused by scene occlusion and material problems; automatically selecting a proper two-dimensional modeling space based on a partition plane in the point cloud, and simplifying all interactive operations into approximate two-dimensional operations; the interactive operation comprises polygon editing, polygon drawing, polygon boundary alignment to image boundary, polygon material giving and the like.

CN202110361587.9A 2021-04-02 2021-04-02 Structure-Based 3D Modeling Method for Indoor Scenes Active CN113112600B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110361587.9A CN113112600B (en) 2021-04-02 2021-04-02 Structure-Based 3D Modeling Method for Indoor Scenes

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110361587.9A CN113112600B (en) 2021-04-02 2021-04-02 Structure-Based 3D Modeling Method for Indoor Scenes

Publications (2)

Publication Number Publication Date
CN113112600A true CN113112600A (en) 2021-07-13
CN113112600B CN113112600B (en) 2023-03-03

Family

ID=76713609

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110361587.9A Active CN113112600B (en) 2021-04-02 2021-04-02 Structure-Based 3D Modeling Method for Indoor Scenes

Country Status (1)

Country Link
CN (1) CN113112600B (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114140575A (en) * 2021-10-21 2022-03-04 北京航空航天大学 Three-dimensional model construction method, device and equipment
CN114241124A (en) * 2021-11-17 2022-03-25 埃洛克航空科技(北京)有限公司 Method, device and equipment for determining stitching edge in three-dimensional model
CN114399583A (en) * 2021-12-03 2022-04-26 聚好看科技股份有限公司 Three-dimensional model splicing method and device based on geometry
CN117456115A (en) * 2023-12-26 2024-01-26 深圳大学 A method of merging adjacent constructed three-dimensional entities

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6429872B1 (en) * 1997-03-11 2002-08-06 Gmd-Forschungszentrum Informationstechnik Gmbh Method and apparatus for representing computer-modeled objects
US20050128196A1 (en) * 2003-10-08 2005-06-16 Popescu Voicu S. System and method for three dimensional modeling
US20060061566A1 (en) * 2004-08-18 2006-03-23 Vivek Verma Method and apparatus for performing three-dimensional computer modeling
CN109325998A (en) * 2018-10-08 2019-02-12 香港理工大学 Indoor 3D modeling method, system and related device based on point cloud data
CN109887082A (en) * 2019-01-22 2019-06-14 武汉大学 A method and device for 3D modeling of indoor buildings based on point cloud data
CN111986322A (en) * 2020-07-21 2020-11-24 西安理工大学 Point cloud indoor scene layout reconstruction method based on structural analysis

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6429872B1 (en) * 1997-03-11 2002-08-06 Gmd-Forschungszentrum Informationstechnik Gmbh Method and apparatus for representing computer-modeled objects
US20050128196A1 (en) * 2003-10-08 2005-06-16 Popescu Voicu S. System and method for three dimensional modeling
US20060061566A1 (en) * 2004-08-18 2006-03-23 Vivek Verma Method and apparatus for performing three-dimensional computer modeling
CN109325998A (en) * 2018-10-08 2019-02-12 香港理工大学 Indoor 3D modeling method, system and related device based on point cloud data
CN109887082A (en) * 2019-01-22 2019-06-14 武汉大学 A method and device for 3D modeling of indoor buildings based on point cloud data
CN111986322A (en) * 2020-07-21 2020-11-24 西安理工大学 Point cloud indoor scene layout reconstruction method based on structural analysis

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
MAYA: "MAYA LT帮助文档", 《HTTPS://HELP.AUTODESK.COM/VIEW/MAYALT/2017/CHS/》 *
丁承君等: "散乱点云的边界提取", 《计算机技术与发展》 *
牛晓静等: "一种聚类与滤波融合的点云去噪平滑方法", 《计算机应用与软件》 *

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114140575A (en) * 2021-10-21 2022-03-04 北京航空航天大学 Three-dimensional model construction method, device and equipment
CN114241124A (en) * 2021-11-17 2022-03-25 埃洛克航空科技(北京)有限公司 Method, device and equipment for determining stitching edge in three-dimensional model
CN114399583A (en) * 2021-12-03 2022-04-26 聚好看科技股份有限公司 Three-dimensional model splicing method and device based on geometry
CN117456115A (en) * 2023-12-26 2024-01-26 深圳大学 A method of merging adjacent constructed three-dimensional entities
CN117456115B (en) * 2023-12-26 2024-04-26 深圳大学 A method for merging adjacent three-dimensional entities

Also Published As

Publication number Publication date
CN113112600B (en) 2023-03-03

Similar Documents

Publication Publication Date Title
Schöps et al. 2019 Surfelmeshing: Online surfel-based mesh reconstruction
CN113112600A (en) 2021-07-13 Indoor scene three-dimensional modeling method based on structure
CN109544677B (en) 2020-12-25 Indoor scene main structure reconstruction method and system based on depth image key frame
US7737969B2 (en) 2010-06-15 System and program product for re-meshing of a three-dimensional input model using progressive implicit approximating levels
CN107123164B (en) 2020-04-28 Three-dimensional reconstruction method and system for keeping sharp features
JP6883062B2 (en) 2021-06-09 Robust merge of 3D textured meshes
CN107067473B (en) 2022-07-01 Method, device and system for reconstructing 3D modeling object
Zhang et al. 2015 Online structure analysis for real-time indoor scene reconstruction
Xiong et al. 2015 Flexible building primitives for 3D building modeling
KR101195942B1 (en) 2012-10-29 Camera calibration method and 3D object reconstruction method using the same
US8711143B2 (en) 2014-04-29 System and method for interactive image-based modeling of curved surfaces using single-view and multi-view feature curves
Di Angelo et al. 2011 A new mesh-growing algorithm for fast surface reconstruction
Vicente et al. 2013 Balloon shapes: Reconstructing and deforming objects with volume from images
KR20160070712A (en) 2016-06-20 Texturing a 3d modeled object
US9665978B2 (en) 2017-05-30 Consistent tessellation via topology-aware surface tracking
US20220261512A1 (en) 2022-08-18 Segmenting a 3d modeled object representing a mechanical part
Hu et al. 2017 Surface segmentation for polycube construction based on generalized centroidal Voronoi tessellation
Gao et al. 2024 Floor plan reconstruction from indoor 3D point clouds using iterative RANSAC line segmentation
US10282858B2 (en) 2019-05-07 Methods and systems for estimating three-dimensional information from two-dimensional concept drawings
Zell et al. 2013 Elastiface: Matching and blending textured faces
Zhang et al. 2003 Model reconstruction from cloud data
Yemez et al. 2009 Shape from silhouette using topology-adaptive mesh deformation
Shi et al. 2012 Fast and effective integration of multiple overlapping range images
Adhikary et al. 2017 Direct global editing of STL mesh model for product design and rapid prototyping
Borish 2023 Cross-sectioning

Legal Events

Date Code Title Description
2021-07-13 PB01 Publication
2021-07-13 PB01 Publication
2021-07-30 SE01 Entry into force of request for substantive examination
2021-07-30 SE01 Entry into force of request for substantive examination
2023-03-03 GR01 Patent grant
2023-03-03 GR01 Patent grant