Get the point that boost::geometry::distance used - c++

I can use boost::geometry::distance to find the minimum distance between a point and a rectangle (linestring). How do I figure out the point that boost::geometry::distance used to obtain that distance?

Related

Distance of a Point to a NURBS Surface

I am looking for an easy way to compute the minimum distance of a point to a nurbs surface with matlab. I am looking for the closest point => not an orthogonal projection.
I have read about the concept of sampling to get a start and then do newton iteration but at the time this exceeds my matlab powers. Thanks for help.
I am looking for the closest point => not an orthogonal projection.
The closest point on the surface is an orthogonal projection of your point in space onto the surface. Orthogonal in the sense that the line connecting point and projection is perpendicular to the tangent plane of the surface at the projected point.
A NURBS surface is parametrized by two parameters u and v. You can also compute how the position changes as u resp. v changes. To do this you compute the partial differentials. You should get a pair of tangent vectors which span the tangent plane. Now you want the difference between point on the plane and point in space to be orthogonal to both these vectors, i.e. have dot product zero. Which means you'll obtain two equations, one for the u direction and one for the v direction, which will help you find the u and v parameters you need.
Note however that this system of equations is likely highly non-linear. So get some good computer algebra or numerics software to find all the solutions, then compute the distance for each of them to pick the minimal one.

Computing location based on a list of approximate distances [closed]

I have N points in at 3D space (I think I can grasp myself general N-dimensional case) and approximate distances to these points, how can I compute my position relative to these N points?
EDIT
Please note that the distances are approximate, so the more approximate distances I have the more convenient result I should get
Thank you!
I would write down an equation that gives you some measure of the errors associated with a possible location, and then find the location that minimizes this measure. My first attempt would be to minimize the sum of the squares of the difference between the distance measured and the distance worked out from the possible location, for each of your approximate distance, so you are minimizing something like SUM_i((sqrt((X-Ai)^2 + (Y-Bi)^2 + (Z-Ci)^2) - Di)^2) where X,Y,Z is the location co-ordinates you are trying to find, (Ai,Bi,Ci) is the co-ordinates of one of the objects from which you are measuring distances, and Di is the distance measured. It doesn't look very pretty, but you should at least be able to compute derivatives and then find some sort of minimization routine in a math library.
You have distances from given N points in a 3D space and their approximate error values. So, you have a thick sphere for each of the points that you are in. You get all of them, calculate their intersection area, and take that area's center point as your approximate location.

Is there an algorithm to find nearby points using only polar coordinates?

Suppose I have a vector of points as polar coordinates.
Suppose one of those points acts as a probe for which I want to find all the other points within a certain distance.
Is there an algorithm to do this without converting them to Cartesian form?
You are looking for the distance for polar coordinates. You can find the formula in this link.
The distance between points (r1, a1) and (r2, a2) is:
D = sqrt(r1*r1 + r2*r2 - 2*r1*r2*cos(a1-a2))
Be careful, if you plan to scale up your algorithm to many points, probing for points nearby is better done using a spatial index. I'm not aware of the existence of spatial indexes using polar coordinates, and I'm sure they would be a bit complex to implement/use. So if you have:
lots of points,
probe more frequently than moving points,
ask yourself the question whether you should use Cartesian coordinates and a spatial index.
Do the math yourself according to your typical use case:
Using cartesian alongside polar coordinates:
Converting polar to cartesian is done only when a point moves, and involve two trigonometric functions;
Finding points nearest than a certain distance to another point may be done in O(1) time (depending on the average distance, the size of the spatial index, the number of points...), and does not involve anything other than adds/multiplies (not even square roots, you compare the distance squared).
Using polar coordinates only:
Scanning for all points w/o spatial index is O(n);
This involves one trigonometric function per comparison (thus n trig calls per probe).
Be aware that trigs are bloody expensive in computation time.

Finding closest pair of points on a sphere

I know how to implement n log n closest pair of points algorithm (Shamos and Hoey) for 2D cases (x and y). However for a problem where latitude and longitude are given this approach cannot be used. The distance between two points is calculated using the haversine formula.
I would like to know if there is some way to convert these latitudes and longitudes to their respective x and y coordinates and find the closest pair of points, or if there is another technique that can be used to do it.
I would translate them to three dimensional coordinates and then use the divide and conquer approach using a plane rather than a line. This will definitely work correctly. We can be assured of this because when only examining points on the sphere, the two closest points by arc distance (distance walking over the surface) will also be the two closest by 3-d Cartesian distance. This will have running time O(nlogn).
To translate to 3-d coordinates, the easiest way is to make (0,0,0) the center of the earth and then your coordinates are (cos(lat)*cos(lon),cos(lat)*sin(lan),sin(lat)). For those purposes I'm using a scale for which the radius of the Earth is 1 in order to simplify calculations. If you want distance in some other unit, just multiply all quantities by the radius of the Earth when measured in that unit.
I should note that all this assumes that the earth is a sphere. It's not exactly one and points may actually have altitude as well, so these answers won't really be completely exact, but they will be very close to correct in almost every case.

Finding centre of rotation for a set of points [closed]

If I have an arbitrary set of points, and then the same set of points rotated by some degree, does anyone know of any algorithms to calculate/estimate where the centre of the rotation is? Or an area of study where these kinds of algorithms are needed?
I am having trouble finding any relevant information.
Thanks
Lets say you have one point (x, y), that moved to (x', y').
Then the center of rotation must lie on the line that is perpendicular to (x,y)-(x',y'), and that intersects the center (x,y)-(x',y').
Now take another point, (x2, y2), that moved to (x'2, y'2). This also gives rise to a line on which the center of rotation must be located on.
Now take these two lines and compute the intersection. There you have the center of rotation.
Update: If you don't have the correspondence of which point went where, it shouldn't be too hard to figure out. Here is a suggestion from top of my head: Find the center of mass of the "before"-points. Order the points according to their distance from this point. Now do the same with the "after"-points. The order of the two sets should now match. (The point closest to the center of mass before rotation, should be the point closest to the center of mass after rotation.)
It would be crazy overkill for this type of problem, but I think the functionality of the generalized Hough transform for object detection at least encompasses what you want, even though it's not quite meant for this purpose.
Given an arbitrary shape created from a set of points, and another arbitrary set of points, it tries to find the shape in the set of the points even though it's been rotated, scaled, and translated. You might be able to take out the scaling and translation and get what you want.
Basically what it would come down to is brute forcing possible rotation points to see which one fit the second set of points best.
Very interesting problem. My knowledge on this is a bit out of date, but as I recall, there's some research in the use of subgraph analysis on this; that is, characterizing subsections of the set of points by the distances between the points and the variances therein, and then correlating those subgraph analyses between the before and after rotations.
This is, of course, assuming a very complex set of points with a nonuniform distribution.
You need to find some signature on your data set that allows to identify the points from the first set (A) with those on the second set (B).
An easy way is as follows:
For every element E in A, find the two nearest points (N1, N2) and calculate the angle between N1,E,N2 resulting in three values: the angle and the distances from E to N1 and N2 (ang, d1, d2).
Find 3 points in A with unique tuples (ang, d1, d2).
For every element in B calculate also the distance to its two nearest neighbors and the angle. Find the 3 points matching those selected from A.
Calculating the rotation is just a matter of geometric analysis.
update: you need 3 points to determine the rotation in 3D space. In 2D, two will do.
update 2: as others have commented on other posts, there may be symmetries in A that would stop you for finding the 3 unique triplets for (ang, d1, d2). In that case, for every one of the selected three points in A, you will have to perform a search over all the elements in B matching their triplets until some combination results in a rotation that works for all the elements in A.

Resources