I'm using now for a couple of years Augmented Reality to visualize 3D-Models, but I never thought to much about the correct projection, when I create my 3D-Models. But for the current project I'm working on I noticed some gaps between 3D-Models which should actually be together.
Currently I'm using a self programmed QGis-Plugin to convert 2D-LineStrings into 3D-Pipes. The first step is to convert the 2D-LineStrings into a metrical coordinate system, in my case a UTM coordinate system. After that I do my calculation for the vertices, faces and normals to get a 3D-Pipe. Then I save every single 3D-Pipe to a .obj/.mtl file.
As an AR-SDK I'm using Wikitude, but I noticed the same behaviour as described above with Metaio (which is not available any more).
I take my 3D-Model and place them with a WGS84 coordinate into the AR-Scene.
The anchor point is normaly the first point from the 2D-LineString.
Roughly it's working but not exactly. Probably I get better results if I'm using the pseudo mercator coordinate system, but I'm not sure.
I'm also not 100% sure what kind of procetion is used in an AR-Scene. I guess that it is a metrical system (OpenGL) which is maybe transformed to a global system like WGS84, which would explain why I could place 3D-Models with a WGS84 coordinate.
To recap everything:
- In which metrical coordinate system should I transform my 2D-LineStrings to do my calculations to get the most accurate results in an AR-Scene.
- What kind of projection is used in an AR-Scene.