I ve implemented this method in Javascript and I m roughly 2.5% out and I d like to understand why.
My input data is an array of points represented as latitude, longitude and the height above the WGS84 ellipsoid. These points are taken from data collected from a wrist-mounted GPS device during a marathon race.
My algorithm was to convert each point to cartesian geocentric co-ordinates and then compute the Euclidean distance (c.f Pythagoras). Cartesian geocentric is also known as Earth Centred Earth Fixed. i.e. it s an X, Y, Z co-ordinate system which rotates with the earth.
My test data was the data from a marathon and so the distance should be very close to 42.26km. However, the distance comes to about 43.4km. I ve tried various approaches and nothing changes the result by more than a metre. e.g. I replaced the height data with data from the NASA SRTM mission, I ve set the height to zero, etc.
Using Google, I found two points in the literature where lat, lon, height had been transformed and my transformation algorithm is matching.
What could explain this? Am I expecting too much from Javascript s double representation? (The X, Y, Z numbers are very big but the differences between two points is very small).
My alternative is to move to computing the geodesic across the WGS84 ellipsoid using Vincenty s algorithm (or similar) and then calculating the Euclidean distance with the two heights but this seems inaccurate.
Thanks in advance for your help!