PilotWolf
Well-known member
In which case can you explain the calibration process? Assuming you're not merely talking about WAAS or DGPS a lot of forumites, myself included, may not be aware of it.
Basically the GPS unit is calibrated to a known point or datum on the vessel - usually the COG as it should have the least movement due to sea conditions, etc. Laser measurements/plotting of the antenna in relation to this point is done to ensure that if one antenna that is say 5 metres away from it (in an ideal world) the particular unit gives the same position as one 10 metres away from the (same) point. It is then fixed/measured from a known and approved point on land - such as a OS triangulation point. The GPS is 'calibrated' using these measurements to enter offsets in relation to the hydrographic or geological data recording equipment. My point being that the primary and secondary (separate) units are showing different, albeit minor position differences despite the manual corrections to remove any installation errors/differences and as such the £200 unit from XXX marine might not be as good as they claim or people falsely believe.
Sitting alongside, zoomed in on the plotter to the highest scale shows the different fixes from different GPS units - usually the primary and secondary inputs.
Given that even commercial grade plotter 'charts' are only so accurate relying on 10m accuracy is risky, I have regularly been moored on - not next to - various quaysides around the UK and Europe. Radar overlaying on to the plotter confirms the inaccuracy of the charts (yes I am aware of radar's limitations and errors too).
I recall a CAA endorsed lecture about GPS jamming a good few years back (so may not be accurate now) that equated the GPS signal strength to a torch bulb shining from space.
W.
Last edited: