Jun 2012
1:02pm, 19 Jun 2012
60,822 posts
|
GregP
It's like 'science', except for people that don't visit the real world. I like maths.
|
Jun 2012
1:05pm, 19 Jun 2012
2,980 posts
|
sprouty76
But it's on real-world data, so it's applied maths.
|
Jun 2012
1:07pm, 19 Jun 2012
60,824 posts
|
GregP
I'm starting to fear that I've bred with Glenners and that Spouter is my mutant offspring.
|
Jun 2012
1:08pm, 19 Jun 2012
2,982 posts
|
sprouty76
It's the only rational explanation.
|
Jun 2012
1:14pm, 19 Jun 2012
16,945 posts
|
hammerite
It's how the data is interpreted. When the training software was developed a scheme would have been set out to interpret the data, some software will use the same way, others will differ. While a tcx contains max speed data it doesn't mean the training software has to use it. Instead they may have set the scheme to look at the time/distance data held in the file and calculate some results from that. If they decided to use time one piece of software may look at the track over 3 second intervals, another over 9 seconds (these are just examples). If you travelled 30 metres in 9 seconds then your max speed would be calculated in mph based on traveling 30 metres in 9 seconds. However, it could be that you travelled 15 metres in 3 seconds, 10 metres in the next 3 seconds and 5 metres in the next 3 seconds - the software that calculates over 3 seconds will take 15 metres in 3 seconds as your max speed, this will be higher than 30 metres in 9 seconds....
Does that make sense?
Withthe wheel sensor I wasn't talking about a gamin wheel sensor, more comparing say a catege max speed with a garmin. The Garmin can only determine your max speed based on your distance travelled between each time it polls the gps satellites. A wheel sensor will be able to determine speed by looking at the time it takes for the sensor to rotate once, a wheel sensor will rotate many times in the time a gamin will poll a satellite - so chances are the max speed an a wheel mounted speedo would be higher.
You may declare this to be bollocks, and I don't know how strava interprets the data compared to any other software, but this always used to be the justification givenfor differences in figures from device to software packages and between various software packages.
|
Jun 2012
1:17pm, 19 Jun 2012
60,828 posts
|
GregP
What we need, really, is a large branch of Caffe Nero. Preferably with a whiteboard.
Why, my dear Hammer, would one (in your example) only 'use' every third data pair?
|
Jun 2012
1:18pm, 19 Jun 2012
9,289 posts
|
Nick Cook
How about a large branch of Costa?
|
Jun 2012
1:19pm, 19 Jun 2012
2,984 posts
|
sprouty76
I believe most Garmins poll the GPS every 2 or 3 seconds depending on the device but some do it faster when there's a power meter attached.
Still doesn't explain what happened with the duration, unless they added up the differences between each pair of track points and totalled that - which would add together the cumulative rounding error as opposed to averaging it (which is what happens when you subtract the start time from the end time)
|
Jun 2012
2:43pm, 19 Jun 2012
573 posts
|
ThorntonRunner
Possibly using only every third data pair could come back to the issue that each data point is only an approximation to the true position at that time. With an accuracy of +/- 10m (for example) for each data point, then the recorded distance travelled in 3 seconds could be very inaccurate (you travel about 10m in 3 seconds at 8 minute miling), so potentially of the order of 10m error in 10m travelled. Using every three data pairs would bring that to potentially of the order of 10m in 30m. Given the accuracy stated when my Garmin locks I'm always surprised at how accurate the trace is when loaded up to display the route. Either there is a large margin on the error bound, or there is some reasonably clever error smoothing going on
|
Jun 2012
3:34pm, 19 Jun 2012
16,946 posts
|
hammerite
Speed I guess GreP. Back in the day training software sat on your desktop (I got my first gamin at the tail end of 2004 I think), data could only be processed based on the speed of the computer. Over say a 10 mile run that might mean processing a huge amount of data if using a pair over 3 seconds which would take a long time. Less data would need to be crunched for analysing a 10 mile run with pairs of data over 9 seconds.
Same reasoning for web based services, storage of data and the processing power might lead to a site using the 9 second example, especially if they expected a big demand. Such things led to this site slowing down on occasions.
I suppose processing power and storage has moved on a bit now though so this should be less an issue. I also guess back in the day there were fewer options so we just took the data as it was.
Try constructing the same route on various mapping sites, they'll all give slightly different elevation info, down to the data being procesed in slightly different ways.
|