GithubHelp home page GithubHelp logo

Comments (21)

lgraham-geocue avatar lgraham-geocue commented on August 18, 2024 1

from las.

esilvia avatar esilvia commented on August 18, 2024 1

During today's LWG call we added the above table to the GPS Time definition and made some wording changes. I think this section is pretty close to being complete. You may review the PDF with these changes here:

https://github.com/ASPRSorg/LAS/actions/runs/6113674176

Please do let me know if you see anything that needs to change.

from las.

esilvia avatar esilvia commented on August 18, 2024

A suggested alternative that I came up with here:

If USGS or the LAS committee supplied a standard VLR to encode a hash table with GPS weeks indexed by point source ID then we could return to GPS Week Time. All of the shortcomings of storing time as a double-precision float would then be nullified. I have already implemented something along these lines in my software and encourage everyone in our offices to process with GPS Week Time encoding, but it will only be useful to others if it's integrated into a standard.

Karl's reaction: I would have no problem with this approach. Then we could improve the precision of time as well.

Martin's reaction: Interesting idea, but I have to disagree. This would only work when the LAS files store flight strips. But more often LAS files store tiled data and points from different flight strips that are tiled into the same tile are not necessarily from the same GPS week. I have spent a lot of time with the national LiDAR mapping project of the Philippines and the neighboring "blocks of flight lines" are often flown months or even years apart. I think the time stamp must be unique per return, not per file. Using differential coding as done by typical compressors (like open LASzip as well as by proprietary MrSID or zLAS) a continuously repeated GPS week will compress away to almost nothing.

We also want to get away from storing the time as a floating point number. A second of the week number stored as a floating point number has drastically varying precision starting at near infinite at the beginning of the week (t close to 0.0) to much lower towards the end of the week (t close to 604,800.0).

from las.

esilvia avatar esilvia commented on August 18, 2024

Martin's suggestion:
I agree that we should keep the existing 64 bit field and design the fixed-bit precision representation such that it results in as much of the same functionality when interpreted as a 64 bit floating point number as possible (mainly: result in the same unique order when sorted by it). We should decide what GPS time resolution suffices. But 1/4 nanosecond seems excessive given what the sensors actually deliver. What are the highest frequencies at which GPS time stamps are taken? Much much lower, no?

Currently the LAS format supports a a spacing of 0.0596 microsecond or 59.6 nanoseconds. Let's assume we agree that a spacing of 10 nanoseconds will suffice then here is a very simple 64 bit representation based on the well understood GPS week + GPS time of week concept:

typedef struct GPStime
{
    U64 week : 16;          // [in weeks]
    U64 time_of_week : 48;  // [in 10 ns]
} GPStime;

When they merely need a unique sort or hash key (most common use of GPS time) then folks may still decide to access these 64 bits of information as a single U64 integer or a single F64 double float using such a construct.

typedef union GPStimeU64F64
{
    U64 u64;
    F64 f64;
    GPStime gpstime;
} GPStimeU64F64;

To store the 604,800 second of the week with 10 ns spacing we need to store numbers up to 60,480,000,000,000 in size. This can be done with 46 bits as 2^46 = 70,368,744,177,664. To get a 1 ns spacing (but that seems excessive to me) we could change the struct to:

typedef struct GPStime
{
    U64 week : 14;          // [in weeks]
    U64 time_of_week : 50;  // [in 1 ns]
} GPStime;

In response, Lewis expressed concern that we should retain byte compatibility between revisions of a particular version. However, this might be a viable option for a new version of LAS (1.5/2.0?).

from las.

esilvia avatar esilvia commented on August 18, 2024

From Martin: Can we introduce a "Newly Adjusted GPS Standard Time Stamp" using another of the global encoding bits that subtracts 1.2 or 1.3 billion instead of 1 billion and still do this as a LAS 1.4 revision?

from las.

rapidlasso avatar rapidlasso commented on August 18, 2024

Some related discussions on this topic are happening in the LAStools user forum.

from las.

nkules avatar nkules commented on August 18, 2024

A couple of comments here since this was brought up as relevance with potentially adding a LAS 1.5:

  • I would be a big fan of accommodating a fixed precision within the point using seconds of week and then the GPS week. The issue with the other proposal of using a look up table by PSID, would prohibit a flightline from crossing GPS midnight at the end of the week. Yes it could be worked around with just rolling the value past 604,800 (which is what some vendors already do to get around SBET rollovers). However I don't think this is as "foolproof" as including both time and GPS week in the time struct.

  • I think we would need at least 13bits or the GPS week, 12 bits buys us about 40 more years, but 13 buys us over 100 years which I would say is plenty enough lifespan for this format or any future extensions of it. 14 is probably overkill, just depends on if we needed to steal it for the SOW precision.

  • With the time precision, it would depend on what use case there may be for super precise analyses. We have to remember that we are talking about the speed of light here, so 1ns is roughly 30cm, or 15cm round trip. So in that, a ns time precision isn't too far fetched. It would just depend on what sort of use cases anyone might be needing that high of precision. It seemed to be mentioned that maybe the source sensor data isn't very precise time wise. I'd be curious what precision sensor vendors are using in their native formats, as I would suspect it might be down to the 1-5ns scale. (but I may be wrong)

from las.

esilvia avatar esilvia commented on August 18, 2024

This came up on our last call in July, when @lgraham-geocue suggested the following (my words summarizing his):

LAS 1.4 and earlier all used the standard offset of -1e9 (1,000,000,000) for Adjusted Standard GPS Time. For LAS 1.5 we could add a GPS Time Offset field to the header that made the offset user-specified. Then the LAS is self-documenting.

I really like this idea because it's simple, easy to implement, and retains compatibility with existing point records.

However, adding this option opens the possibility of having wildly different offset values entering the ecosystem, which would potentially make interoperability between datasets extremely painful. I see two main paths to prevent this:

Option 1

Encode the offset as some large increment to make consistency easier to accomplish. For example...

  1. Encode the time offset as a uint8 (unsigned char), in increments of 500,000,000, or 500M seconds. 500 million seconds is roughly equivalent to almost 16 years, which means each increment would cover +/- 8 years with maximum precision. The existing offset worked relatively well for the roughly 20 years centered on September 14, 2011, which is 0.0 in Adjusted Standard GPS Time, so I think this option should be fine for the next 4000 years or so.
  2. Encode the time offset as a uint8 (unsigned char), in increments of 100,000,00, or 100M seconds. Same advantages as option 1 but each increment results in a difference of about 3.2 years. This option would work for about 795 years without precision loss.
  3. Encode the time offset directly as a uint32 (unsigned long). This would work for ~127 years (until 2100) without precision loss. This option is easy to understand and implement but opens the possibility of people fine-tuning their offsets to their application so much that datasets collecting just a few weeks apart aren't compatible without converting timestamps.

Option 2

Use the wiki to publish recommended offset values at a regular interval, such as every decade. That way, anyone doing a survey in 2020-2030 will use the same offset value and data interoperability becomes straightforward, making it unlikely to have mixed offsets for a given project or application.

These published values would be a recommendation, not a requirement, so, a data buyer with a multi-year contract could dictate a specific offset value that's optimized for their particular application.

My thoughts

The two options aren't mutually exclusive. Personally, think I like option 1.2 in combination with option 2. In this case the offset would be integer increments of 100M, but we'd publish recommended offset values for each decade that attempts to minimize precision loss for the decade. For example...

  1. 1990-2000: Recommended offset = 5 (centered at October 1995)
  2. 2000-2010: Recommended offset = 8 (centered at May 2005)
  3. 2010-2020: Recommended offset = 11 (centered at November 2014)
  4. 2020-2030: Recommended offset = 14 (centered at June 2024)

Values in between such as 6, 7, or 12 would be valid for users targeting specific time periods, but the defaults would be the recommended values. There's also nothing special about decades... we could also do 5-year recommendations.

Question

What do you guys think? I haven't done the math to know whether 100M seconds is sufficient to prevent measurable precision loss, but my hunch is that it'd be okay.

from las.

esilvia avatar esilvia commented on August 18, 2024

Variant of option 1.2: encode the offset as a signed int8 (signed char) in increments of 100M. This would cover us for roughly 400 years before and after 1980. Making it signed would open the possibility of encoding data long before Jan 6 1980.

from las.

rapidlasso avatar rapidlasso commented on August 18, 2024

I suggest to not make it a 8 bit field but a 32 bit field and specify the offset in million of seconds. For any LAS 1.4 to LAS 1.5 converted LiDAR using Standard Adjusted GPS time the number stored to that new field would then be 1000. If we need to subtract another billion seconds (or maybe only 500 million seconds) to regain precision the this number becomes 2000 (or maybe 1500). So 1000 or 500 would be the specification recommended increments. We could even add a simple table that suggests for which years which offsets should be used. However, should we ever need more resolution in the GPS time we can lower these 1000 or 500 increments to a smaller number without having to change the specification. Only the recommended offset table would need to change.

from las.

esilvia avatar esilvia commented on August 18, 2024

I suggest to not make it a 8 bit field but a 32 bit field and specify the offset in million of seconds. For any LAS 1.4 to LAS 1.5 converted LiDAR using Standard Adjusted GPS time the number stored to that new field would then be 1000. If we need to subtract another billion seconds (or maybe only 500 million seconds) to regain precision the this number becomes 2000 (or maybe 1500). So 1000 or 500 would be the specification recommended increments. We could even add a simple table that suggests for which years which offsets should be used. However, should we ever need more resolution in the GPS time we can lower these 1000 or 500 increments to a smaller number without having to change the specification. Only the recommended offset table would need to change.

Thanks for the input @rapidlasso ! Since you've done the math on precision loss and I haven't, I'll defer to your judgment on the recommended interval size. 1M is fine with me. I like the idea of having the option to modify precision without changing the specification.

I don't have any real objections to a 32bit integer, but it seems to me that a 16bit field would be more than enough. Even at the max value, 32768 * 1e6 is ~1041 years.

from las.

abellgithub avatar abellgithub commented on August 18, 2024
  1. Two "extra" bytes in the header is immaterial. Skimping is silly.
  2. Someone should provide the proposal in terms of the math/algorithm that would be applied.

from las.

esilvia avatar esilvia commented on August 18, 2024

I finally took the time to try to recreate Martin's math from back in the day, and came up with pretty similar results. I'm relying pretty heavily on his explanation, but this spreadsheet will hopefully help folks trying to wrap their brains around this issue.

Adjusted_Time_Table_20211221.xlsx

As things stand in LAS 1.4 Adjusted Standard GPS Time, the 1e9 offset results in a zero value around September 14, 2011 (highlighted in yellow). On that date we had the maximum possible precision, and the maximum possible pulserate that can be discretely encoded.

The spreadsheet shows that we currently have around 0.1192 microsecond precision (max 8.4MHz) and will degrade to 0.2384 microsecond precision (max 4.2MHz) around 2028. It doesn't take much imagination to picture 4MHz systems in seven years, given that systems with 2MHz are already on the market.

Hope that helps people understand the problem, what we're doing about it, and how to determine which LAS 1.5 time value they'll want to use. Entering a value of 65535 results in a zero date of Sept 24, 4056. a uint16 will be more than enough.

from las.

esilvia avatar esilvia commented on August 18, 2024

As a matter of fact, that spreadsheet leads us straight to our recommended offset table, too. If I arbitrarily pick 0.1 microseconds (100 nanoseconds) as the target precision, then we have the constraint we need to provide recommended offsets:

  1. 500M = 1991-2000
  2. 750M = 1999-2008
  3. 1000M = 2007-2015 <--- current Adjusted Std GPS Time Offset
  4. 1250M = 2015-2023
  5. 1500M = 2023-2031
  6. 1750M = 2031-2039
  7. 2000M = 2039-2047

from las.

esilvia avatar esilvia commented on August 18, 2024

What do you all think about the name Offset Standard GPS Time or perhaps simply Offset GPS Time as the name for this timestamp encoding method? I don't think we can use the name Adjusted Standard GPS Time since that already explicitly means an offset of 1 billion.

from las.

esilvia avatar esilvia commented on August 18, 2024

During today's LWG call we completed a VERY rough first draft of the Offset GPS Time language for LAS 1.5. You can review the PDF here: https://github.com/ASPRSorg/LAS/actions/runs/5754635108

Specifically, we resolved that we do in fact want to add a new Global Encoding Bit to indicate that the Time Offset value in the header should be used, and it is to be used in concert with the existing GPS Time Type Bit. That is, both bits must be set if Offset GPS Time is being used in the LAS 1.5 file, and it's invalid to have the Offset Time Flag set and to have the GPS Time Type Bit NOT set.

In other words....

image

We will continue this work next month. Please feel free to add any edits or comments between now and then.

from las.

esilvia avatar esilvia commented on August 18, 2024

Rupesh noted at the end of today's LWG call that the Adj Std GPS Time example was unclear. That was clarified in this build:

https://github.com/ASPRSorg/LAS/actions/runs/6113785609

from las.

esilvia avatar esilvia commented on August 18, 2024

One more update in this month's LWG meeting to clarify the point's timestamp description:
https://github.com/ASPRSorg/LAS/actions/runs/6422942576

Last chance for comments! Will merge the PR #140 into the 1.5 Draft during the Nov 2023 meeting if there is no further discussion.

from las.

pchilds avatar pchilds commented on August 18, 2024

Re: the above INVALID field of GPS week time + offset I don't see this as being the case, in fact receiving a file with GPS week time and no knowledge of the particular week is highly annoying and I'd argue there is a strong case for making the offset mandatory and making GPS week time - offset the invalid case. The 2 bytes for time offset as a week number can cover a millenium. It would be only a matter of choosing where the most suitable zero is, whether it be the week of 1 Jan 1970, 6 Jan 1980 or whenever the + 10^9 ends up

from las.

esilvia avatar esilvia commented on August 18, 2024

While I like the idea of deprecating GPS Week Time and agree with all points, I think it's a lost cause for us to fully deprecate GPS Week Time in LAS 1.x at this point. We can review the language to ensure the preference for Offset GPS Time is as strong as possible.

from las.

esilvia avatar esilvia commented on August 18, 2024

PR#140 has been merged into the 1.5 draft branch.

from las.

Related Issues (20)

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.