Open APIs

 View Only
  • 1.  TMF677_Usage_Consumption, Quantity precision loss

    TM Forum Member
    Posted Mar 13, 2020 08:10
    Hello,

      We're mapping data consumption from our OCS system to TMF677 API, but the API specifies the Quantity as a float value, which results in precision loss when dealing with data usage amounts that are measured in bytes in the OCS system.

    What is the TMF recommendation to overcome this issue?
    - should we extend Quantity with another field for extended-precision value?
    - should we just change the TMF spec and remove the "float" format specifier (or change it to "double")
    - should we just convert the value to MB or GB and live with the precision loss (right now the API is only used for customer-facing display purposes, so this could be a valid option, but I feel that this is not really a good solution given that the API might be later consumed by other services that may need exact amounts without precision loss, like charging or promotion engines)

    The "float" type is used in other places (like Money) as well, where it is probably not the best approach either due to the possible rounding errors and precision loss... Is it valid TMF usage to ignore these format specifiers where we think it's detrimental to the API?

    Thanks for the help,
    regards,
    peter

    ------------------------------
    Péter Radics
    Vodafone
    ------------------------------


  • 2.  RE: TMF677_Usage_Consumption, Quantity precision loss

    TM Forum Member
    Posted Mar 13, 2020 10:46
    This API is being led by @Fernando Marin Diaz - at first sight it seems to be a good suggestion (I believe that Swagger supports this format:  https://swagger.io/docs/specification/data-models/data-types/)
    But on the other hand perhaps we need the flexibility of quantity to have multiple fundamental data types to support different use cases - number of bytes would perhaps be a large integer.
    I will try to raise for internal discussion within the team.

    ------------------------------
    Jonathan Goldberg
    Amdocs Management Limited
    Any opinions and statements made by me on this forum are purely personal, and do not necessarily reflect the position of the TM Forum or my employer.
    ------------------------------



  • 3.  RE: TMF677_Usage_Consumption, Quantity precision loss

    TM Forum Member
    Posted Mar 16, 2020 02:08
    Edited by Vance Shipley Mar 17, 2020 02:24
    While decimal notation is the common form of representing monetary amounts actual floats must never be used in accounting. You should consider an amount of $0.99 USD to be 99 cents. If you need fractional cents you must choose an acceptable precision and deal internally with the monetary values as an integer number of those units. For example with a unit of millionths of a cent $0.99 = 99000000.

    So technically it's not a problem, you just have to do the conversion appropriately. Now the problem you may encounter is that you have a JSON CODEC library which converts a JSON number in decimal to a float type automatically.  Here you have two solutions:  1) use a different CODEC; 2) use a string to represent the value.

    It is my opinion that the base type Common/Money.schema.json should be changed to use type string as it is NEVER appropriate to use a float to represent money!

    ------------------------------
    Vance Shipley
    SigScale
    ------------------------------



  • 4.  RE: TMF677_Usage_Consumption, Quantity precision loss

    TM Forum Member
    Posted Mar 16, 2020 03:34
    Edited by Péter Radics Mar 16, 2020 03:34
    That was exactly my point: the API spec should not use "float" for monetary values, and not even for other Quantities. 
    Although I think changing the value type to "number" is sufficient, as most JSON libraries use a high-precision data type (like BigDecimal) in this case instead of a low-precision type like float.

    Regarding the Money issue: the Money type currently has `value` and `unit`, where unit is actually the currency.  Based on the way money should be handled, this should be extended with a "precision" or "rate" type of field that communicates the ratio between the stored `value` and the actual monetary value in the given currency. (tbh, `unit` would be the best name for this, and the current `unit` should be `currency`, but I guess that boat has sailed already...)


    ------------------------------
    Péter Radics
    ------------------------------



  • 5.  RE: TMF677_Usage_Consumption, Quantity precision loss

    Posted Mar 16, 2020 04:41
    ​Peter,

    I suggest to consider introducing your semi-proposed "precision" or "rate" either as "fractional_unit_decimal_logarithmic_ratio" or as "fractional_unit_binary_logarithmic_ratio".

    The "fractional_unit_binary_logarithmic_ratio" version is meant to avoid unnecessary format conversions and replace by less Energy consuming shift Operation.

    I have a Question to you: Has anyone considered the Energy Efficiency of TM-Forum APIs? In particular APIs that are executed with extremely high frequency? In times of Fridays for Future and Fight against Climate Change it may be a valid consideration.

    Lothar

    ------------------------------
    Lothar Reith
    Detecon International
    ------------------------------



  • 6.  RE: TMF677_Usage_Consumption, Quantity precision loss

    TM Forum Member
    Posted Mar 17, 2020 03:43

    Agree with Vance

    and while not looking this specific TMF spec. I look from the bidding/quoting point of view the way prices should be represented accurately in a localized manner including translations, formatting ( ',.' , amount of leading zeros, decimals etc), rounding, units, divisors in the units linked with the actual amount (and alike) and by taking care that the actual "price variables" are not hardcoded to e.g. relational datamodels. Defining prices via proper normalized variable definitions is fundamentally different from the hardcoding prices to datamodels which you see in many applications, causing challenges especially with the usage pay as you go type values.

    Also it should be noted that certain document generators can be toggled to handle numbers as strings, but in-case the precision is lost during jogging the data to-and-from json, then it is nearly impossible to find out where the precision was lost. 

    Is there in TMF a specification which handles the challenges json format has - on principle level?

    While json is not strongly typed, the actual data should be stored in strongly typed manner also from the variable definition point of view. Integrations should share and obey the same definitions for variables, or it will fail by design.

    .. and have to say ... not sure this is defined anywhere but having "price wrong in the bill/invoice" is different from the "charge is wrong in the bill/invoice" do we e.g. make difference between these. Meaning where the price is 0,004 €/min and charge is 4 € due amount of min = 1000.

    Having those values in broken in json, makes things hard.

    rgrds Paavo



    ------------------------------
    Paavo Muranen
    Telia Company
    ------------------------------



  • 7.  RE: TMF677_Usage_Consumption, Quantity precision loss

    Posted Mar 17, 2020 09:00
    Hi all,

    I see lots of threads in the forum discussing challenges with implementing the published API's.   

    In this discussion thread, the topic is binary-based floating-point arithmetic (i.e., float, double) precision loss.   MongoDB has support for lossless decimals (decimal128)  so you do not need to implement unreliable workarounds.  Additional details available in this blog post  http://pauldone.blogspot.com/2018/05/databases-floating-point-precision-loss.html. 

    MongoDB is available as part of the Forum's Open Digital Labs and is also available as a "free tier" for everyone @ https://www.mongodb.com/cloud/atlas  MongoDB Atlas is the global cloud database service for modern applications. Deploy fully managed MongoDB across AWS, Azure, or GCP. Best-in-class automation and proven practices guarantee availability, scalability, and compliance with the most demanding data security and privacy standards. Use MongoDB's robust ecosystem of drivers, integrations, and tools to build applications faster and spend less time managing your database.

    I am happy to speak with any ISV or platform company about porting your BSS / OSS applications from legacy RDBMS (or more recent cloud native) architectures to MongoDB.  Please reach out to me at Eddie.sharkey@mongodb.com.  I am also happy to introduce the CP's to your local account teams across the world.

     MongoDB Technical Overview

    Regards,
    Eddie Sharkey


    ------------------------------
    Eddie Sharkey
    MongoDB
    ------------------------------



  • 8.  RE: TMF677_Usage_Consumption, Quantity precision loss

    TM Forum Member
    Posted Mar 23, 2020 06:21
    Hello Eddie,

      The issue here is not whether some implementation can or can not provide lossless floats, but that the spec is mandating (or just encouraging) bad practices regarding the handling of monetary entities.
    Another issue is representing large-valued quantities as "float" objects which are defined as single-precision floating point numbers by swagger, which is clearly not ideal.

    Your idea of using desimal128 in mongo is indeed an "unreliable workaround", as you have to hack around what the spec says (ignore that it mandates single-precision floats), and then use a specific product as a DB back-end...

    regards,
    peter

    ------------------------------
    Péter Radics
    ------------------------------