Welcome to the SRP Forum! Please refer to the SRP Forum FAQ post if you have any questions regarding how the forum works.

srp_JsonX

edited March 2022 in SRP Utilities
This 'srp_jsonX("details",column.value,'string array')' produced the following output. the column.value is a @tm delimited value.

Is this the correct expect formatted output.
Just wondering about the '\t' is that a json formatting thingy.

PLUS: If there is such a thingy, is there a formatting for BOLDstart BOLDend

"Separable A - Extra Media Converters \t\t\t",
"\t - 5x LC-LC SM fibre leads\t\t",
"\t - 4x Pearle SM Fibre media converters (4x Pairs of data connections with ARm220)\t\t",
"\t\t - 4x Pearle SM SFP's\t",
"",
"Separable A - Extra Media Converters \t\t\t",
"\t - 5x LC-LC SM fibre leads\t\t",
"\t - 4x Pearle SM Fibre media converters (4x Pairs of data connections with ARm220)\t\t",
"\t\t - 4x Pearle SM SFP's\t",
""

Comments

  • Yes, \t is a tab character. That is standard json encoding.
  • 'encoding'..Ah thats the 'thingy'

    Where can I find a list of thingys..ooops encodings
  • Thank you.
    Horizontal tab wont work for me, need each to be on a new line , so will change @tm to \r\n (CRLF)
  • Ok, that did'nt turn out too weel

    swap @tm with '\r\n' in column.value -
    "Separable A - Extra Media Converters \t\t\t\\r\\n\t - 5x LC-LC SM fibre leads\t\t\\r\\n\t - 4x Pearle SM Fibre media converters


    I will leave the original way and they can deal with it at the front (returned) end.
  • I could be wrong, but you shouldn't have to worry about putting the encoded characters into your string. Just use the normal characters (\09\, \0D0A\, etc.) and SRP JsonX will encode them for you.
  • Update.
    Thought the \t was the conversion of @tm
    On a close look at the data I saw Tabs in the data and also in the json output I see an ? character.
    Appears @tm is not converted as part of an array.
    If I do:
    swap \09\ with '' in column.value
    swap @tm with @fm in column.value
    I get a perfect output:

    "details": [
    "Separable A - Extra Media Converters ",
    " - 5x LC-LC SM fibre leads",
    " - 4x Pearle SM Fibre media converters (4x Pairs of data connections with ARm220)",
    " - 4x Pearle SM SFP's",
    "",
    "Separable A - Extra Media Converters ",
    " - 5x LC-LC SM fibre leads",
    " - 4x Pearle SM Fibre media converters (4x Pairs of data connections with ARm220)",
    " - 4x Pearle SM SFP's",
    ""

  • Thought the \t was the conversion of @tm

    No. Kevin was only pointing out that the \t in your JSON was the tab character. He was not suggesting that this was also the @TM delimiter.
    On a close look at the data I saw Tabs in the data and also in the json output I see an ? character.

    That could be the @TM delimiter. What are you using to inspect the JSON where the "?" character appeared?
  • No. Kevin was only pointing out that the \t in your JSON was the tab character. He was not suggesting that this was also the @TM delimiter.


    This was in this reference to the array type, but, of course I could be misreading

    New in 2.2.4. The Hint supports the "Array" value. When you set the hint to "Array", SRP_JsonX assumes you are passing a delimited array as the value. It will parse the array into individual elements and produce a Json array as a value. You don't have to tell SRP_JsonX what delimiter your array is using. The "Array" hint tells it to use any OI delimiter it finds (that is, @FM, @VM, @TM, etc.). You can also combine the "Array" hint with any of the above hints as well, such as "String Array" or "Bool Array". This hint is meant as a quick and convenient way to convert 1-dimensional OI delimited arrays into Json arrays in a single call. It won't work for every scenario.


    What are you using to inspect the JSON where the "?" character appeared?


    Either Postman or Notepad++
  • It should have delimited by @TMs. I am using the Remove statement, which claims to recognize @TMs. Not sure why it wasn't dividing those up.
  • Actually, looking at your original post, it did divide up your data using @TMs.
  • @KevinFournier
    Ah yes, I see it did. Good to know it does work. Extra fiddling must have caused the later issue.
    Thank you for your patience.
Sign In or Register to comment.