GithubHelp home page GithubHelp logo

status-im / nim-json-serialization Goto Github PK

View Code? Open in Web Editor NEW
46.0 20.0 9.0 244 KB

Flexible JSON serialization not relying on run-time type information

License: Apache License 2.0

Nim 100.00%

nim-json-serialization's Introduction

nim-json-serialization

License: Apache License: MIT Stability: experimental Github action

Flexible JSON serialization does not rely on run-time type information.

Overview

nim-json-serialization offers rich features on top of nim-serialization framework. The following is available but not an exhaustive list of features:

  • Decode into Nim data types efficiently without an intermediate token.
  • Able to parse full spec of JSON including the notorious JSON number.
  • Support stdlib/JsonNode out of the box.
  • While stdlib/JsonNode does not support the full spec of the Json number, we offer an alternative JsonValueRef.
  • Skipping Json value is an efficient process, no token is generated at all and at the same time, the grammar is checked.
    • Skipping is also free from custom serializer interference.
  • An entire Json value can be parsed into a valid Json document string. This string document can be parsed again without losing any information.
  • Custom serialization is easy and safe to implement with the help of many built-in parsers.
  • Nonstandard features are put behind flags. You can choose which features to switch on or off.
  • Because the intended usage of this library will be in a security-demanding application, we make sure malicious inputs will not crash this library through fuzz tests.
  • The user also can tweak certain limits of the lexer/parser behavior using the configuration object.
  • createJsonFlavor is a powerful way to prevent cross contamination between different subsystem using different custom serializar on the same type.

Spec compliance

nim-json-serialization implements RFC8259 JSON spec and pass these test suites:

Switchable features

Many of these switchable features are widely used features in various projects but are not standard JSON features. But you can access them using the flags:

  • allowUnknownFields[=off]: enable unknown fields to be skipped instead of throwing an error.
  • requireAllFields[=off]: if one of the required fields is missing, the serializer will throw an error.
  • escapeHex[=off]: JSON doesn't support \xHH escape sequence, but it is a common thing in many languages.
  • relaxedEscape[=off]: only '0x00'..'0x1F' can be prepended by escape char \\, turn this on and you can escape any char.
  • portableInt[=off]: set the limit of integer to -2**53 + 1 and +2**53 - 1.
  • trailingComma[=on]: allow the presence of a trailing comma after the last object member or array element.
  • allowComments[=on]: JSOn standard doesn't mention about comments. Turn this on to parse both C style comments of //..EOL and /* .. */.
  • leadingFraction[=on]: something like .123 is not a valid JSON number, but its widespread usage sometimes creeps into Json documents.
  • integerPositiveSign[=on]: +123 is also not a valid JSON number, but since -123 is a valid JSON number, why not parse it safely?

Safety features

You can modify these default configurations to suit your needs.

  • nestedDepthLimit: 512: maximum depth of the nested structure, they are a combination of objects and arrays depth(0=disable).
  • arrayElementsLimit: 0: maximum number of allowed array elements(0=disable).
  • objectMembersLimit: 0: maximum number of key-value pairs in an object(0=disable).
  • integerDigitsLimit: 128: limit the maximum digits of the integer part of JSON number.
  • fractionDigitsLimit: 128: limit the maximum digits of faction part of JSON number.
  • exponentDigitsLimit: 32: limit the maximum digits of the exponent part of JSON number.
  • stringLengthLimit: 0: limit the maximum bytes of string(0=disable).

Special types

  • JsonString: Use this type if you want to parse a Json value to a valid Json document contained in a string.
  • JsonVoid: Use this type to skip a valid Json value.
  • JsonNumber: Use this to parse a valid Json number including the fraction and exponent part.
    • Please note that this type is a generic, it support uint64 and string as generic param.
    • The generic param will define the integer and exponent part as uint64 or string.
    • If the generic param is uint64, overflow can happen, or max digit limit will apply.
    • If the generic param is string, the max digit limit will apply.
    • The fraction part is always a string to keep the leading zero of the fractional number.
  • JsonValueRef: Use this type to parse any valid Json value into something like stdlib/JsonNode.
    • JsonValueRef is using JsonNumber instead of int or float like stdlib/JsonNode.

Flavor

While flags and limits are runtime configuration, flavor is a powerful compile time mechanism to prevent cross contamination between different custom serializer operated the same type. For example, json-rpc subsystem dan json-rest subsystem maybe have different custom serializer for the same UInt256.

Json-Flavor will make sure, the compiler picks the right serializer for the right subsystem. You can use useDefaultSerializationIn to add serializers of a flavor to a specific type.

# These are the parameters you can pass to `createJsonFlavor` to create a new flavor.

  FlavorName: untyped
  mimeTypeValue = "application/json"
  automaticObjectSerialization = false
  requireAllFields = true
  omitOptionalFields = true
  allowUnknownFields = true
  skipNullFields = false
type
  OptionalFields = object
    one: Opt[string]
    two: Option[int]

createJsonFlavor OptJson
OptionalFields.useDefaultSerializationIn OptJson

omitOptionalFields is used by the Writer to ignore fields with null value. skipNullFields is used by the Reader to ignore fields with null value.

Decoder example

  type
    NimServer = object
      name: string
      port: int

    MixedServer = object
      name: JsonValueRef
      port: int

    StringServer = object
      name: JsonString
      port: JsonString

  # decode into native Nim
  var nim_native = Json.decode(rawJson, NimServer)

  # decode into mixed Nim + JsonValueRef
  var nim_mixed = Json.decode(rawJson, MixedServer)

  # decode any value into string
  var nim_string = Json.decode(rawJson, StringServer)

  # decode any valid JSON
  var json_value = Json.decode(rawJson, JsonValueRef)

Load and save

  var server = Json.loadFile("filename.json", Server)
  var server_string = Json.loadFile("filename.json", JsonString)

  Json.saveFile("filename.json", server)

Objects

Decoding an object can be achieved via the parseObject template. To parse the value, you can use one of the helper functions or use readValue. readObject and readObjectFields iterators are also handy when creating a custom object parser.

proc readValue*(r: var JsonReader, table: var Table[string, int]) =
  parseObject(r, key):
    table[key] = r.parseInt(int)

Sets and list-like

Similar to Object, sets and list or array-like data structures can be parsed using parseArray template. It comes in two variations, indexed and non-indexed.

Built-in readValue for regular seq and array is implemented for you. No built-in readValue for set or set-like is provided, you must overload it yourself depending on your need.

type
  HoldArray = object
    data: array[3, int]

  HoldSeq = object
    data: seq[int]

  WelderFlag = enum
    TIG
    MIG
    MMA

  Welder = object
    flags: set[WelderFlag]

proc readValue*(r: var JsonReader, value: var HoldArray) =
  # parseArray with index, `i` can be any valid identifier
  r.parseArray(i):
    value.data[i] = r.parseInt(int)

proc readValue*(r: var JsonReader, value: var HoldSeq) =
  # parseArray without index
  r.parseArray:
    let lastPos = value.data.len
    value.data.setLen(lastPos + 1)
    readValue(r, value.data[lastPos])

proc readValue*(r: var JsonReader, value: var Welder) =
  # populating set also okay
  r.parseArray:
    value.flags.incl r.parseInt(int).WelderFlag

Custom iterators

Using these custom iterators, you can have access to sub-token elements.

customIntValueIt(r: var JsonReader; body: untyped)
customNumberValueIt(r: var JsonReader; body: untyped)
customStringValueIt(r: var JsonReader; limit: untyped; body: untyped)
customStringValueIt(r: var JsonReader; body: untyped)

Convenience iterators

readArray(r: var JsonReader, ElemType: typedesc): ElemType
readObjectFields(r: var JsonReader, KeyType: type): KeyType
readObjectFields(r: var JsonReader): string
readObject(r: var JsonReader, KeyType: type, ValueType: type): (KeyType, ValueType)

Helper procs

When crafting a custom serializer, use these parsers, they are safe and intuitive. Avoid using the lexer directly.

tokKind(r: var JsonReader): JsonValueKind
parseString(r: var JsonReader, limit: int): string
parseString(r: var JsonReader): string
parseBool(r: var JsonReader): bool
parseNull(r: var JsonReader)
parseNumber(r: var JsonReader, T: type): JsonNumber[T: string or uint64]
parseNumber(r: var JsonReader, val: var JsonNumber)
toInt(r: var JsonReader, val: JsonNumber, T: type SomeInteger, portable: bool): T
parseInt(r: var JsonReader, T: type SomeInteger, portable: bool = false): T
toFloat(r: var JsonReader, val: JsonNumber, T: type SomeFloat): T
parseFloat(r: var JsonReader, T: type SomeFloat): T
parseAsString(r: var JsonReader, val: var string)
parseAsString(r: var JsonReader): JsonString
parseValue(r: var JsonReader, T: type): JsonValueRef[T: string or uint64]
parseValue(r: var JsonReader, val: var JsonValueRef)
parseArray(r: var JsonReader; body: untyped)
parseArray(r: var JsonReader; idx: untyped; body: untyped)
parseObject(r: var JsonReader, key: untyped, body: untyped)
parseObjectWithoutSkip(r: var JsonReader, key: untyped, body: untyped)
parseObjectSkipNullFields(r: var JsonReader, key: untyped, body: untyped)
parseObjectCustomKey(r: var JsonReader, keyAction: untyped, body: untyped)
parseJsonNode(r: var JsonReader): JsonNode
skipSingleJsValue(r: var JsonReader)
readRecordValue[T](r: var JsonReader, value: var T)

Helper procs of JsonWriter

beginRecord(w: var JsonWriter, T: type)
beginRecord(w: var JsonWriter)
endRecord(w: var JsonWriter)

writeObject(w: var JsonWriter, T: type)
writeObject(w: var JsonWriter)

writeFieldName(w: var JsonWriter, name: string)
writeField(w: var JsonWriter, name: string, value: auto)

iterator stepwiseArrayCreation[C](w: var JsonWriter, collection: C): auto
writeIterable(w: var JsonWriter, collection: auto)
writeArray[T](w: var JsonWriter, elements: openArray[T])

writeNumber[F,T](w: var JsonWriter[F], value: JsonNumber[T])
writeJsonValueRef[F,T](w: var JsonWriter[F], value: JsonValueRef[T])

License

Licensed and distributed under either of

or

at your option. These files may not be copied, modified, or distributed except according to those terms.

nim-json-serialization's People

Contributors

alehander92 avatar arnetheduck avatar etan-status avatar gabrielmer avatar jangko avatar kdeme avatar menduist avatar mjfh avatar mratsim avatar planetis-m avatar stefantalpalaru avatar tersec avatar yyoncho avatar zah avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

nim-json-serialization's Issues

Test suite fails due to nim-serialization updates

As of status-im/nim-serialization#59:


/home/runner/.nimble/pkgs/serialization-0.2.0/serialization/object_serialization.nim(231, 11) readField
/home/runner/work/nim-json-serialization/nim-json-serialization/nim/lib/system/assertions.nim(38, 26) failedAssertImpl
/home/runner/work/nim-json-serialization/nim-json-serialization/nim/lib/system/assertions.nim(28, 11) raiseAssert
/home/runner/work/nim-json-serialization/nim-json-serialization/nim/lib/system/fatal.nim(54, 5) sysFatal
/home/runner/work/nim-json-serialization/nim-json-serialization/tests/test_serialization.nim(187, 26) template/generic instantiation of `executeReaderWriterTests` from here
/home/runner/.nimble/pkgs/serialization-0.2.0/serialization/testing/generic_suite.nim(371, 24) template/generic instantiation of `executeRoundtripTests` from here
/home/runner/.nimble/pkgs/serialization-0.2.0/serialization/testing/generic_suite.nim(275, 10) template/generic instantiation of `test` from here
/home/runner/.nimble/pkgs/serialization-0.2.0/serialization/testing/generic_suite.nim(281, 17) template/generic instantiation of `roundtrip` from here
/home/runner/.nimble/pkgs/serialization-0.2.0/serialization/testing/generic_suite.nim(198, 22) template/generic instantiation of `roundtripChecks` from here
/home/runner/.nimble/pkgs/serialization-0.2.0/serialization/testing/generic_suite.nim(165, 25) template/generic instantiation of `decode` from here
/home/runner/.nimble/pkgs/serialization-0.2.0/serialization.nim(49, 13) template/generic instantiation of `readValue` from here
/home/runner/.nimble/pkgs/serialization-0.2.0/serialization.nim(30, 9) template/generic instantiation of `readValue` from here
/home/runner/work/nim-json-serialization/nim-json-serialization/json_serialization/reader.nim(584, 28) template/generic instantiation of `readValue` from here
/home/runner/.nimble/pkgs/serialization-0.2.0/serialization.nim(30, 9) template/generic instantiation of `readValue` from here
/home/runner/work/nim-json-serialization/nim-json-serialization/json_serialization/reader.nim(651, 24) template/generic instantiation of `fieldReadersTable` from here
/home/runner/.nimble/pkgs/serialization-0.2.0/serialization/object_serialization.nim(256, 34) template/generic instantiation of `makeFieldReadersTable` from here
/home/runner/.nimble/pkgs/serialization-0.2.0/serialization/object_serialization.nim(218, 26) template/generic instantiation of `enumAllSerializedFields` from here
/home/runner/.nimble/pkgs/serialization-0.2.0/serialization/object_serialization.nim(152, 32) template/generic instantiation of `enumAllSerializedFieldsImpl` from here
/home/runner/work/nim-json-serialization/nim-json-serialization/nim/lib/system/fatal.nim(54, 5) Error: unhandled exception: /home/runner/.nimble/pkgs/serialization-0.2.0/serialization/object_serialization.nim(231, 19) `not isCaseObject(typeof(obj))` Case object `CaseObject` must have custom `readValue` for `ReaderType` [AssertionDefect]

Skew under version 0.1.0 causing nimble problems

This package breaks the latest version of chronicles (0.9.2) because it's not being properly versioned. It appears that the code was updated for faststreams 0.2.0 but the version on this package was not reved. Fixing involves removing json_serialization and reinstalling it to pick up the changes for faststreams 0.2.0 so chronicles can compile.

Error: undeclared identifier: 'findFieldIdx'

/mnt/c/Users/low/.nimble/pkgs/json_serialization-0.1.0/json_serialization/reader.nim(638, 26) Error: undeclared identifier: 'findFieldIdx' candidates (edit distance, scope distance); see '--spellSuggest': (5, 6): 'findFieldReader' [proc declared in /mnt/c/Users/low/.nimble/pkgs/serialization-0.1.0/serialization/object_serialization.nim(239, 6)]

when building nimbus-eth1

"RangeError" on 32-bit Windows

https://ci.appveyor.com/project/nimbus/nim-json-serialization/builds/25368677/job/s4lxyyer9s2qw9bc :

[Suite] lexer tests
  [OK] object with simple fields
test_lexer.nim(13)       test_lexer
utils.nim(11)            dedent
system.nim(2924)         sysFatal
    Unhandled exception: value out of range: 99999999999 [RangeError]
  [FAILED] int literal
test_lexer.nim(13)       test_lexer
utils.nim(11)            dedent
system.nim(2924)         sysFatal
    Unhandled exception: value out of range: 99999999999 [RangeError]
  [FAILED] float literal
test_lexer.nim(13)       test_lexer
utils.nim(11)            dedent
system.nim(2924)         sysFatal
    Unhandled exception: value out of range: 99999999999 [RangeError]
  [FAILED] string literal

bad unexpectedField error message generated by object/tuple reader

This line is using r.lexer.strVal

r.raiseUnexpectedField(r.lexer.strVal, typeName)

But the intended field name already replaced by previous call to lexer.next() and r.skipToken tkColon

Instead of generating correct field name, that line will generate error message contains field value.

(52, 15) Unexpected field '0x0' while deserializing Genesis

Compilation error with ref type

import json_serialization

type
    DummyRef* = ref object of RootObj
        a: float
var dr = new(DummyRef)
dr.a = 100225.00
echo "resref:\n\t", dr.toJson 

nim-stew\stew\shims\macros.nim(43, 28) Error: index 2 not in 0 .. -1

using case object as object field's type cannot be serialized

import
  strutils, unittest,
  serialization/object_serialization,
  serialization/testing/generic_suite,
  ../json_serialization, ./utils,
  ../json_serialization/std/[options, sets, tables]

type
  MyKind = enum
    Apple
    Banana
    
  MyCaseObject = object
    name: string
    case kind: MyKind
    of Banana: banana: int
    of Apple: apple: string
    
  MyUseCaseObject = object
    field: MyCaseObject

test "case object as field":
    let y = MyUseCaseObject(field: MyCaseObject(name: "hello", kind: Apple, apple: "world"))    
    check:
       y == Json.decode(Json.encode(y), MyUseCaseObject)

error message:

F:\projects\nim-json-serialization\tests\test_serialization.nim(142, 8) template/generic instantiation of `test` from here
F:\projects\nim-json-serialization\tests\test_serialization.nim(145, 5) template/generic instantiation of `check` from here
F:\projects\nim-json-serialization\tests\test_serialization.nim(146, 9) template/generic instantiation of `check` from here
C:\Users\Jangko\.nimble\pkgs\serialization-0.1.0\serialization.nim(41, 13) template/generic instantiation of `encodeImpl` from here
C:\Users\Jangko\.nimble\pkgs\serialization-0.1.0\serialization.nim(27, 9) template/generic instantiation of `writeValue` from here
F:\projects\nim-json-serialization\json_serialization\writer.nim(189, 10) template/generic instantiation of `enumInstanceSerializedFields` from here
F:\projects\nim-json-serialization\json_serialization\writer.nim(192, 8) template/generic instantiation of `writeFieldIMPL` from here
C:\Users\Jangko\.nimble\pkgs\serialization-0.1.0\serialization\object_serialization.nim(174, 9) template/generic instantiation of `writeValue` from here
F:\projects\nim-json-serialization\json_serialization\writer.nim(189, 10) template/generic instantiation of `enumInstanceSerializedFields` from here
C:\Users\Jangko\.nimble\pkgs\serialization-0.1.0\serialization\object_serialization.nim(42, 34) template/generic instantiation of `hasCustomPragmaFixed` from here
F:\projects\nimbus_nim\lib\system\fatal.nim(39, 5) Error: unhandled exception: C:\Users\Jangko\.nimble\pkgs\stew-0.1.0\stew\shims\macros.nim(131, 18) `false`  [AssertionError]

the problem can be traced back to hasCustomPragma and recordFields in nim-stew.
serialize a case object is somewhat easy. but deserialize a case object is on the opposite direction, it can be tricky or even hard to do it properly without manual intervention.

JSON literal `null` not decodable

Parsing a JSON literal null fails with illegal storage access (lldb trace below):

import json_serialization

type
  RpcResponse* = ref object
    jsonrpc*: string
    result*: string
    id*: int
    error*: RpcError

let responseStr = callPrivateRPC("wallet_getCustomTokens", %* []) # returns {"jsonrpc":"2.0","id":0,"result":null}
let response = Json.decode(responseStr, RpcResponse) # <=== Fails here with a SIGSEGV: Illegal storage access

However, this works:

import json_serialization

type
  RpcResponse* = ref object
    jsonrpc*: string
    result*: ref string
    id*: int
    error*: RpcError

let responseStr = callPrivateRPC("wallet_getCustomTokens", %* []) # returns {"jsonrpc":"2.0","id":0,"result":null}
let response = RpcResponse(result: $(responseStr.parseJSON()["result"])) # <=== null get parsed in to a string

lldb trace for the failure:

>>> [libstatus/tokens.getCustomTokens] response: {"jsonrpc":"2.0","id":0,"result":null}
ERR 2020-10-07 15:18:47+11:00 TODO: if user is logged in, logout         topics="main" tid=1306387 file=nim_status_client.nim:109
QtQml was compiled with optimization - stepping may behave oddly; variables may not be available.
Process 23032 stopped
* thread #1, queue = 'com.apple.main-thread', stop reason = EXC_BAD_ACCESS (code=EXC_I386_GPFLT)
    frame #0: 0x00000001015db6d9 QtQml`QQmlData::disconnectNotifiers() [inlined] QScopedPointer<QObjectData, QScopedPointerDeleter<QObjectData> >::operator->(this=0xaaaaaaaaaaaaaab2) const at qscopedpointer.h:118:16 [opt]
Target 0: (nim_status_client) stopped.

Strange dependency tree

I'm not sure why nim-json-serialization depends on BearSSL or Chronos, that seems very wrong.

image

Compilation error: cannot decode Table type

import tables, json_serialization

let tbl: Table[int, string] = [(1, "one")].toTable
let encoded = Json.encode(tbl)

debugEcho("encoded: ", encoded)

let decoded = Json.decode(encoded, Table[int, string])

Causes compilation error

Error: undeclared field: 'data' for type readValue.T [declared in /Users/emizzle/repos/github.com/emizzle/nim-test-env/vendor/nim-json-serialization/json_serialization/reader.nim(491, 10)] 

Repro:

git clone https://github.com/emizzle/nim-test-env
cd nim-test-env
make update && make json_ser

implicit conversion to `cstring`

/home/runner/work/nimbus-eth2/nimbus-eth2/vendor/nim-json-serialization/json_serialization/reader.nim(596, 43) Warning: implicit conversion to 'cstring' from a non-const location: r.lexer.strVal; this will become a compile time error in the future [CStringConv]

Support for ref object targets

Currently the deserialization fails with ref objects with a strange error message.

Test case:

import
  serialization, json_serialization


type
  TestSuite = object
    title: string
    test_suite: string
    test_cases: seq[TestCase]

  TestCase = ref object # Change this to object for correct support
    name: string

proc parseTestSuite*(jsonPath: string): TestSuite =
  try:
    result = Json.loadFile(jsonPath, TestSuite)
  except SerializationError as err:
    stderr.write "Json load issue for file \"", jsonPath, "\"\n"
    stderr.write err.formatMsg(jsonPath), "\n"
    quit 1

when isMainModule:
  const JsonPath = "build/json_test.json"
  let testSuite = parseTestSuite(JsonPath)

  echo testSuite
{"title":"Foo","test_suite":"MyTestSuite","test_cases":[{"name":"Foo"},{"name":"Bar"}]}

Error

Json load issue for file "build/json_test.json"
build/json_test.json(0, 57) Unexpected token 'tkCurlyLe' in place of 'array end bracker'
Error: execution of an external program failed: './build/json_test '

Need to add feature where field with null value is ignored/skipped

In json-rpc, an object field receiving null value is a valid json.

for example:

type
   ChildObject = object
       name: string

   MyType = object
       list: seq[int]
       child: ChildObject

The following json is valid, and the meaning is roughly equal to {}, or the list and child fields can be ignored.

{"list": null, "child": null}

Using Option[seq[int]] and Option[ChildObject] is a workaround, but it's not too ergonomic.
And potentially every fields of an object can be null. It should not trigger error in rpc server except the field is mandatory.
That's why it should be a feature in object parser, not in the custom serializer. And the feature should be switchable by user.

The implementation should parse the key and the ":" separator, and then peek the next token category which only need lookup(whitespace+1).

If it is a null, the object parser should skip the value processing and parse the null value.
If it is not a null, the object parser can continue as usual.

Theoretically, this changes also backward compatible for existing json-serialization usage and json-rpc usage.
But it will break implementation which demand the existence of the object field even though the value is null.

Combining ref types with generics causes a compilation error

import json
import json_serialization

type
  RpcResponse*[T] = ref object
    result*: T
    id*: int
    error*: string

when isMainModule:
  echo("Hello, World!")
  let jsn = %* {
    "result": "0x0",
    "id": 0
  }
  echo Json.decode($jsn, RpcResponse[string])

Error:

/usr/local/Cellar/nim/1.2.0/nim/lib/system.nim(850, 11) Error: invalid type: 'T' in this context: 'ref RpcResponse:ObjectType' for var

Unicode support improvements

According to https://www.ietf.org/rfc/rfc4627.txt, JSON text can be encoded in UTF-8, UTF-16, or UTF-32.

The default encoding of JSON is UTF-8. But obviously, the most widespread encoding is subset of UTF-8.

The unicode text I'm speaking about is specific to JSON string within two quotation marks and not the JSON document.

Currently, unicode support in JsonReader is able to read escaped unicode codepoints, including surrogate pair.
But the JsonWriter is doesn't encode unicode codepoints.

But for serious unicode text heavy JSON, using escaped unicode codepoints is cumbersome and not efficient.
we can improve our support for full range UTF-8. The choice for UTF-8 is obvious, Nim string is encoded in UTF-8, and UTF-8 is easier to handle with already existing Nim libraries.

But unfortunately, the stdlib-unicode have incorrect edge cases we found during development of nim-websock.
We also have a collection of UTF-8, UTF-16, and UTF-32 in various repos. We can collect them and polish them in nim-stew.

Support union types

Requests in nim-json-rpc:

I will call it a union.

While we wait for nim-lang/RFCs#548 to materialize, we can provide something ahead of nim compiler.
And because the usage is domain specific, we don't need full features to handle union like the compiler.
The idea is to create a special types recognized by the library along with a set of api to handle union.
The approach can be similar to nim-ssz-serialization's union, or similar to nim-json-serialization flavor.
Whatever we choose, the end product is little intervention from user and it should similar to what the compiler offers.
When the compiler ship the new feature, migration or combined usage is easy.

Important feature of this ability is to warn or error if the user combine ambiguous types e.g.

json_union(UnionType):
    uint64
    int64

Because both int64 and uint64 will parse from the same json number, such union is invalid for json-serialization although it is valid for nim. The same apply to objects, if the list of sorted name of the fields is the same, they are invalid union.

The hardest part maybe the pattern matching to map literal json to the target type. The order of fields can be different to the target type, but if it's a valid json, it should still map to target type.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.