Compare commits
112 Commits
rust-prese
...
main
Author | SHA1 | Date |
---|---|---|
Tony Garnock-Jones | 685302f547 | |
Tony Garnock-Jones | f83e67899e | |
Tony Garnock-Jones | f2331c0e1e | |
Tony Garnock-Jones | f628e7d31f | |
Tony Garnock-Jones | b767fa4eb0 | |
Tony Garnock-Jones | 58110e7c0c | |
Tony Garnock-Jones | 58ebc93eb5 | |
Tony Garnock-Jones | f18ba9c9d4 | |
Tony Garnock-Jones | 87ecdb7efe | |
Tony Garnock-Jones | 0533840bc0 | |
Tony Garnock-Jones | 77c16df89b | |
Tony Garnock-Jones | 35e6ba2e82 | |
Tony Garnock-Jones | 536e32b0e8 | |
Tony Garnock-Jones | 9192bdea7e | |
Tony Garnock-Jones | 19ac8d16c8 | |
Tony Garnock-Jones | 7f284a9d52 | |
Tony Garnock-Jones | cadf54b927 | |
Tony Garnock-Jones | a8b300e57d | |
Tony Garnock-Jones | 4e5e64f0a6 | |
Tony Garnock-Jones | c8ce125192 | |
Tony Garnock-Jones | c9fa9c590b | |
Tony Garnock-Jones | d568fc56ce | |
Tony Garnock-Jones | 42f4672446 | |
Tony Garnock-Jones | 1c86d8b7c5 | |
Tony Garnock-Jones | 64c1090938 | |
Tony Garnock-Jones | dc61963e16 | |
Tony Garnock-Jones | a33786e469 | |
Tony Garnock-Jones | 23ba2e5a59 | |
Tony Garnock-Jones | 7b8e0ff4b6 | |
Tony Garnock-Jones | 3f7819fafa | |
Tony Garnock-Jones | f5d76a847b | |
Tony Garnock-Jones | 443406a7d7 | |
Tony Garnock-Jones | 05103e9825 | |
Tony Garnock-Jones | 4f75d6d5a3 | |
Tony Garnock-Jones | 07b7739d00 | |
Tony Garnock-Jones | 8f3d22adf1 | |
Tony Garnock-Jones | 8222675b6b | |
Tony Garnock-Jones | fca4b3a22e | |
Tony Garnock-Jones | c5dd2d749a | |
Tony Garnock-Jones | c986ca76cf | |
Tony Garnock-Jones | 3e67c75427 | |
Tony Garnock-Jones | 6bc159e3c6 | |
Tony Garnock-Jones | 99d1acdec7 | |
Tony Garnock-Jones | 3093b89f0d | |
Tony Garnock-Jones | 00c0de40ea | |
Tony Garnock-Jones | 7657952993 | |
Tony Garnock-Jones | 9ecbd0bdd1 | |
Tony Garnock-Jones | 297e1630a8 | |
Tony Garnock-Jones | 85ca0b6c0a | |
Tony Garnock-Jones | 7c9c410a9b | |
Tony Garnock-Jones | cbbc6c50c0 | |
Tony Garnock-Jones | eb4f456550 | |
Tony Garnock-Jones | 4f4ff6e108 | |
Tony Garnock-Jones | 055a7f90e9 | |
Tony Garnock-Jones | b2f6149042 | |
Tony Garnock-Jones | 1bd4a3cdb4 | |
Tony Garnock-Jones | dc0ddf95dd | |
Tony Garnock-Jones | eeace57670 | |
Tony Garnock-Jones | 0aa39da971 | |
Tony Garnock-Jones | f0815ce4eb | |
Tony Garnock-Jones | afba8a0bff | |
Tony Garnock-Jones | d9ec3bfb14 | |
Tony Garnock-Jones | 95ac4b13df | |
Tony Garnock-Jones | 3eeee5f090 | |
Tony Garnock-Jones | aeacce22fc | |
Tony Garnock-Jones | 0726684ab5 | |
Tony Garnock-Jones | f74c4ebaf0 | |
Tony Garnock-Jones | 48a063539a | |
Tony Garnock-Jones | db96fcc95a | |
Tony Garnock-Jones | cee4a25460 | |
Tony Garnock-Jones | 7948ad4260 | |
Tony Garnock-Jones | 3d3c79e617 | |
Tony Garnock-Jones | b925b53756 | |
Tony Garnock-Jones | c0289e0a05 | |
Tony Garnock-Jones | 41189f551d | |
Tony Garnock-Jones | cc8313cf25 | |
Tony Garnock-Jones | bfbff65bb6 | |
Tony Garnock-Jones | 442a987523 | |
Tony Garnock-Jones | f45b136ef5 | |
Tony Garnock-Jones | 73c6593f84 | |
Tony Garnock-Jones | a9e226f759 | |
Tony Garnock-Jones | 33db0b8718 | |
Tony Garnock-Jones | e923d87fa5 | |
Tony Garnock-Jones | 83697b0e56 | |
Tony Garnock-Jones | 1798e64615 | |
Tony Garnock-Jones | be32f9b7c8 | |
Tony Garnock-Jones | dc1b0ac54d | |
Tony Garnock-Jones | d579a0d607 | |
Tony Garnock-Jones | 7178fb0d9b | |
Tony Garnock-Jones | 4c0bd3b9d7 | |
Tony Garnock-Jones | b98f434ac9 | |
Tony Garnock-Jones | 61cec52d46 | |
Tony Garnock-Jones | f6ddf0ca3b | |
Tony Garnock-Jones | 9c7770a54f | |
Tony Garnock-Jones | cd29602761 | |
Tony Garnock-Jones | c411e47d7f | |
Tony Garnock-Jones | 897fc13054 | |
Tony Garnock-Jones | 9420cc7236 | |
Tony Garnock-Jones | e0ef236001 | |
Tony Garnock-Jones | 634b263ed2 | |
Tony Garnock-Jones | 2a6d0912b6 | |
Tony Garnock-Jones | 4a656dc929 | |
Tony Garnock-Jones | 2532b42959 | |
Tony Garnock-Jones | b12d49739c | |
Tony Garnock-Jones | aea735bb4e | |
Tony Garnock-Jones | 9b71388817 | |
Tony Garnock-Jones | b6ac046ba7 | |
Tony Garnock-Jones | 7b3731a5e4 | |
Tony Garnock-Jones | 185c233b2f | |
Tony Garnock-Jones | 22b2f162bc | |
Tony Garnock-Jones | f664399a8c | |
Tony Garnock-Jones | cd504becf7 |
1
Makefile
1
Makefile
|
@ -22,4 +22,3 @@ test-all:
|
||||||
(cd implementations/javascript; npm test)
|
(cd implementations/javascript; npm test)
|
||||||
(cd implementations/python; make test)
|
(cd implementations/python; make test)
|
||||||
(cd implementations/racket/preserves; make testonly)
|
(cd implementations/racket/preserves; make testonly)
|
||||||
(cd implementations/rust; cargo test)
|
|
||||||
|
|
2
NOTICE
2
NOTICE
|
@ -1,2 +1,2 @@
|
||||||
Preserves: an Expressive Data Language
|
Preserves: an Expressive Data Language
|
||||||
Copyright 2018-2022 Tony Garnock-Jones
|
Copyright 2018-2024 Tony Garnock-Jones
|
||||||
|
|
18
README.md
18
README.md
|
@ -38,14 +38,14 @@ automatic, perfect-fidelity conversion between syntaxes.
|
||||||
|
|
||||||
#### Implementations of the data model, plus Preserves textual and binary transfer syntax
|
#### Implementations of the data model, plus Preserves textual and binary transfer syntax
|
||||||
|
|
||||||
| Language[^pre-alpha-implementations] | Code | Package | Docs |
|
| Language[^pre-alpha-implementations] | Code | Package | Docs |
|
||||||
|-----------------------|------------------------------------------------------------------------------|--------------------------------------------------------------------------------|-------------------------------------------|
|
|--------------------------------------|------------------------------------------------------------------------------|--------------------------------------------------------------------------------|-------------------------------------------|
|
||||||
| Nim | [git.syndicate-lang.org](https://git.syndicate-lang.org/ehmry/preserves-nim) | | |
|
| Nim | [git.syndicate-lang.org](https://git.syndicate-lang.org/ehmry/preserves-nim) | | |
|
||||||
| Python | [preserves.dev]({{page.projecttree}}/implementations/python/) | [`pip install preserves`](https://pypi.org/project/preserves/) | [docs](python/latest/) |
|
| Python | [preserves.dev]({{page.projecttree}}/implementations/python/) | [`pip install preserves`](https://pypi.org/project/preserves/) | [docs](python/latest/) |
|
||||||
| Racket | [preserves.dev]({{page.projecttree}}/implementations/racket/preserves/) | [`raco pkg install preserves`](https://pkgs.racket-lang.org/package/preserves) | |
|
| Racket | [preserves.dev]({{page.projecttree}}/implementations/racket/preserves/) | [`raco pkg install preserves`](https://pkgs.racket-lang.org/package/preserves) | |
|
||||||
| Rust | [preserves.dev]({{page.projecttree}}/implementations/rust/) | [`cargo add preserves`](https://crates.io/crates/preserves) | [docs](https://docs.rs/preserves/latest/) |
|
| Rust | [preserves.dev](https://gitlab.com/preserves/preserves-rs/) | [`cargo add preserves`](https://crates.io/crates/preserves) | [docs](https://docs.rs/preserves/latest/) |
|
||||||
| Squeak Smalltalk | [SqueakSource](https://squeaksource.com/Preserves.html) | `Installer ss project: 'Preserves';`<br>` install: 'Preserves'` | |
|
| Squeak Smalltalk | [SqueakSource](https://squeaksource.com/Preserves.html) | `Installer ss project: 'Preserves';`<br>` install: 'Preserves'` | |
|
||||||
| TypeScript/JavaScript | [preserves.dev]({{page.projecttree}}/implementations/javascript/) | [`yarn add @preserves/core`](https://www.npmjs.com/package/@preserves/core) | |
|
| TypeScript/JavaScript | [preserves.dev]({{page.projecttree}}/implementations/javascript/) | [`yarn add @preserves/core`](https://www.npmjs.com/package/@preserves/core) | |
|
||||||
|
|
||||||
[^pre-alpha-implementations]: Pre-alpha implementations also exist for
|
[^pre-alpha-implementations]: Pre-alpha implementations also exist for
|
||||||
[C]({{page.projecttree}}/implementations/c/) and
|
[C]({{page.projecttree}}/implementations/c/) and
|
||||||
|
@ -88,6 +88,6 @@ Tony Garnock-Jones <tonyg@leastfixedpoint.com>
|
||||||
The contents of this repository are made available to you under the
|
The contents of this repository are made available to you under the
|
||||||
[Apache License, version 2.0](LICENSE)
|
[Apache License, version 2.0](LICENSE)
|
||||||
(<http://www.apache.org/licenses/LICENSE-2.0>), and are Copyright
|
(<http://www.apache.org/licenses/LICENSE-2.0>), and are Copyright
|
||||||
2018-2022 Tony Garnock-Jones.
|
2018-2024 Tony Garnock-Jones.
|
||||||
|
|
||||||
## Notes
|
## Notes
|
||||||
|
|
|
@ -105,7 +105,7 @@ A few more interesting differences:
|
||||||
{"dictionaries": "as keys???"}: "well, why not?"}
|
{"dictionaries": "as keys???"}: "well, why not?"}
|
||||||
```
|
```
|
||||||
|
|
||||||
Preserves technically provides a few types of numbers:
|
Preserves technically provides various types of numbers:
|
||||||
|
|
||||||
```
|
```
|
||||||
# Signed Integers
|
# Signed Integers
|
||||||
|
@ -114,9 +114,6 @@ Preserves technically provides a few types of numbers:
|
||||||
5907212309572059846509324862304968273468909473609826340
|
5907212309572059846509324862304968273468909473609826340
|
||||||
-5907212309572059846509324862304968273468909473609826340
|
-5907212309572059846509324862304968273468909473609826340
|
||||||
|
|
||||||
# Floats (Single-precision IEEE floats) (notice the trailing f)
|
|
||||||
3.1415927f
|
|
||||||
|
|
||||||
# Doubles (Double-precision IEEE floats)
|
# Doubles (Double-precision IEEE floats)
|
||||||
3.141592653589793
|
3.141592653589793
|
||||||
```
|
```
|
||||||
|
|
|
@ -13,5 +13,5 @@ defaults:
|
||||||
layout: page
|
layout: page
|
||||||
|
|
||||||
title: "Preserves"
|
title: "Preserves"
|
||||||
version_date: "October 2023"
|
version_date: "March 2024"
|
||||||
version: "0.992.0"
|
version: "0.995.0"
|
||||||
|
|
|
@ -1,5 +1,9 @@
|
||||||
[
|
[
|
||||||
{"version":"0.992.2","title":"0.992.2","aliases":["latest"]},
|
{"version":"0.995.1","title":"0.995.1","aliases":["latest"]},
|
||||||
|
{"version":"0.995.0","title":"0.995.0","aliases":[]},
|
||||||
|
{"version":"0.994.0","title":"0.994.0","aliases":[]},
|
||||||
|
{"version":"0.993.0","title":"0.993.0","aliases":[]},
|
||||||
|
{"version":"0.992.2","title":"0.992.2","aliases":[]},
|
||||||
{"version":"0.992.1","title":"0.992.1","aliases":[]},
|
{"version":"0.992.1","title":"0.992.1","aliases":[]},
|
||||||
{"version":"0.992.0","title":"0.992.0","aliases":[]},
|
{"version":"0.992.0","title":"0.992.0","aliases":[]},
|
||||||
{"version":"0.991.0","title":"0.991.0","aliases":[]},
|
{"version":"0.991.0","title":"0.991.0","aliases":[]},
|
||||||
|
|
|
@ -5,9 +5,8 @@ For a value `V`, we write `«V»` for the binary encoding of `V`.
|
||||||
«#t» = [0x81]
|
«#t» = [0x81]
|
||||||
|
|
||||||
«@W V» = [0x85] ++ «W» ++ «V»
|
«@W V» = [0x85] ++ «W» ++ «V»
|
||||||
«#!V» = [0x86] ++ «V»
|
«#:V» = [0x86] ++ «V»
|
||||||
|
|
||||||
«V» if V ∈ Float = [0x87, 0x04] ++ binary32(V)
|
|
||||||
«V» if V ∈ Double = [0x87, 0x08] ++ binary64(V)
|
«V» if V ∈ Double = [0x87, 0x08] ++ binary64(V)
|
||||||
|
|
||||||
«V» if V ∈ SignedInteger = [0xB0] ++ varint(|intbytes(V)|) ++ intbytes(V)
|
«V» if V ∈ SignedInteger = [0xB0] ++ varint(|intbytes(V)|) ++ intbytes(V)
|
||||||
|
@ -29,5 +28,4 @@ For a value `V`, we write `«V»` for the binary encoding of `V`.
|
||||||
signedBigEndian(n >> 8) ++ [n & 255] otherwise
|
signedBigEndian(n >> 8) ++ [n & 255] otherwise
|
||||||
```
|
```
|
||||||
|
|
||||||
The functions `binary32(F)` and `binary64(D)` yield big-endian 4- and
|
The function `binary64(D)` yields the big-endian 8-byte IEEE 754 binary representation of `D`.
|
||||||
8-byte IEEE 754 binary representations of `F` and `D`, respectively.
|
|
||||||
|
|
|
@ -8,10 +8,9 @@ class="postcard-grammar binarysyntax">*V*</span>.
|
||||||
|
|
||||||
{:.postcard-grammar.binarysyntax}
|
{:.postcard-grammar.binarysyntax}
|
||||||
«`@`*W* *V*» | = | `85` «*W*» «*V*»
|
«`@`*W* *V*» | = | `85` «*W*» «*V*»
|
||||||
«`#!`*V*» | = | `86` «*V*»
|
«`#:`*V*» | = | `86` «*V*»
|
||||||
|
|
||||||
{:.postcard-grammar.binarysyntax}
|
{:.postcard-grammar.binarysyntax}
|
||||||
«*V*» | = | `87``04` **binary32**(*V*) | if *V* ∈ Float
|
|
||||||
«*V*» | = | `87``08` **binary64**(*V*) | if *V* ∈ Double
|
«*V*» | = | `87``08` **binary64**(*V*) | if *V* ∈ Double
|
||||||
|
|
||||||
{:.postcard-grammar.binarysyntax}
|
{:.postcard-grammar.binarysyntax}
|
||||||
|
@ -37,10 +36,9 @@ class="postcard-grammar binarysyntax">*V*</span>.
|
||||||
**signedBigEndian**(*n*) | = | <span class="outputish">*n* & 255</span> | if −128 ≤ *n* ≤ 127
|
**signedBigEndian**(*n*) | = | <span class="outputish">*n* & 255</span> | if −128 ≤ *n* ≤ 127
|
||||||
| | **signedBigEndian**(*n* >> 8) <span class="outputish">*n* & 255</span> | otherwise
|
| | **signedBigEndian**(*n* >> 8) <span class="outputish">*n* & 255</span> | otherwise
|
||||||
|
|
||||||
The functions <span class="postcard-grammar binarysyntax">**binary32**(*F*)</span> and <span
|
The function <span class="postcard-grammar binarysyntax">**binary64**(*D*)</span> yields the
|
||||||
class="postcard-grammar binarysyntax">**binary64**(*D*)</span> yield big-endian 4- and 8-byte
|
big-endian 8-byte IEEE 754 binary representation of <span class="postcard-grammar
|
||||||
IEEE 754 binary representations of <span class="postcard-grammar binarysyntax">*F*</span> and
|
binarysyntax">*D*</span>.
|
||||||
<span class="postcard-grammar binarysyntax">*D*</span>, respectively.
|
|
||||||
|
|
||||||
<!--
|
<!--
|
||||||
Together, <span class="postcard-grammar binarysyntax">**div**</span> and <span
|
Together, <span class="postcard-grammar binarysyntax">**div**</span> and <span
|
||||||
|
|
|
@ -15,7 +15,7 @@ Set := `#{` Expr* Trailer ws `}`
|
||||||
|
|
||||||
Trailer := (ws Annotation)*
|
Trailer := (ws Annotation)*
|
||||||
|
|
||||||
Embedded := `#!` SimpleExpr
|
Embedded := `#:` SimpleExpr
|
||||||
Annotated := Annotation SimpleExpr
|
Annotated := Annotation SimpleExpr
|
||||||
Annotation := `@` SimpleExpr | `#` ((space | tab) linecomment) (cr | lf)
|
Annotation := `@` SimpleExpr | `#` ((space | tab | `!`) linecomment) (cr | lf)
|
||||||
```
|
```
|
||||||
|
|
|
@ -18,6 +18,6 @@ The definitions of `Atom`, `ws`, and `linecomment` are as given in the Preserves
|
||||||
| *Trailer* | := | (**ws** *Annotation*)<sup>⋆</sup>
|
| *Trailer* | := | (**ws** *Annotation*)<sup>⋆</sup>
|
||||||
|
|
||||||
{:.postcard-grammar.textsyntax}
|
{:.postcard-grammar.textsyntax}
|
||||||
| *Embedded* | := | `#!` *SimpleExpr*
|
| *Embedded* | := | `#:` *SimpleExpr*
|
||||||
| *Annotated* | := | *Annotation* *SimpleExpr*
|
| *Annotated* | := | *Annotation* *SimpleExpr*
|
||||||
| *Annotation* | := | `@` *SimpleExpr* | `#` ((**space** | **tab**) *linecomment*) (**cr** | **lf**)
|
| *Annotation* | := | `@` *SimpleExpr* | `#` ((**space** | **tab** | `!`) *linecomment*) (**cr** | **lf**)
|
||||||
|
|
|
@ -9,9 +9,9 @@ Set := `#{` (commas Value)* commas `}`
|
||||||
Dictionary := `{` (commas Value ws `:` Value)* commas `}`
|
Dictionary := `{` (commas Value ws `:` Value)* commas `}`
|
||||||
commas := (ws `,`)* ws
|
commas := (ws `,`)* ws
|
||||||
|
|
||||||
Embedded := `#!` Value
|
Embedded := `#:` Value
|
||||||
Annotated := Annotation Value
|
Annotated := Annotation Value
|
||||||
Annotation := `@` Value | `#` ((space | tab) linecomment) (cr | lf)
|
Annotation := `@` Value | `#` ((space | tab | `!`) linecomment) (cr | lf)
|
||||||
|
|
||||||
Atom := Boolean | ByteString | String | QuotedSymbol | Symbol | Number
|
Atom := Boolean | ByteString | String | QuotedSymbol | Symbol | Number
|
||||||
Boolean := `#t` | `#f`
|
Boolean := `#t` | `#f`
|
||||||
|
@ -21,8 +21,7 @@ ByteString := `#"` binchar* `"`
|
||||||
String := `"` («any unicode scalar except `\` or `"`» | escaped | `\"`)* `"`
|
String := `"` («any unicode scalar except `\` or `"`» | escaped | `\"`)* `"`
|
||||||
QuotedSymbol := `|` («any unicode scalar except `\` or `|`» | escaped | `\|`)* `|`
|
QuotedSymbol := `|` («any unicode scalar except `\` or `|`» | escaped | `\|`)* `|`
|
||||||
Symbol := (`A`..`Z` | `a`..`z` | `0`..`9` | sympunct | symuchar)+
|
Symbol := (`A`..`Z` | `a`..`z` | `0`..`9` | sympunct | symuchar)+
|
||||||
Number := Float | Double | SignedInteger
|
Number := Double | SignedInteger
|
||||||
Float := flt (`f`|`F`) | `#xf"` (ws hex hex)4 ws `"`
|
|
||||||
Double := flt | `#xd"` (ws hex hex)8 ws `"`
|
Double := flt | `#xd"` (ws hex hex)8 ws `"`
|
||||||
SignedInteger := int
|
SignedInteger := int
|
||||||
|
|
||||||
|
|
|
@ -11,9 +11,9 @@
|
||||||
| **commas** | := | (**ws** `,`)<sup>⋆</sup> **ws** |
|
| **commas** | := | (**ws** `,`)<sup>⋆</sup> **ws** |
|
||||||
|
|
||||||
{:.postcard-grammar.textsyntax}
|
{:.postcard-grammar.textsyntax}
|
||||||
| *Embedded* | := | `#!`*Value* |
|
| *Embedded* | := | `#:`*Value* |
|
||||||
| *Annotated* | := | *Annotation* *Value* |
|
| *Annotated* | := | *Annotation* *Value* |
|
||||||
| *Annotation* | := | `@`*Value* |`#` ((**space** | **tab**) *linecomment*) (**cr** | **lf**) |
|
| *Annotation* | := | `@`*Value* |`#` ((**space** | **tab** | `!`) *linecomment*) (**cr** | **lf**) |
|
||||||
|
|
||||||
{:.postcard-grammar.textsyntax}
|
{:.postcard-grammar.textsyntax}
|
||||||
| *Atom* | := | *Boolean* | *ByteString* | *String* | *QuotedSymbol* | *Symbol* | *Number* |
|
| *Atom* | := | *Boolean* | *ByteString* | *String* | *QuotedSymbol* | *Symbol* | *Number* |
|
||||||
|
@ -22,8 +22,7 @@
|
||||||
| *String* | := | `"` (« any unicode scalar value except `\` or `"` » | *escaped* |`\"`)<sup>⋆</sup> `"` |
|
| *String* | := | `"` (« any unicode scalar value except `\` or `"` » | *escaped* |`\"`)<sup>⋆</sup> `"` |
|
||||||
| *QuotedSymbol* | := | `|` (« any unicode scalar value except `\` or `|` » | *escaped* |`\|`)<sup>⋆</sup> `|` |
|
| *QuotedSymbol* | := | `|` (« any unicode scalar value except `\` or `|` » | *escaped* |`\|`)<sup>⋆</sup> `|` |
|
||||||
| *Symbol* | := | (`A`..`Z`|`a`..`z`|`0`..`9`| *sympunct* | *symuchar*)<sup>+</sup> |
|
| *Symbol* | := | (`A`..`Z`|`a`..`z`|`0`..`9`| *sympunct* | *symuchar*)<sup>+</sup> |
|
||||||
| *Number* | := | *Float* | *Double* | *SignedInteger* |
|
| *Number* | := | *Double* | *SignedInteger* |
|
||||||
| *Float* | := | *flt* (`f`|`F`) |`#xf"` (**ws** *hex* *hex*)<sup>4</sup> **ws**`"` |
|
|
||||||
| *Double* | := | *flt* |`#xd"` (**ws** *hex* *hex*)<sup>8</sup> **ws**`"` |
|
| *Double* | := | *flt* |`#xd"` (**ws** *hex* *hex*)<sup>8</sup> **ws**`"` |
|
||||||
| *SignedInteger* | := | *int* |
|
| *SignedInteger* | := | *int* |
|
||||||
|
|
||||||
|
|
|
@ -1,5 +1,5 @@
|
||||||
Python's strings, byte strings, integers, booleans, and double-precision floats stand directly
|
Python's strings, byte strings, integers, booleans, and double-precision floats stand directly
|
||||||
for their Preserves counterparts. Wrapper objects for [Float][preserves.values.Float] and
|
for their Preserves counterparts. Wrapper objects for
|
||||||
[Symbol][preserves.values.Symbol] complete the suite of atomic types.
|
[Symbol][preserves.values.Symbol] complete the suite of atomic types.
|
||||||
|
|
||||||
Python's lists and tuples correspond to Preserves `Sequence`s, and dicts and sets to
|
Python's lists and tuples correspond to Preserves `Sequence`s, and dicts and sets to
|
||||||
|
|
|
@ -2,7 +2,6 @@ Here are a few example values, written using the [text
|
||||||
syntax](https://preserves.dev/preserves-text.html):
|
syntax](https://preserves.dev/preserves-text.html):
|
||||||
|
|
||||||
Boolean : #t #f
|
Boolean : #t #f
|
||||||
Float : 1.0f 10.4e3f -100.6f
|
|
||||||
Double : 1.0 10.4e3 -100.6
|
Double : 1.0 10.4e3 -100.6
|
||||||
Integer : 1 0 -100
|
Integer : 1 0 -100
|
||||||
String : "Hello, world!\n"
|
String : "Hello, world!\n"
|
||||||
|
@ -12,6 +11,6 @@ syntax](https://preserves.dev/preserves-text.html):
|
||||||
Sequence : [value1 value2 ...]
|
Sequence : [value1 value2 ...]
|
||||||
Set : #{value1 value2 ...}
|
Set : #{value1 value2 ...}
|
||||||
Dictionary : {key1: value1 key2: value2 ...: ...}
|
Dictionary : {key1: value1 key2: value2 ...: ...}
|
||||||
Embedded : #!value
|
Embedded : #:value
|
||||||
|
|
||||||
Commas are optional in sequences, sets, and dictionaries.
|
Commas are optional in sequences, sets, and dictionaries.
|
||||||
|
|
|
@ -4,7 +4,6 @@
|
||||||
| Embedded
|
| Embedded
|
||||||
|
|
||||||
Atom = Boolean
|
Atom = Boolean
|
||||||
| Float
|
|
||||||
| Double
|
| Double
|
||||||
| SignedInteger
|
| SignedInteger
|
||||||
| String
|
| String
|
||||||
|
|
|
@ -38,7 +38,7 @@ representations of their keys.[^no-need-for-by-value]
|
||||||
**Other kinds of `Value`.**
|
**Other kinds of `Value`.**
|
||||||
There are no special canonicalization restrictions on
|
There are no special canonicalization restrictions on
|
||||||
`SignedInteger`s, `String`s, `ByteString`s, `Symbol`s, `Boolean`s,
|
`SignedInteger`s, `String`s, `ByteString`s, `Symbol`s, `Boolean`s,
|
||||||
`Float`s, `Double`s, `Record`s, `Sequence`s, or `Embedded`s. The
|
`Double`s, `Record`s, `Sequence`s, or `Embedded`s. The
|
||||||
constraints given for these `Value`s in the [specification][spec]
|
constraints given for these `Value`s in the [specification][spec]
|
||||||
suffice to ensure canonicity.
|
suffice to ensure canonicity.
|
||||||
|
|
||||||
|
|
|
@ -23,10 +23,10 @@ Appropriately-labelled `Record`s denote these domain-specific data
|
||||||
types.[^why-dictionaries]
|
types.[^why-dictionaries]
|
||||||
|
|
||||||
[^why-dictionaries]: Given `Record`'s existence, it may seem odd
|
[^why-dictionaries]: Given `Record`'s existence, it may seem odd
|
||||||
that `Dictionary`, `Set`, `Float`, etc. are given special
|
that `Dictionary`, `Set`, `Double`, etc. are given special
|
||||||
treatment. Preserves aims to offer a useful basic equivalence
|
treatment. Preserves aims to offer a useful basic equivalence
|
||||||
predicate to programmers, and so if a data type demands a special
|
predicate to programmers, and so if a data type demands a special
|
||||||
equivalence predicate, as `Dictionary`, `Set` and `Float` all do,
|
equivalence predicate, as `Dictionary`, `Set` and `Double` all do,
|
||||||
then the type should be included in the base language. Otherwise,
|
then the type should be included in the base language. Otherwise,
|
||||||
it can be represented as a `Record` and treated separately.
|
it can be represented as a `Record` and treated separately.
|
||||||
`Boolean`, `String` and `Symbol` are seeming exceptions. The first
|
`Boolean`, `String` and `Symbol` are seeming exceptions. The first
|
||||||
|
@ -99,6 +99,22 @@ the usual `@`-prefixed annotation notation can also be used.
|
||||||
#x"0C0D0E0F"
|
#x"0C0D0E0F"
|
||||||
]
|
]
|
||||||
|
|
||||||
|
## Interpreter specification lines ("shebang" lines).
|
||||||
|
|
||||||
|
Unix systems interpret `#!` at the beginning of an executable file specially. The text
|
||||||
|
following `#!` on the first line is interpreted as a specification for an interpreter for the
|
||||||
|
executable file. Preserves offers special support for `#!`, reading it similarly to a comment,
|
||||||
|
but producing an `<interpreter ...>` annotation instead of a string.
|
||||||
|
|
||||||
|
For example,
|
||||||
|
|
||||||
|
#!/usr/bin/preserves-tool convert
|
||||||
|
[1, 2, 3]
|
||||||
|
|
||||||
|
is read as
|
||||||
|
|
||||||
|
@<interpreter "/usr/bin/preserves-tool convert"> [1, 2, 3]
|
||||||
|
|
||||||
## MIME-type tagged binary data.
|
## MIME-type tagged binary data.
|
||||||
|
|
||||||
Many internet protocols use
|
Many internet protocols use
|
||||||
|
|
|
@ -0,0 +1,2 @@
|
||||||
|
/build.python/
|
||||||
|
/*.deb
|
|
@ -0,0 +1,10 @@
|
||||||
|
PYTHON_PACKAGEVERSION := $(shell ../implementations/python/print-package-version)
|
||||||
|
|
||||||
|
all: python3-preserves_$(PYTHON_PACKAGEVERSION)_all.deb
|
||||||
|
|
||||||
|
python3-preserves_%_all.deb:
|
||||||
|
./build-python-deb
|
||||||
|
|
||||||
|
clean:
|
||||||
|
rm -f *.deb
|
||||||
|
rm -rf build.*
|
|
@ -0,0 +1,37 @@
|
||||||
|
#!/bin/sh
|
||||||
|
|
||||||
|
set -e
|
||||||
|
|
||||||
|
PYTHON_PACKAGEVERSION=$(../implementations/python/print-package-version)
|
||||||
|
DIRTY=
|
||||||
|
if ! git diff-index --quiet HEAD --
|
||||||
|
then
|
||||||
|
DIRTY=+
|
||||||
|
fi
|
||||||
|
GITSUFFIX=$(git log --date=format:%Y%m%d%H%M%S --pretty=~git%cd.%h -1)
|
||||||
|
VERSION=${PYTHON_PACKAGEVERSION}${GITSUFFIX}${DIRTY}
|
||||||
|
|
||||||
|
echo "Building deb for ${VERSION}"
|
||||||
|
(cd ../implementations/python && . ./.envrc && make build-only)
|
||||||
|
|
||||||
|
rm -rf build.python
|
||||||
|
mkdir build.python
|
||||||
|
|
||||||
|
(
|
||||||
|
cd build.python
|
||||||
|
tar -zxvf ../../implementations/python/dist/preserves-${PYTHON_PACKAGEVERSION}.tar.gz
|
||||||
|
(
|
||||||
|
cd preserves-${PYTHON_PACKAGEVERSION}
|
||||||
|
cp -a ../../python ./debian
|
||||||
|
cat > ./debian/changelog <<EOF
|
||||||
|
preserves (${VERSION}) UNRELEASED; urgency=low
|
||||||
|
|
||||||
|
* Unofficial debian packaging of Python Preserves
|
||||||
|
|
||||||
|
-- Tony Garnock-Jones <tonyg@leastfixedpoint.com> $(date --rfc-email)
|
||||||
|
EOF
|
||||||
|
|
||||||
|
dpkg-buildpackage
|
||||||
|
)
|
||||||
|
cp *.deb ..
|
||||||
|
)
|
|
@ -0,0 +1,14 @@
|
||||||
|
Source: preserves
|
||||||
|
Section: python
|
||||||
|
Priority: optional
|
||||||
|
Maintainer: Tony Garnock-Jones <tonyg@leastfixedpoint.com>
|
||||||
|
Build-Depends: debhelper-compat (= 12), python3, dh-python, pybuild-plugin-pyproject
|
||||||
|
Standards-Version: 3.9.3
|
||||||
|
Homepage: https://preserves.dev/
|
||||||
|
Vcs-Git: https://gitlab.com/preserves/preserves.git
|
||||||
|
Vcs-Browser: https://gitlab.com/preserves/preserves
|
||||||
|
|
||||||
|
Package: python3-preserves
|
||||||
|
Architecture: all
|
||||||
|
Depends: ${python3:Depends}
|
||||||
|
Description: Python implementation of the Preserves data language
|
|
@ -0,0 +1,6 @@
|
||||||
|
#!/usr/bin/make -f
|
||||||
|
#export DH_VERBOSE=1
|
||||||
|
export PYBUILD_NAME=preserves
|
||||||
|
|
||||||
|
%:
|
||||||
|
dh $@ --with python3 --buildsystem=pybuild
|
|
@ -31,7 +31,6 @@ fi
|
||||||
# Ensure that various copies of schema.prs, schema.bin, path.bin,
|
# Ensure that various copies of schema.prs, schema.bin, path.bin,
|
||||||
# samples.pr and samples.bin are in fact identical.
|
# samples.pr and samples.bin are in fact identical.
|
||||||
${COMMAND} path/path.bin implementations/python/preserves/path.prb
|
${COMMAND} path/path.bin implementations/python/preserves/path.prb
|
||||||
${COMMAND} path/path.bin implementations/rust/preserves-path/path.bin
|
|
||||||
|
|
||||||
${COMMAND} schema/schema.bin implementations/python/preserves/schema.prb
|
${COMMAND} schema/schema.bin implementations/python/preserves/schema.prb
|
||||||
${COMMAND} schema/schema.prs implementations/racket/preserves/preserves-schema/schema.prs
|
${COMMAND} schema/schema.prs implementations/racket/preserves/preserves-schema/schema.prs
|
||||||
|
@ -40,11 +39,4 @@ ${COMMAND} tests/samples.bin implementations/python/tests/samples.bin
|
||||||
${COMMAND} tests/samples.pr implementations/python/tests/samples.pr
|
${COMMAND} tests/samples.pr implementations/python/tests/samples.pr
|
||||||
${COMMAND} tests/samples.pr implementations/racket/preserves/preserves/tests/samples.pr
|
${COMMAND} tests/samples.pr implementations/racket/preserves/preserves/tests/samples.pr
|
||||||
|
|
||||||
${COMMAND} _includes/what-is-preserves.md implementations/rust/preserves/doc/what-is-preserves.md
|
|
||||||
${COMMAND} _includes/cheatsheet-binary-plaintext.md implementations/rust/preserves/doc/cheatsheet-binary-plaintext.md
|
|
||||||
${COMMAND} _includes/cheatsheet-text-plaintext.md implementations/rust/preserves/doc/cheatsheet-text-plaintext.md
|
|
||||||
${COMMAND} _includes/value-grammar.md implementations/rust/preserves/doc/value-grammar.md
|
|
||||||
|
|
||||||
${COMMAND} _includes/what-is-preserves-schema.md implementations/rust/preserves-schema/doc/what-is-preserves-schema.md
|
|
||||||
|
|
||||||
[ -z "$failed" ]
|
[ -z "$failed" ]
|
||||||
|
|
|
@ -13,9 +13,9 @@ Here you may find:
|
||||||
- [racket](racket/), an implementation for Racket 7.x and newer
|
- [racket](racket/), an implementation for Racket 7.x and newer
|
||||||
(though older Rackets may also work with it).
|
(though older Rackets may also work with it).
|
||||||
|
|
||||||
- [rust](rust/), an implementation for Rust that interoperates with
|
|
||||||
serde.
|
|
||||||
|
|
||||||
Other implementations are also available:
|
Other implementations are also available:
|
||||||
|
|
||||||
|
- [Preserves for Rust](https://gitlab.com/preserves/preserves-rs/), an implementation for Rust
|
||||||
|
that interoperates with serde.
|
||||||
|
|
||||||
- [Preserves for Squeak Smalltalk](https://squeaksource.com/Preserves.html)
|
- [Preserves for Squeak Smalltalk](https://squeaksource.com/Preserves.html)
|
||||||
|
|
|
@ -261,7 +261,6 @@ PRESERVES_OUTOFLINE
|
||||||
|
|
||||||
typedef enum preserves_type_tag {
|
typedef enum preserves_type_tag {
|
||||||
PRESERVES_BOOLEAN = 0,
|
PRESERVES_BOOLEAN = 0,
|
||||||
PRESERVES_FLOAT,
|
|
||||||
PRESERVES_DOUBLE,
|
PRESERVES_DOUBLE,
|
||||||
|
|
||||||
PRESERVES_SIGNED_INTEGER,
|
PRESERVES_SIGNED_INTEGER,
|
||||||
|
@ -283,7 +282,6 @@ typedef enum preserves_type_tag {
|
||||||
PRESERVES_OUTOFLINE(char const *preserves_type_tag_name(preserves_type_tag_t type), {
|
PRESERVES_OUTOFLINE(char const *preserves_type_tag_name(preserves_type_tag_t type), {
|
||||||
switch (type) {
|
switch (type) {
|
||||||
case PRESERVES_BOOLEAN: return "BOOLEAN";
|
case PRESERVES_BOOLEAN: return "BOOLEAN";
|
||||||
case PRESERVES_FLOAT: return "FLOAT";
|
|
||||||
case PRESERVES_DOUBLE: return "DOUBLE";
|
case PRESERVES_DOUBLE: return "DOUBLE";
|
||||||
case PRESERVES_SIGNED_INTEGER: return "SIGNED_INTEGER";
|
case PRESERVES_SIGNED_INTEGER: return "SIGNED_INTEGER";
|
||||||
case PRESERVES_STRING: return "STRING";
|
case PRESERVES_STRING: return "STRING";
|
||||||
|
@ -366,7 +364,6 @@ PRESERVES_OUTOFLINE
|
||||||
|
|
||||||
/*
|
/*
|
||||||
PRESERVES_BOOLEAN: repr==PRESERVES_REPR_NONE, len=0, data._boolean
|
PRESERVES_BOOLEAN: repr==PRESERVES_REPR_NONE, len=0, data._boolean
|
||||||
PRESERVES_FLOAT: repr=PRESERVES_REPR_NONE, len=0, data._float
|
|
||||||
PRESERVES_DOUBLE: repr=PRESERVES_REPR_NONE, len=0, data._double
|
PRESERVES_DOUBLE: repr=PRESERVES_REPR_NONE, len=0, data._double
|
||||||
|
|
||||||
PRESERVES_SIGNED_INTEGER:
|
PRESERVES_SIGNED_INTEGER:
|
||||||
|
@ -418,7 +415,6 @@ typedef struct preserves_index_entry {
|
||||||
|
|
||||||
union {
|
union {
|
||||||
bool _boolean;
|
bool _boolean;
|
||||||
float _float;
|
|
||||||
double _double;
|
double _double;
|
||||||
int64_t _signed;
|
int64_t _signed;
|
||||||
uint64_t _unsigned;
|
uint64_t _unsigned;
|
||||||
|
@ -818,18 +814,6 @@ PRESERVES_OUTOFLINE
|
||||||
uint8_t *bs = _preserves_reader_next_bytes(r, len);
|
uint8_t *bs = _preserves_reader_next_bytes(r, len);
|
||||||
if (bs == NULL) return _preserves_reader_finish(r, PRESERVES_END_INCOMPLETE_INPUT);
|
if (bs == NULL) return _preserves_reader_finish(r, PRESERVES_END_INCOMPLETE_INPUT);
|
||||||
switch (len) {
|
switch (len) {
|
||||||
case 4: {
|
|
||||||
uint32_t i;
|
|
||||||
memcpy(&i, bs, 4);
|
|
||||||
i = ntohl(i);
|
|
||||||
float f;
|
|
||||||
memcpy(&f, &i, 4);
|
|
||||||
RETURN_ON_FAIL(_preserves_reader_emit_entry(r, &count, (preserves_index_entry_t) {
|
|
||||||
.type = PRESERVES_FLOAT, .repr = PRESERVES_REPR_NONE, .len = 0, .data = {
|
|
||||||
._float = f
|
|
||||||
}}));
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
case 8: {
|
case 8: {
|
||||||
uint32_t lo, hi;
|
uint32_t lo, hi;
|
||||||
memcpy(&hi, bs, 4);
|
memcpy(&hi, bs, 4);
|
||||||
|
@ -995,10 +979,6 @@ PRESERVES_IMPLEMENTATION_CHUNK
|
||||||
fprintf(f, i->data._boolean ? " #t" : " #f");
|
fprintf(f, i->data._boolean ? " #t" : " #f");
|
||||||
break;
|
break;
|
||||||
|
|
||||||
case PRESERVES_FLOAT:
|
|
||||||
fprintf(f, " %f", i->data._float);
|
|
||||||
break;
|
|
||||||
|
|
||||||
case PRESERVES_DOUBLE:
|
case PRESERVES_DOUBLE:
|
||||||
fprintf(f, " %f", i->data._double);
|
fprintf(f, " %f", i->data._double);
|
||||||
break;
|
break;
|
||||||
|
|
|
@ -41,15 +41,6 @@ namespace Preserves {
|
||||||
decodeEmbedded(decodeEmbedded)
|
decodeEmbedded(decodeEmbedded)
|
||||||
{}
|
{}
|
||||||
|
|
||||||
boost::optional<float> next_float() {
|
|
||||||
uint8_t buf[4];
|
|
||||||
if (!next_chunk(buf, sizeof(buf))) return boost::none;
|
|
||||||
uint32_t n = buf[0] << 24 | buf[1] << 16 | buf[2] << 8 | buf[3];
|
|
||||||
float f;
|
|
||||||
memcpy(&f, &n, sizeof(f));
|
|
||||||
return f;
|
|
||||||
}
|
|
||||||
|
|
||||||
boost::optional<double> next_double() {
|
boost::optional<double> next_double() {
|
||||||
uint8_t buf[8];
|
uint8_t buf[8];
|
||||||
if (!next_chunk(buf, sizeof(buf))) return boost::none;
|
if (!next_chunk(buf, sizeof(buf))) return boost::none;
|
||||||
|
@ -113,7 +104,6 @@ namespace Preserves {
|
||||||
return BinaryReader<>(i).next().map(decodeEmbedded).map(Value<T>::from_embedded);
|
return BinaryReader<>(i).next().map(decodeEmbedded).map(Value<T>::from_embedded);
|
||||||
case BinaryTag::Ieee754: return varint(i).flat_map([&](size_t len)-> boost::optional<Value<T>> {
|
case BinaryTag::Ieee754: return varint(i).flat_map([&](size_t len)-> boost::optional<Value<T>> {
|
||||||
switch (len) {
|
switch (len) {
|
||||||
case 4: return next_float().map(Value<T>::from_float);
|
|
||||||
case 8: return next_double().map(Value<T>::from_double);
|
case 8: return next_double().map(Value<T>::from_double);
|
||||||
default: return boost::none;
|
default: return boost::none;
|
||||||
}
|
}
|
||||||
|
|
|
@ -151,19 +151,6 @@ namespace Preserves {
|
||||||
return (*this) << (b ? BinaryTag::True : BinaryTag::False);
|
return (*this) << (b ? BinaryTag::True : BinaryTag::False);
|
||||||
}
|
}
|
||||||
|
|
||||||
BinaryWriter& operator<<(float f) {
|
|
||||||
uint32_t n;
|
|
||||||
memcpy(&n, &f, sizeof(f));
|
|
||||||
uint8_t buf[4];
|
|
||||||
buf[0] = (n >> 24) & 0xff;
|
|
||||||
buf[1] = (n >> 16) & 0xff;
|
|
||||||
buf[2] = (n >> 8) & 0xff;
|
|
||||||
buf[3] = (n) & 0xff;
|
|
||||||
(*this) << BinaryTag::Ieee754;
|
|
||||||
put(uint8_t(sizeof(buf)));
|
|
||||||
return write(buf, sizeof(buf));
|
|
||||||
}
|
|
||||||
|
|
||||||
BinaryWriter& operator<<(double d) {
|
BinaryWriter& operator<<(double d) {
|
||||||
uint64_t n;
|
uint64_t n;
|
||||||
memcpy(&n, &d, sizeof(d));
|
memcpy(&n, &d, sizeof(d));
|
||||||
|
|
|
@ -35,13 +35,6 @@ namespace Preserves {
|
||||||
BinaryWriter& write(BinaryWriter& w) const override {
|
BinaryWriter& write(BinaryWriter& w) const override {
|
||||||
return w << this->_value();
|
return w << this->_value();
|
||||||
});
|
});
|
||||||
PRESERVES_ATOMIC_VALUE_CLASS(Float, float, float, ValueKind::Float, as_float,
|
|
||||||
BinaryWriter& write(BinaryWriter& w) const override {
|
|
||||||
return w << this->_value();
|
|
||||||
}
|
|
||||||
boost::optional<double> as_double() const override {
|
|
||||||
return this->value;
|
|
||||||
});
|
|
||||||
PRESERVES_ATOMIC_VALUE_CLASS(Double, double, double, ValueKind::Double, as_double,
|
PRESERVES_ATOMIC_VALUE_CLASS(Double, double, double, ValueKind::Double, as_double,
|
||||||
BinaryWriter& write(BinaryWriter& w) const override {
|
BinaryWriter& write(BinaryWriter& w) const override {
|
||||||
return w << this->_value();
|
return w << this->_value();
|
||||||
|
@ -57,13 +50,6 @@ namespace Preserves {
|
||||||
return boost::none;
|
return boost::none;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
boost::optional<float> as_float() const override {
|
|
||||||
if (uint64_t(float(this->value)) == this->value) {
|
|
||||||
return float(this->value);
|
|
||||||
} else {
|
|
||||||
return boost::none;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
boost::optional<double> as_double() const override {
|
boost::optional<double> as_double() const override {
|
||||||
if (uint64_t(double(this->value)) == this->value) {
|
if (uint64_t(double(this->value)) == this->value) {
|
||||||
return double(this->value);
|
return double(this->value);
|
||||||
|
@ -82,13 +68,6 @@ namespace Preserves {
|
||||||
return boost::none;
|
return boost::none;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
boost::optional<float> as_float() const override {
|
|
||||||
if (int64_t(float(this->value)) == this->value) {
|
|
||||||
return float(this->value);
|
|
||||||
} else {
|
|
||||||
return boost::none;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
boost::optional<double> as_double() const override {
|
boost::optional<double> as_double() const override {
|
||||||
if (int64_t(double(this->value)) == this->value) {
|
if (int64_t(double(this->value)) == this->value) {
|
||||||
return double(this->value);
|
return double(this->value);
|
||||||
|
@ -295,7 +274,6 @@ namespace Preserves {
|
||||||
bool is_mutable() const override { return underlying.is_mutable(); }
|
bool is_mutable() const override { return underlying.is_mutable(); }
|
||||||
|
|
||||||
boost::optional<bool> as_bool() const override { return underlying.as_bool(); }
|
boost::optional<bool> as_bool() const override { return underlying.as_bool(); }
|
||||||
boost::optional<float> as_float() const override { return underlying.as_float(); }
|
|
||||||
boost::optional<double> as_double() const override { return underlying.as_double(); }
|
boost::optional<double> as_double() const override { return underlying.as_double(); }
|
||||||
boost::optional<uint64_t> as_unsigned() const override { return underlying.as_unsigned(); }
|
boost::optional<uint64_t> as_unsigned() const override { return underlying.as_unsigned(); }
|
||||||
boost::optional<int64_t> as_signed() const override { return underlying.as_signed(); }
|
boost::optional<int64_t> as_signed() const override { return underlying.as_signed(); }
|
||||||
|
@ -355,12 +333,6 @@ namespace Preserves {
|
||||||
return Value<T>(new Boolean<T>(b));
|
return Value<T>(new Boolean<T>(b));
|
||||||
}
|
}
|
||||||
|
|
||||||
template <typename T>
|
|
||||||
Value<T> Value<T>::from_float(float f)
|
|
||||||
{
|
|
||||||
return Value<T>(new Float<T>(f));
|
|
||||||
}
|
|
||||||
|
|
||||||
template <typename T>
|
template <typename T>
|
||||||
Value<T> Value<T>::from_double(double d)
|
Value<T> Value<T>::from_double(double d)
|
||||||
{
|
{
|
||||||
|
|
|
@ -14,7 +14,6 @@ namespace Preserves {
|
||||||
|
|
||||||
enum class ValueKind {
|
enum class ValueKind {
|
||||||
Boolean,
|
Boolean,
|
||||||
Float,
|
|
||||||
Double,
|
Double,
|
||||||
SignedInteger,
|
SignedInteger,
|
||||||
String,
|
String,
|
||||||
|
@ -43,7 +42,6 @@ namespace Preserves {
|
||||||
std::shared_ptr<ValueImpl<T>> _impl() const { return p; }
|
std::shared_ptr<ValueImpl<T>> _impl() const { return p; }
|
||||||
|
|
||||||
static Value from_bool(bool b);
|
static Value from_bool(bool b);
|
||||||
static Value from_float(float f);
|
|
||||||
static Value from_double(double d);
|
static Value from_double(double d);
|
||||||
static Value from_int(uint64_t i);
|
static Value from_int(uint64_t i);
|
||||||
static Value from_int(int64_t i);
|
static Value from_int(int64_t i);
|
||||||
|
@ -67,11 +65,9 @@ namespace Preserves {
|
||||||
|
|
||||||
static Value from_number(uint64_t i) { return from_int(i); }
|
static Value from_number(uint64_t i) { return from_int(i); }
|
||||||
static Value from_number(int64_t i) { return from_int(i); }
|
static Value from_number(int64_t i) { return from_int(i); }
|
||||||
static Value from_number(float f) { return from_float(f); }
|
|
||||||
static Value from_number(double d) { return from_double(d); }
|
static Value from_number(double d) { return from_double(d); }
|
||||||
|
|
||||||
static Value from(bool b) { return from_bool(b); }
|
static Value from(bool b) { return from_bool(b); }
|
||||||
static Value from(float f) { return from_float(f); }
|
|
||||||
static Value from(double d) { return from_double(d); }
|
static Value from(double d) { return from_double(d); }
|
||||||
static Value from(uint64_t i) { return from_int(i); }
|
static Value from(uint64_t i) { return from_int(i); }
|
||||||
static Value from(unsigned i) { return from_int(uint64_t(i)); }
|
static Value from(unsigned i) { return from_int(uint64_t(i)); }
|
||||||
|
@ -95,7 +91,6 @@ namespace Preserves {
|
||||||
bool is_mutable() const;
|
bool is_mutable() const;
|
||||||
|
|
||||||
bool is_bool() const { return value_kind() == ValueKind::Boolean; }
|
bool is_bool() const { return value_kind() == ValueKind::Boolean; }
|
||||||
bool is_float() const { return value_kind() == ValueKind::Float; }
|
|
||||||
bool is_double() const { return value_kind() == ValueKind::Double; }
|
bool is_double() const { return value_kind() == ValueKind::Double; }
|
||||||
bool is_int() const { return value_kind() == ValueKind::SignedInteger; }
|
bool is_int() const { return value_kind() == ValueKind::SignedInteger; }
|
||||||
bool is_string() const { return value_kind() == ValueKind::String; }
|
bool is_string() const { return value_kind() == ValueKind::String; }
|
||||||
|
@ -109,9 +104,6 @@ namespace Preserves {
|
||||||
boost::optional<bool> as_bool() const;
|
boost::optional<bool> as_bool() const;
|
||||||
bool to_bool() const { return as_bool().value(); }
|
bool to_bool() const { return as_bool().value(); }
|
||||||
|
|
||||||
boost::optional<float> as_float() const;
|
|
||||||
float to_float() const { return as_float().value(); }
|
|
||||||
|
|
||||||
boost::optional<double> as_double() const;
|
boost::optional<double> as_double() const;
|
||||||
double to_double() const { return as_double().value(); }
|
double to_double() const { return as_double().value(); }
|
||||||
|
|
||||||
|
@ -175,7 +167,6 @@ namespace Preserves {
|
||||||
virtual bool is_mutable() const { return false; }
|
virtual bool is_mutable() const { return false; }
|
||||||
|
|
||||||
virtual boost::optional<bool> as_bool() const { return boost::none; }
|
virtual boost::optional<bool> as_bool() const { return boost::none; }
|
||||||
virtual boost::optional<float> as_float() const { return boost::none; }
|
|
||||||
virtual boost::optional<double> as_double() const { return boost::none; }
|
virtual boost::optional<double> as_double() const { return boost::none; }
|
||||||
virtual boost::optional<uint64_t> as_unsigned() const { return boost::none; }
|
virtual boost::optional<uint64_t> as_unsigned() const { return boost::none; }
|
||||||
virtual boost::optional<int64_t> as_signed() const { return boost::none; }
|
virtual boost::optional<int64_t> as_signed() const { return boost::none; }
|
||||||
|
@ -219,7 +210,6 @@ namespace Preserves {
|
||||||
#define PRESERVES_DELEGATE_CAST(t, name) \
|
#define PRESERVES_DELEGATE_CAST(t, name) \
|
||||||
template <typename T> boost::optional<t> Value<T>::name() const { return p->name(); }
|
template <typename T> boost::optional<t> Value<T>::name() const { return p->name(); }
|
||||||
PRESERVES_DELEGATE_CAST(bool, as_bool);
|
PRESERVES_DELEGATE_CAST(bool, as_bool);
|
||||||
PRESERVES_DELEGATE_CAST(float, as_float);
|
|
||||||
PRESERVES_DELEGATE_CAST(double, as_double);
|
PRESERVES_DELEGATE_CAST(double, as_double);
|
||||||
PRESERVES_DELEGATE_CAST(uint64_t, as_unsigned);
|
PRESERVES_DELEGATE_CAST(uint64_t, as_unsigned);
|
||||||
PRESERVES_DELEGATE_CAST(int64_t, as_signed);
|
PRESERVES_DELEGATE_CAST(int64_t, as_signed);
|
||||||
|
@ -265,7 +255,6 @@ namespace Preserves {
|
||||||
if (bKind < aKind) return false;
|
if (bKind < aKind) return false;
|
||||||
switch (aKind) {
|
switch (aKind) {
|
||||||
case ValueKind::Boolean: return a.to_bool() < b.to_bool();
|
case ValueKind::Boolean: return a.to_bool() < b.to_bool();
|
||||||
case ValueKind::Float: return a.to_float() < b.to_float();
|
|
||||||
case ValueKind::Double: return a.to_double() < b.to_double();
|
case ValueKind::Double: return a.to_double() < b.to_double();
|
||||||
case ValueKind::SignedInteger: {
|
case ValueKind::SignedInteger: {
|
||||||
if (auto av = a.as_signed()) {
|
if (auto av = a.as_signed()) {
|
||||||
|
|
|
@ -41,7 +41,7 @@ let render
|
||||||
)
|
)
|
||||||
m
|
m
|
||||||
++ " }"
|
++ " }"
|
||||||
, embedded = λ(value : Text) → "#!${value}"
|
, embedded = λ(value : Text) → "#:${value}"
|
||||||
}
|
}
|
||||||
|
|
||||||
let Preserves/boolean = ./boolean.dhall
|
let Preserves/boolean = ./boolean.dhall
|
||||||
|
@ -94,7 +94,7 @@ let example0 =
|
||||||
)}
|
)}
|
||||||
''
|
''
|
||||||
≡ ''
|
≡ ''
|
||||||
{ a: 1 b: [ 2 3 ] c: { d: 1.0 e: -1.0 } d: #!#t e: <capture <_>> }
|
{ a: 1 b: [ 2 3 ] c: { d: 1.0 e: -1.0 } d: #:#t e: <capture <_>> }
|
||||||
''
|
''
|
||||||
|
|
||||||
in render
|
in render
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
{
|
{
|
||||||
"name": "@preserves/core",
|
"name": "@preserves/core",
|
||||||
"version": "0.992.1",
|
"version": "0.995.206",
|
||||||
"description": "Preserves data serialization format",
|
"description": "Preserves data serialization format",
|
||||||
"homepage": "https://gitlab.com/preserves/preserves",
|
"homepage": "https://gitlab.com/preserves/preserves",
|
||||||
"license": "Apache-2.0",
|
"license": "Apache-2.0",
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
import { Tag } from "./constants";
|
import { Tag } from "./constants";
|
||||||
import { is, isAnnotated, IsPreservesAnnotated } from "./is";
|
import { is, isAnnotated, IsPreservesAnnotated } from "./is";
|
||||||
import type { GenericEmbedded } from "./embedded";
|
import type { Embeddable, GenericEmbedded } from "./embedded";
|
||||||
import type { Value } from "./values";
|
import type { Value } from "./values";
|
||||||
import type { Encoder, Preservable } from "./encoder";
|
import type { Encoder, Preservable } from "./encoder";
|
||||||
import type { Writer, PreserveWritable } from "./writer";
|
import type { Writer, PreserveWritable } from "./writer";
|
||||||
|
@ -52,7 +52,7 @@ export function formatPosition(p: Position | null | string): string {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export class Annotated<T = GenericEmbedded> implements Preservable<T>, PreserveWritable<T> {
|
export class Annotated<T extends Embeddable = GenericEmbedded> implements Preservable<T>, PreserveWritable<T> {
|
||||||
readonly annotations: Array<Value<T>>;
|
readonly annotations: Array<Value<T>>;
|
||||||
readonly pos: Position | null;
|
readonly pos: Position | null;
|
||||||
readonly item: Value<T>;
|
readonly item: Value<T>;
|
||||||
|
@ -67,7 +67,7 @@ export class Annotated<T = GenericEmbedded> implements Preservable<T>, PreserveW
|
||||||
return this;
|
return this;
|
||||||
}
|
}
|
||||||
|
|
||||||
static __from_preserve__<T>(v: Value<T>): undefined | Annotated<T> {
|
static __from_preserve__<T extends Embeddable>(v: Value<T>): undefined | Annotated<T> {
|
||||||
return isAnnotated<T>(v) ? v : void 0;
|
return isAnnotated<T>(v) ? v : void 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -109,21 +109,24 @@ export class Annotated<T = GenericEmbedded> implements Preservable<T>, PreserveW
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
|
||||||
static isAnnotated<T = GenericEmbedded>(x: any): x is Annotated<T> {
|
static isAnnotated<T extends Embeddable = GenericEmbedded>(x: any): x is Annotated<T> {
|
||||||
return isAnnotated(x);
|
return isAnnotated(x);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export function annotate<T = GenericEmbedded>(v0: Value<T>, ...anns: Value<T>[]): Annotated<T> {
|
export function annotate<T extends Embeddable = GenericEmbedded>(
|
||||||
|
v0: Value<T>,
|
||||||
|
...anns: Value<T>[]
|
||||||
|
): Annotated<T> {
|
||||||
const v = Annotated.isAnnotated<T>(v0) ? v0 : new Annotated(v0);
|
const v = Annotated.isAnnotated<T>(v0) ? v0 : new Annotated(v0);
|
||||||
anns.forEach((a) => v.annotations.push(a));
|
anns.forEach((a) => v.annotations.push(a));
|
||||||
return v;
|
return v;
|
||||||
}
|
}
|
||||||
|
|
||||||
export function annotations<T = GenericEmbedded>(v: Value<T>): Array<Value<T>> {
|
export function annotations<T extends Embeddable = GenericEmbedded>(v: Value<T>): Array<Value<T>> {
|
||||||
return Annotated.isAnnotated<T>(v) ? v.annotations : [];
|
return Annotated.isAnnotated<T>(v) ? v.annotations : [];
|
||||||
}
|
}
|
||||||
|
|
||||||
export function position<T = GenericEmbedded>(v: Value<T>): Position | null {
|
export function position<T extends Embeddable = GenericEmbedded>(v: Value<T>): Position | null {
|
||||||
return Annotated.isAnnotated<T>(v) ? v.pos : null;
|
return Annotated.isAnnotated<T>(v) ? v.pos : null;
|
||||||
}
|
}
|
||||||
|
|
|
@ -0,0 +1,46 @@
|
||||||
|
const BASE64_DEC: {[key: string]: number} = {};
|
||||||
|
[... 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789'].forEach(
|
||||||
|
(c, i) => BASE64_DEC[c] = i);
|
||||||
|
BASE64_DEC['+'] = BASE64_DEC['-'] = 62;
|
||||||
|
BASE64_DEC['/'] = BASE64_DEC['_'] = 63;
|
||||||
|
|
||||||
|
export function decodeBase64(s: string): Uint8Array {
|
||||||
|
const bs = new Uint8Array(Math.floor(s.length * 3/4));
|
||||||
|
let i = 0;
|
||||||
|
let j = 0;
|
||||||
|
while (i < s.length) {
|
||||||
|
const v1 = BASE64_DEC[s[i++]];
|
||||||
|
const v2 = BASE64_DEC[s[i++]];
|
||||||
|
const v3 = BASE64_DEC[s[i++]];
|
||||||
|
const v4 = BASE64_DEC[s[i++]];
|
||||||
|
const v = (v1 << 18) | (v2 << 12) | (v3 << 6) | v4;
|
||||||
|
bs[j++] = (v >> 16) & 255;
|
||||||
|
if (v3 === void 0) break;
|
||||||
|
bs[j++] = (v >> 8) & 255;
|
||||||
|
if (v4 === void 0) break;
|
||||||
|
bs[j++] = v & 255;
|
||||||
|
}
|
||||||
|
return bs.subarray(0, j);
|
||||||
|
}
|
||||||
|
|
||||||
|
const BASE64_ENC = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/'
|
||||||
|
|
||||||
|
export function encodeBase64(bs: Uint8Array): string {
|
||||||
|
let s = '';
|
||||||
|
let buffer = 0;
|
||||||
|
let bitcount = 0;
|
||||||
|
for (let b of bs) {
|
||||||
|
buffer = ((buffer & 0x3f) << 8) | b;
|
||||||
|
bitcount += 8;
|
||||||
|
while (bitcount >= 6) {
|
||||||
|
bitcount -= 6;
|
||||||
|
const v = (buffer >> bitcount) & 0x3f;
|
||||||
|
s = s + BASE64_ENC[v];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
if (bitcount > 0) {
|
||||||
|
const v = (buffer << (6 - bitcount)) & 0x3f;
|
||||||
|
s = s + BASE64_ENC[v];
|
||||||
|
}
|
||||||
|
return s;
|
||||||
|
}
|
|
@ -1,11 +1,12 @@
|
||||||
import { Tag } from './constants';
|
import { Tag } from './constants';
|
||||||
import { GenericEmbedded } from './embedded';
|
import type { Embeddable, GenericEmbedded } from './embedded';
|
||||||
import { Encoder, Preservable } from './encoder';
|
import { Encoder, Preservable } from './encoder';
|
||||||
import { Value } from './values';
|
import { Value } from './values';
|
||||||
import { Writer, PreserveWritable } from './writer';
|
import type { Writer, PreserveWritable } from './writer';
|
||||||
|
import { decodeBase64, encodeBase64 } from './base64';
|
||||||
|
|
||||||
const textEncoder = new TextEncoder();
|
const textEncoder = new TextEncoder();
|
||||||
const textDecoder = new TextDecoder();
|
const textDecoder = new TextDecoder('utf-8', { fatal: true });
|
||||||
|
|
||||||
export const IsPreservesBytes = Symbol.for('IsPreservesBytes');
|
export const IsPreservesBytes = Symbol.for('IsPreservesBytes');
|
||||||
|
|
||||||
|
@ -51,6 +52,22 @@ export class Bytes implements Preservable<any>, PreserveWritable<any> {
|
||||||
return new Bytes(Uint8Array.of(...bytes));
|
return new Bytes(Uint8Array.of(...bytes));
|
||||||
}
|
}
|
||||||
|
|
||||||
|
static fromLatin1(s: string): Bytes {
|
||||||
|
// Takes codepoints in [0..255] from s, treats them as bytes.
|
||||||
|
// Codepoints outside that range trigger an exception.
|
||||||
|
const result = new Bytes(s.length); // assume all the codepoints are OK
|
||||||
|
for (let i = 0; i < s.length; i++) {
|
||||||
|
const n = s.charCodeAt(i);
|
||||||
|
if (n >= 256) throw new Error("Codepoint out of range for 'latin1' byte encoding");
|
||||||
|
result._view[i] = n;
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
static fromBase64(s: string): Bytes {
|
||||||
|
return new Bytes(decodeBase64(s));
|
||||||
|
}
|
||||||
|
|
||||||
static fromHex(s: string): Bytes {
|
static fromHex(s: string): Bytes {
|
||||||
if (s.length & 1) throw new Error("Cannot decode odd-length hexadecimal string");
|
if (s.length & 1) throw new Error("Cannot decode odd-length hexadecimal string");
|
||||||
const result = new Bytes(s.length >> 1);
|
const result = new Bytes(s.length >> 1);
|
||||||
|
@ -131,14 +148,22 @@ export class Bytes implements Preservable<any>, PreserveWritable<any> {
|
||||||
return textDecoder.decode(this._view);
|
return textDecoder.decode(this._view);
|
||||||
}
|
}
|
||||||
|
|
||||||
__as_preserve__<T = GenericEmbedded>(): Value<T> {
|
__as_preserve__<T extends Embeddable = GenericEmbedded>(): Value<T> {
|
||||||
return this;
|
return this;
|
||||||
}
|
}
|
||||||
|
|
||||||
static __from_preserve__<T>(v: Value<T>): undefined | Bytes {
|
static __from_preserve__<T extends Embeddable>(v: Value<T>): undefined | Bytes {
|
||||||
return Bytes.isBytes(v) ? v : void 0;
|
return Bytes.isBytes(v) ? v : void 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
toLatin1(): string {
|
||||||
|
return String.fromCharCode.apply(null, this._view as any as number[]);
|
||||||
|
}
|
||||||
|
|
||||||
|
toBase64(): string {
|
||||||
|
return encodeBase64(this._view);
|
||||||
|
}
|
||||||
|
|
||||||
toHex(digit = hexDigit): string {
|
toHex(digit = hexDigit): string {
|
||||||
var nibbles = [];
|
var nibbles = [];
|
||||||
for (let i = 0; i < this.length; i++) {
|
for (let i = 0; i < this.length; i++) {
|
||||||
|
|
|
@ -1,12 +1,12 @@
|
||||||
import type { Compound, Value } from "./values";
|
import type { Compound, Value } from "./values";
|
||||||
import type { GenericEmbedded } from "./embedded";
|
import type { Embeddable, GenericEmbedded } from "./embedded";
|
||||||
import { Dictionary, Set } from "./dictionary";
|
import { Dictionary, Set } from "./dictionary";
|
||||||
|
|
||||||
export function isCompound<T = GenericEmbedded>(x: Value<T>): x is Compound<T>
|
export function isCompound<T extends Embeddable = GenericEmbedded>(x: Value<T>): x is Compound<T>
|
||||||
{
|
{
|
||||||
return (Array.isArray(x) || Set.isSet(x) || Dictionary.isDictionary(x));
|
return (Array.isArray(x) || Set.isSet(x) || Dictionary.isDictionary(x));
|
||||||
}
|
}
|
||||||
|
|
||||||
export function isSequence<T = GenericEmbedded>(x: Value<T>): x is Array<Value<T>> {
|
export function isSequence<T extends Embeddable = GenericEmbedded>(x: Value<T>): x is Array<Value<T>> {
|
||||||
return (Array.isArray(x) && !('label' in x));
|
return (Array.isArray(x) && !('label' in x));
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,24 +1,25 @@
|
||||||
import { Annotated } from "./annotated";
|
import { Annotated } from "./annotated";
|
||||||
import { DecodeError, ShortPacket } from "./codec";
|
import { DecodeError, ShortPacket } from "./codec";
|
||||||
import { Tag } from "./constants";
|
import { Tag } from "./constants";
|
||||||
import { Set, Dictionary } from "./dictionary";
|
import { Set, Dictionary, DictionaryMap } from "./dictionary";
|
||||||
import { DoubleFloat, SingleFloat } from "./float";
|
import { DoubleFloat } from "./float";
|
||||||
import { Record } from "./record";
|
import { Record } from "./record";
|
||||||
import { Bytes, BytesLike, underlying, hexDigit } from "./bytes";
|
import { Bytes, BytesLike, underlying, hexDigit } from "./bytes";
|
||||||
import { Value } from "./values";
|
import { Value } from "./values";
|
||||||
import { is } from "./is";
|
import { is } from "./is";
|
||||||
import { embed, GenericEmbedded, Embedded, EmbeddedTypeDecode } from "./embedded";
|
import { GenericEmbedded, Embeddable, EmbeddedTypeDecode } from "./embedded";
|
||||||
import { ReaderStateOptions } from "reader";
|
import { ReaderStateOptions } from "./reader";
|
||||||
|
import { stringify } from "./text";
|
||||||
|
|
||||||
export interface DecoderOptions {
|
export interface DecoderOptions {
|
||||||
includeAnnotations?: boolean;
|
includeAnnotations?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface DecoderEmbeddedOptions<T> extends DecoderOptions {
|
export interface DecoderEmbeddedOptions<T extends Embeddable> extends DecoderOptions {
|
||||||
embeddedDecode?: EmbeddedTypeDecode<T>;
|
embeddedDecode?: EmbeddedTypeDecode<T>;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface TypedDecoder<T> {
|
export interface TypedDecoder<T extends Embeddable> {
|
||||||
atEnd(): boolean;
|
atEnd(): boolean;
|
||||||
|
|
||||||
mark(): any;
|
mark(): any;
|
||||||
|
@ -26,14 +27,13 @@ export interface TypedDecoder<T> {
|
||||||
|
|
||||||
skip(): void;
|
skip(): void;
|
||||||
next(): Value<T>;
|
next(): Value<T>;
|
||||||
withEmbeddedDecode<S, R>(
|
withEmbeddedDecode<S extends Embeddable, R>(
|
||||||
embeddedDecode: EmbeddedTypeDecode<S>,
|
embeddedDecode: EmbeddedTypeDecode<S>,
|
||||||
body: (d: TypedDecoder<S>) => R): R;
|
body: (d: TypedDecoder<S>) => R): R;
|
||||||
|
|
||||||
nextBoolean(): boolean | undefined;
|
nextBoolean(): boolean | undefined;
|
||||||
nextFloat(): SingleFloat | undefined;
|
|
||||||
nextDouble(): DoubleFloat | undefined;
|
nextDouble(): DoubleFloat | undefined;
|
||||||
nextEmbedded(): Embedded<T> | undefined;
|
nextEmbedded(): T | undefined;
|
||||||
nextSignedInteger(): number | bigint | undefined;
|
nextSignedInteger(): number | bigint | undefined;
|
||||||
nextString(): string | undefined;
|
nextString(): string | undefined;
|
||||||
nextByteString(): Bytes | undefined;
|
nextByteString(): Bytes | undefined;
|
||||||
|
@ -47,7 +47,7 @@ export interface TypedDecoder<T> {
|
||||||
closeCompound(): boolean;
|
closeCompound(): boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
export function asLiteral<T, E extends Exclude<Value<T>, Annotated<T>>>(
|
export function asLiteral<T extends Embeddable, E extends Exclude<Value<T>, Annotated<T>>>(
|
||||||
actual: Value<T>,
|
actual: Value<T>,
|
||||||
expected: E): E | undefined
|
expected: E): E | undefined
|
||||||
{
|
{
|
||||||
|
@ -166,11 +166,11 @@ export class DecoderState {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
wrap<T>(v: Value<T>): Value<T> {
|
wrap<T extends Embeddable>(v: Value<T>): Value<T> {
|
||||||
return this.includeAnnotations ? new Annotated(v) : v;
|
return this.includeAnnotations ? new Annotated(v) : v;
|
||||||
}
|
}
|
||||||
|
|
||||||
unshiftAnnotation<T>(a: Value<T>, v: Annotated<T>): Annotated<T> {
|
unshiftAnnotation<T extends Embeddable>(a: Value<T>, v: Annotated<T>): Annotated<T> {
|
||||||
if (this.includeAnnotations) {
|
if (this.includeAnnotations) {
|
||||||
v.annotations.unshift(a);
|
v.annotations.unshift(a);
|
||||||
}
|
}
|
||||||
|
@ -188,7 +188,7 @@ export const neverEmbeddedTypeDecode: EmbeddedTypeDecode<never> = {
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
|
|
||||||
export class Decoder<T = never> implements TypedDecoder<T> {
|
export class Decoder<T extends Embeddable = never> implements TypedDecoder<T> {
|
||||||
state: DecoderState;
|
state: DecoderState;
|
||||||
embeddedDecode: EmbeddedTypeDecode<T>;
|
embeddedDecode: EmbeddedTypeDecode<T>;
|
||||||
|
|
||||||
|
@ -218,13 +218,14 @@ export class Decoder<T = never> implements TypedDecoder<T> {
|
||||||
return result;
|
return result;
|
||||||
}
|
}
|
||||||
|
|
||||||
static dictionaryFromArray<T>(vs: Value<T>[]): Dictionary<T> {
|
static dictionaryFromArray<T extends Embeddable>(vs: Value<T>[]): Dictionary<T> {
|
||||||
const d = new Dictionary<T>();
|
const d = new DictionaryMap<T>();
|
||||||
if (vs.length % 2) throw new DecodeError("Missing dictionary value");
|
if (vs.length % 2) throw new DecodeError("Missing dictionary value");
|
||||||
for (let i = 0; i < vs.length; i += 2) {
|
for (let i = 0; i < vs.length; i += 2) {
|
||||||
|
if (d.has(vs[i])) throw new DecodeError(`Duplicate key: ${stringify(vs[i])}`);
|
||||||
d.set(vs[i], vs[i+1]);
|
d.set(vs[i], vs[i+1]);
|
||||||
}
|
}
|
||||||
return d;
|
return d.simplifiedValue();
|
||||||
}
|
}
|
||||||
|
|
||||||
next(): Value<T> {
|
next(): Value<T> {
|
||||||
|
@ -238,10 +239,9 @@ export class Decoder<T = never> implements TypedDecoder<T> {
|
||||||
const v = this.next() as Annotated<T>;
|
const v = this.next() as Annotated<T>;
|
||||||
return this.state.unshiftAnnotation(a, v);
|
return this.state.unshiftAnnotation(a, v);
|
||||||
}
|
}
|
||||||
case Tag.Embedded: return this.state.wrap<T>(embed(this.embeddedDecode.decode(this.state)));
|
case Tag.Embedded: return this.state.wrap<T>(this.embeddedDecode.decode(this.state));
|
||||||
case Tag.Ieee754:
|
case Tag.Ieee754:
|
||||||
switch (this.state.varint()) {
|
switch (this.state.varint()) {
|
||||||
case 4: return this.state.wrap<T>(SingleFloat.fromBytes(this.state.nextbytes(4)));
|
|
||||||
case 8: return this.state.wrap<T>(DoubleFloat.fromBytes(this.state.nextbytes(8)));
|
case 8: return this.state.wrap<T>(DoubleFloat.fromBytes(this.state.nextbytes(8)));
|
||||||
default: throw new DecodeError("Invalid IEEE754 size");
|
default: throw new DecodeError("Invalid IEEE754 size");
|
||||||
}
|
}
|
||||||
|
@ -255,7 +255,14 @@ export class Decoder<T = never> implements TypedDecoder<T> {
|
||||||
return this.state.wrap<T>(Record(vs[0], vs.slice(1)));
|
return this.state.wrap<T>(Record(vs[0], vs.slice(1)));
|
||||||
}
|
}
|
||||||
case Tag.Sequence: return this.state.wrap<T>(this.nextvalues());
|
case Tag.Sequence: return this.state.wrap<T>(this.nextvalues());
|
||||||
case Tag.Set: return this.state.wrap<T>(new Set(this.nextvalues()));
|
case Tag.Set: {
|
||||||
|
const s = new Set<T>();
|
||||||
|
for (const v of this.nextvalues()) {
|
||||||
|
if (s.has(v)) throw new DecodeError(`Duplicate value: ${stringify(v)}`);
|
||||||
|
s.add(v);
|
||||||
|
}
|
||||||
|
return this.state.wrap<T>(s);
|
||||||
|
}
|
||||||
case Tag.Dictionary: return this.state.wrap<T>(Decoder.dictionaryFromArray(this.nextvalues()));
|
case Tag.Dictionary: return this.state.wrap<T>(Decoder.dictionaryFromArray(this.nextvalues()));
|
||||||
default: throw new DecodeError("Unsupported Preserves tag: " + tag);
|
default: throw new DecodeError("Unsupported Preserves tag: " + tag);
|
||||||
}
|
}
|
||||||
|
@ -282,7 +289,7 @@ export class Decoder<T = never> implements TypedDecoder<T> {
|
||||||
this.next();
|
this.next();
|
||||||
}
|
}
|
||||||
|
|
||||||
withEmbeddedDecode<S, R>(
|
withEmbeddedDecode<S extends Embeddable, R>(
|
||||||
embeddedDecode: EmbeddedTypeDecode<S>,
|
embeddedDecode: EmbeddedTypeDecode<S>,
|
||||||
body: (d: TypedDecoder<S>) => R): R
|
body: (d: TypedDecoder<S>) => R): R
|
||||||
{
|
{
|
||||||
|
@ -308,14 +315,6 @@ export class Decoder<T = never> implements TypedDecoder<T> {
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
nextFloat(): SingleFloat | undefined {
|
|
||||||
return this.skipAnnotations((reset) => {
|
|
||||||
if (this.state.nextbyte() !== Tag.Ieee754) return reset();
|
|
||||||
if (this.state.nextbyte() !== 4) return reset();
|
|
||||||
return SingleFloat.fromBytes(this.state.nextbytes(4));
|
|
||||||
});
|
|
||||||
}
|
|
||||||
|
|
||||||
nextDouble(): DoubleFloat | undefined {
|
nextDouble(): DoubleFloat | undefined {
|
||||||
return this.skipAnnotations((reset) => {
|
return this.skipAnnotations((reset) => {
|
||||||
if (this.state.nextbyte() !== Tag.Ieee754) return reset();
|
if (this.state.nextbyte() !== Tag.Ieee754) return reset();
|
||||||
|
@ -324,10 +323,10 @@ export class Decoder<T = never> implements TypedDecoder<T> {
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
||||||
nextEmbedded(): Embedded<T> | undefined {
|
nextEmbedded(): T | undefined {
|
||||||
return this.skipAnnotations((reset) => {
|
return this.skipAnnotations((reset) => {
|
||||||
switch (this.state.nextbyte()) {
|
switch (this.state.nextbyte()) {
|
||||||
case Tag.Embedded: return embed(this.embeddedDecode.decode(this.state));
|
case Tag.Embedded: return this.embeddedDecode.decode(this.state);
|
||||||
default: return reset();
|
default: return reset();
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
@ -396,11 +395,16 @@ export class Decoder<T = never> implements TypedDecoder<T> {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export function decode<T>(bs: BytesLike, options: DecoderEmbeddedOptions<T> = {}): Value<T> {
|
export function decode<T extends Embeddable>(
|
||||||
|
bs: BytesLike,
|
||||||
|
options: DecoderEmbeddedOptions<T> = {},
|
||||||
|
): Value<T> {
|
||||||
return new Decoder(bs, options).next();
|
return new Decoder(bs, options).next();
|
||||||
}
|
}
|
||||||
|
|
||||||
export function decodeWithAnnotations<T>(bs: BytesLike,
|
export function decodeWithAnnotations<T extends Embeddable>(
|
||||||
options: DecoderEmbeddedOptions<T> = {}): Annotated<T> {
|
bs: BytesLike,
|
||||||
|
options: DecoderEmbeddedOptions<T> = {},
|
||||||
|
): Annotated<T> {
|
||||||
return decode(bs, { ... options, includeAnnotations: true }) as Annotated<T>;
|
return decode(bs, { ... options, includeAnnotations: true }) as Annotated<T>;
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,149 +1,404 @@
|
||||||
import { Encoder, canonicalEncode, canonicalString } from "./encoder";
|
import { Encoder, canonicalString } from "./encoder";
|
||||||
import { Tag } from "./constants";
|
import { Tag } from "./constants";
|
||||||
import { FlexMap, FlexSet, _iterMap } from "./flex";
|
import { FlexMap, FlexSet, _iterMap, IdentitySet, Equivalence, IsMap } from "./flex";
|
||||||
import { Value } from "./values";
|
import { Value } from "./values";
|
||||||
import { Bytes } from './bytes';
|
import { Bytes } from './bytes';
|
||||||
import { GenericEmbedded } from "./embedded";
|
import { Embeddable, GenericEmbedded, isEmbedded } from "./embedded";
|
||||||
import type { Preservable } from "./encoder";
|
import type { Preservable } from "./encoder";
|
||||||
import type { Writer, PreserveWritable } from "./writer";
|
import type { Writer, PreserveWritable } from "./writer";
|
||||||
import { annotations, Annotated } from "./annotated";
|
import { annotations, Annotated } from "./annotated";
|
||||||
|
import { Float } from "./float";
|
||||||
|
import { JsDictionary } from "./jsdictionary";
|
||||||
|
import { unannotate } from "./strip";
|
||||||
|
|
||||||
export type DictionaryType = 'Dictionary' | 'Set';
|
export type DictionaryType = 'Dictionary' | 'Set';
|
||||||
export const DictionaryType = Symbol.for('DictionaryType');
|
export const DictionaryType = Symbol.for('DictionaryType');
|
||||||
|
|
||||||
export class KeyedDictionary<K extends Value<T>, V, T = GenericEmbedded> extends FlexMap<K, V>
|
export type CompoundKey<T extends Embeddable> = Value<T> | (Preservable<T> & PreserveWritable<T>);
|
||||||
|
|
||||||
|
export class EncodableDictionary<T extends Embeddable, K, V> extends FlexMap<K, V>
|
||||||
implements Preservable<T>, PreserveWritable<T>
|
implements Preservable<T>, PreserveWritable<T>
|
||||||
|
{
|
||||||
|
constructor(
|
||||||
|
public readonly encodeK: (k: K) => CompoundKey<T>,
|
||||||
|
public readonly encodeV: (v: V) => CompoundKey<T>,
|
||||||
|
items?: readonly [K, V][] | Iterable<readonly [K, V]>
|
||||||
|
) {
|
||||||
|
super((k: K) => canonicalString(encodeK(k)), items);
|
||||||
|
}
|
||||||
|
|
||||||
|
__preserve_on__(encoder: Encoder<T>) {
|
||||||
|
encodeDictionaryOn(this,
|
||||||
|
encoder,
|
||||||
|
(k, e) => e.push(this.encodeK(k)),
|
||||||
|
(v, e) => e.push(this.encodeV(v)));
|
||||||
|
}
|
||||||
|
|
||||||
|
__preserve_text_on__(w: Writer<T>) {
|
||||||
|
writeDictionaryOn(this,
|
||||||
|
w,
|
||||||
|
(k, w) => w.push(this.encodeK(k)),
|
||||||
|
(v, w) => w.push(this.encodeV(v)));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export class KeyedDictionary<T extends Embeddable = GenericEmbedded, K extends CompoundKey<T> = Value<T>, V = Value<T>>
|
||||||
|
extends EncodableDictionary<T, K, V>
|
||||||
{
|
{
|
||||||
get [DictionaryType](): DictionaryType {
|
get [DictionaryType](): DictionaryType {
|
||||||
return 'Dictionary';
|
return 'Dictionary';
|
||||||
}
|
}
|
||||||
|
|
||||||
static isKeyedDictionary<K extends Value<T>, V, T = GenericEmbedded>(x: any): x is KeyedDictionary<K, V, T> {
|
static isKeyedDictionary<T extends Embeddable = GenericEmbedded, K extends CompoundKey<T> = Value<T>, V = Value<T>>(
|
||||||
|
x: any,
|
||||||
|
): x is KeyedDictionary<T, K, V> {
|
||||||
return x?.[DictionaryType] === 'Dictionary';
|
return x?.[DictionaryType] === 'Dictionary';
|
||||||
}
|
}
|
||||||
|
|
||||||
constructor(items?: readonly [K, V][]);
|
constructor(items?: readonly [K, V][] | Iterable<readonly [K, V]>) {
|
||||||
constructor(items?: Iterable<readonly [K, V]>);
|
// The cast in encodeV is suuuuuuuper unsound, since V may not in fact be Encodable and Writable.
|
||||||
constructor(items?: Iterable<readonly [K, V]>) {
|
// Don't try to encode/write dictionaries holding non-encodable/non-writable values.
|
||||||
super(canonicalString, items);
|
super(k => k, v => v as CompoundKey<T>, items);
|
||||||
}
|
}
|
||||||
|
|
||||||
mapEntries<W, S extends Value<R>, R = GenericEmbedded>(f: (entry: [K, V]) => [S, W]): KeyedDictionary<S, W, R> {
|
clone(): KeyedDictionary<T, K, V> {
|
||||||
const result = new KeyedDictionary<S, W, R>();
|
|
||||||
for (let oldEntry of this.entries()) {
|
|
||||||
const newEntry = f(oldEntry);
|
|
||||||
result.set(newEntry[0], newEntry[1])
|
|
||||||
}
|
|
||||||
return result;
|
|
||||||
}
|
|
||||||
|
|
||||||
clone(): KeyedDictionary<K, V, T> {
|
|
||||||
return new KeyedDictionary(this);
|
return new KeyedDictionary(this);
|
||||||
}
|
}
|
||||||
|
|
||||||
get [Symbol.toStringTag]() { return 'Dictionary'; }
|
get [Symbol.toStringTag]() { return 'Dictionary'; }
|
||||||
|
|
||||||
__preserve_on__(encoder: Encoder<T>) {
|
equals(otherAny: any, eqv: Equivalence<V> = (v1, v2) => v1 === v2): boolean {
|
||||||
if (encoder.canonical) {
|
const otherMap = Dictionary.asMap(otherAny);
|
||||||
const entries = Array.from(this);
|
if (!otherMap) return false;
|
||||||
const pieces = entries.map<[Bytes, number]>(([k, _v], i) => [canonicalEncode(k), i]);
|
return super.equals(otherMap, eqv);
|
||||||
pieces.sort((a, b) => Bytes.compare(a[0], b[0]));
|
|
||||||
encoder.state.emitbyte(Tag.Dictionary);
|
|
||||||
pieces.forEach(([_encodedKey, i]) => {
|
|
||||||
const [k, v] = entries[i];
|
|
||||||
encoder.push(k);
|
|
||||||
encoder.push(v as unknown as Value<T>); // Suuuuuuuper unsound
|
|
||||||
});
|
|
||||||
encoder.state.emitbyte(Tag.End);
|
|
||||||
} else {
|
|
||||||
encoder.state.emitbyte(Tag.Dictionary);
|
|
||||||
this.forEach((v, k) => {
|
|
||||||
encoder.push(k);
|
|
||||||
encoder.push(v as unknown as Value<T>); // Suuuuuuuper unsound
|
|
||||||
});
|
|
||||||
encoder.state.emitbyte(Tag.End);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
__preserve_text_on__(w: Writer<T>) {
|
|
||||||
w.state.writeSeq('{', '}', this.entries(), ([k, v]) => {
|
|
||||||
w.push(k);
|
|
||||||
if (Annotated.isAnnotated<T>(v) && (annotations(v).length > 1) && w.state.isIndenting) {
|
|
||||||
w.state.pieces.push(':');
|
|
||||||
w.state.indentCount++;
|
|
||||||
w.state.writeIndent();
|
|
||||||
w.push(v);
|
|
||||||
w.state.indentCount--;
|
|
||||||
} else {
|
|
||||||
w.state.pieces.push(': ');
|
|
||||||
w.push(v as unknown as Value<T>); // Suuuuuuuper unsound
|
|
||||||
}
|
|
||||||
});
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export class Dictionary<T = GenericEmbedded, V = Value<T>> extends KeyedDictionary<Value<T>, V, T> {
|
export type Dictionary<T extends Embeddable = GenericEmbedded, V = Value<T>> =
|
||||||
static isDictionary<T = GenericEmbedded, V = Value<T>>(x: any): x is Dictionary<T, V> {
|
JsDictionary<V> | KeyedDictionary<T, Value<T>, V>;
|
||||||
return x?.[DictionaryType] === 'Dictionary';
|
|
||||||
|
export class DictionaryMap<T extends Embeddable = GenericEmbedded, V = Value<T>> implements Map<Value<T>, V> {
|
||||||
|
get [IsMap](): boolean { return true; }
|
||||||
|
|
||||||
|
j: JsDictionary<V> | undefined;
|
||||||
|
k: KeyedDictionary<T, Value<T>, V> | undefined;
|
||||||
|
|
||||||
|
constructor(input?: Dictionary<T, V>) {
|
||||||
|
if (input === void 0) {
|
||||||
|
this.j = {};
|
||||||
|
this.k = void 0;
|
||||||
|
} else if (DictionaryType in input) {
|
||||||
|
this.j = void 0;
|
||||||
|
this.k = input;
|
||||||
|
} else {
|
||||||
|
this.j = input;
|
||||||
|
this.k = void 0;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
static __from_preserve__<T>(v: Value<T>): undefined | Dictionary<T> {
|
static from<T extends Embeddable = GenericEmbedded, V = Value<T>>(
|
||||||
|
entries: [Value<T>, V][] | Iterable<[Value<T>, V]>,
|
||||||
|
): DictionaryMap<T, V> {
|
||||||
|
const r = new DictionaryMap<T, V>();
|
||||||
|
for (const [key, value] of entries) r.set(key, value);
|
||||||
|
return r;
|
||||||
|
}
|
||||||
|
|
||||||
|
clear(): void {
|
||||||
|
if (this.j) {
|
||||||
|
JsDictionary.clear(this.j);
|
||||||
|
} else {
|
||||||
|
this.k!.clear();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
delete(key: Value<T>): boolean {
|
||||||
|
if (this.j) {
|
||||||
|
key = unannotate(key);
|
||||||
|
if (typeof key !== 'symbol') return false;
|
||||||
|
return JsDictionary.remove(this.j, key);
|
||||||
|
} else {
|
||||||
|
return this.k!.delete(key);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
forEach(callbackfn: (value: V, key: Value<T>, map: Map<Value<T>, V>) => void, thisArg?: any): void {
|
||||||
|
if (this.j) {
|
||||||
|
JsDictionary.forEach(this.j, (v, k) => callbackfn.call(thisArg, v, k, this));
|
||||||
|
} else {
|
||||||
|
this.k!.forEach(callbackfn, thisArg);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
get(key: Value<T>): V | undefined {
|
||||||
|
if (this.j) {
|
||||||
|
key = unannotate(key);
|
||||||
|
if (typeof key !== 'symbol') return void 0;
|
||||||
|
return JsDictionary.get(this.j, key);
|
||||||
|
} else {
|
||||||
|
return this.k!.get(key);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
has(key: Value<T>): boolean {
|
||||||
|
if (this.j) {
|
||||||
|
key = unannotate(key);
|
||||||
|
if (typeof key !== 'symbol') return false;
|
||||||
|
return JsDictionary.has(this.j, key);
|
||||||
|
} else {
|
||||||
|
return this.k!.has(key);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
set(key: Value<T>, value: V): this {
|
||||||
|
if (this.j) {
|
||||||
|
if (typeof key === 'symbol') {
|
||||||
|
JsDictionary.set(this.j, key, value);
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
this.k = new KeyedDictionary<T, Value<T>, V>(JsDictionary.entries(this.j));
|
||||||
|
this.j = void 0;
|
||||||
|
}
|
||||||
|
this.k!.set(key, value);
|
||||||
|
return this;
|
||||||
|
}
|
||||||
|
|
||||||
|
get size(): number {
|
||||||
|
return this.j ? JsDictionary.size(this.j) : this.k!.size;
|
||||||
|
}
|
||||||
|
|
||||||
|
entries(): IterableIterator<[Value<T>, V]> {
|
||||||
|
return this.j ? JsDictionary.entries(this.j) : this.k!.entries();
|
||||||
|
}
|
||||||
|
|
||||||
|
keys(): IterableIterator<Value<T>> {
|
||||||
|
return this.j ? JsDictionary.keys(this.j) : this.k!.keys();
|
||||||
|
}
|
||||||
|
|
||||||
|
values(): IterableIterator<V> {
|
||||||
|
return this.j ? JsDictionary.values(this.j) : this.k!.values();
|
||||||
|
}
|
||||||
|
|
||||||
|
[Symbol.iterator](): IterableIterator<[Value<T>, V]> {
|
||||||
|
return this.entries();
|
||||||
|
}
|
||||||
|
|
||||||
|
get [Symbol.toStringTag](): string {
|
||||||
|
return 'DictionaryMap';
|
||||||
|
}
|
||||||
|
|
||||||
|
clone(): DictionaryMap<T, V> {
|
||||||
|
return new DictionaryMap<T, V>(this.j ? JsDictionary.clone(this.j) : this.k!.clone());
|
||||||
|
}
|
||||||
|
|
||||||
|
get value(): Dictionary<T, V> {
|
||||||
|
return this.j ?? this.k!;
|
||||||
|
}
|
||||||
|
|
||||||
|
simplify(): void {
|
||||||
|
if (!this.j) {
|
||||||
|
const r: JsDictionary<V> = {};
|
||||||
|
for (const [key, value] of this.k!.entries()) {
|
||||||
|
if (typeof key !== 'symbol') return;
|
||||||
|
r[key.description!] = value;
|
||||||
|
}
|
||||||
|
this.j = r;
|
||||||
|
this.k = void 0;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
simplifiedValue(): Dictionary<T, V> {
|
||||||
|
this.simplify();
|
||||||
|
return this.value;
|
||||||
|
}
|
||||||
|
|
||||||
|
asJsDictionary(): JsDictionary<V> {
|
||||||
|
this.simplify();
|
||||||
|
if (!this.j) throw new Error("Cannot represent general dictionary as JsDictionary");
|
||||||
|
return this.j;
|
||||||
|
}
|
||||||
|
|
||||||
|
asKeyedDictionary(): KeyedDictionary<T, Value<T>, V> {
|
||||||
|
return this.k ?? new KeyedDictionary<T, Value<T>, V>(JsDictionary.entries(this.j!));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export namespace Dictionary {
|
||||||
|
export function isDictionary<T extends Embeddable = GenericEmbedded, V = Value<T>>(
|
||||||
|
x: any
|
||||||
|
): x is Dictionary<T, V> {
|
||||||
|
if (typeof x !== 'object' || x === null) return false;
|
||||||
|
switch (x[DictionaryType]) {
|
||||||
|
case 'Dictionary': return true;
|
||||||
|
case void 0: return JsDictionary.isJsDictionary(x);
|
||||||
|
default: return false;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function asMap<T extends Embeddable = GenericEmbedded, V = Value<T>>(
|
||||||
|
x: Dictionary<T, V>
|
||||||
|
): DictionaryMap<T, V>;
|
||||||
|
export function asMap<T extends Embeddable = GenericEmbedded, V = Value<T>>(
|
||||||
|
x: any
|
||||||
|
): DictionaryMap<T, V> | undefined;
|
||||||
|
export function asMap<T extends Embeddable = GenericEmbedded, V = Value<T>>(
|
||||||
|
x: any
|
||||||
|
): DictionaryMap<T, V> | undefined {
|
||||||
|
return isDictionary<T, V>(x) ? new DictionaryMap(x) : void 0;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function from<T extends Embeddable = GenericEmbedded, V = Value<T>>(
|
||||||
|
entries: [Value<T>, V][] | Iterable<[Value<T>, V]>,
|
||||||
|
): Dictionary<T, V> {
|
||||||
|
return DictionaryMap.from(entries).simplifiedValue();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function __from_preserve__<T extends Embeddable>(v: Value<T>): undefined | Dictionary<T> {
|
||||||
return Dictionary.isDictionary<T>(v) ? v : void 0;
|
return Dictionary.isDictionary<T>(v) ? v : void 0;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export class KeyedSet<K extends Value<T>, T = GenericEmbedded> extends FlexSet<K>
|
export function encodeDictionaryOn<T extends Embeddable, K, V>(
|
||||||
|
dict: Map<K, V>,
|
||||||
|
encoder: Encoder<T>,
|
||||||
|
encodeK: (k: K, encoder: Encoder<T>) => void,
|
||||||
|
encodeV: (v: V, encoder: Encoder<T>) => void,
|
||||||
|
) {
|
||||||
|
if (encoder.canonical) {
|
||||||
|
const entries = Array.from(dict);
|
||||||
|
const canonicalEncoder = new Encoder<T>({
|
||||||
|
canonical: true,
|
||||||
|
embeddedEncode: encoder.embeddedEncode,
|
||||||
|
});
|
||||||
|
const pieces = entries.map<[Bytes, number]>(([k, _v], i) => {
|
||||||
|
encodeK(k, canonicalEncoder);
|
||||||
|
return [canonicalEncoder.contents(), i];
|
||||||
|
});
|
||||||
|
pieces.sort((a, b) => Bytes.compare(a[0], b[0]));
|
||||||
|
encoder.grouped(Tag.Dictionary, () => pieces.forEach(([_encodedKey, i]) => {
|
||||||
|
const [k, v] = entries[i];
|
||||||
|
encodeK(k, encoder);
|
||||||
|
encodeV(v, encoder);
|
||||||
|
}));
|
||||||
|
} else {
|
||||||
|
encoder.grouped(Tag.Dictionary, () => dict.forEach((v, k) => {
|
||||||
|
encodeK(k, encoder);
|
||||||
|
encodeV(v, encoder);
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function writeDictionaryOn<T extends Embeddable, K, V>(
|
||||||
|
dict: Map<K, V>,
|
||||||
|
w: Writer<T>,
|
||||||
|
writeK: (k: K, w: Writer<T>) => void,
|
||||||
|
writeV: (v: V, w: Writer<T>) => void,
|
||||||
|
) {
|
||||||
|
w.state.writeSeq('{', '}', dict.entries(), ([k, v]) => {
|
||||||
|
writeK(k, w);
|
||||||
|
if (Annotated.isAnnotated<T>(v) && (annotations(v).length > 1) && w.state.isIndenting) {
|
||||||
|
w.state.pieces.push(':');
|
||||||
|
w.state.indentCount++;
|
||||||
|
w.state.writeIndent();
|
||||||
|
writeV(v, w);
|
||||||
|
w.state.indentCount--;
|
||||||
|
} else {
|
||||||
|
w.state.pieces.push(': ');
|
||||||
|
writeV(v, w);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
export class EncodableSet<T extends Embeddable, V> extends FlexSet<V>
|
||||||
implements Preservable<T>, PreserveWritable<T>
|
implements Preservable<T>, PreserveWritable<T>
|
||||||
|
{
|
||||||
|
constructor(
|
||||||
|
public readonly encodeV: (v: V) => CompoundKey<T>,
|
||||||
|
items?: Iterable<V>,
|
||||||
|
) {
|
||||||
|
super((v: V) => canonicalString(encodeV(v)), items);
|
||||||
|
}
|
||||||
|
|
||||||
|
__preserve_on__(encoder: Encoder<T>) {
|
||||||
|
encodeSetOn(this, encoder, (v, e) => e.push(this.encodeV(v)));
|
||||||
|
}
|
||||||
|
|
||||||
|
__preserve_text_on__(w: Writer<T>) {
|
||||||
|
writeSetOn(this, w, (v, w) => w.push(this.encodeV(v)));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export class KeyedSet<T extends Embeddable = GenericEmbedded, K extends CompoundKey<T> = Value<T>>
|
||||||
|
extends EncodableSet<T, K>
|
||||||
{
|
{
|
||||||
get [DictionaryType](): DictionaryType {
|
get [DictionaryType](): DictionaryType {
|
||||||
return 'Set';
|
return 'Set';
|
||||||
}
|
}
|
||||||
|
|
||||||
static isKeyedSet<K extends Value<T>, T = GenericEmbedded>(x: any): x is KeyedSet<K, T> {
|
static isKeyedSet<T extends Embeddable = GenericEmbedded, K extends CompoundKey<T> = Value<T>>(
|
||||||
|
x: any,
|
||||||
|
): x is KeyedSet<T, K> {
|
||||||
return x?.[DictionaryType] === 'Set';
|
return x?.[DictionaryType] === 'Set';
|
||||||
}
|
}
|
||||||
|
|
||||||
constructor(items?: Iterable<K>) {
|
constructor(items?: Iterable<K>) {
|
||||||
super(canonicalString, items);
|
super(k => k, items);
|
||||||
}
|
}
|
||||||
|
|
||||||
map<S extends Value<R>, R = GenericEmbedded>(f: (value: K) => S): KeyedSet<S, R> {
|
map<R extends Embeddable = GenericEmbedded, S extends Value<R> = Value<R>>(
|
||||||
|
f: (value: K) => S,
|
||||||
|
): KeyedSet<R, S> {
|
||||||
return new KeyedSet(_iterMap(this[Symbol.iterator](), f));
|
return new KeyedSet(_iterMap(this[Symbol.iterator](), f));
|
||||||
}
|
}
|
||||||
|
|
||||||
filter(f: (value: K) => boolean): KeyedSet<K, T> {
|
filter(f: (value: K) => boolean): KeyedSet<T, K> {
|
||||||
const result = new KeyedSet<K, T>();
|
const result = new KeyedSet<T, K>();
|
||||||
for (let k of this) if (f(k)) result.add(k);
|
for (let k of this) if (f(k)) result.add(k);
|
||||||
return result;
|
return result;
|
||||||
}
|
}
|
||||||
|
|
||||||
clone(): KeyedSet<K, T> {
|
clone(): KeyedSet<T, K> {
|
||||||
return new KeyedSet(this);
|
return new KeyedSet(this);
|
||||||
}
|
}
|
||||||
|
|
||||||
get [Symbol.toStringTag]() { return 'Set'; }
|
get [Symbol.toStringTag]() { return 'Set'; }
|
||||||
|
|
||||||
__preserve_on__(encoder: Encoder<T>) {
|
|
||||||
if (encoder.canonical) {
|
|
||||||
const pieces = Array.from(this).map<[Bytes, K]>(k => [canonicalEncode(k), k]);
|
|
||||||
pieces.sort((a, b) => Bytes.compare(a[0], b[0]));
|
|
||||||
encoder.encodevalues(Tag.Set, pieces.map(e => e[1]));
|
|
||||||
} else {
|
|
||||||
encoder.encodevalues(Tag.Set, this);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
__preserve_text_on__(w: Writer<T>) {
|
|
||||||
w.state.writeSeq('#{', '}', this, vv => w.push(vv));
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
export class Set<T = GenericEmbedded> extends KeyedSet<Value<T>, T> {
|
export class Set<T extends Embeddable = GenericEmbedded> extends KeyedSet<T> {
|
||||||
static isSet<T = GenericEmbedded>(x: any): x is Set<T> {
|
static isSet<T extends Embeddable = GenericEmbedded>(x: any): x is Set<T> {
|
||||||
return x?.[DictionaryType] === 'Set';
|
return x?.[DictionaryType] === 'Set';
|
||||||
}
|
}
|
||||||
|
|
||||||
static __from_preserve__<T>(v: Value<T>): undefined | Set<T> {
|
static __from_preserve__<T extends Embeddable>(v: Value<T>): undefined | Set<T> {
|
||||||
return Set.isSet<T>(v) ? v : void 0;
|
return Set.isSet<T>(v) ? v : void 0;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function encodeSetOn<T extends Embeddable, V>(
|
||||||
|
s: IdentitySet<V>,
|
||||||
|
encoder: Encoder<T>,
|
||||||
|
encodeV: (v: V, encoder: Encoder<T>) => void,
|
||||||
|
) {
|
||||||
|
if (encoder.canonical) {
|
||||||
|
const canonicalEncoder = new Encoder<T>({
|
||||||
|
canonical: true,
|
||||||
|
embeddedEncode: encoder.embeddedEncode,
|
||||||
|
});
|
||||||
|
const pieces = Array.from(s).map<[Bytes, V]>(v => {
|
||||||
|
encodeV(v, canonicalEncoder);
|
||||||
|
return [canonicalEncoder.contents(), v];
|
||||||
|
});
|
||||||
|
pieces.sort((a, b) => Bytes.compare(a[0], b[0]));
|
||||||
|
encoder.grouped(Tag.Set, () => pieces.forEach(([_e, v]) => encodeV(v, encoder)));
|
||||||
|
} else {
|
||||||
|
encoder.grouped(Tag.Set, () => s.forEach(v => encodeV(v, encoder)));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function writeSetOn<T extends Embeddable, V>(
|
||||||
|
s: IdentitySet<V>,
|
||||||
|
w: Writer<T>,
|
||||||
|
writeV: (v: V, w: Writer<T>) => void,
|
||||||
|
) {
|
||||||
|
w.state.writeSeq('#{', '}', s, vv => writeV(vv, w));
|
||||||
|
}
|
||||||
|
|
|
@ -3,7 +3,17 @@ import type { EncoderState } from "./encoder";
|
||||||
import type { Value } from "./values";
|
import type { Value } from "./values";
|
||||||
import { ReaderStateOptions } from "./reader";
|
import { ReaderStateOptions } from "./reader";
|
||||||
|
|
||||||
export type EmbeddedTypeEncode<T> = {
|
export const IsEmbedded = Symbol.for('IsEmbedded');
|
||||||
|
|
||||||
|
export interface Embeddable {
|
||||||
|
readonly [IsEmbedded]: true;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function isEmbedded<T extends Embeddable>(v: any): v is T {
|
||||||
|
return !!v?.[IsEmbedded];
|
||||||
|
}
|
||||||
|
|
||||||
|
export type EmbeddedTypeEncode<T extends Embeddable> = {
|
||||||
encode(s: EncoderState, v: T): void;
|
encode(s: EncoderState, v: T): void;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -12,46 +22,22 @@ export type EmbeddedTypeDecode<T> = {
|
||||||
fromValue(v: Value<GenericEmbedded>, options: ReaderStateOptions): T;
|
fromValue(v: Value<GenericEmbedded>, options: ReaderStateOptions): T;
|
||||||
}
|
}
|
||||||
|
|
||||||
export type EmbeddedType<T> = EmbeddedTypeEncode<T> & EmbeddedTypeDecode<T>;
|
export type EmbeddedType<T extends Embeddable> = EmbeddedTypeEncode<T> & EmbeddedTypeDecode<T>;
|
||||||
|
|
||||||
export class Embedded<T> {
|
export class Embedded<T> {
|
||||||
embeddedValue: T;
|
get [IsEmbedded](): true { return true; }
|
||||||
|
|
||||||
constructor(embeddedValue: T) {
|
constructor(public readonly value: T) {}
|
||||||
this.embeddedValue = embeddedValue;
|
|
||||||
|
equals(other: any): boolean {
|
||||||
|
return typeof other === 'object' && 'value' in other && Object.is(this.value, other.value);
|
||||||
}
|
}
|
||||||
|
|
||||||
equals(other: any, is: (a: any, b: any) => boolean) {
|
|
||||||
return isEmbedded<T>(other) && is(this.embeddedValue, other.embeddedValue);
|
|
||||||
}
|
|
||||||
|
|
||||||
toString(): string {
|
|
||||||
return '#!' + (this.embeddedValue as any).toString();
|
|
||||||
}
|
|
||||||
|
|
||||||
__as_preserve__<R>(): T extends R ? Value<R> : never {
|
|
||||||
return this as any;
|
|
||||||
}
|
|
||||||
|
|
||||||
static __from_preserve__<T>(v: Value<T>): undefined | Embedded<T> {
|
|
||||||
return isEmbedded<T>(v) ? v : void 0;
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export function embed<T>(embeddedValue: T): Embedded<T> {
|
|
||||||
return new Embedded(embeddedValue);
|
|
||||||
}
|
|
||||||
|
|
||||||
export function isEmbedded<T>(v: Value<T>): v is Embedded<T> {
|
|
||||||
return typeof v === 'object' && 'embeddedValue' in v;
|
|
||||||
}
|
}
|
||||||
|
|
||||||
export class GenericEmbedded {
|
export class GenericEmbedded {
|
||||||
generic: Value;
|
get [IsEmbedded](): true { return true; }
|
||||||
|
|
||||||
constructor(generic: Value) {
|
constructor(public readonly generic: Value) {}
|
||||||
this.generic = generic;
|
|
||||||
}
|
|
||||||
|
|
||||||
equals(other: any, is: (a: any, b: any) => boolean) {
|
equals(other: any, is: (a: any, b: any) => boolean) {
|
||||||
return typeof other === 'object' && 'generic' in other && is(this.generic, other.generic);
|
return typeof other === 'object' && 'generic' in other && is(this.generic, other.generic);
|
||||||
|
|
|
@ -3,17 +3,18 @@ import { Bytes, unhexDigit } from "./bytes";
|
||||||
import { Value } from "./values";
|
import { Value } from "./values";
|
||||||
import { EncodeError } from "./codec";
|
import { EncodeError } from "./codec";
|
||||||
import { Record, Tuple } from "./record";
|
import { Record, Tuple } from "./record";
|
||||||
import { GenericEmbedded, EmbeddedTypeEncode } from "./embedded";
|
import { EmbeddedTypeEncode, isEmbedded } from "./embedded";
|
||||||
import type { Embedded } from "./embedded";
|
import type { Embeddable } from "./embedded";
|
||||||
|
import { DictionaryMap, encodeDictionaryOn } from "./dictionary";
|
||||||
|
|
||||||
export type Encodable<T> =
|
export type Encodable<T extends Embeddable> =
|
||||||
Value<T> | Preservable<T> | Iterable<Value<T>> | ArrayBufferView;
|
Value<T> | Preservable<T> | Iterable<Value<T>> | ArrayBufferView;
|
||||||
|
|
||||||
export interface Preservable<T> {
|
export interface Preservable<T extends Embeddable> {
|
||||||
__preserve_on__(encoder: Encoder<T>): void;
|
__preserve_on__(encoder: Encoder<T>): void;
|
||||||
}
|
}
|
||||||
|
|
||||||
export function isPreservable<T>(v: any): v is Preservable<T> {
|
export function isPreservable<T extends Embeddable>(v: any): v is Preservable<T> {
|
||||||
return typeof v === 'object' && v !== null && '__preserve_on__' in v && typeof v.__preserve_on__ === 'function';
|
return typeof v === 'object' && v !== null && '__preserve_on__' in v && typeof v.__preserve_on__ === 'function';
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -22,11 +23,11 @@ export interface EncoderOptions {
|
||||||
includeAnnotations?: boolean;
|
includeAnnotations?: boolean;
|
||||||
}
|
}
|
||||||
|
|
||||||
export interface EncoderEmbeddedOptions<T> extends EncoderOptions {
|
export interface EncoderEmbeddedOptions<T extends Embeddable> extends EncoderOptions {
|
||||||
embeddedEncode?: EmbeddedTypeEncode<T>;
|
embeddedEncode?: EmbeddedTypeEncode<T>;
|
||||||
}
|
}
|
||||||
|
|
||||||
export function asLatin1(bs: Uint8Array): string {
|
function asLatin1(bs: Uint8Array): string {
|
||||||
return String.fromCharCode.apply(null, bs as any as number[]);
|
return String.fromCharCode.apply(null, bs as any as number[]);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -199,7 +200,7 @@ export class EncoderState {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export class Encoder<T = object> {
|
export class Encoder<T extends Embeddable> {
|
||||||
state: EncoderState;
|
state: EncoderState;
|
||||||
embeddedEncode: EmbeddedTypeEncode<T>;
|
embeddedEncode: EmbeddedTypeEncode<T>;
|
||||||
|
|
||||||
|
@ -218,7 +219,7 @@ export class Encoder<T = object> {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
withEmbeddedEncode<S>(
|
withEmbeddedEncode<S extends Embeddable>(
|
||||||
embeddedEncode: EmbeddedTypeEncode<S>,
|
embeddedEncode: EmbeddedTypeEncode<S>,
|
||||||
body: (e: Encoder<S>) => void): this
|
body: (e: Encoder<S>) => void): this
|
||||||
{
|
{
|
||||||
|
@ -242,14 +243,14 @@ export class Encoder<T = object> {
|
||||||
return this.state.contentsString();
|
return this.state.contentsString();
|
||||||
}
|
}
|
||||||
|
|
||||||
encodevalues(tag: Tag, items: Iterable<Value<T>>) {
|
grouped(tag: Tag, f: () => void) {
|
||||||
this.state.emitbyte(tag);
|
this.state.emitbyte(tag);
|
||||||
for (let i of items) { this.push(i); }
|
f();
|
||||||
this.state.emitbyte(Tag.End);
|
this.state.emitbyte(Tag.End);
|
||||||
}
|
}
|
||||||
|
|
||||||
push(v: Encodable<T>) {
|
push(v: Encodable<T>) {
|
||||||
if (isPreservable<unknown>(v)) {
|
if (isPreservable<any>(v)) {
|
||||||
v.__preserve_on__(this);
|
v.__preserve_on__(this);
|
||||||
}
|
}
|
||||||
else if (isPreservable<T>(v)) {
|
else if (isPreservable<T>(v)) {
|
||||||
|
@ -284,19 +285,25 @@ export class Encoder<T = object> {
|
||||||
this.state.emitbyte(Tag.End);
|
this.state.emitbyte(Tag.End);
|
||||||
}
|
}
|
||||||
else if (isIterable<Value<T>>(v)) {
|
else if (isIterable<Value<T>>(v)) {
|
||||||
this.encodevalues(Tag.Sequence, v);
|
this.grouped(Tag.Sequence, () => {
|
||||||
|
for (let i of v) this.push(i);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
else if (isEmbedded<T>(v)) {
|
||||||
|
this.state.emitbyte(Tag.Embedded);
|
||||||
|
this.embeddedEncode.encode(this.state, v);
|
||||||
}
|
}
|
||||||
else {
|
else {
|
||||||
((v: Embedded<T>) => {
|
encodeDictionaryOn(new DictionaryMap<T>(v),
|
||||||
this.state.emitbyte(Tag.Embedded);
|
this,
|
||||||
this.embeddedEncode.encode(this.state, v.embeddedValue);
|
(k, e) => e.push(k),
|
||||||
})(v);
|
(v, e) => e.push(v));
|
||||||
}
|
}
|
||||||
return this; // for chaining
|
return this; // for chaining
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export function encode<T>(
|
export function encode<T extends Embeddable>(
|
||||||
v: Encodable<T>,
|
v: Encodable<T>,
|
||||||
options: EncoderEmbeddedOptions<T> = {}): Bytes
|
options: EncoderEmbeddedOptions<T> = {}): Bytes
|
||||||
{
|
{
|
||||||
|
@ -330,7 +337,9 @@ export function canonicalString(v: Encodable<any>): string {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export function encodeWithAnnotations<T>(v: Encodable<T>,
|
export function encodeWithAnnotations<T extends Embeddable>(
|
||||||
options: EncoderEmbeddedOptions<T> = {}): Bytes {
|
v: Encodable<T>,
|
||||||
|
options: EncoderEmbeddedOptions<T> = {},
|
||||||
|
): Bytes {
|
||||||
return encode(v, { ... options, includeAnnotations: true });
|
return encode(v, { ... options, includeAnnotations: true });
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,12 +1,12 @@
|
||||||
import { Tag } from "./constants";
|
import { Tag } from "./constants";
|
||||||
import { stringify } from "./text";
|
|
||||||
import { Value } from "./values";
|
import { Value } from "./values";
|
||||||
import type { GenericEmbedded } from "./embedded";
|
import type { Embeddable, GenericEmbedded } from "./embedded";
|
||||||
import type { Encoder, Preservable } from "./encoder";
|
import type { Encoder, Preservable } from "./encoder";
|
||||||
import type { Writer, PreserveWritable } from "./writer";
|
import type { Writer, PreserveWritable } from "./writer";
|
||||||
import { Bytes, dataview, underlying } from "./bytes";
|
import { Bytes, dataview } from "./bytes";
|
||||||
|
|
||||||
export type FloatType = 'Single' | 'Double';
|
// v Previously included 'Single'; may again in future. Also, 'Half', 'Quad'?
|
||||||
|
export type FloatType = 'Double';
|
||||||
export const FloatType = Symbol.for('FloatType');
|
export const FloatType = Symbol.for('FloatType');
|
||||||
|
|
||||||
export abstract class Float {
|
export abstract class Float {
|
||||||
|
@ -16,8 +16,8 @@ export abstract class Float {
|
||||||
this.value = typeof value === 'number' ? value : value.value;
|
this.value = typeof value === 'number' ? value : value.value;
|
||||||
}
|
}
|
||||||
|
|
||||||
toString() {
|
__preserve_text_on__(w: Writer<any>) {
|
||||||
return stringify(this);
|
w.state.pieces.push(this.toString());
|
||||||
}
|
}
|
||||||
|
|
||||||
abstract toBytes(): Bytes;
|
abstract toBytes(): Bytes;
|
||||||
|
@ -38,7 +38,6 @@ export abstract class Float {
|
||||||
abstract get [FloatType](): FloatType;
|
abstract get [FloatType](): FloatType;
|
||||||
|
|
||||||
static isFloat = (x: any): x is Float => x?.[FloatType] !== void 0;
|
static isFloat = (x: any): x is Float => x?.[FloatType] !== void 0;
|
||||||
static isSingle = (x: any): x is SingleFloat => x?.[FloatType] === 'Single';
|
|
||||||
static isDouble = (x: any): x is DoubleFloat => x?.[FloatType] === 'Double';
|
static isDouble = (x: any): x is DoubleFloat => x?.[FloatType] === 'Double';
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -59,79 +58,42 @@ export function floatlikeString(f: number): string {
|
||||||
return s + '.0';
|
return s + '.0';
|
||||||
}
|
}
|
||||||
|
|
||||||
export class SingleFloat extends Float implements Preservable<any>, PreserveWritable<any> {
|
// -- These snippets are useful to keep in mind for promoting 4-byte, single-precision floats
|
||||||
__as_preserve__<T = GenericEmbedded>(): Value<T> {
|
// -- to 8-byte, double-precision floats *while preserving NaN bit-patterns*:
|
||||||
return this;
|
//
|
||||||
}
|
// static fromBytes(bs: Bytes | DataView): SingleFloat {
|
||||||
|
// const view = dataview(bs);
|
||||||
static fromBytes(bs: Bytes | DataView): SingleFloat {
|
// const vf = view.getInt32(0, false);
|
||||||
const view = dataview(bs);
|
// if ((vf & 0x7f800000) === 0x7f800000) {
|
||||||
const vf = view.getInt32(0, false);
|
// // NaN or inf. Preserve quiet/signalling bit by manually expanding to double-precision.
|
||||||
if ((vf & 0x7f800000) === 0x7f800000) {
|
// const sign = vf >> 31;
|
||||||
// NaN or inf. Preserve quiet/signalling bit by manually expanding to double-precision.
|
// const payload = vf & 0x007fffff;
|
||||||
const sign = vf >> 31;
|
// const dbs = new Bytes(8);
|
||||||
const payload = vf & 0x007fffff;
|
// const dview = dataview(dbs);
|
||||||
const dbs = new Bytes(8);
|
// dview.setInt16(0, (sign << 15) | 0x7ff0 | (payload >> 19), false);
|
||||||
const dview = dataview(dbs);
|
// dview.setInt32(2, (payload & 0x7ffff) << 13, false);
|
||||||
dview.setInt16(0, (sign << 15) | 0x7ff0 | (payload >> 19), false);
|
// return new SingleFloat(dview.getFloat64(0, false));
|
||||||
dview.setInt32(2, (payload & 0x7ffff) << 13, false);
|
// } else {
|
||||||
return new SingleFloat(dview.getFloat64(0, false));
|
// return new SingleFloat(dataview(bs).getFloat32(0, false));
|
||||||
} else {
|
// }
|
||||||
return new SingleFloat(dataview(bs).getFloat32(0, false));
|
// }
|
||||||
}
|
//
|
||||||
}
|
// __w(v: DataView, offset: number) {
|
||||||
|
// if (Number.isNaN(this.value)) {
|
||||||
static __from_preserve__<T>(v: Value<T>): undefined | SingleFloat {
|
// const dbs = new Bytes(8);
|
||||||
return Float.isSingle(v) ? v : void 0;
|
// const dview = dataview(dbs);
|
||||||
}
|
// dview.setFloat64(0, this.value, false);
|
||||||
|
// const sign = dview.getInt8(0) >> 7;
|
||||||
__w(v: DataView, offset: number) {
|
// const payload = (dview.getInt32(1, false) >> 5) & 0x007fffff;
|
||||||
if (Number.isNaN(this.value)) {
|
// const vf = (sign << 31) | 0x7f800000 | payload;
|
||||||
const dbs = new Bytes(8);
|
// v.setInt32(offset, vf, false);
|
||||||
const dview = dataview(dbs);
|
// } else {
|
||||||
dview.setFloat64(0, this.value, false);
|
// v.setFloat32(offset, this.value, false);
|
||||||
const sign = dview.getInt8(0) >> 7;
|
// }
|
||||||
const payload = (dview.getInt32(1, false) >> 5) & 0x007fffff;
|
// }
|
||||||
const vf = (sign << 31) | 0x7f800000 | payload;
|
|
||||||
v.setInt32(offset, vf, false);
|
|
||||||
} else {
|
|
||||||
v.setFloat32(offset, this.value, false);
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
__preserve_on__(encoder: Encoder<any>) {
|
|
||||||
encoder.state.emitbyte(Tag.Ieee754);
|
|
||||||
encoder.state.emitbyte(4);
|
|
||||||
encoder.state.makeroom(4);
|
|
||||||
this.__w(encoder.state.view, encoder.state.index);
|
|
||||||
encoder.state.index += 4;
|
|
||||||
}
|
|
||||||
|
|
||||||
toBytes(): Bytes {
|
|
||||||
const bs = new Bytes(4);
|
|
||||||
this.__w(bs.dataview(), 0);
|
|
||||||
return bs;
|
|
||||||
}
|
|
||||||
|
|
||||||
__preserve_text_on__(w: Writer<any>) {
|
|
||||||
if (Number.isFinite(this.value)) {
|
|
||||||
w.state.pieces.push(floatlikeString(this.value) + 'f');
|
|
||||||
} else {
|
|
||||||
w.state.pieces.push('#xf"', this.toBytes().toHex(), '"');
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
get [FloatType](): 'Single' {
|
|
||||||
return 'Single';
|
|
||||||
}
|
|
||||||
}
|
|
||||||
|
|
||||||
export function Single(value: number | Float): SingleFloat {
|
|
||||||
return new SingleFloat(value);
|
|
||||||
}
|
|
||||||
|
|
||||||
export class DoubleFloat extends Float implements Preservable<any>, PreserveWritable<any> {
|
export class DoubleFloat extends Float implements Preservable<any>, PreserveWritable<any> {
|
||||||
__as_preserve__<T = GenericEmbedded>(): Value<T> {
|
__as_preserve__<T extends Embeddable = GenericEmbedded>(): Value<T> {
|
||||||
return this;
|
return this;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -139,7 +101,7 @@ export class DoubleFloat extends Float implements Preservable<any>, PreserveWrit
|
||||||
return new DoubleFloat(dataview(bs).getFloat64(0, false));
|
return new DoubleFloat(dataview(bs).getFloat64(0, false));
|
||||||
}
|
}
|
||||||
|
|
||||||
static __from_preserve__<T>(v: Value<T>): undefined | DoubleFloat {
|
static __from_preserve__<T extends Embeddable>(v: Value<T>): undefined | DoubleFloat {
|
||||||
return Float.isDouble(v) ? v : void 0;
|
return Float.isDouble(v) ? v : void 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -157,11 +119,11 @@ export class DoubleFloat extends Float implements Preservable<any>, PreserveWrit
|
||||||
return bs;
|
return bs;
|
||||||
}
|
}
|
||||||
|
|
||||||
__preserve_text_on__(w: Writer<any>) {
|
toString(): string {
|
||||||
if (Number.isFinite(this.value)) {
|
if (Number.isFinite(this.value)) {
|
||||||
w.state.pieces.push(floatlikeString(this.value));
|
return floatlikeString(this.value);
|
||||||
} else {
|
} else {
|
||||||
w.state.pieces.push('#xd"', this.toBytes().toHex(), '"');
|
return '#xd"' + this.toBytes().toHex() + '"';
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -1,14 +1,13 @@
|
||||||
import { Record, Tuple } from "./record";
|
import { Record, Tuple } from "./record";
|
||||||
import { Bytes } from "./bytes";
|
import { Bytes } from "./bytes";
|
||||||
import { Value } from "./values";
|
import { Value } from "./values";
|
||||||
import { Set, Dictionary } from "./dictionary";
|
import { Set, KeyedDictionary, Dictionary, DictionaryMap } from "./dictionary";
|
||||||
import { annotate, Annotated } from "./annotated";
|
import { annotate, Annotated } from "./annotated";
|
||||||
import { Double, Float, Single } from "./float";
|
import { Double, Float } from "./float";
|
||||||
import { Embedded } from "./embedded";
|
import { Embeddable, isEmbedded } from "./embedded";
|
||||||
|
|
||||||
export enum ValueClass {
|
export enum ValueClass {
|
||||||
Boolean,
|
Boolean,
|
||||||
Float,
|
|
||||||
Double,
|
Double,
|
||||||
SignedInteger,
|
SignedInteger,
|
||||||
String,
|
String,
|
||||||
|
@ -22,11 +21,10 @@ export enum ValueClass {
|
||||||
Annotated, // quasi-class
|
Annotated, // quasi-class
|
||||||
}
|
}
|
||||||
|
|
||||||
export type Fold<T, R = Value<T>> = (v: Value<T>) => R;
|
export type Fold<T extends Embeddable, R = Value<T>> = (v: Value<T>) => R;
|
||||||
|
|
||||||
export interface FoldMethods<T, R> {
|
export interface FoldMethods<T extends Embeddable, R> {
|
||||||
boolean(b: boolean): R;
|
boolean(b: boolean): R;
|
||||||
single(f: number): R;
|
|
||||||
double(f: number): R;
|
double(f: number): R;
|
||||||
integer(i: number | bigint): R;
|
integer(i: number | bigint): R;
|
||||||
string(s: string): R;
|
string(s: string): R;
|
||||||
|
@ -36,46 +34,42 @@ export interface FoldMethods<T, R> {
|
||||||
record(r: Record<Value<T>, Tuple<Value<T>>, T>, k: Fold<T, R>): R;
|
record(r: Record<Value<T>, Tuple<Value<T>>, T>, k: Fold<T, R>): R;
|
||||||
array(a: Array<Value<T>>, k: Fold<T, R>): R;
|
array(a: Array<Value<T>>, k: Fold<T, R>): R;
|
||||||
set(s: Set<T>, k: Fold<T, R>): R;
|
set(s: Set<T>, k: Fold<T, R>): R;
|
||||||
dictionary(d: Dictionary<T>, k: Fold<T, R>): R;
|
dictionary(d: DictionaryMap<T>, k: Fold<T, R>): R;
|
||||||
|
|
||||||
annotated(a: Annotated<T>, k: Fold<T, R>): R;
|
annotated(a: Annotated<T>, k: Fold<T, R>): R;
|
||||||
|
|
||||||
embedded(t: Embedded<T>, k: Fold<T, R>): R;
|
embedded(t: T, k: Fold<T, R>): R;
|
||||||
}
|
}
|
||||||
|
|
||||||
export class VoidFold<T> implements FoldMethods<T, void> {
|
export class VoidFold<T extends Embeddable> implements FoldMethods<T, void> {
|
||||||
boolean(b: boolean): void {}
|
boolean(_b: boolean): void {}
|
||||||
single(f: number): void {}
|
double(_f: number): void {}
|
||||||
double(f: number): void {}
|
integer(_i: number | bigint): void {}
|
||||||
integer(i: number | bigint): void {}
|
string(_s: string): void {}
|
||||||
string(s: string): void {}
|
bytes(_b: Bytes): void {}
|
||||||
bytes(b: Bytes): void {}
|
symbol(_s: symbol): void {}
|
||||||
symbol(s: symbol): void {}
|
|
||||||
record(r: Record<Value<T>, Tuple<Value<T>>, T>, k: Fold<T, void>): void {
|
record(r: Record<Value<T>, Tuple<Value<T>>, T>, k: Fold<T, void>): void {
|
||||||
k(r.label);
|
k(r.label);
|
||||||
r.forEach(k);
|
r.forEach(k);
|
||||||
}
|
}
|
||||||
array(a: Value<T>[], k: Fold<T, void>): void { a.forEach(k); }
|
array(a: Value<T>[], k: Fold<T, void>): void { a.forEach(k); }
|
||||||
set(s: Set<T>, k: Fold<T, void>): void { s.forEach(k); }
|
set(s: Set<T>, k: Fold<T, void>): void { s.forEach(k); }
|
||||||
dictionary(d: Dictionary<T>, k: Fold<T, void>): void {
|
dictionary(d: DictionaryMap<T>, k: Fold<T, void>): void {
|
||||||
d.forEach((value, key) => { k(key); k(value); });
|
d.forEach((value, key) => { k(key); k(value); });
|
||||||
}
|
}
|
||||||
annotated(a: Annotated<T>, k: Fold<T, void>): void { k(a.item); a.annotations.forEach(k); }
|
annotated(a: Annotated<T>, k: Fold<T, void>): void { k(a.item); a.annotations.forEach(k); }
|
||||||
embedded(_t: Embedded<T>, _k: Fold<T, void>): void {}
|
embedded(_t: T, _k: Fold<T, void>): void {}
|
||||||
}
|
}
|
||||||
|
|
||||||
export class ForEachEmbedded<T> extends VoidFold<T> {
|
export class ForEachEmbedded<T extends Embeddable> extends VoidFold<T> {
|
||||||
constructor(public readonly f: (t: T, k: Fold<T, void>) => void) { super(); }
|
constructor(public readonly f: (t: T, k: Fold<T, void>) => void) { super(); }
|
||||||
embedded(t: Embedded<T>, k: Fold<T, void>): void { this.f(t.embeddedValue, k); }
|
embedded(t: T, k: Fold<T, void>): void { this.f(t, k); }
|
||||||
}
|
}
|
||||||
|
|
||||||
export abstract class ValueFold<T, R = T> implements FoldMethods<T, Value<R>> {
|
export abstract class ValueFold<T extends Embeddable, R extends Embeddable = T> implements FoldMethods<T, Value<R>> {
|
||||||
boolean(b: boolean): Value<R> {
|
boolean(b: boolean): Value<R> {
|
||||||
return b;
|
return b;
|
||||||
}
|
}
|
||||||
single(f: number): Value<R> {
|
|
||||||
return Single(f);
|
|
||||||
}
|
|
||||||
double(f: number): Value<R> {
|
double(f: number): Value<R> {
|
||||||
return Double(f);
|
return Double(f);
|
||||||
}
|
}
|
||||||
|
@ -100,22 +94,24 @@ export abstract class ValueFold<T, R = T> implements FoldMethods<T, Value<R>> {
|
||||||
set(s: Set<T>, k: Fold<T, Value<R>>): Value<R> {
|
set(s: Set<T>, k: Fold<T, Value<R>>): Value<R> {
|
||||||
return s.map(k);
|
return s.map(k);
|
||||||
}
|
}
|
||||||
dictionary(d: Dictionary<T>, k: Fold<T, Value<R>>): Value<R> {
|
dictionary(d: DictionaryMap<T>, k: Fold<T, Value<R>>): Value<R> {
|
||||||
return d.mapEntries(([key, value]) => [k(key), k(value)]);
|
const result = new DictionaryMap<R>();
|
||||||
|
d.forEach((value, key) => result.set(k(key), k(value)));
|
||||||
|
return result.simplifiedValue();
|
||||||
}
|
}
|
||||||
annotated(a: Annotated<T>, k: Fold<T, Value<R>>): Value<R> {
|
annotated(a: Annotated<T>, k: Fold<T, Value<R>>): Value<R> {
|
||||||
return annotate(k(a.item), ...a.annotations.map(k));
|
return annotate(k(a.item), ...a.annotations.map(k));
|
||||||
}
|
}
|
||||||
abstract embedded(t: Embedded<T>, k: Fold<T, Value<R>>): Value<R>;
|
abstract embedded(t: T, k: Fold<T, Value<R>>): Value<R>;
|
||||||
}
|
}
|
||||||
|
|
||||||
export class IdentityFold<T> extends ValueFold<T, T> {
|
export class IdentityFold<T extends Embeddable> extends ValueFold<T, T> {
|
||||||
embedded(t: Embedded<T>, _k: Fold<T, Value<T>>): Value<T> {
|
embedded(t: T, _k: Fold<T, Value<T>>): Value<T> {
|
||||||
return t;
|
return t;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export class MapFold<T, R> extends ValueFold<T, R> {
|
export class MapFold<T extends Embeddable, R extends Embeddable> extends ValueFold<T, R> {
|
||||||
readonly f: (t: T) => Value<R>;
|
readonly f: (t: T) => Value<R>;
|
||||||
|
|
||||||
constructor(f: (t: T) => Value<R>) {
|
constructor(f: (t: T) => Value<R>) {
|
||||||
|
@ -123,18 +119,18 @@ export class MapFold<T, R> extends ValueFold<T, R> {
|
||||||
this.f = f;
|
this.f = f;
|
||||||
}
|
}
|
||||||
|
|
||||||
embedded(t: Embedded<T>, _k: Fold<T, Value<R>>): Value<R> {
|
embedded(t: T, _k: Fold<T, Value<R>>): Value<R> {
|
||||||
return this.f(t.embeddedValue);
|
return this.f(t);
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export function valueClass<T>(v: Value<T>): ValueClass {
|
export function valueClass<T extends Embeddable>(v: Value<T>): ValueClass {
|
||||||
switch (typeof v) {
|
switch (typeof v) {
|
||||||
case 'boolean':
|
case 'boolean':
|
||||||
return ValueClass.Boolean;
|
return ValueClass.Boolean;
|
||||||
case 'number':
|
case 'number':
|
||||||
if (!Number.isInteger(v)) {
|
if (!Number.isInteger(v)) {
|
||||||
throw new Error("Non-integer number in Preserves valueClass; missing SingleFloat/DoubleFloat wrapper?");
|
throw new Error("Non-integer number in Preserves valueClass; missing Float wrapper?");
|
||||||
} else {
|
} else {
|
||||||
return ValueClass.SignedInteger;
|
return ValueClass.SignedInteger;
|
||||||
}
|
}
|
||||||
|
@ -157,12 +153,10 @@ export function valueClass<T>(v: Value<T>): ValueClass {
|
||||||
return ValueClass.Annotated;
|
return ValueClass.Annotated;
|
||||||
} else if (Bytes.isBytes(v)) {
|
} else if (Bytes.isBytes(v)) {
|
||||||
return ValueClass.ByteString;
|
return ValueClass.ByteString;
|
||||||
} else if (Float.isSingle(v)) {
|
|
||||||
return ValueClass.Float;
|
|
||||||
} else if (Float.isDouble(v)) {
|
} else if (Float.isDouble(v)) {
|
||||||
return ValueClass.Double;
|
return ValueClass.Double;
|
||||||
} else {
|
} else {
|
||||||
return ValueClass.Embedded;
|
return ((_v: T) => ValueClass.Embedded)(v);
|
||||||
}
|
}
|
||||||
default:
|
default:
|
||||||
((_v: never): never => { throw new Error("Internal error"); })(v);
|
((_v: never): never => { throw new Error("Internal error"); })(v);
|
||||||
|
@ -171,7 +165,7 @@ export function valueClass<T>(v: Value<T>): ValueClass {
|
||||||
|
|
||||||
export const IDENTITY_FOLD = new IdentityFold<any>();
|
export const IDENTITY_FOLD = new IdentityFold<any>();
|
||||||
|
|
||||||
export function fold<T, R>(v: Value<T>, o: FoldMethods<T, R>): R {
|
export function fold<T extends Embeddable, R>(v: Value<T>, o: FoldMethods<T, R>): R {
|
||||||
const walk = (v: Value<T>): R => {
|
const walk = (v: Value<T>): R => {
|
||||||
switch (typeof v) {
|
switch (typeof v) {
|
||||||
case 'boolean':
|
case 'boolean':
|
||||||
|
@ -196,18 +190,16 @@ export function fold<T, R>(v: Value<T>, o: FoldMethods<T, R>): R {
|
||||||
return o.array(v, walk);
|
return o.array(v, walk);
|
||||||
} else if (Set.isSet<T>(v)) {
|
} else if (Set.isSet<T>(v)) {
|
||||||
return o.set(v, walk);
|
return o.set(v, walk);
|
||||||
} else if (Dictionary.isDictionary<T>(v)) {
|
} else if (isEmbedded(v)) {
|
||||||
return o.dictionary(v, walk);
|
return o.embedded(v, walk);
|
||||||
} else if (Annotated.isAnnotated<T>(v)) {
|
} else if (Annotated.isAnnotated<T>(v)) {
|
||||||
return o.annotated(v, walk);
|
return o.annotated(v, walk);
|
||||||
} else if (Bytes.isBytes(v)) {
|
} else if (Bytes.isBytes(v)) {
|
||||||
return o.bytes(v);
|
return o.bytes(v);
|
||||||
} else if (Float.isSingle(v)) {
|
|
||||||
return o.single(v.value);
|
|
||||||
} else if (Float.isDouble(v)) {
|
} else if (Float.isDouble(v)) {
|
||||||
return o.double(v.value);
|
return o.double(v.value);
|
||||||
} else {
|
} else if (Dictionary.isDictionary<T>(v)) {
|
||||||
return o.embedded(v, walk);
|
return o.dictionary(new DictionaryMap(v), walk);
|
||||||
}
|
}
|
||||||
default:
|
default:
|
||||||
((_v: never): never => { throw new Error("Internal error"); })(v);
|
((_v: never): never => { throw new Error("Internal error"); })(v);
|
||||||
|
@ -216,7 +208,7 @@ export function fold<T, R>(v: Value<T>, o: FoldMethods<T, R>): R {
|
||||||
return walk(v);
|
return walk(v);
|
||||||
}
|
}
|
||||||
|
|
||||||
export function mapEmbeddeds<T, R>(
|
export function mapEmbeddeds<T extends Embeddable, R extends Embeddable>(
|
||||||
v: Value<T>,
|
v: Value<T>,
|
||||||
f: (t: T) => Value<R>,
|
f: (t: T) => Value<R>,
|
||||||
): Value<R>
|
): Value<R>
|
||||||
|
@ -224,6 +216,9 @@ export function mapEmbeddeds<T, R>(
|
||||||
return fold(v, new MapFold(f));
|
return fold(v, new MapFold(f));
|
||||||
}
|
}
|
||||||
|
|
||||||
export function forEachEmbedded<T>(v: Value<T>, f: (t: T, k: Fold<T, void>) => void): void {
|
export function forEachEmbedded<T extends Embeddable>(
|
||||||
|
v: Value<T>,
|
||||||
|
f: (t: T, k: Fold<T, void>) => void,
|
||||||
|
): void {
|
||||||
return fold(v, new ForEachEmbedded(f));
|
return fold(v, new ForEachEmbedded(f));
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,72 +1,97 @@
|
||||||
import { embed, GenericEmbedded } from "./embedded";
|
import { Embeddable, GenericEmbedded, isEmbedded } from "./embedded";
|
||||||
import { Bytes } from "./bytes";
|
import { Bytes } from "./bytes";
|
||||||
import { Record, Tuple } from "./record";
|
import { Record, Tuple } from "./record";
|
||||||
import { Value } from "./values";
|
import { Value } from "./values";
|
||||||
import { Dictionary, Set } from "./dictionary";
|
import { Dictionary, KeyedDictionary, Set } from "./dictionary";
|
||||||
|
import { JsDictionary } from "./jsdictionary";
|
||||||
|
|
||||||
export function fromJS<T = GenericEmbedded>(x: any): Value<T> {
|
export interface FromJSOptions<T extends Embeddable = GenericEmbedded> {
|
||||||
switch (typeof x) {
|
onNonInteger?(n: number): Value<T> | undefined;
|
||||||
case 'number':
|
}
|
||||||
if (!Number.isInteger(x)) {
|
|
||||||
// We require that clients be explicit about integer vs. non-integer types.
|
export function fromJS<T extends Embeddable = GenericEmbedded>(x: any): Value<T> {
|
||||||
throw new TypeError("Refusing to autoconvert non-integer number to Single or Double");
|
return fromJS_options(x);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function fromJS_options<T extends Embeddable = GenericEmbedded>(x: any, options?: FromJSOptions<T>): Value<T> {
|
||||||
|
function walk(x: any): Value<T> {
|
||||||
|
switch (typeof x) {
|
||||||
|
case 'number':
|
||||||
|
if (!Number.isInteger(x)) {
|
||||||
|
// We require that clients be explicit about integer vs. non-integer types.
|
||||||
|
const converted = options?.onNonInteger?.(x) ?? void 0;
|
||||||
|
if (converted !== void 0) return converted;
|
||||||
|
throw new TypeError("Refusing to autoconvert non-integer number to Double");
|
||||||
|
}
|
||||||
// FALL THROUGH
|
// FALL THROUGH
|
||||||
case 'bigint':
|
case 'bigint':
|
||||||
case 'string':
|
case 'string':
|
||||||
case 'symbol':
|
case 'symbol':
|
||||||
case 'boolean':
|
case 'boolean':
|
||||||
return x;
|
|
||||||
|
|
||||||
case 'undefined':
|
|
||||||
case 'function':
|
|
||||||
break;
|
|
||||||
|
|
||||||
case 'object':
|
|
||||||
if (x === null) {
|
|
||||||
break;
|
|
||||||
}
|
|
||||||
if (typeof x.__as_preserve__ === 'function') {
|
|
||||||
return x.__as_preserve__();
|
|
||||||
}
|
|
||||||
if (Record.isRecord<Value<T>, Tuple<Value<T>>, T>(x)) {
|
|
||||||
return x;
|
return x;
|
||||||
}
|
|
||||||
if (Array.isArray(x)) {
|
|
||||||
return x.map<Value<T>>(fromJS);
|
|
||||||
}
|
|
||||||
if (ArrayBuffer.isView(x) || x instanceof ArrayBuffer) {
|
|
||||||
return Bytes.from(x);
|
|
||||||
}
|
|
||||||
if (Map.isMap(x)) {
|
|
||||||
const d = new Dictionary<T>();
|
|
||||||
x.forEach((v, k) => d.set(fromJS(k), fromJS(v)));
|
|
||||||
return d;
|
|
||||||
}
|
|
||||||
if (Set.isSet(x)) {
|
|
||||||
const s = new Set<T>();
|
|
||||||
x.forEach(v => s.add(fromJS(v)));
|
|
||||||
return s;
|
|
||||||
}
|
|
||||||
// Just... assume it's a T.
|
|
||||||
return embed(x as T);
|
|
||||||
|
|
||||||
default:
|
case 'undefined':
|
||||||
break;
|
case 'function':
|
||||||
|
break;
|
||||||
|
|
||||||
|
case 'object':
|
||||||
|
if (x === null) {
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
if (typeof x.__as_preserve__ === 'function') {
|
||||||
|
return x.__as_preserve__();
|
||||||
|
}
|
||||||
|
if (Record.isRecord<Value<T>, Tuple<Value<T>>, T>(x)) {
|
||||||
|
return x;
|
||||||
|
}
|
||||||
|
if (Array.isArray(x)) {
|
||||||
|
return x.map<Value<T>>(walk);
|
||||||
|
}
|
||||||
|
if (ArrayBuffer.isView(x) || x instanceof ArrayBuffer) {
|
||||||
|
return Bytes.from(x);
|
||||||
|
}
|
||||||
|
if (Map.isMap(x)) {
|
||||||
|
const d = new KeyedDictionary<T>();
|
||||||
|
x.forEach((v, k) => d.set(walk(k), walk(v)));
|
||||||
|
return d;
|
||||||
|
}
|
||||||
|
if (Set.isSet(x)) {
|
||||||
|
const s = new Set<T>();
|
||||||
|
x.forEach(v => s.add(walk(v)));
|
||||||
|
return s;
|
||||||
|
}
|
||||||
|
if (isEmbedded<T>(x)) {
|
||||||
|
return x;
|
||||||
|
}
|
||||||
|
// Handle plain JS objects to build a JsDictionary
|
||||||
|
{
|
||||||
|
const r: JsDictionary<Value<T>> = {};
|
||||||
|
Object.entries(x).forEach(([k, v]) => r[k] = walk(v));
|
||||||
|
return r;
|
||||||
|
}
|
||||||
|
|
||||||
|
default:
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
|
||||||
|
throw new TypeError("Cannot represent JavaScript value as Preserves: " + x);
|
||||||
}
|
}
|
||||||
|
|
||||||
throw new TypeError("Cannot represent JavaScript value as Preserves: " + x);
|
return walk(x);
|
||||||
}
|
}
|
||||||
|
|
||||||
declare module "./dictionary" {
|
declare module "./dictionary" {
|
||||||
namespace Dictionary {
|
namespace Dictionary {
|
||||||
export function fromJS<T = GenericEmbedded, V = GenericEmbedded>(x: object): Dictionary<T, Value<V>>;
|
export function stringMap<T extends Embeddable = GenericEmbedded>(
|
||||||
|
x: object
|
||||||
|
): KeyedDictionary<T, string, Value<T>>;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
Dictionary.fromJS = function <T = GenericEmbedded, V = GenericEmbedded>(x: object): Dictionary<T, Value<V>> {
|
Dictionary.stringMap = function <T extends Embeddable = GenericEmbedded>(
|
||||||
if (Dictionary.isDictionary<T, Value<V>>(x)) return x;
|
x: object
|
||||||
const d = new Dictionary<T, Value<V>>();
|
): KeyedDictionary<T, string, Value<T>> {
|
||||||
Object.entries(x).forEach(([key, value]) => d.set(key, fromJS(value)));
|
const r = new KeyedDictionary<T, string, Value<T>>();
|
||||||
return d;
|
Object.entries(x).forEach(([key, value]) => r.set(key, fromJS(value)));
|
||||||
|
return r;
|
||||||
};
|
};
|
||||||
|
|
|
@ -1,15 +1,17 @@
|
||||||
export * from './runtime';
|
export * from './runtime';
|
||||||
export * as Constants from './constants';
|
export * as Constants from './constants';
|
||||||
|
export * as Pexpr from './pexpr';
|
||||||
|
|
||||||
|
import type { Embeddable } from './embedded';
|
||||||
import type { Value } from './values';
|
import type { Value } from './values';
|
||||||
|
|
||||||
declare global {
|
declare global {
|
||||||
interface ArrayConstructor {
|
interface ArrayConstructor {
|
||||||
__from_preserve__<T>(v: Value<T>): undefined | Array<Value<T>>;
|
__from_preserve__<T extends Embeddable>(v: Value<T>): undefined | Array<Value<T>>;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
Array.__from_preserve__ = <T>(v: Value<T>) => {
|
Array.__from_preserve__ = <T extends Embeddable>(v: Value<T>) => {
|
||||||
return Array.isArray(v) ? v : void 0;
|
return Array.isArray(v) ? v : void 0;
|
||||||
};
|
};
|
||||||
|
|
||||||
|
|
|
@ -1,9 +1,10 @@
|
||||||
import type { GenericEmbedded } from "./embedded";
|
import type { Embeddable, GenericEmbedded } from "./embedded";
|
||||||
import type { Annotated } from "./annotated";
|
import type { Annotated } from "./annotated";
|
||||||
|
import { Dictionary } from "./dictionary";
|
||||||
|
|
||||||
export const IsPreservesAnnotated = Symbol.for('IsPreservesAnnotated');
|
export const IsPreservesAnnotated = Symbol.for('IsPreservesAnnotated');
|
||||||
|
|
||||||
export function isAnnotated<T = GenericEmbedded>(x: any): x is Annotated<T>
|
export function isAnnotated<T extends Embeddable = GenericEmbedded>(x: any): x is Annotated<T>
|
||||||
{
|
{
|
||||||
return !!x?.[IsPreservesAnnotated];
|
return !!x?.[IsPreservesAnnotated];
|
||||||
}
|
}
|
||||||
|
@ -30,6 +31,17 @@ export function is(a: any, b: any): boolean {
|
||||||
for (let i = 0; i < a.length; i++) if (!is(a[i], b[i])) return false;
|
for (let i = 0; i < a.length; i++) if (!is(a[i], b[i])) return false;
|
||||||
return true;
|
return true;
|
||||||
}
|
}
|
||||||
|
{
|
||||||
|
const aMap = Dictionary.asMap(a);
|
||||||
|
const bMap = Dictionary.asMap(b);
|
||||||
|
if (!aMap || !bMap) return false;
|
||||||
|
if (aMap.size !== bMap.size) return false;
|
||||||
|
for (const k of aMap.keys()) {
|
||||||
|
if (!bMap.has(k)) return false;
|
||||||
|
if (!is(aMap.get(k), bMap.get(k))) return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
}
|
}
|
||||||
return false;
|
return false;
|
||||||
}
|
}
|
||||||
|
|
|
@ -0,0 +1,89 @@
|
||||||
|
import { isEmbedded } from './embedded';
|
||||||
|
import { Equivalence, _iterMap } from './flex';
|
||||||
|
|
||||||
|
export interface JsDictionary<V> {
|
||||||
|
[key: string]: V;
|
||||||
|
}
|
||||||
|
|
||||||
|
export namespace JsDictionary {
|
||||||
|
export function isJsDictionary<V>(x: any): x is JsDictionary<V> {
|
||||||
|
// We accept only literal objects and objects created via `new Object` as dictionaries.
|
||||||
|
// Furthermore, we require no function-valued `__as_preserve__` property to exist.
|
||||||
|
return typeof x === 'object'
|
||||||
|
&& x !== null
|
||||||
|
&& Object.getPrototypeOf(Object.getPrototypeOf(x)) === null
|
||||||
|
&& typeof x.__as_preserve__ !== 'function'
|
||||||
|
&& !isEmbedded(x);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function from<V>(entries: Iterable<[symbol, V]>): JsDictionary<V> {
|
||||||
|
const r: JsDictionary<V> = {};
|
||||||
|
for (const [key, value] of entries) r[key.description!] = value;
|
||||||
|
return r;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function clear<V>(j: JsDictionary<V>): void {
|
||||||
|
for (const key in j) delete j[key];
|
||||||
|
}
|
||||||
|
|
||||||
|
export function remove<V>(j: JsDictionary<V>, key: symbol): boolean {
|
||||||
|
const result = has(j, key);
|
||||||
|
delete j[key.description!];
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function forEach<V>(
|
||||||
|
j: JsDictionary<V>,
|
||||||
|
callbackfn: (value: V, key: symbol) => void,
|
||||||
|
): void {
|
||||||
|
Object.entries(j).forEach(([key, val]) => callbackfn(val, Symbol.for(key)));
|
||||||
|
}
|
||||||
|
|
||||||
|
export function get<V>(j: JsDictionary<V>, key: symbol): V | undefined {
|
||||||
|
return j[key.description!];
|
||||||
|
}
|
||||||
|
|
||||||
|
export function has<V>(j: JsDictionary<V>, key: symbol): boolean {
|
||||||
|
return Object.hasOwnProperty.call(j, key.description!);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function set<V>(j: JsDictionary<V>, key: symbol, value: V): JsDictionary<V> {
|
||||||
|
j[key.description!] = value;
|
||||||
|
return j;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function size<V>(j: JsDictionary<V>): number {
|
||||||
|
return Object.keys(j).length;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function entries<V>(j: JsDictionary<V>): IterableIterator<[symbol, V]> {
|
||||||
|
return _iterMap(Object.entries(j).values(), ([k, v]) => [Symbol.for(k), v]);
|
||||||
|
}
|
||||||
|
|
||||||
|
export function keys<V>(j: JsDictionary<V>): IterableIterator<symbol> {
|
||||||
|
return _iterMap(Object.keys(j).values(), k => Symbol.for(k));
|
||||||
|
}
|
||||||
|
|
||||||
|
export function values<V>(j: JsDictionary<V>): IterableIterator<V> {
|
||||||
|
return Object.values(j).values();
|
||||||
|
}
|
||||||
|
|
||||||
|
export function clone<V>(j: JsDictionary<V>): JsDictionary<V> {
|
||||||
|
const r: JsDictionary<V> = {};
|
||||||
|
Object.keys(j).forEach(k => r[k] = j[k]);
|
||||||
|
return r;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function equals<V>(
|
||||||
|
j1: JsDictionary<V>,
|
||||||
|
j2: JsDictionary<V>,
|
||||||
|
eqv: Equivalence<V> = (v1, v2) => v1 === v2,
|
||||||
|
): boolean {
|
||||||
|
if (size(j1) !== size(j2)) return false;
|
||||||
|
for (let [k, v] of entries(j1)) {
|
||||||
|
if (!has(j2, k)) return false;
|
||||||
|
if (!eqv(v, get(j2, k)!)) return false;
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
}
|
|
@ -3,13 +3,15 @@ import { Bytes } from "./bytes";
|
||||||
import { fold } from "./fold";
|
import { fold } from "./fold";
|
||||||
import { is } from "./is";
|
import { is } from "./is";
|
||||||
import { Value } from "./values";
|
import { Value } from "./values";
|
||||||
import { Set, Dictionary } from "./dictionary";
|
import { Set, Dictionary, KeyedDictionary, DictionaryMap } from "./dictionary";
|
||||||
import { Annotated } from "./annotated";
|
import { Annotated } from "./annotated";
|
||||||
import { unannotate } from "./strip";
|
import { unannotate } from "./strip";
|
||||||
import { embed, isEmbedded, Embedded } from "./embedded";
|
import { isEmbedded } from "./embedded";
|
||||||
import { isCompound } from "./compound";
|
import { isCompound } from "./compound";
|
||||||
|
import type { Embeddable } from "./embedded";
|
||||||
|
import { JsDictionary } from "./jsdictionary";
|
||||||
|
|
||||||
export function merge<T>(
|
export function merge<T extends Embeddable>(
|
||||||
mergeEmbeddeds: (a: T, b: T) => T | undefined,
|
mergeEmbeddeds: (a: T, b: T) => T | undefined,
|
||||||
item0: Value<T>,
|
item0: Value<T>,
|
||||||
... items: Array<Value<T>>): Value<T>
|
... items: Array<Value<T>>): Value<T>
|
||||||
|
@ -32,7 +34,6 @@ export function merge<T>(
|
||||||
}
|
}
|
||||||
return fold<T, Value<T>>(a, {
|
return fold<T, Value<T>>(a, {
|
||||||
boolean: die,
|
boolean: die,
|
||||||
single(_f: number) { return is(a, b) ? a : die(); },
|
|
||||||
double(_f: number) { return is(a, b) ? a : die(); },
|
double(_f: number) { return is(a, b) ? a : die(); },
|
||||||
integer: die,
|
integer: die,
|
||||||
string: die,
|
string: die,
|
||||||
|
@ -43,33 +44,38 @@ export function merge<T>(
|
||||||
if (!Record.isRecord<Value<T>, Tuple<Value<T>>, T>(b)) die();
|
if (!Record.isRecord<Value<T>, Tuple<Value<T>>, T>(b)) die();
|
||||||
return Record(walk(r.label, b.label), walkMany(r, b));
|
return Record(walk(r.label, b.label), walkMany(r, b));
|
||||||
},
|
},
|
||||||
|
|
||||||
array(a: Array<Value<T>>) {
|
array(a: Array<Value<T>>) {
|
||||||
if (!Array.isArray(b) || Record.isRecord(b)) die();
|
if (!Array.isArray(b) || Record.isRecord(b)) die();
|
||||||
return walkMany(a, b);
|
return walkMany(a, b);
|
||||||
},
|
},
|
||||||
|
|
||||||
set(_s: Set<T>) { die(); },
|
set(_s: Set<T>) { die(); },
|
||||||
dictionary(d: Dictionary<T>) {
|
|
||||||
if (!Dictionary.isDictionary<T>(b)) die();
|
dictionary(aMap: DictionaryMap<T>) {
|
||||||
const r = new Dictionary<T>();
|
const bMap = Dictionary.asMap<T>(b);
|
||||||
d.forEach((av,ak) => {
|
if (bMap === void 0) die();
|
||||||
const bv = b.get(ak);
|
|
||||||
|
const r = new DictionaryMap<T>();
|
||||||
|
aMap.forEach((av,ak) => {
|
||||||
|
const bv = bMap.get(ak);
|
||||||
r.set(ak, bv === void 0 ? av : walk(av, bv));
|
r.set(ak, bv === void 0 ? av : walk(av, bv));
|
||||||
});
|
});
|
||||||
b.forEach((bv, bk) => {
|
bMap.forEach((bv, bk) => {
|
||||||
if (!d.has(bk)) r.set(bk, bv);
|
if (!aMap.has(bk)) r.set(bk, bv);
|
||||||
});
|
});
|
||||||
return r;
|
return r.simplifiedValue();
|
||||||
},
|
},
|
||||||
|
|
||||||
annotated(a: Annotated<T>) {
|
annotated(a: Annotated<T>) {
|
||||||
return walk(a, unannotate(b));
|
return walk(a, unannotate(b));
|
||||||
},
|
},
|
||||||
|
|
||||||
embedded(t: Embedded<T>) {
|
embedded(t: T) {
|
||||||
if (!isEmbedded<T>(b)) die();
|
if (!isEmbedded<T>(b)) die();
|
||||||
const r = mergeEmbeddeds(t.embeddedValue, b.embeddedValue);
|
const r = mergeEmbeddeds(t, b);
|
||||||
if (r === void 0) die();
|
if (r === void 0) die();
|
||||||
return embed(r);
|
return r;
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
}
|
}
|
||||||
|
|
|
@ -2,12 +2,12 @@
|
||||||
|
|
||||||
import { Annotated } from './annotated';
|
import { Annotated } from './annotated';
|
||||||
import { Bytes } from './bytes';
|
import { Bytes } from './bytes';
|
||||||
import { Set, Dictionary } from './dictionary';
|
import { Set, EncodableDictionary } from './dictionary';
|
||||||
import { stringify } from './text';
|
import { stringify } from './text';
|
||||||
|
|
||||||
import * as util from 'util';
|
import * as util from 'util';
|
||||||
|
|
||||||
[Bytes, Annotated, Set, Dictionary].forEach((C) => {
|
[Bytes, Annotated, Set, EncodableDictionary].forEach((C) => {
|
||||||
(C as any).prototype[util.inspect.custom] =
|
(C as any).prototype[util.inspect.custom] =
|
||||||
function (_depth: any, _options: any) {
|
function (_depth: any, _options: any) {
|
||||||
return stringify(this, { indent: 2 });
|
return stringify(this, { indent: 2 });
|
||||||
|
|
|
@ -0,0 +1,103 @@
|
||||||
|
import { is, isAnnotated } from './is';
|
||||||
|
import { Bytes } from './bytes';
|
||||||
|
import { Set, Dictionary } from './dictionary';
|
||||||
|
import { isEmbedded } from './embedded';
|
||||||
|
import { Float } from './float';
|
||||||
|
import { Value } from './values';
|
||||||
|
import { Record } from './record';
|
||||||
|
import type { Embeddable } from './embedded';
|
||||||
|
|
||||||
|
export function typeCode<T extends Embeddable>(v: Value<T>): number {
|
||||||
|
if (isAnnotated<T>(v)) v = v.item;
|
||||||
|
switch (typeof v) {
|
||||||
|
case 'boolean':
|
||||||
|
return 0;
|
||||||
|
case 'number':
|
||||||
|
case 'bigint':
|
||||||
|
return 3;
|
||||||
|
case 'string':
|
||||||
|
return 4;
|
||||||
|
case 'symbol':
|
||||||
|
return 6;
|
||||||
|
case 'object':
|
||||||
|
if (Float.isFloat(v)) return 2; // 1 was for single-precision floats
|
||||||
|
if (Bytes.isBytes(v)) return 5;
|
||||||
|
if (Array.isArray(v)) {
|
||||||
|
return ('label' in v) ? 7 : 8;
|
||||||
|
}
|
||||||
|
if (Set.isSet<T>(v)) return 9;
|
||||||
|
if (Dictionary.isDictionary<T>(v)) return 10;
|
||||||
|
if (isEmbedded(v)) return 11;
|
||||||
|
/* fall through */
|
||||||
|
default:
|
||||||
|
throw new Error("Invalid Value<T> in typeCode");
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function compare<T extends Embeddable>(
|
||||||
|
a: Value<T>,
|
||||||
|
b: Value<T>,
|
||||||
|
compare_embedded: (a: T, b: T) => number = (a, b) => is(a, b) ? 0 : a < b ? -1 : 1,
|
||||||
|
): number {
|
||||||
|
function cmp(a: Value<T>, b: Value<T>): number {
|
||||||
|
if (isAnnotated<T>(a)) a = a.item;
|
||||||
|
if (isAnnotated<T>(b)) b = b.item;
|
||||||
|
const ta = typeCode(a);
|
||||||
|
const tb = typeCode(b);
|
||||||
|
if (ta < tb) return -1;
|
||||||
|
if (ta > tb) return 1;
|
||||||
|
switch (ta) {
|
||||||
|
case 0:
|
||||||
|
case 3:
|
||||||
|
case 4: {
|
||||||
|
const va = a as any;
|
||||||
|
const vb = b as any;
|
||||||
|
return va < vb ? -1 : va > vb ? 1 : 0;
|
||||||
|
}
|
||||||
|
// case 1: // was single-precision
|
||||||
|
case 2: {
|
||||||
|
const va = (a as Float).value;
|
||||||
|
const vb = (b as Float).value;
|
||||||
|
return va < vb ? -1 : va > vb ? 1 : 0;
|
||||||
|
}
|
||||||
|
case 5:
|
||||||
|
return Bytes.compare(a as Bytes, b as Bytes);
|
||||||
|
case 6: {
|
||||||
|
const va = (a as symbol).description!;
|
||||||
|
const vb = (b as symbol).description!;
|
||||||
|
return va < vb ? -1 : va > vb ? 1 : 0;
|
||||||
|
}
|
||||||
|
case 7: {
|
||||||
|
const lr = cmp((a as Record<Value<T>, Value<T>[], T>).label,
|
||||||
|
(b as Record<Value<T>, Value<T>[], T>).label);
|
||||||
|
if (lr !== 0) return lr;
|
||||||
|
/* fall through */
|
||||||
|
}
|
||||||
|
case 8: {
|
||||||
|
const va = a as Value<T>[];
|
||||||
|
const vb = b as Value<T>[];
|
||||||
|
const l = Math.min(va.length, vb.length)
|
||||||
|
for (let i = 0; i < l; i++) {
|
||||||
|
const c = cmp(va[i], vb[i]);
|
||||||
|
if (c !== 0) return c;
|
||||||
|
}
|
||||||
|
return va.length < vb.length ? -1 : va.length > vb.length ? 1 : 0;
|
||||||
|
}
|
||||||
|
case 9: {
|
||||||
|
const va = Array.from(a as Set<T>).sort(cmp);
|
||||||
|
const vb = Array.from(b as Set<T>).sort(cmp);
|
||||||
|
return cmp(va, vb);
|
||||||
|
}
|
||||||
|
case 10: {
|
||||||
|
const va = Array.from(Dictionary.asMap<T>(a)!.entries()).sort(cmp);
|
||||||
|
const vb = Array.from(Dictionary.asMap<T>(b)!.entries()).sort(cmp);
|
||||||
|
return cmp(va, vb);
|
||||||
|
}
|
||||||
|
case 11:
|
||||||
|
return compare_embedded(a as T, b as T);
|
||||||
|
default:
|
||||||
|
throw new Error("Invalid typeCode: " + ta);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
return cmp(a, b);
|
||||||
|
}
|
|
@ -0,0 +1,353 @@
|
||||||
|
// Preserves-Expressions. https://preserves.dev/preserves-expressions.html
|
||||||
|
|
||||||
|
import { ReaderBase } from './reader';
|
||||||
|
import { Atom, Value } from './values';
|
||||||
|
import { Position, annotate, formatPosition } from './annotated';
|
||||||
|
import { Record as VRecord } from './record';
|
||||||
|
import { Embeddable, GenericEmbedded } from './embedded';
|
||||||
|
import { fromJS } from './fromjs';
|
||||||
|
import { DictionaryMap, Set as VSet } from './dictionary';
|
||||||
|
|
||||||
|
export type Expr = SimpleExpr | Punct;
|
||||||
|
export type SimpleExpr = Atom | Compound | Embedded;
|
||||||
|
|
||||||
|
export type Positioned<I> = { position: Position, item: I, annotations?: Annotations };
|
||||||
|
|
||||||
|
export class Punct {
|
||||||
|
constructor(public text: string) {}
|
||||||
|
__as_preserve__(): Value { return VRecord(Symbol.for('p'), [Symbol.for(this.text)]); }
|
||||||
|
|
||||||
|
isComma(): boolean { return this.text === ','; }
|
||||||
|
static isComma(v: Expr): boolean { return v instanceof Punct && v.isComma(); }
|
||||||
|
|
||||||
|
isColon(n = 1): boolean { return this.text === ':'.repeat(n); }
|
||||||
|
static isColon(v: Expr, n = 1): boolean { return v instanceof Punct && v.isColon(n); }
|
||||||
|
}
|
||||||
|
|
||||||
|
export class Embedded {
|
||||||
|
constructor(public expr: SimpleExpr, public annotations?: Annotations) {}
|
||||||
|
__as_preserve__(): Value {
|
||||||
|
const v = fromJS(this.expr);
|
||||||
|
return new GenericEmbedded(this.annotations?.wrap(v) ?? v);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export class BaseCompound<I> {
|
||||||
|
positions: Position[] = [];
|
||||||
|
exprs: I[] = [];
|
||||||
|
annotations?: Annotations[] = void 0; // sparse array when non-void
|
||||||
|
|
||||||
|
get(i: number): Positioned<I> | undefined {
|
||||||
|
if (i >= this.exprs.length) return void 0;
|
||||||
|
return {
|
||||||
|
position: this.positions[i],
|
||||||
|
item: this.exprs[i],
|
||||||
|
annotations: this.annotations && this.annotations[i],
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
push(p: Positioned<I>): true;
|
||||||
|
push(expr: I, position: Position, annotations?: Annotations): true;
|
||||||
|
push(v: Positioned<I> | I, position?: Position, annotations?: Annotations) {
|
||||||
|
if (position === void 0) {
|
||||||
|
const p = v as Positioned<I>;
|
||||||
|
if (p.annotations) this._ensureAnnotations()[this.exprs.length] = p.annotations;
|
||||||
|
this.positions.push(p.position);
|
||||||
|
this.exprs.push(p.item);
|
||||||
|
} else {
|
||||||
|
if (annotations) this._ensureAnnotations()[this.exprs.length] = annotations;
|
||||||
|
this.positions.push(position);
|
||||||
|
this.exprs.push(v as I);
|
||||||
|
}
|
||||||
|
return true;
|
||||||
|
}
|
||||||
|
|
||||||
|
_ensureAnnotations(): Annotations[] {
|
||||||
|
if (this.annotations === void 0) this.annotations = [];
|
||||||
|
return this.annotations;
|
||||||
|
}
|
||||||
|
|
||||||
|
_annotationsAt(index: number): Annotations {
|
||||||
|
return this._ensureAnnotations()[index] ??= new Annotations();
|
||||||
|
}
|
||||||
|
|
||||||
|
preservesValues(): Value[] {
|
||||||
|
return this.exprs.map((p, i) => {
|
||||||
|
const v = fromJS(p);
|
||||||
|
if (this.annotations?.[i] !== void 0) {
|
||||||
|
return this.annotations[i].wrap(v);
|
||||||
|
} else {
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
__as_preserve__(): Value {
|
||||||
|
return this.preservesValues();
|
||||||
|
}
|
||||||
|
|
||||||
|
map<R>(f: (item: Positioned<I>, index: number) => R, offset = 0): R[] {
|
||||||
|
const result: R[] = [];
|
||||||
|
for (let i = offset; i < this.exprs.length; i++) {
|
||||||
|
result.push(f(this.get(i)!, i));
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
[Symbol.iterator](): IterableIterator<Positioned<I>> {
|
||||||
|
let c = this;
|
||||||
|
let i = 0;
|
||||||
|
return {
|
||||||
|
next(): IteratorResult<Positioned<I>> {
|
||||||
|
if (i < c.exprs.length) {
|
||||||
|
return { done: false, value: c.get(i++)! };
|
||||||
|
} else {
|
||||||
|
return { done: true, value: void 0 };
|
||||||
|
}
|
||||||
|
},
|
||||||
|
[Symbol.iterator]() { return c[Symbol.iterator](); }
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export class Document extends BaseCompound<Expr> {}
|
||||||
|
|
||||||
|
export class Annotations extends BaseCompound<SimpleExpr> {
|
||||||
|
wrap(v: Value): Value {
|
||||||
|
return annotate(v, ... this.preservesValues());
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export type CompoundVariant = 'sequence' | 'record' | 'block' | 'group' | 'set';
|
||||||
|
|
||||||
|
export abstract class Compound extends BaseCompound<Expr> {
|
||||||
|
abstract get variant(): CompoundVariant;
|
||||||
|
__as_preserve__(): Value {
|
||||||
|
const vs = this.preservesValues();
|
||||||
|
switch (this.variant) {
|
||||||
|
case 'sequence': return vs;
|
||||||
|
case 'record': return VRecord(Symbol.for('r'), vs);
|
||||||
|
case 'block': return VRecord(Symbol.for('b'), vs);
|
||||||
|
case 'group': return VRecord(Symbol.for('g'), vs);
|
||||||
|
case 'set': return VRecord(Symbol.for('s'), vs);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export class Sequence extends Compound {
|
||||||
|
get variant(): CompoundVariant { return 'sequence'; }
|
||||||
|
}
|
||||||
|
|
||||||
|
export class Record extends Compound {
|
||||||
|
get variant(): CompoundVariant { return 'record'; }
|
||||||
|
}
|
||||||
|
|
||||||
|
export class Block extends Compound {
|
||||||
|
get variant(): CompoundVariant { return 'block'; }
|
||||||
|
}
|
||||||
|
|
||||||
|
export class Group extends Compound {
|
||||||
|
get variant(): CompoundVariant { return 'group'; }
|
||||||
|
}
|
||||||
|
|
||||||
|
export class Set extends Compound {
|
||||||
|
get variant(): CompoundVariant { return 'set'; }
|
||||||
|
}
|
||||||
|
|
||||||
|
export class Reader extends ReaderBase<never> {
|
||||||
|
nextDocument(howMany: 'all' | 'one' = 'all'): Document {
|
||||||
|
const doc = new Document();
|
||||||
|
this.readExpr(doc);
|
||||||
|
if (howMany === 'all') {
|
||||||
|
while (this.readExpr(doc)) {}
|
||||||
|
}
|
||||||
|
return doc;
|
||||||
|
}
|
||||||
|
|
||||||
|
readCompound(c: Compound, terminator: string): Compound {
|
||||||
|
while (this.readExpr(c, terminator)) {}
|
||||||
|
return c;
|
||||||
|
}
|
||||||
|
|
||||||
|
readSimpleExpr(c: BaseCompound<SimpleExpr>): boolean {
|
||||||
|
return this._readInto(c, false);
|
||||||
|
}
|
||||||
|
|
||||||
|
readExpr(c: BaseCompound<Expr>, terminator: string | null = null): boolean {
|
||||||
|
return this._readInto(c as BaseCompound<SimpleExpr> /* yuck */, true, terminator);
|
||||||
|
}
|
||||||
|
|
||||||
|
_checkTerminator(actual: string, expected: string | null, startPos: Position): false {
|
||||||
|
if (actual === expected) return false;
|
||||||
|
this.state.error('Unexpected ' + actual, startPos);
|
||||||
|
}
|
||||||
|
|
||||||
|
_readInto(c: BaseCompound<SimpleExpr>, acceptPunct: boolean, terminator: string | null = null): boolean {
|
||||||
|
while (true) {
|
||||||
|
this.state.skipws();
|
||||||
|
if (this.state.atEnd() && terminator === null) return false;
|
||||||
|
const startPos = this.state.copyPos();
|
||||||
|
const ch = this.state.nextchar();
|
||||||
|
switch (ch) {
|
||||||
|
case '"':
|
||||||
|
return c.push(this.state.readString('"'), startPos);
|
||||||
|
case '|':
|
||||||
|
return c.push(Symbol.for(this.state.readString('|')), startPos);
|
||||||
|
case ';':
|
||||||
|
if (acceptPunct) {
|
||||||
|
return (c as BaseCompound<Expr>).push(new Punct(';'), startPos);
|
||||||
|
} else {
|
||||||
|
this.state.error('Semicolon is not permitted at this location', startPos);
|
||||||
|
}
|
||||||
|
case '@':
|
||||||
|
if (!this.readSimpleExpr(c._annotationsAt(c.exprs.length))) {
|
||||||
|
this.state.error('Missing annotation', startPos);
|
||||||
|
}
|
||||||
|
continue;
|
||||||
|
case ':': {
|
||||||
|
let colons: string = ch;
|
||||||
|
while (!this.state.atEnd() && this.state.peek() === ':') {
|
||||||
|
colons = colons + ':';
|
||||||
|
this.state.advance();
|
||||||
|
}
|
||||||
|
if (acceptPunct) {
|
||||||
|
return (c as BaseCompound<Expr>).push(new Punct(colons), startPos);
|
||||||
|
} else {
|
||||||
|
this.state.error('Colons are not permitted at this location', startPos);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case '#': {
|
||||||
|
const ch = this.state.nextchar();
|
||||||
|
switch (ch) {
|
||||||
|
case ' ': case '\t': {
|
||||||
|
const here = this.state.copyPos();
|
||||||
|
c._annotationsAt(c.exprs.length).push(this.state.readCommentLine(), here);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
case '\n': case '\r': {
|
||||||
|
const here = this.state.copyPos();
|
||||||
|
c._annotationsAt(c.exprs.length).push('', here);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
case '!': {
|
||||||
|
const here = this.state.copyPos();
|
||||||
|
const r = new Record();
|
||||||
|
r.push(Symbol.for('interpreter'), here);
|
||||||
|
r.push(this.state.readCommentLine(), here);
|
||||||
|
c._annotationsAt(c.exprs.length).push(r, here);
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
case 'f': this.state.requireDelimiter('#f'); return c.push(false, startPos);
|
||||||
|
case 't': this.state.requireDelimiter('#t'); return c.push(true, startPos);
|
||||||
|
case '{': return c.push(this.readCompound(new Set(), '}'), startPos);
|
||||||
|
case '"': return c.push(this.state.readLiteralBinary(), startPos);
|
||||||
|
case 'x': switch (this.state.nextchar()) {
|
||||||
|
case '"': return c.push(this.state.readHexBinary(), startPos);
|
||||||
|
case 'd': return c.push(this.state.readHexFloat(), startPos);
|
||||||
|
default: this.state.error('Invalid #x syntax', startPos);
|
||||||
|
}
|
||||||
|
case '[': return c.push(this.state.readBase64Binary(), startPos);
|
||||||
|
case ':': {
|
||||||
|
const r = new BaseCompound<SimpleExpr>();
|
||||||
|
if (!this.readSimpleExpr(r)) return false;
|
||||||
|
const e = new Embedded(r.exprs[0], r.annotations && r.annotations[0]);
|
||||||
|
return c.push(e, startPos);
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
this.state.error(`Invalid # syntax: ${ch}`, startPos);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case '(': return c.push(this.readCompound(new Group(), ')'), startPos);
|
||||||
|
case '<': return c.push(this.readCompound(new Record(), '>'), startPos);
|
||||||
|
case '[': return c.push(this.readCompound(new Sequence(), ']'), startPos);
|
||||||
|
case '{': return c.push(this.readCompound(new Block(), '}'), startPos);
|
||||||
|
case ')': return this._checkTerminator(ch, terminator, startPos);
|
||||||
|
case '>': return this._checkTerminator(ch, terminator, startPos);
|
||||||
|
case ']': return this._checkTerminator(ch, terminator, startPos);
|
||||||
|
case '}': return this._checkTerminator(ch, terminator, startPos);
|
||||||
|
case ',':
|
||||||
|
if (acceptPunct) {
|
||||||
|
return (c as BaseCompound<Expr>).push(new Punct(','), startPos);
|
||||||
|
} else {
|
||||||
|
this.state.error('Comma is not permitted at this location', startPos);
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
return c.push(this.state.readRawSymbolOrNumber(ch), startPos);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export interface AsPreservesOptions<T extends Embeddable> {
|
||||||
|
onGroup?: (g: Positioned<Group>) => Value<T>;
|
||||||
|
onEmbedded?: (e: Positioned<Expr>, walk: (p: Positioned<Expr>) => Value<T>) => Value<T>;
|
||||||
|
error?: (tag: string, position: Position) => Value<T>;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function asPreserves<T extends Embeddable>(
|
||||||
|
p: Positioned<Expr>,
|
||||||
|
options: AsPreservesOptions<T> = {},
|
||||||
|
): Value<T> {
|
||||||
|
const error = options.error ?? ((tag, position) => {
|
||||||
|
throw new Error(formatPosition(position) + ": " + tag);
|
||||||
|
});
|
||||||
|
|
||||||
|
function nonCommas(p: Compound): Positioned<Expr>[] {
|
||||||
|
return Array.from(p).filter(p => !Punct.isComma(p.item));
|
||||||
|
}
|
||||||
|
|
||||||
|
function walk(p: Positioned<Expr>): Value<T> {
|
||||||
|
if (p.item instanceof Punct) {
|
||||||
|
return error('invalid-punctuation', p.position);
|
||||||
|
} else if (p.item instanceof Embedded) {
|
||||||
|
if (options.onEmbedded) {
|
||||||
|
return options.onEmbedded({ position: p.position, item: p.item.expr }, walk);
|
||||||
|
} else {
|
||||||
|
return error('unexpected-embedded', p.position);
|
||||||
|
}
|
||||||
|
} else if (p.item instanceof Compound) {
|
||||||
|
switch (p.item.variant) {
|
||||||
|
case 'sequence':
|
||||||
|
return nonCommas(p.item).map(walk);
|
||||||
|
case 'record': {
|
||||||
|
const vs = nonCommas(p.item).map(walk);
|
||||||
|
if (vs.length < 1) {
|
||||||
|
return error('invalid-record', p.position);
|
||||||
|
}
|
||||||
|
const r = vs.slice(1) as unknown as VRecord<Value<T>, Value<T>[], T>;
|
||||||
|
r.label = vs[0];
|
||||||
|
return r;
|
||||||
|
}
|
||||||
|
case 'block': {
|
||||||
|
const d = new DictionaryMap<T>();
|
||||||
|
const vs = nonCommas(p.item);
|
||||||
|
if ((vs.length % 3) !== 0) {
|
||||||
|
return error('invalid-dictionary', p.position);
|
||||||
|
}
|
||||||
|
for (let i = 0; i < vs.length; i += 3) {
|
||||||
|
if (!Punct.isColon(vs[i + 1].item)) {
|
||||||
|
return error('missing-colon', vs[i + 1].position);
|
||||||
|
}
|
||||||
|
const k = walk(vs[i]);
|
||||||
|
const v = walk(vs[i + 2]);
|
||||||
|
d.set(k, v);
|
||||||
|
}
|
||||||
|
return d.simplifiedValue();
|
||||||
|
}
|
||||||
|
case 'group': {
|
||||||
|
if (options.onGroup) {
|
||||||
|
return options.onGroup(p as Positioned<Group>);
|
||||||
|
} else {
|
||||||
|
return error('unexpected-group', p.position);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case 'set':
|
||||||
|
return new VSet(nonCommas(p.item).map(walk));
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
return p.item;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return walk(p);
|
||||||
|
}
|
|
@ -2,15 +2,15 @@
|
||||||
|
|
||||||
import type { Value } from './values';
|
import type { Value } from './values';
|
||||||
import { DecodeError, ShortPacket } from './codec';
|
import { DecodeError, ShortPacket } from './codec';
|
||||||
import { Dictionary, Set } from './dictionary';
|
import { Dictionary, DictionaryMap, Set } from './dictionary';
|
||||||
import { strip } from './strip';
|
import { strip } from './strip';
|
||||||
import { Bytes, unhexDigit } from './bytes';
|
import { Bytes, unhexDigit } from './bytes';
|
||||||
import { Decoder, DecoderState, neverEmbeddedTypeDecode } from './decoder';
|
import { Decoder, DecoderState, neverEmbeddedTypeDecode } from './decoder';
|
||||||
import { Record } from './record';
|
import { Record } from './record';
|
||||||
import { Annotated, newPosition, Position, updatePosition } from './annotated';
|
import { Annotated, newPosition, Position, updatePosition } from './annotated';
|
||||||
import { Double, DoubleFloat, FloatType, Single, SingleFloat } from './float';
|
import { Double, DoubleFloat } from './float';
|
||||||
import { stringify } from './text';
|
import { stringify } from './text';
|
||||||
import { embed, GenericEmbedded, EmbeddedTypeDecode } from './embedded';
|
import { Embeddable, GenericEmbedded, EmbeddedTypeDecode } from './embedded';
|
||||||
|
|
||||||
export interface ReaderStateOptions {
|
export interface ReaderStateOptions {
|
||||||
includeAnnotations?: boolean;
|
includeAnnotations?: boolean;
|
||||||
|
@ -24,12 +24,10 @@ export interface ReaderOptions<T> extends ReaderStateOptions {
|
||||||
const MAX_SAFE_INTEGERn = BigInt(Number.MAX_SAFE_INTEGER);
|
const MAX_SAFE_INTEGERn = BigInt(Number.MAX_SAFE_INTEGER);
|
||||||
const MIN_SAFE_INTEGERn = BigInt(Number.MIN_SAFE_INTEGER);
|
const MIN_SAFE_INTEGERn = BigInt(Number.MIN_SAFE_INTEGER);
|
||||||
|
|
||||||
export const NUMBER_RE: RegExp = /^([-+]?\d+)(((\.\d+([eE][-+]?\d+)?)|([eE][-+]?\d+))([fF]?))?$/;
|
export const NUMBER_RE: RegExp = /^([-+]?\d+)((\.\d+([eE][-+]?\d+)?)|([eE][-+]?\d+))?$/;
|
||||||
// Groups:
|
// Groups:
|
||||||
// 1 - integer part and sign
|
// 1 - integer part and sign
|
||||||
// 2 - decimal part, exponent and Float marker
|
// 2 - decimal part and exponent
|
||||||
// 3 - decimal part and exponent
|
|
||||||
// 7 - Float marker
|
|
||||||
|
|
||||||
export class ReaderState {
|
export class ReaderState {
|
||||||
buffer: string;
|
buffer: string;
|
||||||
|
@ -131,20 +129,14 @@ export class ReaderState {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
readHexFloat(precision: FloatType): SingleFloat | DoubleFloat {
|
readHexFloat(): DoubleFloat {
|
||||||
const pos = this.copyPos();
|
const pos = this.copyPos();
|
||||||
if (this.nextchar() !== '"') {
|
if (this.nextchar() !== '"') {
|
||||||
this.error("Missing open-double-quote in hex-encoded floating-point number", pos);
|
this.error("Missing open-double-quote in hex-encoded floating-point number", pos);
|
||||||
}
|
}
|
||||||
const bs = this.readHexBinary();
|
const bs = this.readHexBinary();
|
||||||
switch (precision) {
|
if (bs.length !== 8) this.error("Incorrect number of bytes in hex-encoded Double", pos);
|
||||||
case 'Single':
|
return DoubleFloat.fromBytes(bs);
|
||||||
if (bs.length !== 4) this.error("Incorrect number of bytes in hex-encoded Float", pos);
|
|
||||||
return SingleFloat.fromBytes(bs);
|
|
||||||
case 'Double':
|
|
||||||
if (bs.length !== 8) this.error("Incorrect number of bytes in hex-encoded Double", pos);
|
|
||||||
return DoubleFloat.fromBytes(bs);
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
readBase64Binary(): Bytes {
|
readBase64Binary(): Bytes {
|
||||||
|
@ -155,7 +147,7 @@ export class ReaderState {
|
||||||
if (c === ']') break;
|
if (c === ']') break;
|
||||||
acc = acc + c;
|
acc = acc + c;
|
||||||
}
|
}
|
||||||
return decodeBase64(acc);
|
return Bytes.fromBase64(acc);
|
||||||
}
|
}
|
||||||
|
|
||||||
requireDelimiter(prefix: string): void {
|
requireDelimiter(prefix: string): void {
|
||||||
|
@ -163,13 +155,15 @@ export class ReaderState {
|
||||||
this.error(`Delimiter must follow ${prefix}`, this.pos);
|
this.error(`Delimiter must follow ${prefix}`, this.pos);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
static readonly DELIMITERS = '(){}[]<>";,@#:|';
|
||||||
|
|
||||||
delimiterFollows(): boolean {
|
delimiterFollows(): boolean {
|
||||||
if (this.atEnd()) return true;
|
if (this.atEnd()) return true;
|
||||||
const ch = this.peek();
|
const ch = this.peek();
|
||||||
return ('(){}[]<>";,@#:|'.indexOf(ch) !== -1) || isSpace(ch);
|
return (ReaderState.DELIMITERS.indexOf(ch) !== -1) || isSpace(ch);
|
||||||
}
|
}
|
||||||
|
|
||||||
readRawSymbolOrNumber<T>(acc: string): Value<T> {
|
readRawSymbolOrNumber(acc: string): number | bigint | symbol | DoubleFloat {
|
||||||
while (!this.delimiterFollows()) acc = acc + this.nextchar();
|
while (!this.delimiterFollows()) acc = acc + this.nextchar();
|
||||||
const m = NUMBER_RE.exec(acc);
|
const m = NUMBER_RE.exec(acc);
|
||||||
if (m) {
|
if (m) {
|
||||||
|
@ -180,10 +174,8 @@ export class ReaderState {
|
||||||
} else {
|
} else {
|
||||||
return Number(v);
|
return Number(v);
|
||||||
}
|
}
|
||||||
} else if (m[7] === '') {
|
|
||||||
return Double(parseFloat(m[1] + m[3]));
|
|
||||||
} else {
|
} else {
|
||||||
return Single(parseFloat(m[1] + m[3]));
|
return Double(parseFloat(acc));
|
||||||
}
|
}
|
||||||
} else {
|
} else {
|
||||||
return Symbol.for(acc);
|
return Symbol.for(acc);
|
||||||
|
@ -258,6 +250,16 @@ export class ReaderState {
|
||||||
'x',
|
'x',
|
||||||
() => this.readHex2());
|
() => this.readHex2());
|
||||||
}
|
}
|
||||||
|
|
||||||
|
readCommentLine(): string {
|
||||||
|
let acc = '';
|
||||||
|
while (true) {
|
||||||
|
if (this.atEnd()) return acc;
|
||||||
|
const c = this.nextchar();
|
||||||
|
if (c === '\n' || c === '\r') return acc;
|
||||||
|
acc = acc + c;
|
||||||
|
}
|
||||||
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
export const genericEmbeddedTypeDecode: EmbeddedTypeDecode<GenericEmbedded> = {
|
export const genericEmbeddedTypeDecode: EmbeddedTypeDecode<GenericEmbedded> = {
|
||||||
|
@ -270,7 +272,7 @@ export const genericEmbeddedTypeDecode: EmbeddedTypeDecode<GenericEmbedded> = {
|
||||||
},
|
},
|
||||||
};
|
};
|
||||||
|
|
||||||
export class Reader<T> {
|
export class ReaderBase<T extends Embeddable> {
|
||||||
state: ReaderState;
|
state: ReaderState;
|
||||||
embeddedType: EmbeddedTypeDecode<T>;
|
embeddedType: EmbeddedTypeDecode<T>;
|
||||||
|
|
||||||
|
@ -293,17 +295,13 @@ export class Reader<T> {
|
||||||
write(data: string) {
|
write(data: string) {
|
||||||
this.state.write(data);
|
this.state.write(data);
|
||||||
}
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export class Reader<T extends Embeddable> extends ReaderBase<T> {
|
||||||
|
|
||||||
readCommentLine(): Value<T> {
|
readCommentLine(): Value<T> {
|
||||||
const startPos = this.state.copyPos();
|
const startPos = this.state.copyPos();
|
||||||
let acc = '';
|
return this.wrap(this.state.readCommentLine(), startPos);
|
||||||
while (true) {
|
|
||||||
const c = this.state.nextchar();
|
|
||||||
if (c === '\n' || c === '\r') {
|
|
||||||
return this.wrap(acc, startPos);
|
|
||||||
}
|
|
||||||
acc = acc + c;
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
wrap(v: Value<T>, pos: Position): Value<T> {
|
wrap(v: Value<T>, pos: Position): Value<T> {
|
||||||
|
@ -354,20 +352,22 @@ export class Reader<T> {
|
||||||
switch (c) {
|
switch (c) {
|
||||||
case ' ': case '\t': return this.annotateNextWith(this.readCommentLine());
|
case ' ': case '\t': return this.annotateNextWith(this.readCommentLine());
|
||||||
case '\n': case '\r': return this.annotateNextWith('');
|
case '\n': case '\r': return this.annotateNextWith('');
|
||||||
|
case '!':
|
||||||
|
return this.annotateNextWith(
|
||||||
|
Record(Symbol.for('interpreter'), [this.readCommentLine()]));
|
||||||
case 'f': this.state.requireDelimiter('#f'); return false;
|
case 'f': this.state.requireDelimiter('#f'); return false;
|
||||||
case 't': this.state.requireDelimiter('#t'); return true;
|
case 't': this.state.requireDelimiter('#t'); return true;
|
||||||
case '{': return this.readSet();
|
case '{': return this.readSet();
|
||||||
case '"': return this.state.readLiteralBinary();
|
case '"': return this.state.readLiteralBinary();
|
||||||
case 'x': switch (this.state.nextchar()) {
|
case 'x': switch (this.state.nextchar()) {
|
||||||
case '"': return this.state.readHexBinary();
|
case '"': return this.state.readHexBinary();
|
||||||
case 'f': return this.state.readHexFloat('Single');
|
case 'd': return this.state.readHexFloat();
|
||||||
case 'd': return this.state.readHexFloat('Double');
|
|
||||||
default: this.state.error('Invalid #x syntax', startPos);
|
default: this.state.error('Invalid #x syntax', startPos);
|
||||||
}
|
}
|
||||||
case '[': return this.state.readBase64Binary();
|
case '[': return this.state.readBase64Binary();
|
||||||
case '!': return embed(this.embeddedType.fromValue(
|
case ':': return this.embeddedType.fromValue(
|
||||||
new Reader<GenericEmbedded>(this.state, genericEmbeddedTypeDecode).next(),
|
new Reader<GenericEmbedded>(this.state, genericEmbeddedTypeDecode).next(),
|
||||||
this.state.options));
|
this.state.options);
|
||||||
default:
|
default:
|
||||||
this.state.error(`Invalid # syntax: ${c}`, startPos);
|
this.state.error(`Invalid # syntax: ${c}`, startPos);
|
||||||
}
|
}
|
||||||
|
@ -406,22 +406,18 @@ export class Reader<T> {
|
||||||
}
|
}
|
||||||
|
|
||||||
readDictionary(): Dictionary<T> {
|
readDictionary(): Dictionary<T> {
|
||||||
return this.seq(true,
|
return this.seq(true, new DictionaryMap<T>(), (k, acc) => {
|
||||||
new Dictionary<T>(),
|
this.state.skipws();
|
||||||
(k, acc) => {
|
switch (this.state.peek()) {
|
||||||
this.state.skipws();
|
case ':':
|
||||||
switch (this.state.peek()) {
|
this.state.advance();
|
||||||
case ':':
|
if (acc.has(k)) this.state.error(`Duplicate key: ${stringify(k)}`, this.state.pos);
|
||||||
if (acc.has(k)) this.state.error(
|
acc.set(k, this.next());
|
||||||
`Duplicate key: ${stringify(k)}`, this.state.pos);
|
break;
|
||||||
this.state.advance();
|
default:
|
||||||
acc.set(k, this.next());
|
this.state.error('Missing key/value separator', this.state.pos);
|
||||||
break;
|
}
|
||||||
default:
|
}, '}').simplifiedValue();
|
||||||
this.state.error('Missing key/value separator', this.state.pos);
|
|
||||||
}
|
|
||||||
},
|
|
||||||
'}');
|
|
||||||
}
|
}
|
||||||
|
|
||||||
readSet(): Set<T> {
|
readSet(): Set<T> {
|
||||||
|
@ -436,31 +432,6 @@ export class Reader<T> {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const BASE64: {[key: string]: number} = {};
|
|
||||||
[... 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789'].forEach(
|
|
||||||
(c, i) => BASE64[c] = i);
|
|
||||||
BASE64['+'] = BASE64['-'] = 62;
|
|
||||||
BASE64['/'] = BASE64['_'] = 63;
|
|
||||||
|
|
||||||
export function decodeBase64(s: string): Bytes {
|
|
||||||
const bs = new Uint8Array(Math.floor(s.length * 3/4));
|
|
||||||
let i = 0;
|
|
||||||
let j = 0;
|
|
||||||
while (i < s.length) {
|
|
||||||
const v1 = BASE64[s[i++]];
|
|
||||||
const v2 = BASE64[s[i++]];
|
|
||||||
const v3 = BASE64[s[i++]];
|
|
||||||
const v4 = BASE64[s[i++]];
|
|
||||||
const v = (v1 << 18) | (v2 << 12) | (v3 << 6) | v4;
|
|
||||||
bs[j++] = (v >> 16) & 255;
|
|
||||||
if (v3 === void 0) break;
|
|
||||||
bs[j++] = (v >> 8) & 255;
|
|
||||||
if (v4 === void 0) break;
|
|
||||||
bs[j++] = v & 255;
|
|
||||||
}
|
|
||||||
return Bytes.from(bs.subarray(0, j));
|
|
||||||
}
|
|
||||||
|
|
||||||
function isSpace(s: string): boolean {
|
function isSpace(s: string): boolean {
|
||||||
return ' \t\n\r'.indexOf(s) !== -1;
|
return ' \t\n\r'.indexOf(s) !== -1;
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,11 +1,10 @@
|
||||||
import { GenericEmbedded } from "./embedded";
|
import { Embeddable, GenericEmbedded } from "./embedded";
|
||||||
import { is } from "./is";
|
import { is } from "./is";
|
||||||
import { Value } from "./values";
|
import { Value } from "./values";
|
||||||
import { Writer } from "./writer";
|
|
||||||
|
|
||||||
export type Tuple<T> = Array<T> | [T];
|
export type Tuple<T> = Array<T> | [T];
|
||||||
|
|
||||||
export type Record<LabelType extends Value<T>, FieldsType extends Tuple<Value<T>>, T = GenericEmbedded>
|
export type Record<LabelType extends Value<T>, FieldsType extends Tuple<Value<T>>, T extends Embeddable = GenericEmbedded>
|
||||||
= FieldsType & { label: LabelType };
|
= FieldsType & { label: LabelType };
|
||||||
|
|
||||||
export type RecordGetters<Fs, R> = {
|
export type RecordGetters<Fs, R> = {
|
||||||
|
@ -15,7 +14,7 @@ export type RecordGetters<Fs, R> = {
|
||||||
export type CtorTypes<Fs, Names extends Tuple<keyof Fs>> =
|
export type CtorTypes<Fs, Names extends Tuple<keyof Fs>> =
|
||||||
{ [K in keyof Names]: Fs[keyof Fs & Names[K]] } & any[];
|
{ [K in keyof Names]: Fs[keyof Fs & Names[K]] } & any[];
|
||||||
|
|
||||||
export interface RecordConstructor<L extends Value<T>, Fs, Names extends Tuple<keyof Fs>, T = GenericEmbedded> {
|
export interface RecordConstructor<L extends Value<T>, Fs, Names extends Tuple<keyof Fs>, T extends Embeddable = GenericEmbedded> {
|
||||||
(...fields: CtorTypes<Fs, Names>): Record<L, CtorTypes<Fs, Names>, T>;
|
(...fields: CtorTypes<Fs, Names>): Record<L, CtorTypes<Fs, Names>, T>;
|
||||||
constructorInfo: RecordConstructorInfo<L, T>;
|
constructorInfo: RecordConstructorInfo<L, T>;
|
||||||
isClassOf(v: any): v is Record<L, CtorTypes<Fs, Names>, T>;
|
isClassOf(v: any): v is Record<L, CtorTypes<Fs, Names>, T>;
|
||||||
|
@ -23,7 +22,7 @@ export interface RecordConstructor<L extends Value<T>, Fs, Names extends Tuple<k
|
||||||
fieldNumbers: { [K in string & keyof Fs]: number };
|
fieldNumbers: { [K in string & keyof Fs]: number };
|
||||||
};
|
};
|
||||||
|
|
||||||
export interface RecordConstructorInfo<L extends Value<T>, T = GenericEmbedded> {
|
export interface RecordConstructorInfo<L extends Value<T>, T extends Embeddable = GenericEmbedded> {
|
||||||
label: L;
|
label: L;
|
||||||
arity: number;
|
arity: number;
|
||||||
}
|
}
|
||||||
|
@ -48,23 +47,23 @@ export function Record<L, FieldsType extends Tuple<any>>(
|
||||||
}
|
}
|
||||||
|
|
||||||
export namespace Record {
|
export namespace Record {
|
||||||
export function isRecord<L extends Value<T>, FieldsType extends Tuple<Value<T>>, T = GenericEmbedded>(x: any): x is Record<L, FieldsType, T> {
|
export function isRecord<L extends Value<T>, FieldsType extends Tuple<Value<T>>, T extends Embeddable = GenericEmbedded>(x: any): x is Record<L, FieldsType, T> {
|
||||||
return Array.isArray(x) && 'label' in x;
|
return Array.isArray(x) && 'label' in x;
|
||||||
}
|
}
|
||||||
|
|
||||||
export function constructorInfo<L extends Value<T>, FieldsType extends Tuple<Value<T>>, T = GenericEmbedded>(
|
export function constructorInfo<L extends Value<T>, FieldsType extends Tuple<Value<T>>, T extends Embeddable = GenericEmbedded>(
|
||||||
r: Record<L, FieldsType, T>): RecordConstructorInfo<L, T>
|
r: Record<L, FieldsType, T>): RecordConstructorInfo<L, T>
|
||||||
{
|
{
|
||||||
return { label: r.label, arity: r.length };
|
return { label: r.label, arity: r.length };
|
||||||
}
|
}
|
||||||
|
|
||||||
export function isClassOf<L extends Value<T>, FieldsType extends Tuple<Value<T>>, T = GenericEmbedded>(
|
export function isClassOf<L extends Value<T>, FieldsType extends Tuple<Value<T>>, T extends Embeddable = GenericEmbedded>(
|
||||||
ci: RecordConstructorInfo<L, T>, v: any): v is Record<L, FieldsType, T>
|
ci: RecordConstructorInfo<L, T>, v: any): v is Record<L, FieldsType, T>
|
||||||
{
|
{
|
||||||
return (Record.isRecord(v)) && is(ci.label, v.label) && (ci.arity === v.length);
|
return (Record.isRecord(v)) && is(ci.label, v.label) && (ci.arity === v.length);
|
||||||
}
|
}
|
||||||
|
|
||||||
export function makeConstructor<Fs, T = GenericEmbedded>()
|
export function makeConstructor<Fs, T extends Embeddable = GenericEmbedded>()
|
||||||
: (<L extends Value<T>, Names extends Tuple<keyof Fs>>(label: L, fieldNames: Names) =>
|
: (<L extends Value<T>, Names extends Tuple<keyof Fs>>(label: L, fieldNames: Names) =>
|
||||||
RecordConstructor<L, Fs, Names, T>)
|
RecordConstructor<L, Fs, Names, T>)
|
||||||
{
|
{
|
||||||
|
|
|
@ -1,4 +1,5 @@
|
||||||
export * from './annotated';
|
export * from './annotated';
|
||||||
|
export * from './base64';
|
||||||
export * from './bytes';
|
export * from './bytes';
|
||||||
export * from './codec';
|
export * from './codec';
|
||||||
export * from './compound';
|
export * from './compound';
|
||||||
|
@ -12,7 +13,9 @@ export * from './float';
|
||||||
export * from './fold';
|
export * from './fold';
|
||||||
export * from './fromjs';
|
export * from './fromjs';
|
||||||
export * from './is';
|
export * from './is';
|
||||||
|
export * from './jsdictionary';
|
||||||
export * from './merge';
|
export * from './merge';
|
||||||
|
export * from './order';
|
||||||
export * from './reader';
|
export * from './reader';
|
||||||
export * from './record';
|
export * from './record';
|
||||||
export * from './strip';
|
export * from './strip';
|
||||||
|
|
|
@ -1,18 +1,18 @@
|
||||||
import { Value } from "./values";
|
import { Value } from "./values";
|
||||||
import { Annotated } from "./annotated";
|
import { Annotated } from "./annotated";
|
||||||
import { Record, Tuple } from "./record";
|
import { Record, Tuple } from "./record";
|
||||||
import { Set, Dictionary } from "./dictionary";
|
import { Set, Dictionary, DictionaryMap } from "./dictionary";
|
||||||
import type { GenericEmbedded } from "./embedded";
|
import type { Embeddable, GenericEmbedded } from "./embedded";
|
||||||
|
|
||||||
export function unannotate<T = GenericEmbedded>(v: Value<T>): Value<T> {
|
export function unannotate<T extends Embeddable = GenericEmbedded>(v: Value<T>): Value<T> {
|
||||||
return Annotated.isAnnotated<T>(v) ? v.item : v;
|
return Annotated.isAnnotated<T>(v) ? v.item : v;
|
||||||
}
|
}
|
||||||
|
|
||||||
export function peel<T = GenericEmbedded>(v: Value<T>): Value<T> {
|
export function peel<T extends Embeddable = GenericEmbedded>(v: Value<T>): Value<T> {
|
||||||
return strip(v, 1);
|
return strip(v, 1);
|
||||||
}
|
}
|
||||||
|
|
||||||
export function strip<T = GenericEmbedded>(
|
export function strip<T extends Embeddable = GenericEmbedded>(
|
||||||
v: Value<T>,
|
v: Value<T>,
|
||||||
depth: number = Infinity): Value<T>
|
depth: number = Infinity): Value<T>
|
||||||
{
|
{
|
||||||
|
@ -34,7 +34,9 @@ export function strip<T = GenericEmbedded>(
|
||||||
} else if (Set.isSet<T>(v.item)) {
|
} else if (Set.isSet<T>(v.item)) {
|
||||||
return v.item.map(walk);
|
return v.item.map(walk);
|
||||||
} else if (Dictionary.isDictionary<T>(v.item)) {
|
} else if (Dictionary.isDictionary<T>(v.item)) {
|
||||||
return v.item.mapEntries((e) => [walk(e[0]), walk(e[1])]);
|
const result = new DictionaryMap<T>();
|
||||||
|
new DictionaryMap<T>(v.item).forEach((val, key) => result.set(walk(key), walk(val)));
|
||||||
|
return result.simplifiedValue();
|
||||||
} else {
|
} else {
|
||||||
return v.item;
|
return v.item;
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,4 +1,4 @@
|
||||||
import { Embedded, GenericEmbedded } from './embedded';
|
import { Embeddable, GenericEmbedded, isEmbedded } from './embedded';
|
||||||
import type { Value } from './values';
|
import type { Value } from './values';
|
||||||
|
|
||||||
import { Annotated } from './annotated';
|
import { Annotated } from './annotated';
|
||||||
|
@ -8,22 +8,28 @@ import { Writer, WriterOptions, EmbeddedWriter, WriterState } from './writer';
|
||||||
import { fromJS } from './fromjs';
|
import { fromJS } from './fromjs';
|
||||||
import { Reader, ReaderOptions } from './reader';
|
import { Reader, ReaderOptions } from './reader';
|
||||||
|
|
||||||
export function parse<T = GenericEmbedded>(buffer: string, options?: ReaderOptions<T>): Value<T> {
|
export function parse<T extends Embeddable = GenericEmbedded>(
|
||||||
|
buffer: string,
|
||||||
|
options?: ReaderOptions<T>,
|
||||||
|
): Value<T> {
|
||||||
return new Reader<T>(buffer, options).next();
|
return new Reader<T>(buffer, options).next();
|
||||||
}
|
}
|
||||||
|
|
||||||
export function parseAll<T = GenericEmbedded>(buffer: string, options?: ReaderOptions<T>): Value<T>[] {
|
export function parseAll<T extends Embeddable = GenericEmbedded>(
|
||||||
|
buffer: string,
|
||||||
|
options?: ReaderOptions<T>,
|
||||||
|
): Value<T>[] {
|
||||||
return new Reader<T>(buffer, options).readToEnd();
|
return new Reader<T>(buffer, options).readToEnd();
|
||||||
}
|
}
|
||||||
|
|
||||||
export const stringifyEmbeddedWrite: EmbeddedWriter<any> = {
|
export const stringifyEmbeddedWrite = {
|
||||||
write(s: WriterState, v: any): void {
|
write(s: WriterState, v: any): void {
|
||||||
if (v instanceof GenericEmbedded) {
|
if (v instanceof GenericEmbedded) {
|
||||||
new Writer(s, this).push(v.generic);
|
new Writer(s, this).push(v.generic);
|
||||||
} else {
|
} else {
|
||||||
try {
|
try {
|
||||||
const j = fromJS(v);
|
const j = fromJS(v);
|
||||||
if (!(j instanceof Embedded)) {
|
if (!isEmbedded(j)) {
|
||||||
new Writer(s, this).push(j);
|
new Writer(s, this).push(j);
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
@ -37,13 +43,13 @@ export const stringifyEmbeddedWrite: EmbeddedWriter<any> = {
|
||||||
}
|
}
|
||||||
};
|
};
|
||||||
|
|
||||||
export function stringify<T = GenericEmbedded>(x: any, options?: WriterOptions<T>): string {
|
export function stringify<T extends Embeddable = GenericEmbedded>(x: any, options?: WriterOptions<T>): string {
|
||||||
options = { ... (options ?? {}) };
|
options = { ... (options ?? {}) };
|
||||||
options.embeddedWrite = options.embeddedWrite ?? stringifyEmbeddedWrite;
|
options.embeddedWrite = options.embeddedWrite ?? stringifyEmbeddedWrite;
|
||||||
return Writer.stringify(fromJS<T>(x), options);
|
return Writer.stringify(fromJS<T>(x), options);
|
||||||
}
|
}
|
||||||
|
|
||||||
export function preserves<T>(pieces: TemplateStringsArray, ...values: any[]): string {
|
export function preserves(pieces: TemplateStringsArray, ...values: any[]): string {
|
||||||
const result = [pieces[0]];
|
const result = [pieces[0]];
|
||||||
values.forEach((v, i) => {
|
values.forEach((v, i) => {
|
||||||
result.push(stringify(v));
|
result.push(stringify(v));
|
||||||
|
|
|
@ -1,25 +1,25 @@
|
||||||
// Preserves Values.
|
// Preserves Values.
|
||||||
|
|
||||||
import type { Bytes } from './bytes';
|
import type { Bytes } from './bytes';
|
||||||
import type { DoubleFloat, SingleFloat } from './float';
|
import type { DoubleFloat } from './float';
|
||||||
import type { Annotated } from './annotated';
|
import type { Annotated } from './annotated';
|
||||||
import type { Set, Dictionary } from './dictionary';
|
import type { JsDictionary } from './jsdictionary';
|
||||||
import type { Embedded, GenericEmbedded } from './embedded';
|
import { Set, KeyedDictionary } from './dictionary';
|
||||||
|
import type { Embeddable, GenericEmbedded } from './embedded';
|
||||||
|
|
||||||
export type Value<T = GenericEmbedded> =
|
export type Value<T extends Embeddable = GenericEmbedded> =
|
||||||
| Atom
|
| Atom
|
||||||
| Compound<T>
|
| Compound<T>
|
||||||
| Embedded<T>
|
| T
|
||||||
| Annotated<T>;
|
| Annotated<T>;
|
||||||
export type Atom =
|
export type Atom =
|
||||||
| boolean
|
| boolean
|
||||||
| SingleFloat
|
|
||||||
| DoubleFloat
|
| DoubleFloat
|
||||||
| number | bigint
|
| number | bigint
|
||||||
| string
|
| string
|
||||||
| Bytes
|
| Bytes
|
||||||
| symbol;
|
| symbol;
|
||||||
export type Compound<T = GenericEmbedded> =
|
export type Compound<T extends Embeddable = GenericEmbedded> =
|
||||||
| (Array<Value<T>> | [Value<T>]) & { label: Value<T> }
|
| (Array<Value<T>> | [Value<T>]) & { label: Value<T> }
|
||||||
// ^ expanded from definition of Record<> in record.ts,
|
// ^ expanded from definition of Record<> in record.ts,
|
||||||
// because if we use Record<Value<T>, Tuple<Value<T>>, T>,
|
// because if we use Record<Value<T>, Tuple<Value<T>>, T>,
|
||||||
|
@ -28,4 +28,7 @@ export type Compound<T = GenericEmbedded> =
|
||||||
// Value<T> to any.
|
// Value<T> to any.
|
||||||
| Array<Value<T>>
|
| Array<Value<T>>
|
||||||
| Set<T>
|
| Set<T>
|
||||||
| Dictionary<T>;
|
// v Expanded from definition of Dictionary<> in dictionary.ts,
|
||||||
|
// because of circular-use-of-Value<T> issues.
|
||||||
|
| JsDictionary<Value<T>>
|
||||||
|
| KeyedDictionary<T>;
|
||||||
|
|
|
@ -1,18 +1,20 @@
|
||||||
import { isAnnotated } from './is';
|
import { isAnnotated } from './is';
|
||||||
import { Record, Tuple } from "./record";
|
import { Record, Tuple } from "./record";
|
||||||
import type { GenericEmbedded, Embedded, EmbeddedTypeEncode } from "./embedded";
|
import { Embeddable, GenericEmbedded, EmbeddedTypeEncode, isEmbedded } from "./embedded";
|
||||||
import { Encoder, EncoderState } from "./encoder";
|
import { Encoder, EncoderState } from "./encoder";
|
||||||
import type { Value } from "./values";
|
import type { Value } from "./values";
|
||||||
import { NUMBER_RE } from './reader';
|
import { NUMBER_RE } from './reader';
|
||||||
|
import { encodeBase64 } from './base64';
|
||||||
|
import { DictionaryMap, writeDictionaryOn } from './dictionary';
|
||||||
|
|
||||||
export type Writable<T> =
|
export type Writable<T extends Embeddable> =
|
||||||
Value<T> | PreserveWritable<T> | Iterable<Value<T>> | ArrayBufferView;
|
Value<T> | PreserveWritable<T> | Iterable<Value<T>> | ArrayBufferView;
|
||||||
|
|
||||||
export interface PreserveWritable<T> {
|
export interface PreserveWritable<T extends Embeddable> {
|
||||||
__preserve_text_on__(writer: Writer<T>): void;
|
__preserve_text_on__(writer: Writer<T>): void;
|
||||||
}
|
}
|
||||||
|
|
||||||
export function isPreserveWritable<T>(v: any): v is PreserveWritable<T> {
|
export function isPreserveWritable<T extends Embeddable>(v: any): v is PreserveWritable<T> {
|
||||||
return typeof v === 'object' && v !== null && '__preserve_text_on__' in v && typeof v.__preserve_text_on__ === 'function';
|
return typeof v === 'object' && v !== null && '__preserve_text_on__' in v && typeof v.__preserve_text_on__ === 'function';
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -190,7 +192,7 @@ export class WriterState {
|
||||||
this.pieces.push(buf + '"');
|
this.pieces.push(buf + '"');
|
||||||
}
|
}
|
||||||
|
|
||||||
couldBeFlat<T>(vs: Writable<T>[]): boolean {
|
couldBeFlat<T extends Embeddable>(vs: Writable<T>[]): boolean {
|
||||||
let seenCompound = false;
|
let seenCompound = false;
|
||||||
for (let v of vs) {
|
for (let v of vs) {
|
||||||
if (Array.isArray(v) || Set.isSet(v) || Map.isMap(v)) {
|
if (Array.isArray(v) || Set.isSet(v) || Map.isMap(v)) {
|
||||||
|
@ -205,29 +207,7 @@ export class WriterState {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
const BASE64 = 'ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/'
|
export class Writer<T extends Embeddable> {
|
||||||
|
|
||||||
export function encodeBase64(bs: Uint8Array): string {
|
|
||||||
let s = '';
|
|
||||||
let buffer = 0;
|
|
||||||
let bitcount = 0;
|
|
||||||
for (let b of bs) {
|
|
||||||
buffer = ((buffer & 0x3f) << 8) | b;
|
|
||||||
bitcount += 8;
|
|
||||||
while (bitcount >= 6) {
|
|
||||||
bitcount -= 6;
|
|
||||||
const v = (buffer >> bitcount) & 0x3f;
|
|
||||||
s = s + BASE64[v];
|
|
||||||
}
|
|
||||||
}
|
|
||||||
if (bitcount > 0) {
|
|
||||||
const v = (buffer << (6 - bitcount)) & 0x3f;
|
|
||||||
s = s + BASE64[v];
|
|
||||||
}
|
|
||||||
return s;
|
|
||||||
}
|
|
||||||
|
|
||||||
export class Writer<T> {
|
|
||||||
state: WriterState;
|
state: WriterState;
|
||||||
embeddedWrite: EmbeddedWriter<T>;
|
embeddedWrite: EmbeddedWriter<T>;
|
||||||
|
|
||||||
|
@ -246,7 +226,7 @@ export class Writer<T> {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
static stringify<T>(v: Writable<T>, options?: WriterOptions<T>): string {
|
static stringify<T extends Embeddable>(v: Writable<T>, options?: WriterOptions<T>): string {
|
||||||
const w = new Writer(options);
|
const w = new Writer(options);
|
||||||
w.push(v);
|
w.push(v);
|
||||||
return w.contents();
|
return w.contents();
|
||||||
|
@ -283,7 +263,10 @@ export class Writer<T> {
|
||||||
this.state.pieces.push('' + v);
|
this.state.pieces.push('' + v);
|
||||||
break;
|
break;
|
||||||
case 'object':
|
case 'object':
|
||||||
if (isPreserveWritable<unknown>(v)) {
|
if (v === null) {
|
||||||
|
throw new Error("Cannot encode null in Preserves Writer.push");
|
||||||
|
}
|
||||||
|
else if (isPreserveWritable<any>(v)) {
|
||||||
v.__preserve_text_on__(this);
|
v.__preserve_text_on__(this);
|
||||||
}
|
}
|
||||||
else if (isPreserveWritable<T>(v)) {
|
else if (isPreserveWritable<T>(v)) {
|
||||||
|
@ -316,16 +299,19 @@ export class Writer<T> {
|
||||||
else if (isIterable(v)) {
|
else if (isIterable(v)) {
|
||||||
this.state.writeSeq('[', ']', v, vv => this.push(vv));
|
this.state.writeSeq('[', ']', v, vv => this.push(vv));
|
||||||
}
|
}
|
||||||
else {
|
else if (isEmbedded(v)) {
|
||||||
((v: Embedded<T>) => {
|
this.state.pieces.push('#:');
|
||||||
this.state.pieces.push('#!');
|
if ('write' in this.embeddedWrite) {
|
||||||
if ('write' in this.embeddedWrite) {
|
this.embeddedWrite.write(this.state, v);
|
||||||
this.embeddedWrite.write(this.state, v.embeddedValue);
|
} else {
|
||||||
} else {
|
new Writer(this.state, genericEmbeddedTypeEncode)
|
||||||
new Writer(this.state, genericEmbeddedTypeEncode)
|
.push(this.embeddedWrite.toValue(v));
|
||||||
.push(this.embeddedWrite.toValue(v.embeddedValue));
|
}
|
||||||
}
|
} else {
|
||||||
})(v);
|
writeDictionaryOn(new DictionaryMap<T>(v),
|
||||||
|
this,
|
||||||
|
(k, w) => w.push(k),
|
||||||
|
(v, w) => w.push(v));
|
||||||
}
|
}
|
||||||
break;
|
break;
|
||||||
default:
|
default:
|
||||||
|
|
|
@ -82,37 +82,47 @@ describe('immutable byte arrays', () => {
|
||||||
});
|
});
|
||||||
|
|
||||||
describe('base64 decoder', () => {
|
describe('base64 decoder', () => {
|
||||||
describe('RFC4648 tests', () => {
|
const d64 = (s: string) => Bytes.from(decodeBase64(s));
|
||||||
it('10.0', () => expect(decodeBase64("")).is(Bytes.of()));
|
|
||||||
it('10.1', () => expect(decodeBase64("Zg==")).is(Bytes.of(102)));
|
|
||||||
it('10.2', () => expect(decodeBase64("Zm8=")).is(Bytes.of(102, 111)));
|
|
||||||
it('10.3', () => expect(decodeBase64("Zm9v")).is(Bytes.of(102, 111, 111)));
|
|
||||||
it('10.4', () => expect(decodeBase64("Zm9vYg==")).is(Bytes.of(102, 111, 111, 98)));
|
|
||||||
it('10.5', () => expect(decodeBase64("Zm9vYmE=")).is(Bytes.of(102, 111, 111, 98, 97)));
|
|
||||||
it('10.6', () => expect(decodeBase64("Zm9vYmFy")).is(Bytes.of(102, 111, 111, 98, 97, 114)));
|
|
||||||
|
|
||||||
it('10.1b', () => expect(decodeBase64("Zg")).is(Bytes.of(102)));
|
describe('RFC4648 tests', () => {
|
||||||
it('10.2b', () => expect(decodeBase64("Zm8")).is(Bytes.of(102, 111)));
|
it('10.0', () => expect(d64("")).is(Bytes.of()));
|
||||||
it('10.4b', () => expect(decodeBase64("Zm9vYg")).is(Bytes.of(102, 111, 111, 98)));
|
it('10.1', () => expect(d64("Zg==")).is(Bytes.of(102)));
|
||||||
it('10.5b', () => expect(decodeBase64("Zm9vYmE")).is(Bytes.of(102, 111, 111, 98, 97)));
|
it('10.2', () => expect(d64("Zm8=")).is(Bytes.of(102, 111)));
|
||||||
|
it('10.3', () => expect(d64("Zm9v")).is(Bytes.of(102, 111, 111)));
|
||||||
|
it('10.4', () => expect(d64("Zm9vYg==")).is(Bytes.of(102, 111, 111, 98)));
|
||||||
|
it('10.5', () => expect(d64("Zm9vYmE=")).is(Bytes.of(102, 111, 111, 98, 97)));
|
||||||
|
it('10.6', () => expect(d64("Zm9vYmFy")).is(Bytes.of(102, 111, 111, 98, 97, 114)));
|
||||||
|
|
||||||
|
it('10.1b', () => expect(d64("Zg")).is(Bytes.of(102)));
|
||||||
|
it('10.2b', () => expect(d64("Zm8")).is(Bytes.of(102, 111)));
|
||||||
|
it('10.4b', () => expect(d64("Zm9vYg")).is(Bytes.of(102, 111, 111, 98)));
|
||||||
|
it('10.5b', () => expect(d64("Zm9vYmE")).is(Bytes.of(102, 111, 111, 98, 97)));
|
||||||
});
|
});
|
||||||
|
|
||||||
describe('RFC4648 examples', () => {
|
describe('RFC4648 examples', () => {
|
||||||
it('example0', () =>
|
it('example0', () =>
|
||||||
expect(decodeBase64('FPucA9l+')).is(Bytes.of(0x14, 0xfb, 0x9c, 0x03, 0xd9, 0x7e)));
|
expect(d64('FPucA9l+')).is(Bytes.of(0x14, 0xfb, 0x9c, 0x03, 0xd9, 0x7e)));
|
||||||
it('example1', () =>
|
it('example1', () =>
|
||||||
expect(decodeBase64('FPucA9k=')).is(Bytes.of(0x14, 0xfb, 0x9c, 0x03, 0xd9)));
|
expect(d64('FPucA9k=')).is(Bytes.of(0x14, 0xfb, 0x9c, 0x03, 0xd9)));
|
||||||
it('example1b', () =>
|
it('example1b', () =>
|
||||||
expect(decodeBase64('FPucA9k')).is(Bytes.of(0x14, 0xfb, 0x9c, 0x03, 0xd9)));
|
expect(d64('FPucA9k')).is(Bytes.of(0x14, 0xfb, 0x9c, 0x03, 0xd9)));
|
||||||
it('example2', () =>
|
it('example2', () =>
|
||||||
expect(decodeBase64('FPucAw==')).is(Bytes.of(0x14, 0xfb, 0x9c, 0x03)));
|
expect(d64('FPucAw==')).is(Bytes.of(0x14, 0xfb, 0x9c, 0x03)));
|
||||||
it('example2b', () =>
|
it('example2b', () =>
|
||||||
expect(decodeBase64('FPucAw=')).is(Bytes.of(0x14, 0xfb, 0x9c, 0x03)));
|
expect(d64('FPucAw=')).is(Bytes.of(0x14, 0xfb, 0x9c, 0x03)));
|
||||||
it('example2c', () =>
|
it('example2c', () =>
|
||||||
expect(decodeBase64('FPucAw')).is(Bytes.of(0x14, 0xfb, 0x9c, 0x03)));
|
expect(d64('FPucAw')).is(Bytes.of(0x14, 0xfb, 0x9c, 0x03)));
|
||||||
});
|
});
|
||||||
|
|
||||||
describe('Misc test cases', () => {
|
describe('Misc test cases', () => {
|
||||||
it('gQ==', () => expect(decodeBase64('gQ==')).is(Bytes.of(0x81)));
|
it('gQ==', () => expect(d64('gQ==')).is(Bytes.of(0x81)));
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
describe('latin1 quasi-decoder', () => {
|
||||||
|
it('decodes ascii', () => expect(Bytes.fromLatin1('abc')).is(Bytes.of(97, 98, 99)));
|
||||||
|
it('encodes ascii', () => expect(Bytes.of(97, 98, 99).toLatin1()).is('abc'));
|
||||||
|
it('decodes unprintables', () => expect(Bytes.fromLatin1('\x00\x01a\xfe\xff')).is(Bytes.of(0, 1, 97, 254, 255)));
|
||||||
|
it('encodes unprintables', () => expect(Bytes.of(0, 1, 97, 254, 255).toLatin1()).is('\x00\x01a\xfe\xff'));
|
||||||
|
it('rejects out-of-bounds', () => expect(() => Bytes.fromLatin1('ac╔b')).toThrowError('Codepoint out of range'));
|
||||||
|
});
|
||||||
|
|
|
@ -4,7 +4,6 @@ import {
|
||||||
decode, decodeWithAnnotations, encode, canonicalEncode,
|
decode, decodeWithAnnotations, encode, canonicalEncode,
|
||||||
DecodeError, ShortPacket,
|
DecodeError, ShortPacket,
|
||||||
Bytes, Record,
|
Bytes, Record,
|
||||||
annotate,
|
|
||||||
strip, peel,
|
strip, peel,
|
||||||
preserves,
|
preserves,
|
||||||
stringify,
|
stringify,
|
||||||
|
@ -16,11 +15,11 @@ import {
|
||||||
EmbeddedType,
|
EmbeddedType,
|
||||||
DecoderState,
|
DecoderState,
|
||||||
Decoder,
|
Decoder,
|
||||||
Embedded,
|
|
||||||
embed,
|
|
||||||
genericEmbeddedTypeDecode,
|
genericEmbeddedTypeDecode,
|
||||||
genericEmbeddedTypeEncode,
|
genericEmbeddedTypeEncode,
|
||||||
parse,
|
parse,
|
||||||
|
Embedded,
|
||||||
|
KeyedDictionary,
|
||||||
} from '../src/index';
|
} from '../src/index';
|
||||||
const { Tag } = Constants;
|
const { Tag } = Constants;
|
||||||
import './test-utils';
|
import './test-utils';
|
||||||
|
@ -78,7 +77,7 @@ describe('parsing from subarray', () => {
|
||||||
|
|
||||||
describe('reusing buffer space', () => {
|
describe('reusing buffer space', () => {
|
||||||
it('should be done safely, even with nested dictionaries', () => {
|
it('should be done safely, even with nested dictionaries', () => {
|
||||||
expect(canonicalEncode(fromJS(['aaa', Dictionary.fromJS({a: 1}), 'zzz'])).toHex()).is(
|
expect(canonicalEncode(fromJS(['aaa', Dictionary.stringMap({a: 1}), 'zzz'])).toHex()).is(
|
||||||
`b5
|
`b5
|
||||||
b103616161
|
b103616161
|
||||||
b7
|
b7
|
||||||
|
@ -90,33 +89,29 @@ describe('reusing buffer space', () => {
|
||||||
});
|
});
|
||||||
|
|
||||||
describe('encoding and decoding embeddeds', () => {
|
describe('encoding and decoding embeddeds', () => {
|
||||||
class LookasideEmbeddedType implements EmbeddedType<object> {
|
class LookasideEmbeddedType implements EmbeddedType<Embedded<object>> {
|
||||||
readonly objects: object[];
|
readonly objects: Embedded<object>[];
|
||||||
|
|
||||||
constructor(objects: object[]) {
|
constructor(objects: Embedded<object>[]) {
|
||||||
this.objects = objects;
|
this.objects = objects;
|
||||||
}
|
}
|
||||||
|
|
||||||
decode(d: DecoderState): object {
|
decode(d: DecoderState): Embedded<object> {
|
||||||
return this.fromValue(new Decoder<GenericEmbedded>(d).next());
|
return this.fromValue(new Decoder<GenericEmbedded>(d).next());
|
||||||
}
|
}
|
||||||
|
|
||||||
encode(e: EncoderState, v: object): void {
|
encode(e: EncoderState, v: Embedded<object>): void {
|
||||||
new Encoder(e).push(this.toValue(v));
|
new Encoder(e).push(this.toValue(v));
|
||||||
}
|
}
|
||||||
|
|
||||||
equals(a: object, b: object): boolean {
|
fromValue(v: Value<GenericEmbedded>): Embedded<object> {
|
||||||
return Object.is(a, b);
|
|
||||||
}
|
|
||||||
|
|
||||||
fromValue(v: Value<GenericEmbedded>): object {
|
|
||||||
if (typeof v !== 'number' || v < 0 || v >= this.objects.length) {
|
if (typeof v !== 'number' || v < 0 || v >= this.objects.length) {
|
||||||
throw new Error("Unknown embedded target");
|
throw new Error(`Unknown embedded target: ${stringify(v)}`);
|
||||||
}
|
}
|
||||||
return this.objects[v];
|
return this.objects[v];
|
||||||
}
|
}
|
||||||
|
|
||||||
toValue(v: object): number {
|
toValue(v: Embedded<object>): number {
|
||||||
let i = this.objects.indexOf(v);
|
let i = this.objects.indexOf(v);
|
||||||
if (i !== -1) return i;
|
if (i !== -1) return i;
|
||||||
this.objects.push(v);
|
this.objects.push(v);
|
||||||
|
@ -125,8 +120,8 @@ describe('encoding and decoding embeddeds', () => {
|
||||||
}
|
}
|
||||||
|
|
||||||
it('should encode using embeddedId when no function has been supplied', () => {
|
it('should encode using embeddedId when no function has been supplied', () => {
|
||||||
const A1 = embed({a: 1});
|
const A1 = new Embedded({a: 1});
|
||||||
const A2 = embed({a: 1});
|
const A2 = new Embedded({a: 1});
|
||||||
const bs1 = canonicalEncode(A1);
|
const bs1 = canonicalEncode(A1);
|
||||||
const bs2 = canonicalEncode(A2);
|
const bs2 = canonicalEncode(A2);
|
||||||
const bs3 = canonicalEncode(A1);
|
const bs3 = canonicalEncode(A1);
|
||||||
|
@ -143,24 +138,24 @@ describe('encoding and decoding embeddeds', () => {
|
||||||
.toThrow("Embeddeds not permitted at this point in Preserves document");
|
.toThrow("Embeddeds not permitted at this point in Preserves document");
|
||||||
});
|
});
|
||||||
it('should encode properly', () => {
|
it('should encode properly', () => {
|
||||||
const objects: object[] = [];
|
const objects: Embedded<object>[] = [];
|
||||||
const pt = new LookasideEmbeddedType(objects);
|
const pt = new LookasideEmbeddedType(objects);
|
||||||
const A = embed({a: 1});
|
const A = new Embedded({a: 1});
|
||||||
const B = embed({b: 2});
|
const B = new Embedded({b: 2});
|
||||||
expect(encode([A, B], { embeddedEncode: pt })).is(
|
expect(encode([A, B], { embeddedEncode: pt })).is(
|
||||||
Bytes.from([Tag.Sequence,
|
Bytes.from([Tag.Sequence,
|
||||||
Tag.Embedded, Tag.SignedInteger, 0,
|
Tag.Embedded, Tag.SignedInteger, 0,
|
||||||
Tag.Embedded, Tag.SignedInteger, 1, 1,
|
Tag.Embedded, Tag.SignedInteger, 1, 1,
|
||||||
Tag.End]));
|
Tag.End]));
|
||||||
expect(objects).toEqual([A.embeddedValue, B.embeddedValue]);
|
expect(objects).toEqual([A, B]);
|
||||||
});
|
});
|
||||||
it('should decode properly', () => {
|
it('should decode properly', () => {
|
||||||
const objects: object[] = [];
|
const objects: Embedded<object>[] = [];
|
||||||
const pt = new LookasideEmbeddedType(objects);
|
const pt = new LookasideEmbeddedType(objects);
|
||||||
const X: Embedded<object> = embed({x: 123});
|
const X = new Embedded({x: 123});
|
||||||
const Y: Embedded<object> = embed({y: 456});
|
const Y = new Embedded({y: 456});
|
||||||
objects.push(X.embeddedValue);
|
objects.push(X);
|
||||||
objects.push(Y.embeddedValue);
|
objects.push(Y);
|
||||||
expect(decode(Bytes.from([
|
expect(decode(Bytes.from([
|
||||||
Tag.Sequence,
|
Tag.Sequence,
|
||||||
Tag.Embedded, Tag.SignedInteger, 0,
|
Tag.Embedded, Tag.SignedInteger, 0,
|
||||||
|
@ -169,17 +164,17 @@ describe('encoding and decoding embeddeds', () => {
|
||||||
]), { embeddedDecode: pt })).is([X, Y]);
|
]), { embeddedDecode: pt })).is([X, Y]);
|
||||||
});
|
});
|
||||||
it('should store embeddeds embedded in map keys correctly', () => {
|
it('should store embeddeds embedded in map keys correctly', () => {
|
||||||
const A1a = {a: 1};
|
const A1a = new Embedded({a: 1});
|
||||||
const A1: Embedded<object> = embed(A1a);
|
const A1 = A1a;
|
||||||
const A2: Embedded<object> = embed({a: 1});
|
const A2 = new Embedded({a: 1});
|
||||||
const m = new Dictionary<object, number>();
|
const m = new KeyedDictionary<Embedded<object>, Value<Embedded<object>>, number>();
|
||||||
m.set([A1], 1);
|
m.set([A1], 1);
|
||||||
m.set([A2], 2);
|
m.set([A2], 2);
|
||||||
expect(m.get(A1)).toBeUndefined();
|
expect(m.get(A1)).toBeUndefined();
|
||||||
expect(m.get([A1])).toBe(1);
|
expect(m.get([A1])).toBe(1);
|
||||||
expect(m.get([A2])).toBe(2);
|
expect(m.get([A2])).toBe(2);
|
||||||
expect(m.get([embed({a: 1})])).toBeUndefined();
|
expect(m.get([{a: 1}])).toBeUndefined();
|
||||||
A1a.a = 3;
|
A1a.value.a = 3;
|
||||||
expect(m.get([A1])).toBe(1);
|
expect(m.get([A1])).toBe(1);
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
@ -309,7 +304,7 @@ describe('common test suite', () => {
|
||||||
|
|
||||||
const tests = (peel(TestCases._.cases(peel(samples) as TestCases)) as
|
const tests = (peel(TestCases._.cases(peel(samples) as TestCases)) as
|
||||||
Dictionary<GenericEmbedded>);
|
Dictionary<GenericEmbedded>);
|
||||||
tests.forEach((t0: Value<GenericEmbedded>, tName0: Value<GenericEmbedded>) => {
|
Dictionary.asMap(tests).forEach((t0, tName0) => {
|
||||||
const tName = Symbol.keyFor(strip(tName0) as symbol)!;
|
const tName = Symbol.keyFor(strip(tName0) as symbol)!;
|
||||||
const t = peel(t0) as Record<symbol, any, GenericEmbedded>;
|
const t = peel(t0) as Record<symbol, any, GenericEmbedded>;
|
||||||
switch (t.label) {
|
switch (t.label) {
|
||||||
|
|
|
@ -0,0 +1,45 @@
|
||||||
|
import { Pexpr, Value, fromJS, parse, stringify } from '../src/index';
|
||||||
|
import './test-utils';
|
||||||
|
|
||||||
|
function P(s: string): Value<any>[] {
|
||||||
|
return parse(s, { includeAnnotations: true });
|
||||||
|
}
|
||||||
|
|
||||||
|
describe('basics', () => {
|
||||||
|
it('simple example', () => {
|
||||||
|
const r = new Pexpr.Reader('#!foo\n<bar {zot ::quux}, [a; b; c;]>');
|
||||||
|
const d = r.nextDocument();
|
||||||
|
expect(fromJS(d)).is(P(`[\n#!foo\n
|
||||||
|
<r bar <b zot <p |::|> quux> <p |,|> [a <p |;|> b <p |;|> c <p |;|>]>]`));
|
||||||
|
});
|
||||||
|
|
||||||
|
it('simple group', () => {
|
||||||
|
const r = new Pexpr.Reader('(+ 1 2)');
|
||||||
|
expect(fromJS(r.nextDocument())).is(P('[<g + 1 2>]'));
|
||||||
|
});
|
||||||
|
|
||||||
|
it('asPreserves', () => {
|
||||||
|
const s = '{a: b, c: d, e: [1, 2 <r 3 4>]}';
|
||||||
|
const d = new Pexpr.Reader(s).nextDocument();
|
||||||
|
const v = Pexpr.asPreserves(d.get(0)!);
|
||||||
|
expect(v).is(P(s));
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
|
describe('trailing comments', () => {
|
||||||
|
it('basics 1', () => {
|
||||||
|
const d = new Pexpr.Reader('# a comment with nothing after').nextDocument();
|
||||||
|
expect(d.annotations?.[d.exprs.length].get(0)?.item).toBe('a comment with nothing after');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('basics 2', () => {
|
||||||
|
const d = new Pexpr.Reader('# a comment with nothing after\n').nextDocument();
|
||||||
|
expect(d.annotations?.[d.exprs.length].get(0)?.item).toBe('a comment with nothing after');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('inside a sequence', () => {
|
||||||
|
const d = new Pexpr.Reader('[\n1\n# a comment with nothing after\n]\n').nextDocument();
|
||||||
|
const seq = d.get(0)?.item as Pexpr.Compound;
|
||||||
|
expect(seq.annotations?.[seq.exprs.length].get(0)?.item).toBe('a comment with nothing after');
|
||||||
|
});
|
||||||
|
});
|
|
@ -1,4 +1,4 @@
|
||||||
import { Bytes, Decoder, genericEmbeddedType, encode, Reader, Double } from '../src/index';
|
import { Bytes, Decoder, genericEmbeddedType, encode, Reader, Double, KeyedDictionary, Value } from '../src/index';
|
||||||
import './test-utils';
|
import './test-utils';
|
||||||
|
|
||||||
import * as fs from 'fs';
|
import * as fs from 'fs';
|
||||||
|
@ -36,4 +36,23 @@ describe('reading common test suite', () => {
|
||||||
expect(new Reader('123.0').next()).toEqual(Double(123.0));
|
expect(new Reader('123.0').next()).toEqual(Double(123.0));
|
||||||
expect(new Reader('123.00').next()).toEqual(Double(123.0));
|
expect(new Reader('123.00').next()).toEqual(Double(123.0));
|
||||||
});
|
});
|
||||||
|
|
||||||
|
it('should produce a sensible JS object for symbol-keyed dictionaries', () => {
|
||||||
|
expect(new Reader('{a: 1, b: 2}').next()).toEqual({a: 1, b: 2});
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should produce a sensible dictionary for mixed-keyed dictionaries', () => {
|
||||||
|
expect(new Reader('{a: 1, "b": 2}').next()).is(
|
||||||
|
new KeyedDictionary([[Symbol.for('a'), 1], ["b", 2]] as [Value, Value][]));
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should produce a sensible dictionary for string-keyed dictionaries', () => {
|
||||||
|
expect(new Reader('{"a": 1, "b": 2}').next()).is(
|
||||||
|
new KeyedDictionary([["a", 1], ["b", 2]] as [Value, Value][]));
|
||||||
|
});
|
||||||
|
|
||||||
|
it('should produce a sensible dictionary for integer-keyed dictionaries', () => {
|
||||||
|
expect(new Reader('{9: 1, 8: 2}').next()).is(
|
||||||
|
new KeyedDictionary([[9, 1], [8, 2]] as [Value, Value][]));
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
import { Value, is, preserves } from '../src/index';
|
import { Value, is, preserves, Embeddable } from '../src/index';
|
||||||
import '../src/node_support';
|
import '../src/node_support';
|
||||||
|
|
||||||
declare global {
|
declare global {
|
||||||
namespace jest {
|
namespace jest {
|
||||||
interface Matchers<R> {
|
interface Matchers<R> {
|
||||||
is<T>(expected: Value<T>): R;
|
is<T extends Embeddable>(expected: Value<T>): R;
|
||||||
toThrowFilter(f: (e: Error) => boolean): R;
|
toThrowFilter(f: (e: Error) => boolean): R;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -1,12 +1,6 @@
|
||||||
import { Single, Double, fromJS, Dictionary, IDENTITY_FOLD, fold, mapEmbeddeds, Value, embed, preserves } from '../src/index';
|
import { Double, fromJS, IDENTITY_FOLD, fold, mapEmbeddeds, Value, preserves, KeyedDictionary, Embeddable, Embedded } from '../src/index';
|
||||||
import './test-utils';
|
import './test-utils';
|
||||||
|
|
||||||
describe('Single', () => {
|
|
||||||
it('should print reasonably', () => {
|
|
||||||
expect(Single(123.45).toString()).toEqual("123.45f");
|
|
||||||
});
|
|
||||||
});
|
|
||||||
|
|
||||||
describe('Double', () => {
|
describe('Double', () => {
|
||||||
it('should print reasonably', () => {
|
it('should print reasonably', () => {
|
||||||
expect(Double(123.45).toString()).toEqual("123.45");
|
expect(Double(123.45).toString()).toEqual("123.45");
|
||||||
|
@ -14,26 +8,26 @@ describe('Double', () => {
|
||||||
});
|
});
|
||||||
|
|
||||||
describe('fold', () => {
|
describe('fold', () => {
|
||||||
function mkv<T extends object>(t: T): Value<T> {
|
function mkv<T extends Embeddable>(t: T): Value<T> {
|
||||||
return fromJS<T>([
|
return fromJS<T>([
|
||||||
1,
|
1,
|
||||||
2,
|
2,
|
||||||
new Dictionary([[[3, 4], fromJS([5, 6])],
|
new KeyedDictionary<T>([[[3, 4], fromJS([5, 6])],
|
||||||
['a', 1],
|
['a', 1],
|
||||||
['b', true]]),
|
['b', true]]),
|
||||||
Single(3.4),
|
Double(3.4),
|
||||||
t,
|
t,
|
||||||
]);
|
]);
|
||||||
}
|
}
|
||||||
|
|
||||||
it('should support identity', () => {
|
it('should support identity', () => {
|
||||||
const w = new Date();
|
const w = new Date();
|
||||||
const v = mkv(w);
|
const v = mkv(new Embedded(w));
|
||||||
expect(fold(v, IDENTITY_FOLD)).is(v);
|
expect(fold(v, IDENTITY_FOLD)).is(v);
|
||||||
const w1 = new Date();
|
const w1 = new Date();
|
||||||
const v1 = mkv(w1);
|
const v1 = mkv(new Embedded(w1));
|
||||||
expect(fold(v, IDENTITY_FOLD)).not.is(v1);
|
expect(fold(v, IDENTITY_FOLD)).not.is(v1);
|
||||||
expect(mapEmbeddeds(v, _t => embed(w1))).is(v1);
|
expect(mapEmbeddeds(v, _t => new Embedded(w1))).is(v1);
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
|
@ -75,6 +69,12 @@ describe('is()', () => {
|
||||||
expect(c).not.toBe(3);
|
expect(c).not.toBe(3);
|
||||||
expect(c).not.is(3);
|
expect(c).not.is(3);
|
||||||
});
|
});
|
||||||
|
it('should compare equivalent JsDictionary and KeyedDictionary values sensibly', () => {
|
||||||
|
const a = {a: 1, b: 2};
|
||||||
|
const b = new KeyedDictionary(
|
||||||
|
[[Symbol.for('a'), 1], [Symbol.for('b'), 2]] as [Value, Value][]);
|
||||||
|
expect(a).is(b);
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|
||||||
describe('`preserves` formatter', () => {
|
describe('`preserves` formatter', () => {
|
||||||
|
@ -88,4 +88,18 @@ describe('`preserves` formatter', () => {
|
||||||
expect(preserves`>${BigInt("12345678123456781234567812345678")}<`)
|
expect(preserves`>${BigInt("12345678123456781234567812345678")}<`)
|
||||||
.toBe('>12345678123456781234567812345678<');
|
.toBe('>12345678123456781234567812345678<');
|
||||||
});
|
});
|
||||||
|
it('should format regular JS objects', () => {
|
||||||
|
expect(preserves`>${({a: 1, b: 2})}<`)
|
||||||
|
.toBe('>{a: 1 b: 2}<');
|
||||||
|
});
|
||||||
|
it('should format dictionaries with string keys', () => {
|
||||||
|
const v = new KeyedDictionary([["a", 1], ["b", 2]]);
|
||||||
|
expect(preserves`>${v}<`)
|
||||||
|
.toBe('>{"a": 1 "b": 2}<');
|
||||||
|
});
|
||||||
|
it('should format dictionaries with symbol keys', () => {
|
||||||
|
const v = new KeyedDictionary([[Symbol.for("a"), 1], [Symbol.for("b"), 2]]);
|
||||||
|
expect(preserves`>${v}<`)
|
||||||
|
.toBe('>{a: 1 b: 2}<');
|
||||||
|
});
|
||||||
});
|
});
|
||||||
|
|
|
@ -1,2 +1,2 @@
|
||||||
#!/usr/bin/env node
|
#!/usr/bin/env node
|
||||||
require('../dist/bin/preserves-schema-ts.js').main(process.argv.slice(2));
|
require('../lib/bin/preserves-schema-ts.js').main(process.argv.slice(2));
|
||||||
|
|
|
@ -1,2 +1,2 @@
|
||||||
#!/usr/bin/env node
|
#!/usr/bin/env node
|
||||||
require('../dist/bin/preserves-schemac.js').main(process.argv.slice(2));
|
require('../lib/bin/preserves-schemac.js').main(process.argv.slice(2));
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
{
|
{
|
||||||
"name": "@preserves/schema-cli",
|
"name": "@preserves/schema-cli",
|
||||||
"version": "0.992.1",
|
"version": "0.995.206",
|
||||||
"description": "Command-line tools for Preserves Schema",
|
"description": "Command-line tools for Preserves Schema",
|
||||||
"homepage": "https://gitlab.com/preserves/preserves",
|
"homepage": "https://gitlab.com/preserves/preserves",
|
||||||
"license": "Apache-2.0",
|
"license": "Apache-2.0",
|
||||||
|
@ -11,11 +11,9 @@
|
||||||
"author": "Tony Garnock-Jones <tonyg@leastfixedpoint.com>",
|
"author": "Tony Garnock-Jones <tonyg@leastfixedpoint.com>",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"clean": "rm -rf lib dist",
|
"clean": "rm -rf lib dist",
|
||||||
"prepare": "yarn compile && yarn rollup",
|
"prepare": "yarn compile",
|
||||||
"compile": "tsc",
|
"compile": "tsc",
|
||||||
"compile:watch": "yarn compile -w",
|
"compile:watch": "yarn compile -w",
|
||||||
"rollup": "rollup -c",
|
|
||||||
"rollup:watch": "yarn rollup -w",
|
|
||||||
"test": "true",
|
"test": "true",
|
||||||
"veryclean": "yarn run clean && rm -rf node_modules"
|
"veryclean": "yarn run clean && rm -rf node_modules"
|
||||||
},
|
},
|
||||||
|
@ -28,8 +26,8 @@
|
||||||
"@types/minimatch": "^3.0"
|
"@types/minimatch": "^3.0"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@preserves/core": "^0.992.1",
|
"@preserves/core": "^0.995.206",
|
||||||
"@preserves/schema": "^0.992.1",
|
"@preserves/schema": "^0.995.206",
|
||||||
"chalk": "^4.1",
|
"chalk": "^4.1",
|
||||||
"chokidar": "^3.5",
|
"chokidar": "^3.5",
|
||||||
"commander": "^7.2",
|
"commander": "^7.2",
|
||||||
|
|
|
@ -1,17 +0,0 @@
|
||||||
import terser from '@rollup/plugin-terser';
|
|
||||||
|
|
||||||
function cli(name) {
|
|
||||||
return {
|
|
||||||
input: `lib/bin/${name}.js`,
|
|
||||||
output: [{file: `dist/bin/${name}.js`, format: 'commonjs'}],
|
|
||||||
external: [
|
|
||||||
'@preserves/core',
|
|
||||||
'@preserves/schema',
|
|
||||||
],
|
|
||||||
};
|
|
||||||
}
|
|
||||||
|
|
||||||
export default [
|
|
||||||
cli('preserves-schema-ts'),
|
|
||||||
cli('preserves-schemac'),
|
|
||||||
];
|
|
|
@ -24,10 +24,9 @@ export function run(options: CommandLineArguments): void {
|
||||||
|
|
||||||
if (failures.length === 0) {
|
if (failures.length === 0) {
|
||||||
if (options.bundle) {
|
if (options.bundle) {
|
||||||
fs.writeSync(1, underlying(canonicalEncode(M.fromBundle({
|
fs.writeSync(1, underlying(canonicalEncode(M.fromBundle(M.Bundle(
|
||||||
modules: new KeyedDictionary<M.ModulePath, M.Schema, M.InputEmbedded>(
|
new KeyedDictionary<M.InputEmbedded, M.ModulePath, M.Schema>(
|
||||||
inputFiles.map(i => [i.modulePath, i.schema])),
|
inputFiles.map(i => [i.modulePath, i.schema])))))));
|
||||||
}))));
|
|
||||||
} else {
|
} else {
|
||||||
fs.writeSync(1, underlying(canonicalEncode(M.fromSchema(inputFiles[0].schema))));
|
fs.writeSync(1, underlying(canonicalEncode(M.fromSchema(inputFiles[0].schema))));
|
||||||
}
|
}
|
||||||
|
|
|
@ -9,6 +9,7 @@
|
||||||
"declarationDir": "./lib",
|
"declarationDir": "./lib",
|
||||||
"esModuleInterop": true,
|
"esModuleInterop": true,
|
||||||
"moduleResolution": "node",
|
"moduleResolution": "node",
|
||||||
|
"module": "commonjs",
|
||||||
"sourceMap": true,
|
"sourceMap": true,
|
||||||
"strict": true
|
"strict": true
|
||||||
},
|
},
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
{
|
{
|
||||||
"name": "@preserves/schema",
|
"name": "@preserves/schema",
|
||||||
"version": "0.992.1",
|
"version": "0.995.206",
|
||||||
"description": "Schema support for Preserves data serialization format",
|
"description": "Schema support for Preserves data serialization format",
|
||||||
"homepage": "https://gitlab.com/preserves/preserves",
|
"homepage": "https://gitlab.com/preserves/preserves",
|
||||||
"license": "Apache-2.0",
|
"license": "Apache-2.0",
|
||||||
|
@ -13,19 +13,23 @@
|
||||||
"types": "lib/index.d.ts",
|
"types": "lib/index.d.ts",
|
||||||
"author": "Tony Garnock-Jones <tonyg@leastfixedpoint.com>",
|
"author": "Tony Garnock-Jones <tonyg@leastfixedpoint.com>",
|
||||||
"scripts": {
|
"scripts": {
|
||||||
"regenerate": "rm -rf ./src/gen && yarn copy-schema && ../schema-cli/bin/preserves-schema-ts.js --output ./src/gen ./dist:schema.prs",
|
"regenerate": "rm -rf ./src/gen && yarn copy-schema && ../schema-cli/bin/preserves-schema-ts.js --output ./src/gen ./dist:*.prs",
|
||||||
"clean": "rm -rf lib dist",
|
"clean": "rm -rf lib dist",
|
||||||
"prepare": "yarn compile && yarn rollup && yarn copy-schema",
|
"prepare": "yarn compile && yarn rollup && yarn copy-schema && cp preserves-schema-browser.js dist/",
|
||||||
"compile": "tsc",
|
"compile": "tsc",
|
||||||
"compile:watch": "yarn compile -w",
|
"compile:watch": "yarn compile -w",
|
||||||
"rollup": "rollup -c",
|
"rollup": "rollup -c",
|
||||||
"rollup:watch": "yarn rollup -w",
|
"rollup:watch": "yarn rollup -w",
|
||||||
"copy-schema": "mkdir -p ./dist && cp -a ../../../../schema/schema.prs ./dist",
|
"copy-schema": "mkdir -p ./dist && cp -a ../../../../schema/schema.prs ../../../../schema/host.prs ./dist",
|
||||||
"test": "jest",
|
"test": "jest",
|
||||||
"test:watch": "jest --watch",
|
"test:watch": "jest --watch",
|
||||||
"veryclean": "yarn run clean && rm -rf node_modules"
|
"veryclean": "yarn run clean && rm -rf node_modules"
|
||||||
},
|
},
|
||||||
"dependencies": {
|
"dependencies": {
|
||||||
"@preserves/core": "^0.992.1"
|
"@preserves/core": "^0.995.206"
|
||||||
|
},
|
||||||
|
"devDependencies": {
|
||||||
|
"@types/js-beautify": "1.14",
|
||||||
|
"js-beautify": "1.15"
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -0,0 +1,76 @@
|
||||||
|
(() => {
|
||||||
|
const I = new PreservesSchema.SchemaInterpreter();
|
||||||
|
|
||||||
|
globalThis.Schema = { __interpreter: I };
|
||||||
|
let schemaReady;
|
||||||
|
globalThis.SchemaReady = new Promise(res => schemaReady = res);
|
||||||
|
|
||||||
|
async function translateScripts() {
|
||||||
|
|
||||||
|
const schemaScripts =
|
||||||
|
Array.from(document.getElementsByTagName('script'))
|
||||||
|
.filter(s => (s.type === 'text/preserves+schema' ||
|
||||||
|
s.type === 'schema'));
|
||||||
|
|
||||||
|
for (const script of schemaScripts) {
|
||||||
|
function complain(message, detail) {
|
||||||
|
const e = new Error(message);
|
||||||
|
e.script = script;
|
||||||
|
e.detail = detail;
|
||||||
|
console.error(e);
|
||||||
|
}
|
||||||
|
|
||||||
|
let sourceCodeBlob;
|
||||||
|
const sourceUrl = script.src || script.getAttribute('data-src') || false;
|
||||||
|
if (sourceUrl) {
|
||||||
|
const res = await fetch(sourceUrl);
|
||||||
|
if (res.ok) {
|
||||||
|
sourceCodeBlob = new Uint8Array(await res.arrayBuffer());
|
||||||
|
} else {
|
||||||
|
complain(`Failed to retrieve schema from ${sourceUrl}`, { res });
|
||||||
|
continue;
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
sourceCodeBlob = new TextEncoder().encode(script.innerHTML);
|
||||||
|
}
|
||||||
|
|
||||||
|
const schemaName = () => {
|
||||||
|
const n = script.getAttribute('name');
|
||||||
|
if (n === null) complain(`<script type="schema"> must have name attribute`);
|
||||||
|
return n;
|
||||||
|
};
|
||||||
|
|
||||||
|
if (sourceCodeBlob[0] >= 128) {
|
||||||
|
// Binary Preserves blob
|
||||||
|
const value = Preserves.decode(sourceCodeBlob);
|
||||||
|
const bundle = PreservesSchema.Meta.toBundle(value);
|
||||||
|
if (bundle !== void 0) {
|
||||||
|
const prefixStr = script.getAttribute('prefix');
|
||||||
|
const bundlePrefix = (prefixStr ? prefixStr.split('.') : []).map(Symbol.for);
|
||||||
|
bundle.modules.forEach((schema, path) => {
|
||||||
|
const modulePath = [... bundlePrefix, ... path];
|
||||||
|
I.env.set(modulePath, schema);
|
||||||
|
});
|
||||||
|
} else {
|
||||||
|
const schema = PreservesSchema.Meta.toSchema(value);
|
||||||
|
if (schema !== void 0) {
|
||||||
|
const modulePath = schemaName().split('.').map(Symbol.for);
|
||||||
|
I.env.set(modulePath, schema);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
} else {
|
||||||
|
// Presumably text
|
||||||
|
const sourceCode = new TextDecoder('utf-8', { fatal: true }).decode(sourceCodeBlob);
|
||||||
|
const name = schemaName();
|
||||||
|
const schema = PreservesSchema.readSchema(sourceCode, { name });
|
||||||
|
const modulePath = name.split('.').map(Symbol.for);
|
||||||
|
I.env.set(modulePath, schema);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
I.moduleTree(Schema);
|
||||||
|
schemaReady();
|
||||||
|
}
|
||||||
|
|
||||||
|
window.addEventListener('DOMContentLoaded', translateScripts);
|
||||||
|
})();
|
|
@ -1,11 +1,11 @@
|
||||||
import { stringify } from '@preserves/core';
|
import { JsDictionary, stringify } from '@preserves/core';
|
||||||
import * as M from './meta';
|
import * as M from './meta';
|
||||||
|
|
||||||
export function checkSchema(schema: M.Schema): (
|
export function checkSchema(schema: M.Schema): (
|
||||||
{ ok: true, schema: M.Schema } | { ok: false, problems: Array<string> })
|
{ ok: true, schema: M.Schema } | { ok: false, problems: Array<string> })
|
||||||
{
|
{
|
||||||
const checker = new Checker();
|
const checker = new Checker();
|
||||||
schema.definitions.forEach(checker.checkDefinition.bind(checker));
|
JsDictionary.forEach(schema.definitions, checker.checkDefinition.bind(checker));
|
||||||
if (checker.problems.length > 0) {
|
if (checker.problems.length > 0) {
|
||||||
return { ok: false, problems: checker.problems };
|
return { ok: false, problems: checker.problems };
|
||||||
} else {
|
} else {
|
||||||
|
|
|
@ -1,10 +1,10 @@
|
||||||
import { encode, stringify } from "@preserves/core";
|
import { encode, JsDictionary, stringify } from "@preserves/core";
|
||||||
import * as M from "./meta";
|
import * as M from "./meta";
|
||||||
import { CompilerOptions, ModuleContext } from "./compiler/context";
|
import { CompilerOptions, ModuleContext } from "./compiler/context";
|
||||||
import { Formatter, block, seq, braces, opseq } from "./compiler/block";
|
import { Formatter, block, seq, braces } from "./compiler/block";
|
||||||
import { typeForDefinition } from "./compiler/gentype";
|
import { typeForDefinition } from "./compiler/gentype";
|
||||||
import { converterForDefinition } from "./compiler/genconverter";
|
import { converterForDefinition } from "./compiler/genconverter";
|
||||||
import { renderType } from "./compiler/rendertype";
|
import { renderTypeWithConversionMixins } from "./compiler/rendertype";
|
||||||
import { genConstructor } from "./compiler/genctor";
|
import { genConstructor } from "./compiler/genctor";
|
||||||
import { unconverterForDefinition } from "./compiler/genunconverter";
|
import { unconverterForDefinition } from "./compiler/genunconverter";
|
||||||
import { sourceCodeFor } from "./compiler/value";
|
import { sourceCodeFor } from "./compiler/value";
|
||||||
|
@ -29,13 +29,13 @@ export function compile(
|
||||||
mod.defineType(seq(`export type _embedded = `, mod.embeddedType, `;`));
|
mod.defineType(seq(`export type _embedded = `, mod.embeddedType, `;`));
|
||||||
}
|
}
|
||||||
|
|
||||||
for (const [name, def] of schema.definitions) {
|
for (const [name, def] of JsDictionary.entries(schema.definitions)) {
|
||||||
const t = typeForDefinition(mod.resolver(), def);
|
const t = typeForDefinition(mod.resolver(), def);
|
||||||
const nameStr = stringify(name);
|
const nameStr = stringify(name);
|
||||||
const resultTypeItem = mod.withAsPreserveMixinType(nameStr, t);
|
const resultTypeItem = seq(nameStr, mod.genericArgsFor(t));
|
||||||
|
|
||||||
mod.defineType(seq(`export type ${nameStr}`, mod.genericParametersFor(t),
|
mod.defineType(seq(`export type ${nameStr}`, mod.genericParametersFor(t),
|
||||||
` = `, renderType(mod, t), `;`));
|
` = `, renderTypeWithConversionMixins(mod, t), `;`));
|
||||||
|
|
||||||
if (t.kind === 'union') {
|
if (t.kind === 'union') {
|
||||||
mod.defineFunctions(nameStr, _ctx =>
|
mod.defineFunctions(nameStr, _ctx =>
|
||||||
|
@ -49,11 +49,11 @@ export function compile(
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
for (const [name0, def] of schema.definitions) {
|
for (const [name0, def] of JsDictionary.entries(schema.definitions)) {
|
||||||
const t = typeForDefinition(mod.resolver(), def);
|
const t = typeForDefinition(mod.resolver(), def);
|
||||||
const name = name0 as symbol;
|
const name = name0 as symbol;
|
||||||
const nameStr = name0.description!;
|
const nameStr = name0.description!;
|
||||||
const resultTypeItem = mod.withAsPreserveMixinType(nameStr, t);
|
const resultTypeItem = seq(nameStr, mod.genericArgsFor(t));
|
||||||
|
|
||||||
mod.defineFunctions(nameStr, ctx =>
|
mod.defineFunctions(nameStr, ctx =>
|
||||||
[seq(`export function as${name.description!}`, mod.genericParameters(),
|
[seq(`export function as${name.description!}`, mod.genericParameters(),
|
||||||
|
|
|
@ -1,7 +1,7 @@
|
||||||
import { Dictionary, KeyedSet, FlexSet, Position, stringify } from "@preserves/core";
|
import { KeyedSet, FlexSet, Position, stringify, DictionaryMap, JsDictionary } from "@preserves/core";
|
||||||
import { refPosition } from "../reader";
|
import { refPosition } from "../reader";
|
||||||
import * as M from "../meta";
|
import * as M from "../meta";
|
||||||
import { anglebrackets, block, braces, commas, formatItems, Item, keyvalue, seq, opseq } from "./block";
|
import { anglebrackets, block, braces, commas, formatItems, Item, keyvalue, seq } from "./block";
|
||||||
import { ANY_TYPE, RefType, Type } from "./type";
|
import { ANY_TYPE, RefType, Type } from "./type";
|
||||||
import { renderType, variantInitFor } from "./rendertype";
|
import { renderType, variantInitFor } from "./rendertype";
|
||||||
import { typeForDefinition } from "./gentype";
|
import { typeForDefinition } from "./gentype";
|
||||||
|
@ -27,11 +27,11 @@ export class ModuleContext {
|
||||||
readonly options: CompilerOptions;
|
readonly options: CompilerOptions;
|
||||||
readonly embeddedType: Item;
|
readonly embeddedType: Item;
|
||||||
|
|
||||||
readonly literals = new Dictionary<M.InputEmbedded, string>();
|
readonly literals = new DictionaryMap<M.InputEmbedded, string>();
|
||||||
readonly preamble: Item[] = [];
|
readonly preamble: Item[] = [];
|
||||||
readonly typedefs: Item[] = [];
|
readonly typedefs: Item[] = [];
|
||||||
readonly functiondefs: Item[] = [];
|
readonly functiondefs: Item[] = [];
|
||||||
readonly imports = new KeyedSet<[M.ModulePath, string, string, string]>();
|
readonly imports = new KeyedSet<M.InputEmbedded, [M.ModulePath, string, string, string]>();
|
||||||
|
|
||||||
constructor(
|
constructor(
|
||||||
env: M.Environment,
|
env: M.Environment,
|
||||||
|
@ -97,7 +97,7 @@ export class ModuleContext {
|
||||||
return (ref) => this.lookup(
|
return (ref) => this.lookup(
|
||||||
ref,
|
ref,
|
||||||
(_p, _t) => Type.ref(ref.name.description!, ref),
|
(_p, _t) => Type.ref(ref.name.description!, ref),
|
||||||
(modPath, modId, modFile, modExpr, _p, t) => {
|
(modPath, modId, modFile, modExpr, _p, _t) => {
|
||||||
this.imports.add([modPath, modId, modFile, modExpr]);
|
this.imports.add([modPath, modId, modFile, modExpr]);
|
||||||
return Type.ref(`${modId}${modExpr}.${ref.name.description!}`, ref);
|
return Type.ref(`${modId}${modExpr}.${ref.name.description!}`, ref);
|
||||||
},
|
},
|
||||||
|
@ -137,7 +137,7 @@ export class ModuleContext {
|
||||||
null,
|
null,
|
||||||
null);
|
null);
|
||||||
} else {
|
} else {
|
||||||
const p = e.schema.definitions.get(name.name);
|
const p = JsDictionary.get(e.schema.definitions, name.name);
|
||||||
if (p !== void 0) {
|
if (p !== void 0) {
|
||||||
let t = () => typeForDefinition(this.resolver(soughtModule), p);
|
let t = () => typeForDefinition(this.resolver(soughtModule), p);
|
||||||
if (name.module.length) {
|
if (name.module.length) {
|
||||||
|
@ -158,7 +158,7 @@ export class ModuleContext {
|
||||||
}
|
}
|
||||||
|
|
||||||
genericParameters(): Item {
|
genericParameters(): Item {
|
||||||
return anglebrackets(seq('_embedded = ', this.embeddedType));
|
return anglebrackets(seq('_embedded extends _.Embeddable = ', this.embeddedType));
|
||||||
}
|
}
|
||||||
|
|
||||||
genericParametersFor(t: Type): Item {
|
genericParametersFor(t: Type): Item {
|
||||||
|
@ -209,19 +209,6 @@ export class ModuleContext {
|
||||||
|
|
||||||
return walk(t);
|
return walk(t);
|
||||||
}
|
}
|
||||||
|
|
||||||
withAsPreserveMixinType(name: string, t: Type): Item {
|
|
||||||
if (t.kind === 'unit' || t.kind === 'record' || t.kind === 'union') {
|
|
||||||
return opseq('any', ' & ',
|
|
||||||
seq(name, this.genericArgsFor(t)),
|
|
||||||
braces(seq('__as_preserve__',
|
|
||||||
this.hasEmbedded(t) ? '' : this.genericParameters(),
|
|
||||||
'()',
|
|
||||||
': _.Value', this.genericArgs())));
|
|
||||||
} else {
|
|
||||||
return seq(name, this.genericArgsFor(t));
|
|
||||||
}
|
|
||||||
}
|
|
||||||
}
|
}
|
||||||
|
|
||||||
export class FunctionContext {
|
export class FunctionContext {
|
||||||
|
@ -285,16 +272,25 @@ export class FunctionContext {
|
||||||
}
|
}
|
||||||
|
|
||||||
buildCapturedCompound(dest: string): Item {
|
buildCapturedCompound(dest: string): Item {
|
||||||
const fields = [
|
return seq(`${dest} = `, buildProduct(
|
||||||
... variantInitFor(this.variantName),
|
this.definitionName, this.variantName, this.captures));
|
||||||
... this.captures.map(({ fieldName, sourceExpr }) =>
|
|
||||||
keyvalue(fieldName, sourceExpr)),
|
|
||||||
seq(`__as_preserve__() `, block(`return from${this.definitionName}(this)`))
|
|
||||||
];
|
|
||||||
return seq(`${dest} = `, braces(... fields));
|
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function buildProduct(
|
||||||
|
definitionName: string,
|
||||||
|
variant: string | undefined,
|
||||||
|
initializers: Capture[],
|
||||||
|
): Item {
|
||||||
|
return braces(
|
||||||
|
... variantInitFor(variant),
|
||||||
|
... initializers.map(({ fieldName, sourceExpr }) => keyvalue(fieldName, sourceExpr)),
|
||||||
|
seq(`__as_preserve__() `, block(`return from${M.jsId(definitionName)}(this)`)),
|
||||||
|
seq(`__preserve_on__(e) { e.push(from${M.jsId(definitionName)}(this)); }`),
|
||||||
|
seq(`__preserve_text_on__(w) { w.push(from${M.jsId(definitionName)}(this)); }`),
|
||||||
|
);
|
||||||
|
}
|
||||||
|
|
||||||
export class WalkState {
|
export class WalkState {
|
||||||
modulePath: M.ModulePath;
|
modulePath: M.ModulePath;
|
||||||
readonly seen: FlexSet<M.Ref>;
|
readonly seen: FlexSet<M.Ref>;
|
||||||
|
|
|
@ -1,8 +1,9 @@
|
||||||
import { FunctionContext } from "./context";
|
import { FunctionContext } from "./context";
|
||||||
import * as M from '../meta';
|
import * as M from '../meta';
|
||||||
import { Item, seq } from "./block";
|
import { Item, seq, parens, anglebrackets } from "./block";
|
||||||
import { simpleType, typeFor } from "./gentype";
|
import { simpleType, typeFor } from "./gentype";
|
||||||
import { ANY_TYPE, Type } from "./type";
|
import { ANY_TYPE, isSymbolType, Type } from "./type";
|
||||||
|
import { renderType } from "./rendertype";
|
||||||
|
|
||||||
export function converterForDefinition(
|
export function converterForDefinition(
|
||||||
ctx: FunctionContext,
|
ctx: FunctionContext,
|
||||||
|
@ -86,7 +87,7 @@ function converterForTuple(ctx: FunctionContext,
|
||||||
}
|
}
|
||||||
|
|
||||||
const lengthCheck = variablePattern === void 0
|
const lengthCheck = variablePattern === void 0
|
||||||
? seq(` && ${src}.length === ${ps.length}`)
|
? seq(` && ${src}.length >= ${ps.length}`)
|
||||||
: ((ps.length === 0) ? '' : seq(` && ${src}.length >= ${ps.length}`));
|
: ((ps.length === 0) ? '' : seq(` && ${src}.length >= ${ps.length}`));
|
||||||
|
|
||||||
return knownArray
|
return knownArray
|
||||||
|
@ -94,6 +95,31 @@ function converterForTuple(ctx: FunctionContext,
|
||||||
: [seq(`if (_.isSequence(${src})`, lengthCheck, `) `, ctx.block(() => loop(0)))];
|
: [seq(`if (_.isSequence(${src})`, lengthCheck, `) `, ctx.block(() => loop(0)))];
|
||||||
}
|
}
|
||||||
|
|
||||||
|
function encoderForSimplePattern(
|
||||||
|
ctx: FunctionContext,
|
||||||
|
p: M.SimplePattern,
|
||||||
|
): Item | null {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'Ref':
|
||||||
|
return ctx.mod.lookup(
|
||||||
|
p.value,
|
||||||
|
(_p, t) => `from${M.jsId(p.value.name.description!)}${ctx.mod.genericArgsFor(t())}`,
|
||||||
|
(modPath, modId, modFile, modExpr, _p, t) => {
|
||||||
|
ctx.mod.imports.add([modPath, modId, modFile, modExpr]);
|
||||||
|
return `${modId}${modExpr}.from${M.jsId(p.value.name.description!)}${t ? ctx.mod.genericArgsFor(t()) : ''}`;
|
||||||
|
});
|
||||||
|
case 'embedded':
|
||||||
|
return `_.embed`;
|
||||||
|
case 'seqof': {
|
||||||
|
const e = encoderForSimplePattern(ctx, p.pattern);
|
||||||
|
if (e === null) return null;
|
||||||
|
return seq(`vs => vs.map`, parens(e));
|
||||||
|
}
|
||||||
|
default:
|
||||||
|
return null;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
function converterFor(
|
function converterFor(
|
||||||
ctx: FunctionContext,
|
ctx: FunctionContext,
|
||||||
np: M.NamedPattern,
|
np: M.NamedPattern,
|
||||||
|
@ -128,7 +154,6 @@ export function converterForSimple(
|
||||||
let valexp: Item = `${src}`;
|
let valexp: Item = `${src}`;
|
||||||
switch (p.atomKind._variant) {
|
switch (p.atomKind._variant) {
|
||||||
case 'Boolean': test = `typeof ${src} === 'boolean'`; break;
|
case 'Boolean': test = `typeof ${src} === 'boolean'`; break;
|
||||||
case 'Float': test = `_.Float.isSingle(${src})`; valexp = `${src}.value`; break;
|
|
||||||
case 'Double': test =`_.Float.isDouble(${src})`; valexp = `${src}.value`; break;
|
case 'Double': test =`_.Float.isDouble(${src})`; valexp = `${src}.value`; break;
|
||||||
case 'SignedInteger': test = `typeof ${src} === 'number'`; break;
|
case 'SignedInteger': test = `typeof ${src} === 'number'`; break;
|
||||||
case 'String': test = `typeof ${src} === 'string'`; break;
|
case 'String': test = `typeof ${src} === 'string'`; break;
|
||||||
|
@ -138,7 +163,7 @@ export function converterForSimple(
|
||||||
return [seq(`${dest} = `, test, ` ? `, valexp, ` : void 0`)];
|
return [seq(`${dest} = `, test, ` ? `, valexp, ` : void 0`)];
|
||||||
}
|
}
|
||||||
case 'embedded':
|
case 'embedded':
|
||||||
return [`${dest} = _.isEmbedded<_embedded>(${src}) ? ${src}.embeddedValue : void 0`];
|
return [`${dest} = _.isEmbedded<_embedded>(${src}) ? ${src} : void 0`];
|
||||||
case 'lit':
|
case 'lit':
|
||||||
return [`${dest} = _.is(${src}, ${ctx.mod.literal(p.value)}) ? {} : void 0`];
|
return [`${dest} = _.is(${src}, ${ctx.mod.literal(p.value)}) ? {} : void 0`];
|
||||||
|
|
||||||
|
@ -163,9 +188,12 @@ export function converterForSimple(
|
||||||
case 'setof':
|
case 'setof':
|
||||||
return [`${dest} = void 0`,
|
return [`${dest} = void 0`,
|
||||||
seq(`if (_.Set.isSet<_embedded>(${src})) `, ctx.block(() => {
|
seq(`if (_.Set.isSet<_embedded>(${src})) `, ctx.block(() => {
|
||||||
|
const vt = simpleType(ctx.mod.resolver(), p.pattern);
|
||||||
const v = ctx.gentempname();
|
const v = ctx.gentempname();
|
||||||
return [
|
return [
|
||||||
seq(`${dest} = new _.KeyedSet()`),
|
seq(`${dest} = new _.EncodableSet`,
|
||||||
|
anglebrackets('_embedded', renderType(ctx.mod, vt)),
|
||||||
|
parens(encoderForSimplePattern(ctx, p.pattern) ?? `v => v`)),
|
||||||
seq(`for (const ${v} of ${src}) `, ctx.block(() => [
|
seq(`for (const ${v} of ${src}) `, ctx.block(() => [
|
||||||
... converterFor(ctx, M.anonymousSimplePattern(p.pattern), v, vv =>
|
... converterFor(ctx, M.anonymousSimplePattern(p.pattern), v, vv =>
|
||||||
[`${dest}.add(${vv})`, `continue`]),
|
[`${dest}.add(${vv})`, `continue`]),
|
||||||
|
@ -175,14 +203,30 @@ export function converterForSimple(
|
||||||
case 'dictof':
|
case 'dictof':
|
||||||
return [`${dest} = void 0`,
|
return [`${dest} = void 0`,
|
||||||
seq(`if (_.Dictionary.isDictionary<_embedded>(${src})) `, ctx.block(() => {
|
seq(`if (_.Dictionary.isDictionary<_embedded>(${src})) `, ctx.block(() => {
|
||||||
const v = ctx.gentempname();
|
const srcMap = ctx.gentempname();
|
||||||
|
const resolver = ctx.mod.resolver();
|
||||||
|
const kt = simpleType(resolver, p.key);
|
||||||
|
const vt = simpleType(resolver, p.value);
|
||||||
const k = ctx.gentempname();
|
const k = ctx.gentempname();
|
||||||
|
const v = ctx.gentempname();
|
||||||
|
const symbolKeyed = isSymbolType(kt);
|
||||||
return [
|
return [
|
||||||
seq(`${dest} = new _.KeyedDictionary()`),
|
seq(`const ${srcMap} = new _.DictionaryMap(${src})`),
|
||||||
seq(`for (const [${k}, ${v}] of ${src}) `, ctx.block(() => [
|
(symbolKeyed
|
||||||
|
? seq(`${dest} = {}`)
|
||||||
|
: seq(`${dest} = new _.EncodableDictionary`,
|
||||||
|
anglebrackets('_embedded', renderType(ctx.mod, kt), renderType(ctx.mod, vt)),
|
||||||
|
parens(encoderForSimplePattern(ctx, p.key) ?? `k => k`,
|
||||||
|
encoderForSimplePattern(ctx, p.value) ?? `v => v`))),
|
||||||
|
seq(`for (const [${k}, ${v}] of ${srcMap}) `, ctx.block(() => [
|
||||||
... converterFor(ctx, M.anonymousSimplePattern(p.key), k, kk =>
|
... converterFor(ctx, M.anonymousSimplePattern(p.key), k, kk =>
|
||||||
converterFor(ctx, M.anonymousSimplePattern(p.value), v, vv =>
|
converterFor(ctx, M.anonymousSimplePattern(p.value), v, vv =>
|
||||||
[`${dest}.set(${kk}, ${vv})`, `continue`])),
|
[
|
||||||
|
(symbolKeyed
|
||||||
|
? `${dest}[${kk}.description!] = ${vv}`
|
||||||
|
: `${dest}.set(${kk}, ${vv})`),
|
||||||
|
`continue`
|
||||||
|
])),
|
||||||
seq(`${dest} = void 0`),
|
seq(`${dest} = void 0`),
|
||||||
seq(`break`)]))];
|
seq(`break`)]))];
|
||||||
}))];
|
}))];
|
||||||
|
@ -217,12 +261,13 @@ function converterForCompound(
|
||||||
case 'tuplePrefix':
|
case 'tuplePrefix':
|
||||||
return converterForTuple(ctx, p.fixed, src, knownArray, p.variable, ks);
|
return converterForTuple(ctx, p.fixed, src, knownArray, p.variable, ks);
|
||||||
case 'dict': {
|
case 'dict': {
|
||||||
|
const srcMap = ctx.gentempname();
|
||||||
const entries = Array.from(p.entries);
|
const entries = Array.from(p.entries);
|
||||||
function loop(i: number): Item[] {
|
function loop(i: number): Item[] {
|
||||||
if (i < entries.length) {
|
if (i < entries.length) {
|
||||||
const [k, n] = entries[i];
|
const [k, n] = entries[i];
|
||||||
const tmpSrc = ctx.gentemp();
|
const tmpSrc = ctx.gentemp();
|
||||||
return [seq(`if ((${tmpSrc} = ${src}.get(${ctx.mod.literal(k)})) !== void 0) `,
|
return [seq(`if ((${tmpSrc} = ${srcMap}.get(${ctx.mod.literal(k)})) !== void 0) `,
|
||||||
ctx.block(() =>
|
ctx.block(() =>
|
||||||
converterFor(
|
converterFor(
|
||||||
ctx,
|
ctx,
|
||||||
|
@ -233,7 +278,9 @@ function converterForCompound(
|
||||||
return ks();
|
return ks();
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
return [seq(`if (_.Dictionary.isDictionary<_embedded>(${src})) `, ctx.block(() => loop(0)))];
|
return [seq(`if (_.Dictionary.isDictionary<_embedded>(${src})) `, ctx.block(() => [
|
||||||
|
seq(`const ${srcMap} = new _.DictionaryMap(${src})`),
|
||||||
|
... loop(0)]))];
|
||||||
}
|
}
|
||||||
default:
|
default:
|
||||||
((_p: never) => {})(p);
|
((_p: never) => {})(p);
|
||||||
|
|
|
@ -1,8 +1,8 @@
|
||||||
import * as M from '../meta';
|
import * as M from '../meta';
|
||||||
import { block, braces, Item, keyvalue, parens, seq } from "./block";
|
import { block, braces, Item, parens, seq } from "./block";
|
||||||
import { FieldType, SimpleType, Type } from "./type";
|
import { FieldType, SimpleType, Type } from "./type";
|
||||||
import { renderType } from "./rendertype";
|
import { renderType } from "./rendertype";
|
||||||
import { ModuleContext } from './context';
|
import { ModuleContext, buildProduct } from './context';
|
||||||
|
|
||||||
export function genConstructor(
|
export function genConstructor(
|
||||||
mod: ModuleContext,
|
mod: ModuleContext,
|
||||||
|
@ -29,12 +29,7 @@ export function genConstructor(
|
||||||
simpleValue = (variant === void 0) && (arg.kind !== 'unit');
|
simpleValue = (variant === void 0) && (arg.kind !== 'unit');
|
||||||
}
|
}
|
||||||
|
|
||||||
const initializers: Item[] = (variant !== void 0)
|
const initializers = formals.map(([n, _t]) => ({ fieldName: n, sourceExpr: M.jsId(n) }));
|
||||||
? [keyvalue('_variant', JSON.stringify(variant))]
|
|
||||||
: [];
|
|
||||||
formals.forEach(([n, _t]) => initializers.push(seq(JSON.stringify(n), ': ', M.jsId(n))));
|
|
||||||
|
|
||||||
initializers.push(seq(`__as_preserve__() `, block(`return from${M.jsId(definitionName)}(this)`)));
|
|
||||||
|
|
||||||
const declArgs: Array<Item> = (formals.length > 1)
|
const declArgs: Array<Item> = (formals.length > 1)
|
||||||
? [seq(braces(...formals.map(f => M.jsId(f[0]))), ': ',
|
? [seq(braces(...formals.map(f => M.jsId(f[0]))), ': ',
|
||||||
|
@ -48,7 +43,7 @@ export function genConstructor(
|
||||||
seq(`return `,
|
seq(`return `,
|
||||||
(simpleValue
|
(simpleValue
|
||||||
? 'value'
|
? 'value'
|
||||||
: braces(...initializers))))),
|
: buildProduct(definitionName, variant, initializers))))),
|
||||||
seq(`${M.jsId(name)}.schema = function () `, block(
|
seq(`${M.jsId(name)}.schema = function () `, block(
|
||||||
seq(`return `, braces(
|
seq(`return `, braces(
|
||||||
`schema: _schema()`,
|
`schema: _schema()`,
|
||||||
|
|
|
@ -37,7 +37,6 @@ export function simpleType(resolver: RefResolver, p: M.SimplePattern): FieldType
|
||||||
case 'atom':
|
case 'atom':
|
||||||
switch (p.atomKind._variant) {
|
switch (p.atomKind._variant) {
|
||||||
case 'Boolean': return Type.ref(`boolean`, null);
|
case 'Boolean': return Type.ref(`boolean`, null);
|
||||||
case 'Float': return Type.ref(`number`, null);
|
|
||||||
case 'Double': return Type.ref(`number`, null);
|
case 'Double': return Type.ref(`number`, null);
|
||||||
case 'SignedInteger': return Type.ref(`number`, null);
|
case 'SignedInteger': return Type.ref(`number`, null);
|
||||||
case 'String': return Type.ref(`string`, null);
|
case 'String': return Type.ref(`string`, null);
|
||||||
|
|
|
@ -40,14 +40,13 @@ function unconverterFor(ctx: FunctionContext, p: M.Pattern, src: string): Item {
|
||||||
return `${src}`;
|
return `${src}`;
|
||||||
case 'atom':
|
case 'atom':
|
||||||
switch (p.atomKind._variant) {
|
switch (p.atomKind._variant) {
|
||||||
case 'Float': return `_.Single(${src})`;
|
|
||||||
case 'Double': return `_.Double(${src})`;
|
case 'Double': return `_.Double(${src})`;
|
||||||
default: return `${src}`;
|
default: return `${src}`;
|
||||||
}
|
}
|
||||||
case 'lit':
|
case 'lit':
|
||||||
return ctx.mod.literal(p.value);
|
return ctx.mod.literal(p.value);
|
||||||
case 'embedded':
|
case 'embedded':
|
||||||
return `_.embed(${src})`;
|
return `${src}`;
|
||||||
case 'seqof':
|
case 'seqof':
|
||||||
return seq(`${src}.map(v => `,
|
return seq(`${src}.map(v => `,
|
||||||
unconverterFor(ctx, M.Pattern.SimplePattern(p.pattern), 'v'),
|
unconverterFor(ctx, M.Pattern.SimplePattern(p.pattern), 'v'),
|
||||||
|
@ -58,8 +57,10 @@ function unconverterFor(ctx: FunctionContext, p: M.Pattern, src: string): Item {
|
||||||
unconverterFor(ctx, M.Pattern.SimplePattern(p.pattern), 'v'),
|
unconverterFor(ctx, M.Pattern.SimplePattern(p.pattern), 'v'),
|
||||||
`)`)));
|
`)`)));
|
||||||
case 'dictof':
|
case 'dictof':
|
||||||
return seq(`new _.Dictionary<_embedded>`, parens(seq(
|
return seq(`_.Dictionary.from<_embedded>`, parens(seq(
|
||||||
`_.Array.from(${src}.entries()).map(([k, v]) => `,
|
`_.Array.from(`,
|
||||||
|
M.isSymbolPattern(p.key) ? `_.JsDictionary.entries(${src})` : `${src}.entries()`,
|
||||||
|
`).map(([k, v]) => `,
|
||||||
brackets(
|
brackets(
|
||||||
unconverterFor(ctx, M.Pattern.SimplePattern(p.key), 'k'),
|
unconverterFor(ctx, M.Pattern.SimplePattern(p.key), 'k'),
|
||||||
unconverterFor(ctx, M.Pattern.SimplePattern(p.value), 'v')),
|
unconverterFor(ctx, M.Pattern.SimplePattern(p.value), 'v')),
|
||||||
|
@ -96,7 +97,7 @@ function unconverterFor(ctx: FunctionContext, p: M.Pattern, src: string): Item {
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
case 'dict':
|
case 'dict':
|
||||||
return seq(`new _.Dictionary<_embedded>`, parens(
|
return seq(`_.Dictionary.from<_embedded>`, parens(
|
||||||
brackets(... Array.from(p.entries.entries()).map(([k, n]) =>
|
brackets(... Array.from(p.entries.entries()).map(([k, n]) =>
|
||||||
brackets(
|
brackets(
|
||||||
ctx.mod.literal(k),
|
ctx.mod.literal(k),
|
||||||
|
|
|
@ -1,4 +1,4 @@
|
||||||
import { SimpleType, Type } from "./type";
|
import { isSymbolType, SimpleType, Type } from "./type";
|
||||||
import { anglebrackets, braces, Item, keyvalue, opseq, seq } from "./block";
|
import { anglebrackets, braces, Item, keyvalue, opseq, seq } from "./block";
|
||||||
import { ModuleContext } from "./context";
|
import { ModuleContext } from "./context";
|
||||||
|
|
||||||
|
@ -10,7 +10,7 @@ export function variantFor(variantName: string): Item {
|
||||||
return keyvalue('_variant', JSON.stringify(variantName));
|
return keyvalue('_variant', JSON.stringify(variantName));
|
||||||
}
|
}
|
||||||
|
|
||||||
function simpleTypeFields(ctxt: ModuleContext, baseType: Type, t: SimpleType): Item[] {
|
function simpleTypeFields(ctxt: ModuleContext, t: SimpleType): Item[] {
|
||||||
switch (t.kind) {
|
switch (t.kind) {
|
||||||
case 'unit':
|
case 'unit':
|
||||||
return [];
|
return [];
|
||||||
|
@ -29,35 +29,54 @@ function simpleTypeFields(ctxt: ModuleContext, baseType: Type, t: SimpleType): I
|
||||||
|
|
||||||
export function renderVariant(
|
export function renderVariant(
|
||||||
ctxt: ModuleContext,
|
ctxt: ModuleContext,
|
||||||
baseType: Type,
|
|
||||||
[variantName, t]: [string, SimpleType],
|
[variantName, t]: [string, SimpleType],
|
||||||
): Item {
|
): Item {
|
||||||
let fields = simpleTypeFields(ctxt, baseType, t);
|
let fields = simpleTypeFields(ctxt, t);
|
||||||
return braces(variantFor(variantName), ... fields);
|
return braces(variantFor(variantName), ... fields);
|
||||||
}
|
}
|
||||||
|
|
||||||
export function renderType(ctxt: ModuleContext, t: Type): Item {
|
export function renderType(ctxt: ModuleContext, t: Type): Item {
|
||||||
switch (t.kind) {
|
switch (t.kind) {
|
||||||
case 'union': return opseq('never', ' | ', ...
|
case 'union': return opseq('never', ' | ', ...
|
||||||
Array.from(t.variants).flatMap(entry => renderVariant(ctxt, t, entry)));
|
Array.from(t.variants).flatMap(entry => renderVariant(ctxt, entry)));
|
||||||
case 'unit': return braces(... simpleTypeFields(ctxt, t, t));
|
case 'unit': return braces(... simpleTypeFields(ctxt, t));
|
||||||
case 'ref':
|
case 'ref':
|
||||||
if (t.ref === null && t.typeName === '_embedded') {
|
if (t.ref === null && t.typeName === '_embedded') {
|
||||||
return t.typeName;
|
return t.typeName;
|
||||||
} else {
|
} else {
|
||||||
return seq(t.typeName, ctxt.genericArgsFor(t));
|
return seq(t.typeName, ctxt.genericArgsFor(t));
|
||||||
}
|
}
|
||||||
case 'set': return seq('_.KeyedSet', anglebrackets(
|
case 'set': return seq('_.EncodableSet', anglebrackets(
|
||||||
renderType(ctxt, t.type),
|
'_embedded',
|
||||||
'_embedded'));
|
renderType(ctxt, t.type)));
|
||||||
case 'dictionary': return seq('_.KeyedDictionary', anglebrackets(
|
case 'dictionary':
|
||||||
renderType(ctxt, t.key),
|
if (isSymbolType(t.key)) {
|
||||||
renderType(ctxt, t.value),
|
return seq('_.JsDictionary', anglebrackets(renderType(ctxt, t.value)));
|
||||||
'_embedded'));
|
} else {
|
||||||
|
return seq('_.EncodableDictionary', anglebrackets(
|
||||||
|
'_embedded',
|
||||||
|
renderType(ctxt, t.key),
|
||||||
|
renderType(ctxt, t.value)));
|
||||||
|
}
|
||||||
case 'array': return seq('Array', anglebrackets(renderType(ctxt, t.type)));
|
case 'array': return seq('Array', anglebrackets(renderType(ctxt, t.type)));
|
||||||
case 'record': return braces(... simpleTypeFields(ctxt, t, t));
|
case 'record': return braces(... simpleTypeFields(ctxt, t));
|
||||||
default:
|
default:
|
||||||
((_: never) => {})(t);
|
((_: never) => {})(t);
|
||||||
throw new Error("Unreachable");
|
throw new Error("Unreachable");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function renderTypeWithConversionMixins(ctxt: ModuleContext, t: Type): Item {
|
||||||
|
if (t.kind === 'unit' || t.kind === 'record' || t.kind === 'union') {
|
||||||
|
return opseq('any', ' & ',
|
||||||
|
renderType(ctxt, t),
|
||||||
|
seq('_.Preservable', ctxt.hasEmbedded(t) ? ctxt.genericArgs() : '<any>'),
|
||||||
|
seq('_.PreserveWritable', ctxt.hasEmbedded(t) ? ctxt.genericArgs() : '<any>'),
|
||||||
|
braces(seq('__as_preserve__',
|
||||||
|
ctxt.hasEmbedded(t) ? '' : ctxt.genericParameters(),
|
||||||
|
'()',
|
||||||
|
': _.Value', ctxt.genericArgs())));
|
||||||
|
} else {
|
||||||
|
return renderType(ctxt, t);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
|
@ -35,3 +35,7 @@ export namespace Type {
|
||||||
}
|
}
|
||||||
|
|
||||||
export const ANY_TYPE: FieldType = Type.ref('_.Value', null);
|
export const ANY_TYPE: FieldType = Type.ref('_.Value', null);
|
||||||
|
|
||||||
|
export function isSymbolType(ty: FieldType): ty is { kind: 'ref', typeName: 'symbol', ref: null } {
|
||||||
|
return ty.kind === 'ref' && ty.typeName === 'symbol' && ty.ref === null;
|
||||||
|
}
|
||||||
|
|
|
@ -1,11 +1,10 @@
|
||||||
import { Annotated, Bytes, Set, Dictionary, Fold, fold, Record, Tuple, Value, stringify, Embedded } from "@preserves/core";
|
import { Annotated, Bytes, Set, Fold, fold, Record, Tuple, Value, stringify, DictionaryMap } from "@preserves/core";
|
||||||
import { brackets, Item, parens, seq } from "./block";
|
import { brackets, Item, parens, seq } from "./block";
|
||||||
import * as M from '../meta';
|
import * as M from '../meta';
|
||||||
|
|
||||||
export function sourceCodeFor(v: Value<M.InputEmbedded>): Item {
|
export function sourceCodeFor(v: Value<M.InputEmbedded>): Item {
|
||||||
return fold(v, {
|
return fold(v, {
|
||||||
boolean(b: boolean): Item { return b.toString(); },
|
boolean(b: boolean): Item { return b.toString(); },
|
||||||
single(f: number): Item { return f.toString(); },
|
|
||||||
double(f: number): Item { return f.toString(); },
|
double(f: number): Item { return f.toString(); },
|
||||||
integer(i: number): Item { return i.toString(); },
|
integer(i: number): Item { return i.toString(); },
|
||||||
string(s: string): Item { return JSON.stringify(s); },
|
string(s: string): Item { return JSON.stringify(s); },
|
||||||
|
@ -23,8 +22,8 @@ export function sourceCodeFor(v: Value<M.InputEmbedded>): Item {
|
||||||
set(s: Set<M.InputEmbedded>, k: Fold<M.InputEmbedded, Item>): Item {
|
set(s: Set<M.InputEmbedded>, k: Fold<M.InputEmbedded, Item>): Item {
|
||||||
return seq('new _.Set<_.Value<_embedded>>', parens(brackets(... Array.from(s).map(k))));
|
return seq('new _.Set<_.Value<_embedded>>', parens(brackets(... Array.from(s).map(k))));
|
||||||
},
|
},
|
||||||
dictionary(d: Dictionary<M.InputEmbedded>, k: Fold<M.InputEmbedded, Item>): Item {
|
dictionary(d: DictionaryMap<M.InputEmbedded>, k: Fold<M.InputEmbedded, Item>): Item {
|
||||||
return seq('new _.Dictionary<_embedded>', parens(brackets(... Array.from(d).map(([kk,vv]) =>
|
return seq('_.Dictionary.from<_embedded>', parens(brackets(... Array.from(d).map(([kk,vv]) =>
|
||||||
brackets(k(kk), k(vv))))));
|
brackets(k(kk), k(vv))))));
|
||||||
},
|
},
|
||||||
|
|
||||||
|
@ -32,7 +31,7 @@ export function sourceCodeFor(v: Value<M.InputEmbedded>): Item {
|
||||||
return seq('_.annotate<_embedded>', parens(k(a.item), ... a.annotations.map(k)));
|
return seq('_.annotate<_embedded>', parens(k(a.item), ... a.annotations.map(k)));
|
||||||
},
|
},
|
||||||
|
|
||||||
embedded(t: Embedded<M.InputEmbedded>, _k: Fold<M.InputEmbedded, Item>): Item {
|
embedded(t: M.InputEmbedded, _k: Fold<M.InputEmbedded, Item>): Item {
|
||||||
throw new Error(`Cannot emit source code for construction of embedded ${stringify(t)}`);
|
throw new Error(`Cannot emit source code for construction of embedded ${stringify(t)}`);
|
||||||
},
|
},
|
||||||
});
|
});
|
||||||
|
|
|
@ -0,0 +1,716 @@
|
||||||
|
import * as _ from "@preserves/core";
|
||||||
|
import * as _i_schema from "./schema";
|
||||||
|
|
||||||
|
export const $any = _.Symbol.for("any");
|
||||||
|
export const $array = _.Symbol.for("array");
|
||||||
|
export const $embedded = _.Symbol.for("embedded");
|
||||||
|
export const $map = _.Symbol.for("map");
|
||||||
|
export const $rec = _.Symbol.for("rec");
|
||||||
|
export const $ref = _.Symbol.for("ref");
|
||||||
|
export const $set = _.Symbol.for("set");
|
||||||
|
export const $union = _.Symbol.for("union");
|
||||||
|
export const $unit = _.Symbol.for("unit");
|
||||||
|
|
||||||
|
let __schema: _.Value | null = null;
|
||||||
|
|
||||||
|
export function _schema() {
|
||||||
|
if (__schema === null) {
|
||||||
|
__schema = _.decode<_.GenericEmbedded>(_.Bytes.fromHex("b4b306736368656d61b7b30776657273696f6eb00101b30b646566696e6974696f6e73b7b3054669656c64b4b3026f72b5b5b104756e6974b4b3036c6974b304756e69748484b5b103616e79b4b3036c6974b303616e798484b5b108656d626564646564b4b3036c6974b308656d6265646465648484b5b1056172726179b4b303726563b4b3036c6974b305617272617984b4b3057475706c65b5b4b3056e616d6564b307656c656d656e74b4b303726566b584b3054669656c64848484848484b5b103736574b4b303726563b4b3036c6974b30373657484b4b3057475706c65b5b4b3056e616d6564b307656c656d656e74b4b303726566b584b3054669656c64848484848484b5b1036d6170b4b303726563b4b3036c6974b3036d617084b4b3057475706c65b5b4b3056e616d6564b3036b6579b4b303726566b584b3054669656c648484b4b3056e616d6564b30576616c7565b4b303726566b584b3054669656c64848484848484b5b103726566b4b303726563b4b3036c6974b30372656684b4b3057475706c65b5b4b3056e616d6564b3046e616d65b4b303726566b5b306736368656d6184b303526566848484848484b5b10841746f6d4b696e64b4b303726566b5b306736368656d6184b30841746f6d4b696e6484848484b3065265636f7264b4b303726563b4b3036c6974b30372656384b4b3057475706c65b5b4b3056e616d6564b3066669656c6473b4b3057365716f66b4b303726566b584b30a4e616d65644669656c64848484848484b30653696d706c65b4b3026f72b5b5b1054669656c64b4b303726566b584b3054669656c648484b5b1065265636f7264b4b303726566b584b3065265636f726484848484b30756617269616e74b4b3057475706c65b5b4b3056e616d6564b3056c6162656cb4b30461746f6db30653796d626f6c8484b4b3056e616d6564b30474797065b4b303726566b584b30653696d706c6584848484b30a446566696e6974696f6eb4b3026f72b5b5b105756e696f6eb4b303726563b4b3036c6974b305756e696f6e84b4b3057475706c65b5b4b3056e616d6564b30876617269616e7473b4b3057365716f66b4b303726566b584b30756617269616e7484848484848484b5b10653696d706c65b4b303726566b584b30653696d706c6584848484b30a4e616d65644669656c64b4b3057475706c65b5b4b3056e616d6564b3046e616d65b4b30461746f6db30653796d626f6c8484b4b3056e616d6564b30474797065b4b303726566b584b3054669656c648484848484b30c656d62656464656454797065808484"));
|
||||||
|
};
|
||||||
|
return __schema;
|
||||||
|
}
|
||||||
|
|
||||||
|
export const _imports = {"schema": _i_schema}
|
||||||
|
|
||||||
|
|
||||||
|
export type Definition = (
|
||||||
|
(
|
||||||
|
{"_variant": "union", "variants": Array<Variant>} |
|
||||||
|
{"_variant": "Simple", "value": Simple}
|
||||||
|
) &
|
||||||
|
_.Preservable<any> &
|
||||||
|
_.PreserveWritable<any> &
|
||||||
|
{
|
||||||
|
__as_preserve__<_embedded extends _.Embeddable = _.GenericEmbedded>(): _.Value<_embedded>
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
export type Variant = (
|
||||||
|
{"label": symbol, "type": Simple} &
|
||||||
|
_.Preservable<any> &
|
||||||
|
_.PreserveWritable<any> &
|
||||||
|
{
|
||||||
|
__as_preserve__<_embedded extends _.Embeddable = _.GenericEmbedded>(): _.Value<_embedded>
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
export type Simple = (
|
||||||
|
(
|
||||||
|
{"_variant": "Field", "value": Field} |
|
||||||
|
{"_variant": "Record", "value": Record}
|
||||||
|
) &
|
||||||
|
_.Preservable<any> &
|
||||||
|
_.PreserveWritable<any> &
|
||||||
|
{
|
||||||
|
__as_preserve__<_embedded extends _.Embeddable = _.GenericEmbedded>(): _.Value<_embedded>
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
export type Record = (
|
||||||
|
{"fields": Array<NamedField>} &
|
||||||
|
_.Preservable<any> &
|
||||||
|
_.PreserveWritable<any> &
|
||||||
|
{
|
||||||
|
__as_preserve__<_embedded extends _.Embeddable = _.GenericEmbedded>(): _.Value<_embedded>
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
export type NamedField = (
|
||||||
|
{"name": symbol, "type": Field} &
|
||||||
|
_.Preservable<any> &
|
||||||
|
_.PreserveWritable<any> &
|
||||||
|
{
|
||||||
|
__as_preserve__<_embedded extends _.Embeddable = _.GenericEmbedded>(): _.Value<_embedded>
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
export type Field = (
|
||||||
|
(
|
||||||
|
{"_variant": "unit"} |
|
||||||
|
{"_variant": "any"} |
|
||||||
|
{"_variant": "embedded"} |
|
||||||
|
{"_variant": "array", "element": Field} |
|
||||||
|
{"_variant": "set", "element": Field} |
|
||||||
|
{"_variant": "map", "key": Field, "value": Field} |
|
||||||
|
{"_variant": "ref", "name": _i_schema.Ref} |
|
||||||
|
{"_variant": "AtomKind", "value": _i_schema.AtomKind}
|
||||||
|
) &
|
||||||
|
_.Preservable<any> &
|
||||||
|
_.PreserveWritable<any> &
|
||||||
|
{
|
||||||
|
__as_preserve__<_embedded extends _.Embeddable = _.GenericEmbedded>(): _.Value<_embedded>
|
||||||
|
}
|
||||||
|
);
|
||||||
|
|
||||||
|
|
||||||
|
export namespace Definition {
|
||||||
|
export function union(variants: Array<Variant>): Definition {
|
||||||
|
return {
|
||||||
|
"_variant": "union",
|
||||||
|
"variants": variants,
|
||||||
|
__as_preserve__() {return fromDefinition(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromDefinition(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromDefinition(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
union.schema = function () {
|
||||||
|
return {
|
||||||
|
schema: _schema(),
|
||||||
|
imports: _imports,
|
||||||
|
definitionName: _.Symbol.for("Definition"),
|
||||||
|
variant: _.Symbol.for("union")
|
||||||
|
};
|
||||||
|
};
|
||||||
|
export function Simple(value: Simple): Definition {
|
||||||
|
return {
|
||||||
|
"_variant": "Simple",
|
||||||
|
"value": value,
|
||||||
|
__as_preserve__() {return fromDefinition(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromDefinition(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromDefinition(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
Simple.schema = function () {
|
||||||
|
return {
|
||||||
|
schema: _schema(),
|
||||||
|
imports: _imports,
|
||||||
|
definitionName: _.Symbol.for("Definition"),
|
||||||
|
variant: _.Symbol.for("Simple")
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function Variant({label, type}: {label: symbol, type: Simple}): Variant {
|
||||||
|
return {
|
||||||
|
"label": label,
|
||||||
|
"type": type,
|
||||||
|
__as_preserve__() {return fromVariant(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromVariant(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromVariant(this)); }
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
Variant.schema = function () {
|
||||||
|
return {schema: _schema(), imports: _imports, definitionName: _.Symbol.for("Variant")};
|
||||||
|
}
|
||||||
|
|
||||||
|
export namespace Simple {
|
||||||
|
export function Field(value: Field): Simple {
|
||||||
|
return {
|
||||||
|
"_variant": "Field",
|
||||||
|
"value": value,
|
||||||
|
__as_preserve__() {return fromSimple(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromSimple(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromSimple(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
Field.schema = function () {
|
||||||
|
return {
|
||||||
|
schema: _schema(),
|
||||||
|
imports: _imports,
|
||||||
|
definitionName: _.Symbol.for("Simple"),
|
||||||
|
variant: _.Symbol.for("Field")
|
||||||
|
};
|
||||||
|
};
|
||||||
|
export function Record(value: Record): Simple {
|
||||||
|
return {
|
||||||
|
"_variant": "Record",
|
||||||
|
"value": value,
|
||||||
|
__as_preserve__() {return fromSimple(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromSimple(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromSimple(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
Record.schema = function () {
|
||||||
|
return {
|
||||||
|
schema: _schema(),
|
||||||
|
imports: _imports,
|
||||||
|
definitionName: _.Symbol.for("Simple"),
|
||||||
|
variant: _.Symbol.for("Record")
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function Record(fields: Array<NamedField>): Record {
|
||||||
|
return {
|
||||||
|
"fields": fields,
|
||||||
|
__as_preserve__() {return fromRecord(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromRecord(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromRecord(this)); }
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
Record.schema = function () {
|
||||||
|
return {schema: _schema(), imports: _imports, definitionName: _.Symbol.for("Record")};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function NamedField({name, type}: {name: symbol, type: Field}): NamedField {
|
||||||
|
return {
|
||||||
|
"name": name,
|
||||||
|
"type": type,
|
||||||
|
__as_preserve__() {return fromNamedField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromNamedField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromNamedField(this)); }
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
NamedField.schema = function () {
|
||||||
|
return {
|
||||||
|
schema: _schema(),
|
||||||
|
imports: _imports,
|
||||||
|
definitionName: _.Symbol.for("NamedField")
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export namespace Field {
|
||||||
|
export function unit(): Field {
|
||||||
|
return {
|
||||||
|
"_variant": "unit",
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
unit.schema = function () {
|
||||||
|
return {
|
||||||
|
schema: _schema(),
|
||||||
|
imports: _imports,
|
||||||
|
definitionName: _.Symbol.for("Field"),
|
||||||
|
variant: _.Symbol.for("unit")
|
||||||
|
};
|
||||||
|
};
|
||||||
|
export function any(): Field {
|
||||||
|
return {
|
||||||
|
"_variant": "any",
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
any.schema = function () {
|
||||||
|
return {
|
||||||
|
schema: _schema(),
|
||||||
|
imports: _imports,
|
||||||
|
definitionName: _.Symbol.for("Field"),
|
||||||
|
variant: _.Symbol.for("any")
|
||||||
|
};
|
||||||
|
};
|
||||||
|
export function embedded(): Field {
|
||||||
|
return {
|
||||||
|
"_variant": "embedded",
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
embedded.schema = function () {
|
||||||
|
return {
|
||||||
|
schema: _schema(),
|
||||||
|
imports: _imports,
|
||||||
|
definitionName: _.Symbol.for("Field"),
|
||||||
|
variant: _.Symbol.for("embedded")
|
||||||
|
};
|
||||||
|
};
|
||||||
|
export function array(element: Field): Field {
|
||||||
|
return {
|
||||||
|
"_variant": "array",
|
||||||
|
"element": element,
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
array.schema = function () {
|
||||||
|
return {
|
||||||
|
schema: _schema(),
|
||||||
|
imports: _imports,
|
||||||
|
definitionName: _.Symbol.for("Field"),
|
||||||
|
variant: _.Symbol.for("array")
|
||||||
|
};
|
||||||
|
};
|
||||||
|
export function set(element: Field): Field {
|
||||||
|
return {
|
||||||
|
"_variant": "set",
|
||||||
|
"element": element,
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
set.schema = function () {
|
||||||
|
return {
|
||||||
|
schema: _schema(),
|
||||||
|
imports: _imports,
|
||||||
|
definitionName: _.Symbol.for("Field"),
|
||||||
|
variant: _.Symbol.for("set")
|
||||||
|
};
|
||||||
|
};
|
||||||
|
export function map({key, value}: {key: Field, value: Field}): Field {
|
||||||
|
return {
|
||||||
|
"_variant": "map",
|
||||||
|
"key": key,
|
||||||
|
"value": value,
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
map.schema = function () {
|
||||||
|
return {
|
||||||
|
schema: _schema(),
|
||||||
|
imports: _imports,
|
||||||
|
definitionName: _.Symbol.for("Field"),
|
||||||
|
variant: _.Symbol.for("map")
|
||||||
|
};
|
||||||
|
};
|
||||||
|
export function ref(name: _i_schema.Ref): Field {
|
||||||
|
return {
|
||||||
|
"_variant": "ref",
|
||||||
|
"name": name,
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
ref.schema = function () {
|
||||||
|
return {
|
||||||
|
schema: _schema(),
|
||||||
|
imports: _imports,
|
||||||
|
definitionName: _.Symbol.for("Field"),
|
||||||
|
variant: _.Symbol.for("ref")
|
||||||
|
};
|
||||||
|
};
|
||||||
|
export function AtomKind(value: _i_schema.AtomKind): Field {
|
||||||
|
return {
|
||||||
|
"_variant": "AtomKind",
|
||||||
|
"value": value,
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
AtomKind.schema = function () {
|
||||||
|
return {
|
||||||
|
schema: _schema(),
|
||||||
|
imports: _imports,
|
||||||
|
definitionName: _.Symbol.for("Field"),
|
||||||
|
variant: _.Symbol.for("AtomKind")
|
||||||
|
};
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function asDefinition<_embedded extends _.Embeddable = _.GenericEmbedded>(v: _.Value<_embedded>): Definition {
|
||||||
|
let result = toDefinition(v);
|
||||||
|
if (result === void 0) throw new TypeError(`Invalid Definition: ${_.stringify(v)}`);
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function toDefinition<_embedded extends _.Embeddable = _.GenericEmbedded>(v: _.Value<_embedded>): undefined | Definition {
|
||||||
|
let result: undefined | Definition;
|
||||||
|
if (_.Record.isRecord<_.Value<_embedded>, _.Tuple<_.Value<_embedded>>, _embedded>(v)) {
|
||||||
|
let _tmp0: ({}) | undefined;
|
||||||
|
_tmp0 = _.is(v.label, $union) ? {} : void 0;
|
||||||
|
if (_tmp0 !== void 0) {
|
||||||
|
let _tmp1: (Array<Variant>) | undefined;
|
||||||
|
_tmp1 = void 0;
|
||||||
|
if (_.isSequence(v[0])) {
|
||||||
|
_tmp1 = [];
|
||||||
|
for (const _tmp2 of v[0]) {
|
||||||
|
let _tmp3: (Variant) | undefined;
|
||||||
|
_tmp3 = toVariant(_tmp2);
|
||||||
|
if (_tmp3 !== void 0) {_tmp1.push(_tmp3); continue;};
|
||||||
|
_tmp1 = void 0;
|
||||||
|
break;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
if (_tmp1 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"_variant": "union",
|
||||||
|
"variants": _tmp1,
|
||||||
|
__as_preserve__() {return fromDefinition(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromDefinition(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromDefinition(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
if (result === void 0) {
|
||||||
|
let _tmp4: (Simple) | undefined;
|
||||||
|
_tmp4 = toSimple(v);
|
||||||
|
if (_tmp4 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"_variant": "Simple",
|
||||||
|
"value": _tmp4,
|
||||||
|
__as_preserve__() {return fromDefinition(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromDefinition(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromDefinition(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
export namespace Definition {export const __from_preserve__ = toDefinition;}
|
||||||
|
|
||||||
|
export function fromDefinition<_embedded extends _.Embeddable = _.GenericEmbedded>(_v: Definition): _.Value<_embedded> {
|
||||||
|
switch (_v._variant) {
|
||||||
|
case "union": {return _.Record($union, [_v["variants"].map(v => fromVariant<_embedded>(v))]);};
|
||||||
|
case "Simple": {return fromSimple<_embedded>(_v.value);};
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function asVariant<_embedded extends _.Embeddable = _.GenericEmbedded>(v: _.Value<_embedded>): Variant {
|
||||||
|
let result = toVariant(v);
|
||||||
|
if (result === void 0) throw new TypeError(`Invalid Variant: ${_.stringify(v)}`);
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function toVariant<_embedded extends _.Embeddable = _.GenericEmbedded>(v: _.Value<_embedded>): undefined | Variant {
|
||||||
|
let result: undefined | Variant;
|
||||||
|
if (_.isSequence(v) && v.length >= 2) {
|
||||||
|
let _tmp0: (symbol) | undefined;
|
||||||
|
_tmp0 = typeof v[0] === 'symbol' ? v[0] : void 0;
|
||||||
|
if (_tmp0 !== void 0) {
|
||||||
|
let _tmp1: (Simple) | undefined;
|
||||||
|
_tmp1 = toSimple(v[1]);
|
||||||
|
if (_tmp1 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"label": _tmp0,
|
||||||
|
"type": _tmp1,
|
||||||
|
__as_preserve__() {return fromVariant(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromVariant(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromVariant(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
Variant.__from_preserve__ = toVariant;
|
||||||
|
|
||||||
|
export function fromVariant<_embedded extends _.Embeddable = _.GenericEmbedded>(_v: Variant): _.Value<_embedded> {return [_v["label"], fromSimple<_embedded>(_v["type"])];}
|
||||||
|
|
||||||
|
export function asSimple<_embedded extends _.Embeddable = _.GenericEmbedded>(v: _.Value<_embedded>): Simple {
|
||||||
|
let result = toSimple(v);
|
||||||
|
if (result === void 0) throw new TypeError(`Invalid Simple: ${_.stringify(v)}`);
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function toSimple<_embedded extends _.Embeddable = _.GenericEmbedded>(v: _.Value<_embedded>): undefined | Simple {
|
||||||
|
let _tmp0: (Field) | undefined;
|
||||||
|
let result: undefined | Simple;
|
||||||
|
_tmp0 = toField(v);
|
||||||
|
if (_tmp0 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"_variant": "Field",
|
||||||
|
"value": _tmp0,
|
||||||
|
__as_preserve__() {return fromSimple(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromSimple(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromSimple(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
if (result === void 0) {
|
||||||
|
let _tmp1: (Record) | undefined;
|
||||||
|
_tmp1 = toRecord(v);
|
||||||
|
if (_tmp1 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"_variant": "Record",
|
||||||
|
"value": _tmp1,
|
||||||
|
__as_preserve__() {return fromSimple(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromSimple(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromSimple(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
export namespace Simple {export const __from_preserve__ = toSimple;}
|
||||||
|
|
||||||
|
export function fromSimple<_embedded extends _.Embeddable = _.GenericEmbedded>(_v: Simple): _.Value<_embedded> {
|
||||||
|
switch (_v._variant) {
|
||||||
|
case "Field": {return fromField<_embedded>(_v.value);};
|
||||||
|
case "Record": {return fromRecord<_embedded>(_v.value);};
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
||||||
|
export function asRecord<_embedded extends _.Embeddable = _.GenericEmbedded>(v: _.Value<_embedded>): Record {
|
||||||
|
let result = toRecord(v);
|
||||||
|
if (result === void 0) throw new TypeError(`Invalid Record: ${_.stringify(v)}`);
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function toRecord<_embedded extends _.Embeddable = _.GenericEmbedded>(v: _.Value<_embedded>): undefined | Record {
|
||||||
|
let result: undefined | Record;
|
||||||
|
if (_.Record.isRecord<_.Value<_embedded>, _.Tuple<_.Value<_embedded>>, _embedded>(v)) {
|
||||||
|
let _tmp0: ({}) | undefined;
|
||||||
|
_tmp0 = _.is(v.label, $rec) ? {} : void 0;
|
||||||
|
if (_tmp0 !== void 0) {
|
||||||
|
let _tmp1: (Array<NamedField>) | undefined;
|
||||||
|
_tmp1 = void 0;
|
||||||
|
if (_.isSequence(v[0])) {
|
||||||
|
_tmp1 = [];
|
||||||
|
for (const _tmp2 of v[0]) {
|
||||||
|
let _tmp3: (NamedField) | undefined;
|
||||||
|
_tmp3 = toNamedField(_tmp2);
|
||||||
|
if (_tmp3 !== void 0) {_tmp1.push(_tmp3); continue;};
|
||||||
|
_tmp1 = void 0;
|
||||||
|
break;
|
||||||
|
};
|
||||||
|
};
|
||||||
|
if (_tmp1 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"fields": _tmp1,
|
||||||
|
__as_preserve__() {return fromRecord(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromRecord(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromRecord(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
Record.__from_preserve__ = toRecord;
|
||||||
|
|
||||||
|
export function fromRecord<_embedded extends _.Embeddable = _.GenericEmbedded>(_v: Record): _.Value<_embedded> {return _.Record($rec, [_v["fields"].map(v => fromNamedField<_embedded>(v))]);}
|
||||||
|
|
||||||
|
export function asNamedField<_embedded extends _.Embeddable = _.GenericEmbedded>(v: _.Value<_embedded>): NamedField {
|
||||||
|
let result = toNamedField(v);
|
||||||
|
if (result === void 0) throw new TypeError(`Invalid NamedField: ${_.stringify(v)}`);
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function toNamedField<_embedded extends _.Embeddable = _.GenericEmbedded>(v: _.Value<_embedded>): undefined | NamedField {
|
||||||
|
let result: undefined | NamedField;
|
||||||
|
if (_.isSequence(v) && v.length >= 2) {
|
||||||
|
let _tmp0: (symbol) | undefined;
|
||||||
|
_tmp0 = typeof v[0] === 'symbol' ? v[0] : void 0;
|
||||||
|
if (_tmp0 !== void 0) {
|
||||||
|
let _tmp1: (Field) | undefined;
|
||||||
|
_tmp1 = toField(v[1]);
|
||||||
|
if (_tmp1 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"name": _tmp0,
|
||||||
|
"type": _tmp1,
|
||||||
|
__as_preserve__() {return fromNamedField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromNamedField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromNamedField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
NamedField.__from_preserve__ = toNamedField;
|
||||||
|
|
||||||
|
export function fromNamedField<_embedded extends _.Embeddable = _.GenericEmbedded>(_v: NamedField): _.Value<_embedded> {return [_v["name"], fromField<_embedded>(_v["type"])];}
|
||||||
|
|
||||||
|
export function asField<_embedded extends _.Embeddable = _.GenericEmbedded>(v: _.Value<_embedded>): Field {
|
||||||
|
let result = toField(v);
|
||||||
|
if (result === void 0) throw new TypeError(`Invalid Field: ${_.stringify(v)}`);
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
export function toField<_embedded extends _.Embeddable = _.GenericEmbedded>(v: _.Value<_embedded>): undefined | Field {
|
||||||
|
let _tmp0: ({}) | undefined;
|
||||||
|
let result: undefined | Field;
|
||||||
|
_tmp0 = _.is(v, $unit) ? {} : void 0;
|
||||||
|
if (_tmp0 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"_variant": "unit",
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
if (result === void 0) {
|
||||||
|
let _tmp1: ({}) | undefined;
|
||||||
|
_tmp1 = _.is(v, $any) ? {} : void 0;
|
||||||
|
if (_tmp1 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"_variant": "any",
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
if (result === void 0) {
|
||||||
|
let _tmp2: ({}) | undefined;
|
||||||
|
_tmp2 = _.is(v, $embedded) ? {} : void 0;
|
||||||
|
if (_tmp2 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"_variant": "embedded",
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
if (result === void 0) {
|
||||||
|
if (_.Record.isRecord<_.Value<_embedded>, _.Tuple<_.Value<_embedded>>, _embedded>(v)) {
|
||||||
|
let _tmp3: ({}) | undefined;
|
||||||
|
_tmp3 = _.is(v.label, $array) ? {} : void 0;
|
||||||
|
if (_tmp3 !== void 0) {
|
||||||
|
let _tmp4: (Field) | undefined;
|
||||||
|
_tmp4 = toField(v[0]);
|
||||||
|
if (_tmp4 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"_variant": "array",
|
||||||
|
"element": _tmp4,
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
if (result === void 0) {
|
||||||
|
if (_.Record.isRecord<_.Value<_embedded>, _.Tuple<_.Value<_embedded>>, _embedded>(v)) {
|
||||||
|
let _tmp5: ({}) | undefined;
|
||||||
|
_tmp5 = _.is(v.label, $set) ? {} : void 0;
|
||||||
|
if (_tmp5 !== void 0) {
|
||||||
|
let _tmp6: (Field) | undefined;
|
||||||
|
_tmp6 = toField(v[0]);
|
||||||
|
if (_tmp6 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"_variant": "set",
|
||||||
|
"element": _tmp6,
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
if (result === void 0) {
|
||||||
|
if (_.Record.isRecord<_.Value<_embedded>, _.Tuple<_.Value<_embedded>>, _embedded>(v)) {
|
||||||
|
let _tmp7: ({}) | undefined;
|
||||||
|
_tmp7 = _.is(v.label, $map) ? {} : void 0;
|
||||||
|
if (_tmp7 !== void 0) {
|
||||||
|
let _tmp8: (Field) | undefined;
|
||||||
|
_tmp8 = toField(v[0]);
|
||||||
|
if (_tmp8 !== void 0) {
|
||||||
|
let _tmp9: (Field) | undefined;
|
||||||
|
_tmp9 = toField(v[1]);
|
||||||
|
if (_tmp9 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"_variant": "map",
|
||||||
|
"key": _tmp8,
|
||||||
|
"value": _tmp9,
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
if (result === void 0) {
|
||||||
|
if (_.Record.isRecord<_.Value<_embedded>, _.Tuple<_.Value<_embedded>>, _embedded>(v)) {
|
||||||
|
let _tmp10: ({}) | undefined;
|
||||||
|
_tmp10 = _.is(v.label, $ref) ? {} : void 0;
|
||||||
|
if (_tmp10 !== void 0) {
|
||||||
|
let _tmp11: (_i_schema.Ref) | undefined;
|
||||||
|
_tmp11 = _i_schema.toRef<_embedded>(v[0]);
|
||||||
|
if (_tmp11 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"_variant": "ref",
|
||||||
|
"name": _tmp11,
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
if (result === void 0) {
|
||||||
|
let _tmp12: (_i_schema.AtomKind) | undefined;
|
||||||
|
_tmp12 = _i_schema.toAtomKind<_embedded>(v);
|
||||||
|
if (_tmp12 !== void 0) {
|
||||||
|
result = {
|
||||||
|
"_variant": "AtomKind",
|
||||||
|
"value": _tmp12,
|
||||||
|
__as_preserve__() {return fromField(this);},
|
||||||
|
__preserve_on__(e) { e.push(fromField(this)); },
|
||||||
|
__preserve_text_on__(w) { w.push(fromField(this)); }
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
};
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
export namespace Field {export const __from_preserve__ = toField;}
|
||||||
|
|
||||||
|
export function fromField<_embedded extends _.Embeddable = _.GenericEmbedded>(_v: Field): _.Value<_embedded> {
|
||||||
|
switch (_v._variant) {
|
||||||
|
case "unit": {return $unit;};
|
||||||
|
case "any": {return $any;};
|
||||||
|
case "embedded": {return $embedded;};
|
||||||
|
case "array": {return _.Record($array, [fromField<_embedded>(_v["element"])]);};
|
||||||
|
case "set": {return _.Record($set, [fromField<_embedded>(_v["element"])]);};
|
||||||
|
case "map": {
|
||||||
|
return _.Record($map, [fromField<_embedded>(_v["key"]), fromField<_embedded>(_v["value"])]);
|
||||||
|
};
|
||||||
|
case "ref": {return _.Record($ref, [_i_schema.fromRef<_embedded>(_v["name"])]);};
|
||||||
|
case "AtomKind": {return _i_schema.fromAtomKind<_embedded>(_v.value);};
|
||||||
|
};
|
||||||
|
}
|
||||||
|
|
File diff suppressed because one or more lines are too long
|
@ -0,0 +1,83 @@
|
||||||
|
import { compare, Embeddable } from '@preserves/core';
|
||||||
|
|
||||||
|
import * as M from './meta';
|
||||||
|
import * as H from './gen/host';
|
||||||
|
|
||||||
|
export * from './gen/host';
|
||||||
|
|
||||||
|
export function definitionType<V extends Embeddable>(p: M.Definition<V>): H.Definition {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'or': return H.Definition.union([p.pattern0, p.pattern1, ... p.patternN].map(p =>
|
||||||
|
H.Variant({ label: Symbol.for(p.variantLabel), type: patternType(p.pattern) })));
|
||||||
|
case 'and':
|
||||||
|
return H.Definition.Simple(productType([p.pattern0, p.pattern1, ... p.patternN]));
|
||||||
|
case 'Pattern':
|
||||||
|
return H.Definition.Simple(patternType(p.value));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function patternType<V extends Embeddable>(p: M.Pattern<V>): H.Simple {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'SimplePattern':
|
||||||
|
return H.Simple.Field(fieldType(p.value));
|
||||||
|
case 'CompoundPattern':
|
||||||
|
return productType([M.NamedPattern.anonymous(p)]);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function fieldType<V extends Embeddable>(p: M.SimplePattern<V>): H.Field {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'any': return H.Field.any();
|
||||||
|
case 'atom': return H.Field.AtomKind(p.atomKind);
|
||||||
|
case 'embedded': return H.Field.embedded();
|
||||||
|
case 'lit': return H.Field.unit();
|
||||||
|
case 'seqof': return H.Field.array(fieldType(p.pattern));
|
||||||
|
case 'setof': return H.Field.set(fieldType(p.pattern));
|
||||||
|
case 'dictof': return H.Field.map({ key: fieldType(p.key), value: fieldType(p.value) });
|
||||||
|
case 'Ref': return H.Field.ref(p.value);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export function productType<V extends Embeddable>(ps: M.NamedPattern<V>[]): H.Simple {
|
||||||
|
const gathered: H.NamedField[] = [];
|
||||||
|
ps.forEach(p => gather(p, gathered));
|
||||||
|
if (gathered.length === 0) return H.Simple.Field(H.Field.unit());
|
||||||
|
return H.Simple.Record(H.Record(gathered));
|
||||||
|
}
|
||||||
|
|
||||||
|
function promote<V extends Embeddable>(p: M.NamedSimplePattern<V>): M.NamedPattern<V> {
|
||||||
|
if (p._variant === 'named') return p;
|
||||||
|
return M.NamedPattern.anonymous(M.Pattern.SimplePattern(p.value));
|
||||||
|
}
|
||||||
|
|
||||||
|
function gather<V extends Embeddable>(p: M.NamedPattern<V>, into: H.NamedField[]) {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'named': {
|
||||||
|
const t = fieldType(p.value.pattern);
|
||||||
|
if (t._variant !== 'unit') into.push(H.NamedField({ name: p.value.name, type: t }));
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
case 'anonymous': {
|
||||||
|
if (p.value._variant === 'SimplePattern') return;
|
||||||
|
const q = p.value.value;
|
||||||
|
switch (q._variant) {
|
||||||
|
case 'rec':
|
||||||
|
gather(q.label, into);
|
||||||
|
gather(q.fields, into);
|
||||||
|
break;
|
||||||
|
case 'tuple':
|
||||||
|
q.patterns.forEach(p => gather(p, into));
|
||||||
|
break;
|
||||||
|
case 'tuplePrefix':
|
||||||
|
q.fixed.forEach(p => gather(p, into));
|
||||||
|
gather(promote(q.variable), into);
|
||||||
|
break;
|
||||||
|
case 'dict': {
|
||||||
|
const items = Array.from(q.entries.entries()).sort((a, b) => compare(a[0], b[0]));
|
||||||
|
items.forEach(([_key, p]) => gather(promote(p), into));
|
||||||
|
break;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
|
@ -3,6 +3,11 @@ export * from './error';
|
||||||
export * from './reader';
|
export * from './reader';
|
||||||
export * from './compiler';
|
export * from './compiler';
|
||||||
export * from './reflection';
|
export * from './reflection';
|
||||||
|
|
||||||
|
export { SchemaInterpreter } from './interpreter';
|
||||||
|
export * as Interpreter from './interpreter';
|
||||||
|
|
||||||
|
export * as Host from './host';
|
||||||
export * as Meta from './meta';
|
export * as Meta from './meta';
|
||||||
export * as Type from './compiler/type';
|
export * as Type from './compiler/type';
|
||||||
export * as GenType from './compiler/gentype';
|
export * as GenType from './compiler/gentype';
|
||||||
|
|
|
@ -0,0 +1,604 @@
|
||||||
|
import { EncodableDictionary, KeyedDictionary, Dictionary, Value, is, Record, Float, Bytes, isEmbedded, isSequence, Set, Atom, merge as plainMerge, Preservable, PreserveWritable, _iterMap, stringify, fromJS, Embeddable, DictionaryMap, JsDictionary } from '@preserves/core';
|
||||||
|
import { SchemaDefinition } from './reflection';
|
||||||
|
import * as M from './meta';
|
||||||
|
import * as H from './host';
|
||||||
|
|
||||||
|
export const UNIT: true = true;
|
||||||
|
|
||||||
|
export type Parsed<V extends Embeddable> = Atom | V | Parsed<V>[] | DictOf<V> | Bindings<V>;
|
||||||
|
export type TopParsed<V extends Embeddable> = Atom | V | Parsed<V>[] | DictOf<V> | TopBindings<V>;
|
||||||
|
|
||||||
|
export type Top<V extends Embeddable> =
|
||||||
|
& Preservable<V>
|
||||||
|
& PreserveWritable<V>
|
||||||
|
& { __as_preserve__(): Value<V> };
|
||||||
|
|
||||||
|
export type DictOf<V extends Embeddable> = EncodableDictionary<V, Parsed<V>, Parsed<V>>;
|
||||||
|
|
||||||
|
export type BindingName = string;
|
||||||
|
export type Bindings<V extends Embeddable> = { [key: BindingName]: Parsed<V> };
|
||||||
|
export type TopBindings<V extends Embeddable> = Bindings<V> & Top<V>;
|
||||||
|
|
||||||
|
export type SingleConstructor<V extends Embeddable> = ((input: any) => Parsed<V>) & { schema(): SchemaDefinition };
|
||||||
|
export type MultipleConstructors<V extends Embeddable> = { [key: string]: SingleConstructor<V> };
|
||||||
|
export type DefinitionConstructors<V extends Embeddable> = SingleConstructor<V> | MultipleConstructors<V>;
|
||||||
|
|
||||||
|
export namespace Bindings {
|
||||||
|
export function empty<V extends Embeddable>(): Bindings<V> {
|
||||||
|
return {};
|
||||||
|
}
|
||||||
|
export function single<V extends Embeddable>(k: BindingName, v: Parsed<V>): Bindings<V> {
|
||||||
|
const bs = empty<V>();
|
||||||
|
bs[k] = v;
|
||||||
|
return bs;
|
||||||
|
}
|
||||||
|
export function merge<V extends Embeddable>(... vs: Bindings<V>[]): Bindings<V> {
|
||||||
|
const acc = empty<V>();
|
||||||
|
for (const v of vs) {
|
||||||
|
Object.entries(v).forEach(([kw, vw]) => acc[kw] = vw);
|
||||||
|
}
|
||||||
|
return acc;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
export type DynField<V extends Embeddable> =
|
||||||
|
| { type: 'simple', value: Parsed<V> }
|
||||||
|
| { type: 'compound', values: Bindings<V> }
|
||||||
|
;
|
||||||
|
export namespace DynField {
|
||||||
|
export function unwrap<V extends Embeddable>(f: DynField<V>): Parsed<V> {
|
||||||
|
if (f.type === 'simple') return f.value;
|
||||||
|
return f.values;
|
||||||
|
}
|
||||||
|
export function unwrap_compound<V extends Embeddable>(f: DynField<V>): Bindings<V> {
|
||||||
|
if (f.type === 'simple') throw new Error("Cannot unwrap DynField.simple to compound fields");
|
||||||
|
return f.values;
|
||||||
|
}
|
||||||
|
export function simple<V extends Embeddable>(value: Parsed<V>): DynField<V> {
|
||||||
|
return { type: 'simple', value };
|
||||||
|
}
|
||||||
|
export function maybeSimple<V extends Embeddable>(value: Parsed<V> | null): DynField<V> {
|
||||||
|
return value === null ? compound(Bindings.empty()) : simple(value);
|
||||||
|
}
|
||||||
|
export function compound<V extends Embeddable>(values: Bindings<V>): DynField<V> {
|
||||||
|
return { type: 'compound', values };
|
||||||
|
}
|
||||||
|
export function promote<V extends Embeddable>(f: DynField<V>, key?: symbol): Bindings<V> {
|
||||||
|
if (f.type === 'compound') return f.values;
|
||||||
|
return key ? Bindings.single(M.jsId(key.description!), f.value) : Bindings.empty();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
function optmap<A,B>(a: A | undefined, f: (a: A) => B): B | undefined {
|
||||||
|
if (a === void 0) return void 0;
|
||||||
|
return f(a);
|
||||||
|
}
|
||||||
|
|
||||||
|
export type Unparseable<V extends Embeddable> = TopParsed<V>;
|
||||||
|
export type Unparser<V extends Embeddable> = (v: Parsed<V>) => Value<V>;
|
||||||
|
export type UnparserCompound<V extends Embeddable> = (v: Bindings<V>) => Value<V>;
|
||||||
|
|
||||||
|
function attachSchema<V extends Embeddable>(
|
||||||
|
schema: M.Schema<V>,
|
||||||
|
name: symbol,
|
||||||
|
f: (input: any) => Parsed<V>,
|
||||||
|
variant?: symbol,
|
||||||
|
): SingleConstructor<V> {
|
||||||
|
const g = f as SingleConstructor<V>;
|
||||||
|
g.schema = () => ({
|
||||||
|
schema: fromJS(schema),
|
||||||
|
imports: {}, // TODO
|
||||||
|
definitionName: name,
|
||||||
|
variant,
|
||||||
|
});
|
||||||
|
return g;
|
||||||
|
}
|
||||||
|
|
||||||
|
export class SchemaInterpreter<V extends Embeddable> {
|
||||||
|
activeModule: M.ModulePath = [];
|
||||||
|
unparserCache: { [key: string]: [Unparser<V>] } = {};
|
||||||
|
|
||||||
|
constructor (
|
||||||
|
public env: M.Modules<V> = new KeyedDictionary(),
|
||||||
|
public mergeEmbeddeds: (a: V, b: V) => V | undefined = (_a, _b) => void 0,
|
||||||
|
) {}
|
||||||
|
|
||||||
|
_withModule<R>(modulePath: M.ModulePath, f: () => R): R {
|
||||||
|
const saved = this.activeModule;
|
||||||
|
if (modulePath.length > 0) this.activeModule = modulePath;
|
||||||
|
try {
|
||||||
|
return f();
|
||||||
|
} finally {
|
||||||
|
if (modulePath.length > 0) this.activeModule = saved;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
_findModule(modulePath: M.ModulePath): { resolved: M.ModulePath, schema: M.Schema<V> } {
|
||||||
|
const prefix = this.activeModule.slice();
|
||||||
|
while (true) {
|
||||||
|
const probe = [... prefix, ... modulePath];
|
||||||
|
const schema = this.env.get(probe);
|
||||||
|
if (schema !== void 0) {
|
||||||
|
return { resolved: probe, schema };
|
||||||
|
}
|
||||||
|
if (prefix.length === 0) {
|
||||||
|
throw new Error(`No such preserves-schema module: ${M.formatModulePath(modulePath)}, referred to in module ${M.formatModulePath(this.activeModule)}`);
|
||||||
|
}
|
||||||
|
prefix.pop();
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
_lookup<R>(
|
||||||
|
modulePath: M.ModulePath,
|
||||||
|
name: symbol,
|
||||||
|
f: (d: M.Definition<V>, schema: M.Schema<V>) => R,
|
||||||
|
): R {
|
||||||
|
const { resolved, schema } = this._findModule(modulePath);
|
||||||
|
return this._withModule(resolved, () => {
|
||||||
|
const definition = JsDictionary.get(schema.definitions, name);
|
||||||
|
if (definition === void 0) {
|
||||||
|
throw new Error(`No such preserves-schema definition: ${[... modulePath, name].map(s => s.description!).join('.')}`);
|
||||||
|
}
|
||||||
|
return f(definition, schema);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
makeTop(modulePath: M.ModulePath, name: symbol, fields: Bindings<V>): TopBindings<V> {
|
||||||
|
const result = fields as TopBindings<V>;
|
||||||
|
result.__as_preserve__ = () => this.unparser(modulePath, name)(result);
|
||||||
|
result.__preserve_on__ = function (e) { e.push(this.__as_preserve__()); };
|
||||||
|
result.__preserve_text_on__ = function (w) { w.push(this.__as_preserve__()); };
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
|
||||||
|
buildConstructor(
|
||||||
|
modulePath: M.ModulePath,
|
||||||
|
name: symbol,
|
||||||
|
schema: M.Schema<V>,
|
||||||
|
ty: H.Simple,
|
||||||
|
variant?: symbol,
|
||||||
|
): SingleConstructor<V> {
|
||||||
|
const flatName = M.formatModulePath([
|
||||||
|
... modulePath, name, ... (variant === void 0 ? [] : [variant])]);
|
||||||
|
const mkBase = (variant === void 0)
|
||||||
|
? () => ({})
|
||||||
|
: () => ({ _variant: variant.description! });
|
||||||
|
switch (ty._variant) {
|
||||||
|
case 'Field': {
|
||||||
|
const tmp =
|
||||||
|
ty.value._variant === 'unit'
|
||||||
|
? { [flatName]: () => this.makeTop(modulePath, name, mkBase()) }
|
||||||
|
: (variant === void 0
|
||||||
|
? { [flatName]: (value: any) => value }
|
||||||
|
: { [flatName]: (value: any) => this.makeTop(
|
||||||
|
modulePath, name, { ... mkBase(), value }) });
|
||||||
|
return attachSchema(schema, name, tmp[flatName], variant);
|
||||||
|
}
|
||||||
|
case 'Record': {
|
||||||
|
const rec = ty.value;
|
||||||
|
if (rec.fields.length > 1) {
|
||||||
|
const tmp = { [flatName]: (fields: Bindings<V>) =>
|
||||||
|
this.makeTop(modulePath, name, { ... mkBase(), ... fields }) };
|
||||||
|
return attachSchema(schema, name, tmp[flatName], variant);
|
||||||
|
} else {
|
||||||
|
const tmp = { [flatName]: (field: Parsed<V>) =>
|
||||||
|
this.makeTop(modulePath, name, {
|
||||||
|
... mkBase(),
|
||||||
|
[M.jsId(rec.fields[0].name.description!)]: field,
|
||||||
|
}) };
|
||||||
|
return attachSchema(schema, name, tmp[flatName], variant);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
definitionConstructor(
|
||||||
|
modulePath: M.ModulePath,
|
||||||
|
name: symbol,
|
||||||
|
): DefinitionConstructors<V> {
|
||||||
|
return this._lookup(modulePath, name, (definition, schema): DefinitionConstructors<V> => {
|
||||||
|
const ty = H.definitionType(definition);
|
||||||
|
if (ty._variant === 'union') {
|
||||||
|
const multiple: MultipleConstructors<V> = {};
|
||||||
|
ty.variants.forEach(v => {
|
||||||
|
multiple[M.jsId(v.label.description!)] = this.buildConstructor(
|
||||||
|
modulePath, name, schema, v.type, v.label);
|
||||||
|
});
|
||||||
|
return multiple;
|
||||||
|
} else {
|
||||||
|
return this.buildConstructor(modulePath, name, schema, ty.value);
|
||||||
|
}
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
parse(
|
||||||
|
modulePath: M.ModulePath,
|
||||||
|
name: symbol,
|
||||||
|
input: Value<V>,
|
||||||
|
): Unparseable<V> {
|
||||||
|
const v = this.tryParse(modulePath, name, input);
|
||||||
|
if (v === void 0) {
|
||||||
|
throw new TypeError(
|
||||||
|
`Invalid ${M.formatModulePath([... modulePath, name])}: ${stringify(input)}`)
|
||||||
|
}
|
||||||
|
return v;
|
||||||
|
}
|
||||||
|
|
||||||
|
tryParse(
|
||||||
|
modulePath: M.ModulePath,
|
||||||
|
name: symbol,
|
||||||
|
input: Value<V>,
|
||||||
|
): Unparseable<V> | undefined {
|
||||||
|
return this._lookup(modulePath, name, definition =>
|
||||||
|
optmap(this.parseDefinition(definition, input), result0 => {
|
||||||
|
const ty = H.definitionType(definition);
|
||||||
|
if (ty._variant === 'union' || ty.value._variant === 'Record') {
|
||||||
|
return this.makeTop(modulePath, name, result0 as Bindings<V>);
|
||||||
|
} else {
|
||||||
|
if (ty.value.value._variant === 'unit') {
|
||||||
|
return this.makeTop(modulePath, name, {});
|
||||||
|
} else {
|
||||||
|
return result0 as Exclude<Parsed<V>, Bindings<V>>;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}));
|
||||||
|
}
|
||||||
|
|
||||||
|
parseDefinition(d: M.Definition<V>, input: Value<V>): Parsed<V> | undefined {
|
||||||
|
switch (d._variant) {
|
||||||
|
case 'or':
|
||||||
|
return this.parseNamedAlternative(d.pattern0, input) ??
|
||||||
|
this.parseNamedAlternative(d.pattern1, input) ??
|
||||||
|
(() => {
|
||||||
|
for (const p of d.patternN) {
|
||||||
|
const r = this.parseNamedAlternative(p, input);
|
||||||
|
if (r !== void 0) return r;
|
||||||
|
}
|
||||||
|
return void 0;
|
||||||
|
})();
|
||||||
|
case 'and': {
|
||||||
|
const rs = [this.parseNamedPattern(d.pattern0, input),
|
||||||
|
this.parseNamedPattern(d.pattern1, input),
|
||||||
|
... d.patternN.map(p => this.parseNamedPattern(p, input))];
|
||||||
|
for (const r of rs) {
|
||||||
|
if (r === void 0) return void 0;
|
||||||
|
}
|
||||||
|
return Bindings.merge(... rs as Bindings<V>[]);
|
||||||
|
}
|
||||||
|
case 'Pattern':
|
||||||
|
return optmap(this.parsePattern(d.value, input), DynField.unwrap);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
parseNamedAlternative(p: M.NamedAlternative<V>, input: Value<V>): Bindings<V> | undefined {
|
||||||
|
return optmap(this.parsePattern(p.pattern, input), w => {
|
||||||
|
const result = DynField.promote(w, Symbol.for('value'));
|
||||||
|
result._variant = p.variantLabel;
|
||||||
|
return result;
|
||||||
|
});
|
||||||
|
}
|
||||||
|
|
||||||
|
parseNamedPattern(p: M.NamedPattern<V>, input: Value<V>): Bindings<V> | undefined {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'named':
|
||||||
|
return optmap(this.parseSimplePattern(p.value.pattern, input),
|
||||||
|
w => DynField.promote(DynField.maybeSimple(w), p.value.name));
|
||||||
|
case 'anonymous':
|
||||||
|
return optmap(this.parsePattern(p.value, input),
|
||||||
|
w => DynField.promote(w));
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
parseNamedSimplePattern(p: M.NamedSimplePattern<V>, input: Value<V>): DynField<V> | undefined {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'named':
|
||||||
|
return optmap(this.parseSimplePattern(p.value.pattern, input),
|
||||||
|
w => DynField.compound(DynField.promote(DynField.maybeSimple(w), p.value.name)));
|
||||||
|
case 'anonymous':
|
||||||
|
return optmap(this.parseSimplePattern(p.value, input), DynField.maybeSimple<V>);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
parseSimplePattern(p: M.SimplePattern<V>, input: Value<V>): Parsed<V> | null | undefined {
|
||||||
|
const inputIf = (b: boolean) => b ? input : void 0;
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'any': return input;
|
||||||
|
case 'atom': switch (p.atomKind._variant) {
|
||||||
|
case 'Boolean': return inputIf(typeof input === 'boolean');
|
||||||
|
case 'Double': return inputIf(Float.isDouble(input));
|
||||||
|
case 'SignedInteger': return inputIf(typeof input === 'number' || typeof input === 'bigint');
|
||||||
|
case 'String': return inputIf(typeof input === 'string');
|
||||||
|
case 'ByteString': return inputIf(Bytes.isBytes(input));
|
||||||
|
case 'Symbol': return inputIf(typeof input === 'symbol');
|
||||||
|
}
|
||||||
|
case 'embedded': return isEmbedded(input) ? input : void 0;
|
||||||
|
case 'lit': return is(input, p.value) ? null : void 0;
|
||||||
|
case 'seqof': {
|
||||||
|
if (!isSequence(input)) return void 0;
|
||||||
|
const result: Parsed<V>[] = [];
|
||||||
|
for (const v of input) {
|
||||||
|
const w = this.parseSimplePattern(p.pattern, v);
|
||||||
|
if (w === void 0) return void 0;
|
||||||
|
if (w !== null) result.push(w);
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
case 'setof': {
|
||||||
|
if (!Set.isSet<V>(input)) return void 0;
|
||||||
|
const result: Parsed<V>[] = [];
|
||||||
|
for (const v of input) {
|
||||||
|
const w = this.parseSimplePattern(p.pattern, v);
|
||||||
|
if (w === void 0) return void 0;
|
||||||
|
if (w !== null) result.push(w);
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
case 'dictof': {
|
||||||
|
if (!Dictionary.isDictionary<V>(input)) return void 0;
|
||||||
|
if (M.isSymbolPattern(p.key)) {
|
||||||
|
const result: Bindings<V> = {};
|
||||||
|
for (const [k, v] of Dictionary.asMap<V>(input)) {
|
||||||
|
const kw = this.parseSimplePattern(p.key, k);
|
||||||
|
if (kw === void 0 || typeof kw !== 'symbol') return void 0;
|
||||||
|
const vw = this.parseSimplePattern(p.value, v);
|
||||||
|
if (vw === void 0) return void 0;
|
||||||
|
result[kw.description!] = vw === null ? UNIT : vw;
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
} else {
|
||||||
|
const result: DictOf<V> = new EncodableDictionary(
|
||||||
|
this.unparserSimplePattern(p.key),
|
||||||
|
this.unparserSimplePattern(p.value));
|
||||||
|
for (const [k, v] of Dictionary.asMap<V>(input)) {
|
||||||
|
const kw = this.parseSimplePattern(p.key, k);
|
||||||
|
if (kw === void 0) return void 0;
|
||||||
|
const vw = this.parseSimplePattern(p.value, v);
|
||||||
|
if (vw === void 0) return void 0;
|
||||||
|
result.set(kw === null ? UNIT : kw, vw === null ? UNIT : vw);
|
||||||
|
}
|
||||||
|
return result;
|
||||||
|
}
|
||||||
|
}
|
||||||
|
case 'Ref': return this.tryParse(p.value.module, p.value.name, input);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
parseCompoundPattern(p: M.CompoundPattern<V>, input: Value<V>): Bindings<V> | undefined {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'rec':
|
||||||
|
if (!Record.isRecord<Value<V>, Array<Value<V>>, V>(input)) return void 0;
|
||||||
|
return optmap(this.parseNamedPattern(p.label, input.label),
|
||||||
|
lw => optmap(this.parseNamedPattern(p.fields, Array.from(input)),
|
||||||
|
fsw => Bindings.merge(lw, fsw)));
|
||||||
|
case 'tuple': {
|
||||||
|
if (!isSequence(input)) return void 0;
|
||||||
|
if (input.length < p.patterns.length) return void 0;
|
||||||
|
let results: Bindings<V>[] = [];
|
||||||
|
for (let i = 0; i < p.patterns.length; i++) {
|
||||||
|
const w = this.parseNamedPattern(p.patterns[i], input[i]);
|
||||||
|
if (w === void 0) return void 0;
|
||||||
|
results.push(w);
|
||||||
|
}
|
||||||
|
return Bindings.merge(... results);
|
||||||
|
}
|
||||||
|
case 'tuplePrefix': {
|
||||||
|
if (!isSequence(input)) return void 0;
|
||||||
|
if (input.length < p.fixed.length) return void 0;
|
||||||
|
let fixed_results: Bindings<V>[] = [];
|
||||||
|
for (let i = 0; i < p.fixed.length; i++) {
|
||||||
|
const w = this.parseNamedPattern(p.fixed[i], input[i]);
|
||||||
|
if (w === void 0) return void 0;
|
||||||
|
fixed_results.push(w);
|
||||||
|
}
|
||||||
|
const remainder = input.slice(p.fixed.length);
|
||||||
|
return optmap(this.parseNamedSimplePattern(p.variable, remainder), vw => {
|
||||||
|
const variable_results = DynField.unwrap_compound(vw);
|
||||||
|
return Bindings.merge(variable_results, ... fixed_results);
|
||||||
|
});
|
||||||
|
}
|
||||||
|
case 'dict': {
|
||||||
|
const inputMap = Dictionary.asMap<V>(input);
|
||||||
|
if (!inputMap) return void 0;
|
||||||
|
const results: Bindings<V>[] = [];
|
||||||
|
for (const [key, vp] of p.entries) {
|
||||||
|
const v = inputMap.get(key);
|
||||||
|
if (v === void 0) return void 0;
|
||||||
|
const vw = this.parseNamedSimplePattern(vp, v);
|
||||||
|
if (vw === void 0) return void 0;
|
||||||
|
results.push(DynField.unwrap_compound(vw));
|
||||||
|
}
|
||||||
|
return Bindings.merge(... results);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
parsePattern(p: M.Pattern<V>, input: Value<V>): DynField<V> | undefined {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'SimplePattern':
|
||||||
|
return optmap(this.parseSimplePattern(p.value, input), DynField.maybeSimple);
|
||||||
|
case 'CompoundPattern':
|
||||||
|
return optmap(this.parseCompoundPattern(p.value, input), DynField.compound);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
unparse(
|
||||||
|
modulePath: M.ModulePath,
|
||||||
|
name: symbol,
|
||||||
|
v: Unparseable<V>,
|
||||||
|
): Value<V> {
|
||||||
|
return this.unparser(modulePath, name)(v);
|
||||||
|
}
|
||||||
|
|
||||||
|
unparser(modulePath: M.ModulePath, name: symbol): Unparser<V> {
|
||||||
|
return this._unparser(modulePath, name)[0];
|
||||||
|
}
|
||||||
|
|
||||||
|
_unparser(modulePath: M.ModulePath, name: symbol): [Unparser<V>] {
|
||||||
|
const key = [... modulePath.map(n => n.description!), name.description!].join('.');
|
||||||
|
if (!(key in this.unparserCache)) {
|
||||||
|
const cell: [Unparser<V>] = [null!];
|
||||||
|
this.unparserCache[key] = cell;
|
||||||
|
cell[0] = this._lookup(modulePath, name, p => this.unparserDefinition(p));
|
||||||
|
}
|
||||||
|
return this.unparserCache[key];
|
||||||
|
}
|
||||||
|
|
||||||
|
unparserDefinition(p: M.Definition<V>): Unparser<V> {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'or': {
|
||||||
|
const ups = [p.pattern0, p.pattern1, ... p.patternN].map(
|
||||||
|
p => this.unparserNamedAlternative(p));
|
||||||
|
return v => {
|
||||||
|
const bs = v as Bindings<V>;
|
||||||
|
return ups.find(up => up[0] === bs._variant)![1](bs);
|
||||||
|
};
|
||||||
|
}
|
||||||
|
case 'and': {
|
||||||
|
const ups = [p.pattern0, p.pattern1, ... p.patternN].map(
|
||||||
|
p => this.unparserNamedPattern(p));
|
||||||
|
return v => plainMerge(this.mergeEmbeddeds,
|
||||||
|
ups[0](v), ... ups.slice(1).map(up => up(v)));
|
||||||
|
}
|
||||||
|
case 'Pattern':
|
||||||
|
return this.unparserPattern(p.value);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
unparserNamedAlternative(p: M.NamedAlternative<V>): [string, UnparserCompound<V>] {
|
||||||
|
const up = this.unparserPattern(p.pattern);
|
||||||
|
const ty = H.patternType(p.pattern);
|
||||||
|
switch (ty._variant) {
|
||||||
|
case 'Field': return [p.variantLabel, bs => up(bs['value'])];
|
||||||
|
case 'Record': return [p.variantLabel, up];
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
unparserNamedPattern(p: M.NamedPattern<V>): Unparser<V> {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'named': {
|
||||||
|
const up = this.unparserSimplePattern(p.value.pattern);
|
||||||
|
const key = M.jsId(p.value.name.description!);
|
||||||
|
return v => up((v as Bindings<V>)[key]);
|
||||||
|
}
|
||||||
|
case 'anonymous':
|
||||||
|
return this.unparserPattern(p.value);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
unparserPattern(p: M.Pattern<V>): Unparser<V> {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'CompoundPattern': {
|
||||||
|
const up = this.unparserCompoundPattern(p.value);
|
||||||
|
return v => up(v as Bindings<V>);
|
||||||
|
}
|
||||||
|
case 'SimplePattern':
|
||||||
|
return this.unparserSimplePattern(p.value);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
unparserSimplePattern(p: M.SimplePattern<V>): Unparser<V> {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'any': return v => v as Value<V>; // ?!
|
||||||
|
case 'atom': return v => v as Atom;
|
||||||
|
case 'embedded': return v => v as V;
|
||||||
|
case 'lit': return _v => p.value;
|
||||||
|
case 'seqof': {
|
||||||
|
const up = this.unparserSimplePattern(p.pattern);
|
||||||
|
return vs => (vs as Parsed<V>[]).map(up);
|
||||||
|
}
|
||||||
|
case 'setof': {
|
||||||
|
const up = this.unparserSimplePattern(p.pattern);
|
||||||
|
return vs => new Set<V>((vs as Parsed<V>[]).map(up));
|
||||||
|
}
|
||||||
|
case 'dictof': {
|
||||||
|
const kp = this.unparserSimplePattern(p.key);
|
||||||
|
const vp = this.unparserSimplePattern(p.value);
|
||||||
|
return vs => {
|
||||||
|
const d = new DictionaryMap<V>();
|
||||||
|
for (const [k, v] of
|
||||||
|
(Map.isMap(vs)
|
||||||
|
? vs.entries()
|
||||||
|
: JsDictionary.entries(vs as Bindings<V>)))
|
||||||
|
{
|
||||||
|
d.set(kp(k), vp(v));
|
||||||
|
}
|
||||||
|
return M.isSymbolPattern(p.key) ? d.asJsDictionary() : d.asKeyedDictionary();
|
||||||
|
};
|
||||||
|
}
|
||||||
|
case 'Ref': {
|
||||||
|
const up = this._unparser(p.value.module, p.value.name);
|
||||||
|
return v => up[0](v as Bindings<V>);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
unparserCompoundPattern(p: M.CompoundPattern<V>): UnparserCompound<V> {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'rec': {
|
||||||
|
const lp = this.unparserNamedPattern(p.label);
|
||||||
|
const fp = this.unparserNamedPattern(p.fields);
|
||||||
|
return bs => Record(lp(bs), fp(bs) as Value<V>[]);
|
||||||
|
}
|
||||||
|
case 'tuple': {
|
||||||
|
const ups = p.patterns.map(p => this.unparserNamedPattern(p));
|
||||||
|
return bs => ups.map(up => up(bs));
|
||||||
|
}
|
||||||
|
case 'tuplePrefix': {
|
||||||
|
const fixed = p.fixed.map(p => this.unparserNamedPattern(p));
|
||||||
|
const variable = this.unparserNamedSimplePattern(p.variable);
|
||||||
|
return bs => [... fixed.map(up => up(bs)), ... variable(bs) as Value<V>[]];
|
||||||
|
}
|
||||||
|
case 'dict': {
|
||||||
|
const ups: [Value<V>, Unparser<V>][] = Array.from(p.entries.entries()).map(
|
||||||
|
([key, vp]) => [key, this.unparserNamedSimplePattern(vp)]);
|
||||||
|
return bs => {
|
||||||
|
const result = new DictionaryMap<V>();
|
||||||
|
for (const [key, up] of ups) {
|
||||||
|
result.set(key, up(bs));
|
||||||
|
}
|
||||||
|
return result.simplifiedValue();
|
||||||
|
};
|
||||||
|
}
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
unparserNamedSimplePattern(p: M.NamedSimplePattern<V>): Unparser<V> {
|
||||||
|
switch (p._variant) {
|
||||||
|
case 'named': {
|
||||||
|
const up = this.unparserSimplePattern(p.value.pattern);
|
||||||
|
const key = M.jsId(p.value.name.description!);
|
||||||
|
return v => up((v as Bindings<V>)[key]);
|
||||||
|
}
|
||||||
|
case 'anonymous':
|
||||||
|
return this.unparserSimplePattern(p.value);
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
moduleFor(modulePath: M.ModulePath): { [key: string]: any } | undefined {
|
||||||
|
const schema = this.env.get(modulePath);
|
||||||
|
if (schema === void 0) return void 0;
|
||||||
|
const mod: { [key: string]: any } = {};
|
||||||
|
JsDictionary.forEach(schema.definitions, (_d, n) => {
|
||||||
|
const definitionName = n.description!;
|
||||||
|
const definitionId = M.jsId(definitionName);
|
||||||
|
mod[`${definitionId}`] = this.definitionConstructor(modulePath, n);
|
||||||
|
mod[`from${definitionId}`] = this.unparser(modulePath, n);
|
||||||
|
mod[`to${definitionId}`] = (v: Value<V>) => this.tryParse(modulePath, n, v);
|
||||||
|
mod[`as${definitionId}`] = (v: Value<V>) => this.parse(modulePath, n, v);
|
||||||
|
});
|
||||||
|
return mod;
|
||||||
|
}
|
||||||
|
|
||||||
|
moduleTree(tree: { [key: string]: any } = {}): { [key: string]: any } {
|
||||||
|
for (const modulePath of this.env.keys()) {
|
||||||
|
let container = tree;
|
||||||
|
modulePath.slice(0, -1).forEach(n => {
|
||||||
|
if (!(n.description! in container)) container[n.description!] = {};
|
||||||
|
container = container[n.description!];
|
||||||
|
});
|
||||||
|
container[modulePath[modulePath.length - 1].description!] =
|
||||||
|
this.moduleFor(modulePath)!;
|
||||||
|
}
|
||||||
|
return tree;
|
||||||
|
}
|
||||||
|
}
|
|
@ -1,4 +1,4 @@
|
||||||
import { GenericEmbedded, is, Value } from '@preserves/core';
|
import { Embeddable, GenericEmbedded, is, Value } from '@preserves/core';
|
||||||
import * as M from './gen/schema';
|
import * as M from './gen/schema';
|
||||||
import { isJsKeyword } from './compiler/jskw';
|
import { isJsKeyword } from './compiler/jskw';
|
||||||
|
|
||||||
|
@ -98,3 +98,8 @@ export function namelike(x: Input): string | undefined {
|
||||||
if (typeof x === 'boolean') return x ? 'true' : 'false';
|
if (typeof x === 'boolean') return x ? 'true' : 'false';
|
||||||
return void 0;
|
return void 0;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
export function isSymbolPattern<T extends Embeddable>(p: M.SimplePattern<T>): boolean {
|
||||||
|
return p._variant === 'atom'
|
||||||
|
&& p.atomKind._variant === 'Symbol';
|
||||||
|
}
|
||||||
|
|
|
@ -1,4 +1,4 @@
|
||||||
import { Reader, Annotated, Dictionary, is, peel, preserves, Record, strip, Tuple, Position, position, stringify, isCompound, KeyedDictionary, annotate, annotations, isEmbedded, GenericEmbedded, genericEmbeddedTypeDecode } from '@preserves/core';
|
import { Reader, Annotated, Dictionary, is, peel, preserves, Record, strip, Tuple, Position, position, stringify, isCompound, EncodableDictionary, annotate, annotations, isEmbedded, GenericEmbedded, genericEmbeddedTypeDecode, JsDictionary, KeyedDictionary } from '@preserves/core';
|
||||||
import { Input, Pattern, Schema, Definition, CompoundPattern, SimplePattern } from './meta';
|
import { Input, Pattern, Schema, Definition, CompoundPattern, SimplePattern } from './meta';
|
||||||
import * as M from './meta';
|
import * as M from './meta';
|
||||||
import { SchemaSyntaxError } from './error';
|
import { SchemaSyntaxError } from './error';
|
||||||
|
@ -70,7 +70,7 @@ export function parseSchema(toplevelTokens: Array<Input>, options: SchemaReaderO
|
||||||
{
|
{
|
||||||
let version: M.Version | undefined = void 0;
|
let version: M.Version | undefined = void 0;
|
||||||
let embeddedType: M.EmbeddedTypeName = M.EmbeddedTypeName.$false();
|
let embeddedType: M.EmbeddedTypeName = M.EmbeddedTypeName.$false();
|
||||||
let definitions = new KeyedDictionary<symbol, Definition, M.InputEmbedded>();
|
let definitions: M.Definitions = {};
|
||||||
|
|
||||||
function process(toplevelTokens: Array<Input>): void {
|
function process(toplevelTokens: Array<Input>): void {
|
||||||
const toplevelClauses = splitBy(peel(toplevelTokens) as Array<Input>, M.DOT);
|
const toplevelClauses = splitBy(peel(toplevelTokens) as Array<Input>, M.DOT);
|
||||||
|
@ -82,10 +82,10 @@ export function parseSchema(toplevelTokens: Array<Input>, options: SchemaReaderO
|
||||||
if (!M.isValidToken(name.description!)) {
|
if (!M.isValidToken(name.description!)) {
|
||||||
throw new SchemaSyntaxError(preserves`Invalid definition name: ${name}`, pos);
|
throw new SchemaSyntaxError(preserves`Invalid definition name: ${name}`, pos);
|
||||||
}
|
}
|
||||||
if (definitions.has(name)) {
|
if (JsDictionary.has(definitions, name)) {
|
||||||
throw new SchemaSyntaxError(preserves`Duplicate definition: ${clause}`, pos);
|
throw new SchemaSyntaxError(preserves`Duplicate definition: ${clause}`, pos);
|
||||||
}
|
}
|
||||||
definitions.set(name, parseDefinition(name, pos, clause.slice(2)));
|
JsDictionary.set(definitions, name, parseDefinition(name, pos, clause.slice(2)));
|
||||||
} else if (clause.length === 2 && is(clause[0], M.$version)) {
|
} else if (clause.length === 2 && is(clause[0], M.$version)) {
|
||||||
version = M.asVersion(peel(clause[1]));
|
version = M.asVersion(peel(clause[1]));
|
||||||
} else if (clause.length === 2 && is(clause[0], M.$embeddedType)) {
|
} else if (clause.length === 2 && is(clause[0], M.$embeddedType)) {
|
||||||
|
@ -225,7 +225,6 @@ function parsePattern(name: symbol, body0: Array<Input>): Pattern {
|
||||||
switch (str) {
|
switch (str) {
|
||||||
case 'any': return ks(M.SimplePattern.any());
|
case 'any': return ks(M.SimplePattern.any());
|
||||||
case 'bool': return ks(M.SimplePattern.atom(M.AtomKind.Boolean()));
|
case 'bool': return ks(M.SimplePattern.atom(M.AtomKind.Boolean()));
|
||||||
case 'float': return ks(M.SimplePattern.atom(M.AtomKind.Float()));
|
|
||||||
case 'double': return ks(M.SimplePattern.atom(M.AtomKind.Double()));
|
case 'double': return ks(M.SimplePattern.atom(M.AtomKind.Double()));
|
||||||
case 'int': return ks(M.SimplePattern.atom(M.AtomKind.SignedInteger()));
|
case 'int': return ks(M.SimplePattern.atom(M.AtomKind.SignedInteger()));
|
||||||
case 'string': return ks(M.SimplePattern.atom(M.AtomKind.String()));
|
case 'string': return ks(M.SimplePattern.atom(M.AtomKind.String()));
|
||||||
|
@ -233,7 +232,7 @@ function parsePattern(name: symbol, body0: Array<Input>): Pattern {
|
||||||
case 'symbol': return ks(M.SimplePattern.atom(M.AtomKind.Symbol()));
|
case 'symbol': return ks(M.SimplePattern.atom(M.AtomKind.Symbol()));
|
||||||
default: {
|
default: {
|
||||||
if (str[0] === '=') {
|
if (str[0] === '=') {
|
||||||
return ks(M.SimplePattern.lit(Symbol.for(str.slice(1))));
|
return ks(M.SimplePattern.lit<GenericEmbedded>(Symbol.for(str.slice(1))));
|
||||||
} else if (M.isValidQid(str)) {
|
} else if (M.isValidQid(str)) {
|
||||||
return ks(M.SimplePattern.Ref(parseRef(str, pos)));
|
return ks(M.SimplePattern.Ref(parseRef(str, pos)));
|
||||||
} else {
|
} else {
|
||||||
|
@ -261,20 +260,20 @@ function parsePattern(name: symbol, body0: Array<Input>): Pattern {
|
||||||
if (item.size !== 1) complain();
|
if (item.size !== 1) complain();
|
||||||
const [vp] = item.values();
|
const [vp] = item.values();
|
||||||
return ks(M.SimplePattern.setof(walkSimple(vp)));
|
return ks(M.SimplePattern.setof(walkSimple(vp)));
|
||||||
} else if (Dictionary.isDictionary<M.InputEmbedded, Input>(item)
|
|
||||||
&& item.size === 2
|
|
||||||
&& item.has(M.DOTDOTDOT))
|
|
||||||
{
|
|
||||||
const v = item.clone();
|
|
||||||
v.delete(M.DOTDOTDOT);
|
|
||||||
const [[kp, vp]] = v.entries();
|
|
||||||
return ks(M.SimplePattern.dictof({ key: walkSimple(kp), value: walkSimple(vp) }));
|
|
||||||
} else if (isCompound(item)) {
|
|
||||||
return kf();
|
|
||||||
} else if (isEmbedded(item)) {
|
|
||||||
return ks(M.SimplePattern.embedded(walkSimple(item.embeddedValue.generic)));
|
|
||||||
} else {
|
} else {
|
||||||
return ks(M.SimplePattern.lit(strip(item)));
|
const itemMap = Dictionary.asMap<M.InputEmbedded, Input>(item);
|
||||||
|
if (itemMap && itemMap.size === 2 && itemMap.has(M.DOTDOTDOT)) {
|
||||||
|
const v = itemMap.clone();
|
||||||
|
v.delete(M.DOTDOTDOT);
|
||||||
|
const [[kp, vp]] = v.entries();
|
||||||
|
return ks(M.SimplePattern.dictof({ key: walkSimple(kp), value: walkSimple(vp) }));
|
||||||
|
} else if (isCompound(item)) {
|
||||||
|
return kf();
|
||||||
|
} else if (isEmbedded(item)) {
|
||||||
|
return ks(M.SimplePattern.embedded(walkSimple(item.generic)));
|
||||||
|
} else {
|
||||||
|
return ks(M.SimplePattern.lit(strip(item)));
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -313,19 +312,20 @@ function parsePattern(name: symbol, body0: Array<Input>): Pattern {
|
||||||
});
|
});
|
||||||
} else if (Array.isArray(item)) {
|
} else if (Array.isArray(item)) {
|
||||||
return M.CompoundPattern.tuple(item.map(maybeNamed));
|
return M.CompoundPattern.tuple(item.map(maybeNamed));
|
||||||
} else if (Dictionary.isDictionary<M.InputEmbedded, Input>(item) && !item.has(M.DOTDOTDOT)) {
|
|
||||||
return M.CompoundPattern.dict(
|
|
||||||
M.DictionaryEntries(item.mapEntries<M.NamedSimplePattern, Input, M.InputEmbedded>(
|
|
||||||
([k, vp]) => [
|
|
||||||
strip(k),
|
|
||||||
_maybeNamed(
|
|
||||||
M.NamedSimplePattern.named,
|
|
||||||
M.NamedSimplePattern.anonymous,
|
|
||||||
walkSimple,
|
|
||||||
strip(k))(vp)
|
|
||||||
])));
|
|
||||||
} else {
|
} else {
|
||||||
complain();
|
const itemMap = Dictionary.asMap<M.InputEmbedded, Input>(item);
|
||||||
|
if (itemMap && !itemMap.has(M.DOTDOTDOT)) {
|
||||||
|
const entries = new KeyedDictionary<M.InputEmbedded, Input, M.NamedSimplePattern>();
|
||||||
|
itemMap.forEach((vp, k) => entries.set(
|
||||||
|
strip(k),
|
||||||
|
_maybeNamed(M.NamedSimplePattern.named,
|
||||||
|
M.NamedSimplePattern.anonymous,
|
||||||
|
walkSimple,
|
||||||
|
strip(k))(vp)));
|
||||||
|
return M.CompoundPattern.dict(M.DictionaryEntries(entries));
|
||||||
|
} else {
|
||||||
|
complain();
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
|
|
@ -0,0 +1,29 @@
|
||||||
|
import { } from '@preserves/core';
|
||||||
|
import { compile, Meta as M, readSchema } from '../src/index';
|
||||||
|
import './test-utils';
|
||||||
|
import { js as format } from 'js-beautify';
|
||||||
|
|
||||||
|
function compileSingleSchema(source: string): string {
|
||||||
|
return format(compile([], [Symbol.for('test')], readSchema(`version 1 . ${source}`), {}));
|
||||||
|
}
|
||||||
|
|
||||||
|
describe('compiler', () => {
|
||||||
|
it('basics', () => {
|
||||||
|
expect(compileSingleSchema(`X = int .`)).toContain(`\nexport type X = number;\n`);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('symbol-keyed dictionary', () => {
|
||||||
|
const s = compileSingleSchema(`D = { symbol:any ...:... } .`);
|
||||||
|
expect(s).toMatch(/^export type D.*= _.JsDictionary < _.Value < _embedded >> ;$/m);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('string-keyed dictionary', () => {
|
||||||
|
const s = compileSingleSchema(`D = { string:any ...:... } .`);
|
||||||
|
expect(s).toMatch(/^export type D.*= _.EncodableDictionary < _embedded, string, _.Value < _embedded >> ;$/m);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('any-keyed dictionary', () => {
|
||||||
|
const s = compileSingleSchema(`D = { any:any ...:... } .`);
|
||||||
|
expect(s).toMatch(/^export type D.*= _.EncodableDictionary < _embedded, _.Value < _embedded > , _.Value < _embedded >> ;$/m);
|
||||||
|
});
|
||||||
|
});
|
|
@ -0,0 +1,111 @@
|
||||||
|
import { GenericEmbedded, stringify, fromJS, parse, EncodableDictionary, KeyedDictionary } from '@preserves/core';
|
||||||
|
import { SchemaInterpreter, readSchema } from '../src/index';
|
||||||
|
import './test-utils';
|
||||||
|
|
||||||
|
describe('interpreter', () => {
|
||||||
|
const I = new SchemaInterpreter<GenericEmbedded>();
|
||||||
|
|
||||||
|
const X_schema = readSchema(`
|
||||||
|
version 1 .
|
||||||
|
A = <foo> .
|
||||||
|
B = <bar @v int> .
|
||||||
|
C = <zot @v int @w int> .
|
||||||
|
N = int .
|
||||||
|
|
||||||
|
U = <class @a int @b int> / <true @c int @d int> .
|
||||||
|
V = <public @n int> / <private @n int> .
|
||||||
|
W = <const> / <let> .
|
||||||
|
|
||||||
|
D1 = { symbol:A ...:... } .
|
||||||
|
D2 = { any:A ...:... } .
|
||||||
|
D3 = { string:A ...:... } .
|
||||||
|
`);
|
||||||
|
const X = Symbol.for('X');
|
||||||
|
I.env.set([X], X_schema);
|
||||||
|
|
||||||
|
const E = I.moduleTree();
|
||||||
|
|
||||||
|
it('basically works', () => {
|
||||||
|
const a = E.X.A();
|
||||||
|
const b = E.X.B(22);
|
||||||
|
const c = E.X.C({v: 33, w: 44});
|
||||||
|
const n = E.X.N(22);
|
||||||
|
expect(stringify(a)).toBe('<foo>');
|
||||||
|
expect(stringify(b)).toBe('<bar 22>');
|
||||||
|
expect(stringify(c)).toBe('<zot 33 44>');
|
||||||
|
expect(stringify(n)).toBe('22');
|
||||||
|
expect(parse('<foo>')).is(fromJS(a));
|
||||||
|
expect(parse('<bar 22>')).is(fromJS(b));
|
||||||
|
expect(parse('<zot 33 44>')).is(fromJS(c));
|
||||||
|
expect(parse('22')).is(fromJS(n));
|
||||||
|
expect(stringify(E.X.asA(fromJS(a)))).toBe('<foo>');
|
||||||
|
expect(stringify(E.X.asB(fromJS(b)))).toBe('<bar 22>');
|
||||||
|
expect(stringify(E.X.asC(fromJS(c)))).toBe('<zot 33 44>');
|
||||||
|
expect(stringify(E.X.asN(fromJS(n)))).toBe('22');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('escapes JS keywords', () => {
|
||||||
|
expect(Object.keys(E.X.U)).toEqual(['$class', '$true']);
|
||||||
|
expect(Object.keys(E.X.V)).toEqual(['$public', '$private']);
|
||||||
|
expect(Object.keys(E.X.W)).toEqual(['$const', '$let']);
|
||||||
|
});
|
||||||
|
|
||||||
|
const STANDARD_METHODS = ['__as_preserve__', '__preserve_on__', '__preserve_text_on__'];
|
||||||
|
|
||||||
|
it('accepts correct arguments to n-ary variant ctors', () => {
|
||||||
|
expect(stringify(E.X.U.$class({ a: 123, b: 234 }))).toBe('<class 123 234>');
|
||||||
|
expect(stringify(E.X.U.$true({ c: 123, d: 234 }))).toBe('<true 123 234>');
|
||||||
|
expect(Object.keys(E.X.U.$class({ a: 123, b: 234 }))).toEqual([
|
||||||
|
'_variant', 'a', 'b', ... STANDARD_METHODS]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('accepts correct arguments to unary variant ctors', () => {
|
||||||
|
expect(stringify(E.X.V.$public(123))).toBe('<public 123>');
|
||||||
|
expect(stringify(E.X.V.$private(123))).toBe('<private 123>');
|
||||||
|
expect(Object.keys(E.X.V.$public(123))).toEqual([
|
||||||
|
'_variant', 'n', ... STANDARD_METHODS]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('accepts correct arguments to nullary variant ctors', () => {
|
||||||
|
expect(stringify(E.X.W.$const())).toBe('<const>');
|
||||||
|
expect(stringify(E.X.W.$let())).toBe('<let>');
|
||||||
|
expect(Object.keys(E.X.W.$const())).toEqual([
|
||||||
|
'_variant', ... STANDARD_METHODS]);
|
||||||
|
});
|
||||||
|
|
||||||
|
it('produces JsDictionary for symbol-keyed dicts', () => {
|
||||||
|
const v = E.X.asD1(parse('{ a: <foo>, b: <foo>}'));
|
||||||
|
expect(Object.keys(v)).toEqual(['a', 'b']);
|
||||||
|
expect(stringify(E.X.fromA(v.a))).toBe('<foo>');
|
||||||
|
expect(stringify(E.X.fromA(v.b))).toBe('<foo>');
|
||||||
|
});
|
||||||
|
|
||||||
|
it('produces EncodableDictionary for any-keyed dicts', () => {
|
||||||
|
const d2 = E.X.asD2(parse('{ a: <foo>, b: <foo>}'));
|
||||||
|
expect(d2 instanceof EncodableDictionary).toBe(true);
|
||||||
|
expect(Array.from(d2.keys())).toEqual([Symbol.for('a'), Symbol.for('b')]);
|
||||||
|
expect(fromJS(d2.get(Symbol.for('a')))).is(fromJS(E.X.A()));
|
||||||
|
});
|
||||||
|
|
||||||
|
it('accepts either kind of dictionary for symbol-keyed dicts', () => {
|
||||||
|
const v = { a: E.X.A(), b: E.X.A() };
|
||||||
|
expect(stringify(v)).toBe('{a: <foo> b: <foo>}');
|
||||||
|
expect(E.X.fromD1(v)).is(parse('{a: <foo> b: <foo>}'));
|
||||||
|
expect(E.X.fromD1(E.X.D1(v))).is(parse('{a: <foo> b: <foo>}'));
|
||||||
|
expect(E.X.fromD1(E.X.D1(new KeyedDictionary([
|
||||||
|
[parse('a'), parse('<foo>')],
|
||||||
|
[parse('b'), parse('<foo>')],
|
||||||
|
])))).is(parse('{a: <foo> b: <foo>}'));
|
||||||
|
});
|
||||||
|
|
||||||
|
it('accepts either kind of dictionary for any-keyed dicts', () => {
|
||||||
|
const v = { a: E.X.A(), b: E.X.A() };
|
||||||
|
expect(stringify(v)).toBe('{a: <foo> b: <foo>}');
|
||||||
|
expect(E.X.fromD2(v)).is(parse('{a: <foo> b: <foo>}'));
|
||||||
|
expect(E.X.fromD2(E.X.D2(v))).is(parse('{a: <foo> b: <foo>}'));
|
||||||
|
expect(E.X.fromD2(E.X.D2(new KeyedDictionary([
|
||||||
|
[parse('a'), parse('<foo>')],
|
||||||
|
[parse('b'), parse('<foo>')],
|
||||||
|
])))).is(parse('{a: <foo> b: <foo>}'));
|
||||||
|
});
|
||||||
|
});
|
|
@ -1,3 +1,4 @@
|
||||||
|
import { JsDictionary } from '@preserves/core';
|
||||||
import { readSchema, Meta } from '../src/index';
|
import { readSchema, Meta } from '../src/index';
|
||||||
|
|
||||||
describe('reader schema', () => {
|
describe('reader schema', () => {
|
||||||
|
@ -9,13 +10,17 @@ describe('reader schema', () => {
|
||||||
});
|
});
|
||||||
it('is OK with an empty schema correctly versioned', () => {
|
it('is OK with an empty schema correctly versioned', () => {
|
||||||
const s = readSchema('version 1 .');
|
const s = readSchema('version 1 .');
|
||||||
expect(Object.getOwnPropertyNames(s.version)).toEqual(['__as_preserve__']);
|
expect(Object.getOwnPropertyNames(s.version)).toEqual([
|
||||||
expect(s.definitions.size).toBe(0);
|
'__as_preserve__',
|
||||||
|
'__preserve_on__',
|
||||||
|
'__preserve_text_on__',
|
||||||
|
]);
|
||||||
|
expect(JsDictionary.size(s.definitions)).toBe(0);
|
||||||
expect(s.embeddedType._variant).toBe('false');
|
expect(s.embeddedType._variant).toBe('false');
|
||||||
});
|
});
|
||||||
it('understands patterns under embed', () => {
|
it('understands patterns under embed', () => {
|
||||||
const s = readSchema('version 1 . X = #!0 .');
|
const s = readSchema('version 1 . X = #:0 .');
|
||||||
const def: Meta.Definition = s.definitions.get(Symbol.for('X'))!;
|
const def: Meta.Definition = JsDictionary.get(s.definitions, Symbol.for('X'))!;
|
||||||
if (def._variant !== 'Pattern') fail('bad definition 1');
|
if (def._variant !== 'Pattern') fail('bad definition 1');
|
||||||
if (def.value._variant !== 'SimplePattern') fail ('bad definition 2');
|
if (def.value._variant !== 'SimplePattern') fail ('bad definition 2');
|
||||||
if (def.value.value._variant !== 'embedded') fail('bad definition 3');
|
if (def.value.value._variant !== 'embedded') fail('bad definition 3');
|
||||||
|
|
|
@ -1,9 +1,9 @@
|
||||||
import { Value, is, preserves } from '@preserves/core';
|
import { Value, is, preserves, Embeddable } from '@preserves/core';
|
||||||
|
|
||||||
declare global {
|
declare global {
|
||||||
namespace jest {
|
namespace jest {
|
||||||
interface Matchers<R> {
|
interface Matchers<R> {
|
||||||
is<T>(expected: Value<T>): R;
|
is<T extends Embeddable>(expected: Value<T>): R;
|
||||||
toThrowFilter(f: (e: Error) => boolean): R;
|
toThrowFilter(f: (e: Error) => boolean): R;
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
|
@ -21,6 +21,5 @@ open "cd packages/schema; yarn run compile:watch"
|
||||||
open "cd packages/schema; yarn run rollup:watch"
|
open "cd packages/schema; yarn run rollup:watch"
|
||||||
open "cd packages/schema; yarn run test:watch"
|
open "cd packages/schema; yarn run test:watch"
|
||||||
open "cd packages/schema-cli; yarn run compile:watch"
|
open "cd packages/schema-cli; yarn run compile:watch"
|
||||||
open "cd packages/schema-cli; yarn run rollup:watch"
|
|
||||||
|
|
||||||
tmux select-layout even-vertical
|
tmux select-layout even-vertical
|
||||||
|
|
|
@ -308,6 +308,18 @@
|
||||||
resolved "https://registry.yarnpkg.com/@hutson/parse-repository-url/-/parse-repository-url-3.0.2.tgz#98c23c950a3d9b6c8f0daed06da6c3af06981340"
|
resolved "https://registry.yarnpkg.com/@hutson/parse-repository-url/-/parse-repository-url-3.0.2.tgz#98c23c950a3d9b6c8f0daed06da6c3af06981340"
|
||||||
integrity sha512-H9XAx3hc0BQHY6l+IFSWHDySypcXsvsuLhgYLUGywmJ5pswRVQJUHpOsobnLYp2ZUaUlKiKDrgWWhosOwAEM8Q==
|
integrity sha512-H9XAx3hc0BQHY6l+IFSWHDySypcXsvsuLhgYLUGywmJ5pswRVQJUHpOsobnLYp2ZUaUlKiKDrgWWhosOwAEM8Q==
|
||||||
|
|
||||||
|
"@isaacs/cliui@^8.0.2":
|
||||||
|
version "8.0.2"
|
||||||
|
resolved "https://registry.yarnpkg.com/@isaacs/cliui/-/cliui-8.0.2.tgz#b37667b7bc181c168782259bab42474fbf52b550"
|
||||||
|
integrity sha512-O8jcjabXaleOG9DQ0+ARXWZBTfnP4WNAqzuiJK7ll44AmxGKv/J2M4TPjxjY3znBCfvBXFzucm1twdyFybFqEA==
|
||||||
|
dependencies:
|
||||||
|
string-width "^5.1.2"
|
||||||
|
string-width-cjs "npm:string-width@^4.2.0"
|
||||||
|
strip-ansi "^7.0.1"
|
||||||
|
strip-ansi-cjs "npm:strip-ansi@^6.0.1"
|
||||||
|
wrap-ansi "^8.1.0"
|
||||||
|
wrap-ansi-cjs "npm:wrap-ansi@^7.0.0"
|
||||||
|
|
||||||
"@istanbuljs/load-nyc-config@^1.0.0":
|
"@istanbuljs/load-nyc-config@^1.0.0":
|
||||||
version "1.1.0"
|
version "1.1.0"
|
||||||
resolved "https://registry.yarnpkg.com/@istanbuljs/load-nyc-config/-/load-nyc-config-1.1.0.tgz#fd3db1d59ecf7cf121e80650bb86712f9b55eced"
|
resolved "https://registry.yarnpkg.com/@istanbuljs/load-nyc-config/-/load-nyc-config-1.1.0.tgz#fd3db1d59ecf7cf121e80650bb86712f9b55eced"
|
||||||
|
@ -1417,6 +1429,16 @@
|
||||||
dependencies:
|
dependencies:
|
||||||
"@octokit/openapi-types" "^12.11.0"
|
"@octokit/openapi-types" "^12.11.0"
|
||||||
|
|
||||||
|
"@one-ini/wasm@0.1.1":
|
||||||
|
version "0.1.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/@one-ini/wasm/-/wasm-0.1.1.tgz#6013659736c9dbfccc96e8a9c2b3de317df39323"
|
||||||
|
integrity sha512-XuySG1E38YScSJoMlqovLru4KTUNSjgVTIjyh7qMX6aNN5HY5Ct5LhRJdxO79JtTzKfzV/bnWpz+zquYrISsvw==
|
||||||
|
|
||||||
|
"@pkgjs/parseargs@^0.11.0":
|
||||||
|
version "0.11.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/@pkgjs/parseargs/-/parseargs-0.11.0.tgz#a77ea742fab25775145434eb1d2328cf5013ac33"
|
||||||
|
integrity sha512-+1VkjdD0QBLPodGrJUeqarH8VAIvQODIbwh9XpP5Syisf7YoQgsJKPNFoqqLQlu+VQ/tVSshMR6loPMn8U+dPg==
|
||||||
|
|
||||||
"@rollup/plugin-terser@^0.4":
|
"@rollup/plugin-terser@^0.4":
|
||||||
version "0.4.0"
|
version "0.4.0"
|
||||||
resolved "https://registry.yarnpkg.com/@rollup/plugin-terser/-/plugin-terser-0.4.0.tgz#4c76249ad337f3eb04ab409332f23717af2c1fbf"
|
resolved "https://registry.yarnpkg.com/@rollup/plugin-terser/-/plugin-terser-0.4.0.tgz#4c76249ad337f3eb04ab409332f23717af2c1fbf"
|
||||||
|
@ -1540,6 +1562,11 @@
|
||||||
jest-matcher-utils "^27.0.0"
|
jest-matcher-utils "^27.0.0"
|
||||||
pretty-format "^27.0.0"
|
pretty-format "^27.0.0"
|
||||||
|
|
||||||
|
"@types/js-beautify@1.14":
|
||||||
|
version "1.14.3"
|
||||||
|
resolved "https://registry.yarnpkg.com/@types/js-beautify/-/js-beautify-1.14.3.tgz#6ced76f79935e37e0d613110dea369881d93c1ff"
|
||||||
|
integrity sha512-FMbQHz+qd9DoGvgLHxeqqVPaNRffpIu5ZjozwV8hf9JAGpIOzuAf4wGbRSo8LNITHqGjmmVjaMggTT5P4v4IHg==
|
||||||
|
|
||||||
"@types/minimatch@*":
|
"@types/minimatch@*":
|
||||||
version "5.1.2"
|
version "5.1.2"
|
||||||
resolved "https://registry.yarnpkg.com/@types/minimatch/-/minimatch-5.1.2.tgz#07508b45797cb81ec3f273011b054cd0755eddca"
|
resolved "https://registry.yarnpkg.com/@types/minimatch/-/minimatch-5.1.2.tgz#07508b45797cb81ec3f273011b054cd0755eddca"
|
||||||
|
@ -1627,6 +1654,11 @@ abbrev@1:
|
||||||
resolved "https://registry.yarnpkg.com/abbrev/-/abbrev-1.1.1.tgz#f8f2c887ad10bf67f634f005b6987fed3179aac8"
|
resolved "https://registry.yarnpkg.com/abbrev/-/abbrev-1.1.1.tgz#f8f2c887ad10bf67f634f005b6987fed3179aac8"
|
||||||
integrity sha512-nne9/IiQ/hzIhY6pdDnbBtz7DjPTKrY00P/zvPSm5pOFkl6xuGrGnXn/VtTNNfNtAfZ9/1RtehkszU9qcTii0Q==
|
integrity sha512-nne9/IiQ/hzIhY6pdDnbBtz7DjPTKrY00P/zvPSm5pOFkl6xuGrGnXn/VtTNNfNtAfZ9/1RtehkszU9qcTii0Q==
|
||||||
|
|
||||||
|
abbrev@^2.0.0:
|
||||||
|
version "2.0.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/abbrev/-/abbrev-2.0.0.tgz#cf59829b8b4f03f89dda2771cb7f3653828c89bf"
|
||||||
|
integrity sha512-6/mh1E2u2YgEsCHdY0Yx5oW+61gZU+1vXaoiHHrpKeuRNNgFvS+/jrwHiQhB5apAf5oB7UB7E19ol2R2LKH8hQ==
|
||||||
|
|
||||||
acorn-globals@^6.0.0:
|
acorn-globals@^6.0.0:
|
||||||
version "6.0.0"
|
version "6.0.0"
|
||||||
resolved "https://registry.yarnpkg.com/acorn-globals/-/acorn-globals-6.0.0.tgz#46cdd39f0f8ff08a876619b55f5ac8a6dc770b45"
|
resolved "https://registry.yarnpkg.com/acorn-globals/-/acorn-globals-6.0.0.tgz#46cdd39f0f8ff08a876619b55f5ac8a6dc770b45"
|
||||||
|
@ -1716,6 +1748,11 @@ ansi-regex@^5.0.1:
|
||||||
resolved "https://registry.yarnpkg.com/ansi-regex/-/ansi-regex-5.0.1.tgz#082cb2c89c9fe8659a311a53bd6a4dc5301db304"
|
resolved "https://registry.yarnpkg.com/ansi-regex/-/ansi-regex-5.0.1.tgz#082cb2c89c9fe8659a311a53bd6a4dc5301db304"
|
||||||
integrity sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==
|
integrity sha512-quJQXlTSUGL2LH9SUXo8VwsY4soanhgo6LNSm84E1LBcE8s3O0wpdiRzyR9z/ZZJMlMWv37qOOb9pdJlMUEKFQ==
|
||||||
|
|
||||||
|
ansi-regex@^6.0.1:
|
||||||
|
version "6.0.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/ansi-regex/-/ansi-regex-6.0.1.tgz#3183e38fae9a65d7cb5e53945cd5897d0260a06a"
|
||||||
|
integrity sha512-n5M855fKb2SsfMIiFFoVrABHJC8QtHwVx+mHWP3QcEqBHYienj5dHSgjbxtC0WEZXYt4wcD6zrQElDPhFuZgfA==
|
||||||
|
|
||||||
ansi-styles@^3.2.1:
|
ansi-styles@^3.2.1:
|
||||||
version "3.2.1"
|
version "3.2.1"
|
||||||
resolved "https://registry.yarnpkg.com/ansi-styles/-/ansi-styles-3.2.1.tgz#41fbb20243e50b12be0f04b8dedbf07520ce841d"
|
resolved "https://registry.yarnpkg.com/ansi-styles/-/ansi-styles-3.2.1.tgz#41fbb20243e50b12be0f04b8dedbf07520ce841d"
|
||||||
|
@ -1735,6 +1772,11 @@ ansi-styles@^5.0.0:
|
||||||
resolved "https://registry.yarnpkg.com/ansi-styles/-/ansi-styles-5.2.0.tgz#07449690ad45777d1924ac2abb2fc8895dba836b"
|
resolved "https://registry.yarnpkg.com/ansi-styles/-/ansi-styles-5.2.0.tgz#07449690ad45777d1924ac2abb2fc8895dba836b"
|
||||||
integrity sha512-Cxwpt2SfTzTtXcfOlzGEee8O+c+MmUgGrNiBcXnuWxuFJHe6a5Hz7qwhwe5OgaSYI0IJvkLqWX1ASG+cJOkEiA==
|
integrity sha512-Cxwpt2SfTzTtXcfOlzGEee8O+c+MmUgGrNiBcXnuWxuFJHe6a5Hz7qwhwe5OgaSYI0IJvkLqWX1ASG+cJOkEiA==
|
||||||
|
|
||||||
|
ansi-styles@^6.1.0:
|
||||||
|
version "6.2.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/ansi-styles/-/ansi-styles-6.2.1.tgz#0e62320cf99c21afff3b3012192546aacbfb05c5"
|
||||||
|
integrity sha512-bN798gFfQX+viw3R7yrGWRqnrN2oRkEkUjjl4JNn4E8GxxbjtG3FbrEIIY3l8/hrwUwIeCZvi4QuOTP4MErVug==
|
||||||
|
|
||||||
anymatch@^3.0.3, anymatch@~3.1.2:
|
anymatch@^3.0.3, anymatch@~3.1.2:
|
||||||
version "3.1.3"
|
version "3.1.3"
|
||||||
resolved "https://registry.yarnpkg.com/anymatch/-/anymatch-3.1.3.tgz#790c58b19ba1720a84205b57c618d5ad8524973e"
|
resolved "https://registry.yarnpkg.com/anymatch/-/anymatch-3.1.3.tgz#790c58b19ba1720a84205b57c618d5ad8524973e"
|
||||||
|
@ -1942,6 +1984,13 @@ brace-expansion@^1.1.7:
|
||||||
balanced-match "^1.0.0"
|
balanced-match "^1.0.0"
|
||||||
concat-map "0.0.1"
|
concat-map "0.0.1"
|
||||||
|
|
||||||
|
brace-expansion@^2.0.1:
|
||||||
|
version "2.0.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/brace-expansion/-/brace-expansion-2.0.1.tgz#1edc459e0f0c548486ecf9fc99f2221364b9a0ae"
|
||||||
|
integrity sha512-XnAIvQ8eM+kC6aULx6wuQiwVsnzsi9d3WxzV3FpWTGA19F621kwdbsAcFKXgKUHZWsy+mY6iL1sHTxWEFCytDA==
|
||||||
|
dependencies:
|
||||||
|
balanced-match "^1.0.0"
|
||||||
|
|
||||||
braces@^3.0.2, braces@~3.0.2:
|
braces@^3.0.2, braces@~3.0.2:
|
||||||
version "3.0.2"
|
version "3.0.2"
|
||||||
resolved "https://registry.yarnpkg.com/braces/-/braces-3.0.2.tgz#3454e1a462ee8d599e236df336cd9ea4f8afe107"
|
resolved "https://registry.yarnpkg.com/braces/-/braces-3.0.2.tgz#3454e1a462ee8d599e236df336cd9ea4f8afe107"
|
||||||
|
@ -2232,6 +2281,11 @@ combined-stream@^1.0.6, combined-stream@^1.0.8, combined-stream@~1.0.6:
|
||||||
dependencies:
|
dependencies:
|
||||||
delayed-stream "~1.0.0"
|
delayed-stream "~1.0.0"
|
||||||
|
|
||||||
|
commander@^10.0.0:
|
||||||
|
version "10.0.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/commander/-/commander-10.0.1.tgz#881ee46b4f77d1c1dccc5823433aa39b022cbe06"
|
||||||
|
integrity sha512-y4Mg2tXshplEbSGzx7amzPwKKOCGuoSRP/CjEdwwk0FOGlUbq6lKuoyDZTNZkmxHdJtp54hdfY/JUrdL7Xfdug==
|
||||||
|
|
||||||
commander@^2.20.0:
|
commander@^2.20.0:
|
||||||
version "2.20.3"
|
version "2.20.3"
|
||||||
resolved "https://registry.yarnpkg.com/commander/-/commander-2.20.3.tgz#fd485e84c03eb4881c20722ba48035e8531aeb33"
|
resolved "https://registry.yarnpkg.com/commander/-/commander-2.20.3.tgz#fd485e84c03eb4881c20722ba48035e8531aeb33"
|
||||||
|
@ -2265,7 +2319,7 @@ concat-stream@^2.0.0:
|
||||||
readable-stream "^3.0.2"
|
readable-stream "^3.0.2"
|
||||||
typedarray "^0.0.6"
|
typedarray "^0.0.6"
|
||||||
|
|
||||||
config-chain@^1.1.12:
|
config-chain@^1.1.12, config-chain@^1.1.13:
|
||||||
version "1.1.13"
|
version "1.1.13"
|
||||||
resolved "https://registry.yarnpkg.com/config-chain/-/config-chain-1.1.13.tgz#fad0795aa6a6cdaff9ed1b68e9dff94372c232f4"
|
resolved "https://registry.yarnpkg.com/config-chain/-/config-chain-1.1.13.tgz#fad0795aa6a6cdaff9ed1b68e9dff94372c232f4"
|
||||||
integrity sha512-qj+f8APARXHrM0hraqXYb2/bOVSV4PvJQlNZ/DVj0QrmNM2q2euizkeuVckQ57J+W0mRH6Hvi+k50M4Jul2VRQ==
|
integrity sha512-qj+f8APARXHrM0hraqXYb2/bOVSV4PvJQlNZ/DVj0QrmNM2q2euizkeuVckQ57J+W0mRH6Hvi+k50M4Jul2VRQ==
|
||||||
|
@ -2391,7 +2445,7 @@ create-require@^1.1.0:
|
||||||
resolved "https://registry.yarnpkg.com/create-require/-/create-require-1.1.1.tgz#c1d7e8f1e5f6cfc9ff65f9cd352d37348756c333"
|
resolved "https://registry.yarnpkg.com/create-require/-/create-require-1.1.1.tgz#c1d7e8f1e5f6cfc9ff65f9cd352d37348756c333"
|
||||||
integrity sha512-dcKFX3jn0MpIaXjisoRvexIJVEKzaq7z2rZKxf+MSr9TkdmHmsU4m2lcLojrj/FHl8mk5VxMmYA+ftRkP/3oKQ==
|
integrity sha512-dcKFX3jn0MpIaXjisoRvexIJVEKzaq7z2rZKxf+MSr9TkdmHmsU4m2lcLojrj/FHl8mk5VxMmYA+ftRkP/3oKQ==
|
||||||
|
|
||||||
cross-spawn@^7.0.3:
|
cross-spawn@^7.0.0, cross-spawn@^7.0.3:
|
||||||
version "7.0.3"
|
version "7.0.3"
|
||||||
resolved "https://registry.yarnpkg.com/cross-spawn/-/cross-spawn-7.0.3.tgz#f73a85b9d5d41d045551c177e2882d4ac85728a6"
|
resolved "https://registry.yarnpkg.com/cross-spawn/-/cross-spawn-7.0.3.tgz#f73a85b9d5d41d045551c177e2882d4ac85728a6"
|
||||||
integrity sha512-iRDPJKUPVEND7dHPO8rkbOnPpyDygcDFtWjpeWNCgy8WP2rXcxXL8TskReQl6OrB2G7+UJrags1q15Fudc7G6w==
|
integrity sha512-iRDPJKUPVEND7dHPO8rkbOnPpyDygcDFtWjpeWNCgy8WP2rXcxXL8TskReQl6OrB2G7+UJrags1q15Fudc7G6w==
|
||||||
|
@ -2601,6 +2655,11 @@ dynamic-dedupe@^0.3.0:
|
||||||
dependencies:
|
dependencies:
|
||||||
xtend "^4.0.0"
|
xtend "^4.0.0"
|
||||||
|
|
||||||
|
eastasianwidth@^0.2.0:
|
||||||
|
version "0.2.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/eastasianwidth/-/eastasianwidth-0.2.0.tgz#696ce2ec0aa0e6ea93a397ffcf24aa7840c827cb"
|
||||||
|
integrity sha512-I88TYZWc9XiYHRQ4/3c5rjjfgkjhLyW2luGIheGERbNQ6OY7yTybanSpDXZa8y7VUP9YmDcYa+eyq4ca7iLqWA==
|
||||||
|
|
||||||
ecc-jsbn@~0.1.1:
|
ecc-jsbn@~0.1.1:
|
||||||
version "0.1.2"
|
version "0.1.2"
|
||||||
resolved "https://registry.yarnpkg.com/ecc-jsbn/-/ecc-jsbn-0.1.2.tgz#3a83a904e54353287874c564b7549386849a98c9"
|
resolved "https://registry.yarnpkg.com/ecc-jsbn/-/ecc-jsbn-0.1.2.tgz#3a83a904e54353287874c564b7549386849a98c9"
|
||||||
|
@ -2609,6 +2668,16 @@ ecc-jsbn@~0.1.1:
|
||||||
jsbn "~0.1.0"
|
jsbn "~0.1.0"
|
||||||
safer-buffer "^2.1.0"
|
safer-buffer "^2.1.0"
|
||||||
|
|
||||||
|
editorconfig@^1.0.4:
|
||||||
|
version "1.0.4"
|
||||||
|
resolved "https://registry.yarnpkg.com/editorconfig/-/editorconfig-1.0.4.tgz#040c9a8e9a6c5288388b87c2db07028aa89f53a3"
|
||||||
|
integrity sha512-L9Qe08KWTlqYMVvMcTIvMAdl1cDUubzRNYL+WfA4bLDMHe4nemKkpmYzkznE1FwLKu0EEmy6obgQKzMJrg4x9Q==
|
||||||
|
dependencies:
|
||||||
|
"@one-ini/wasm" "0.1.1"
|
||||||
|
commander "^10.0.0"
|
||||||
|
minimatch "9.0.1"
|
||||||
|
semver "^7.5.3"
|
||||||
|
|
||||||
electron-to-chromium@^1.4.284:
|
electron-to-chromium@^1.4.284:
|
||||||
version "1.4.286"
|
version "1.4.286"
|
||||||
resolved "https://registry.yarnpkg.com/electron-to-chromium/-/electron-to-chromium-1.4.286.tgz#0e039de59135f44ab9a8ec9025e53a9135eba11f"
|
resolved "https://registry.yarnpkg.com/electron-to-chromium/-/electron-to-chromium-1.4.286.tgz#0e039de59135f44ab9a8ec9025e53a9135eba11f"
|
||||||
|
@ -2624,6 +2693,11 @@ emoji-regex@^8.0.0:
|
||||||
resolved "https://registry.yarnpkg.com/emoji-regex/-/emoji-regex-8.0.0.tgz#e818fd69ce5ccfcb404594f842963bf53164cc37"
|
resolved "https://registry.yarnpkg.com/emoji-regex/-/emoji-regex-8.0.0.tgz#e818fd69ce5ccfcb404594f842963bf53164cc37"
|
||||||
integrity sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==
|
integrity sha512-MSjYzcWNOA0ewAHpz0MxpYFvwg6yjy1NG3xteoqz644VCo/RPgnr1/GGt+ic3iJTzQ8Eu3TdM14SawnVUmGE6A==
|
||||||
|
|
||||||
|
emoji-regex@^9.2.2:
|
||||||
|
version "9.2.2"
|
||||||
|
resolved "https://registry.yarnpkg.com/emoji-regex/-/emoji-regex-9.2.2.tgz#840c8803b0d8047f4ff0cf963176b32d4ef3ed72"
|
||||||
|
integrity sha512-L18DaJsXSUk2+42pv8mLs5jJT2hqFkFE4j21wOmgbUqsZ2hL72NsUU785g9RXgo3s0ZNgVl42TiHp3ZtOv/Vyg==
|
||||||
|
|
||||||
encoding@^0.1.12:
|
encoding@^0.1.12:
|
||||||
version "0.1.13"
|
version "0.1.13"
|
||||||
resolved "https://registry.yarnpkg.com/encoding/-/encoding-0.1.13.tgz#56574afdd791f54a8e9b2785c0582a2d26210fa9"
|
resolved "https://registry.yarnpkg.com/encoding/-/encoding-0.1.13.tgz#56574afdd791f54a8e9b2785c0582a2d26210fa9"
|
||||||
|
@ -2897,6 +2971,14 @@ for-each@^0.3.3:
|
||||||
dependencies:
|
dependencies:
|
||||||
is-callable "^1.1.3"
|
is-callable "^1.1.3"
|
||||||
|
|
||||||
|
foreground-child@^3.1.0:
|
||||||
|
version "3.1.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/foreground-child/-/foreground-child-3.1.1.tgz#1d173e776d75d2772fed08efe4a0de1ea1b12d0d"
|
||||||
|
integrity sha512-TMKDUnIte6bfb5nWv7V/caI169OHgvwjb7V4WkeUvbQQdjr5rWKqHFiKWb/fcOwB+CzBT+qbWjvj+DVwRskpIg==
|
||||||
|
dependencies:
|
||||||
|
cross-spawn "^7.0.0"
|
||||||
|
signal-exit "^4.0.1"
|
||||||
|
|
||||||
forever-agent@~0.6.1:
|
forever-agent@~0.6.1:
|
||||||
version "0.6.1"
|
version "0.6.1"
|
||||||
resolved "https://registry.yarnpkg.com/forever-agent/-/forever-agent-0.6.1.tgz#fbc71f0c41adeb37f96c577ad1ed42d8fdacca91"
|
resolved "https://registry.yarnpkg.com/forever-agent/-/forever-agent-0.6.1.tgz#fbc71f0c41adeb37f96c577ad1ed42d8fdacca91"
|
||||||
|
@ -3103,6 +3185,17 @@ glob-parent@^5.1.1, glob-parent@^5.1.2, glob-parent@~5.1.2:
|
||||||
dependencies:
|
dependencies:
|
||||||
is-glob "^4.0.1"
|
is-glob "^4.0.1"
|
||||||
|
|
||||||
|
glob@^10.3.3:
|
||||||
|
version "10.3.10"
|
||||||
|
resolved "https://registry.yarnpkg.com/glob/-/glob-10.3.10.tgz#0351ebb809fd187fe421ab96af83d3a70715df4b"
|
||||||
|
integrity sha512-fa46+tv1Ak0UPK1TOy/pZrIybNNt4HCv7SDzwyfiOZkvZLEbjsZkJBPtDHVshZjbecAoAGSC20MjLDG/qr679g==
|
||||||
|
dependencies:
|
||||||
|
foreground-child "^3.1.0"
|
||||||
|
jackspeak "^2.3.5"
|
||||||
|
minimatch "^9.0.1"
|
||||||
|
minipass "^5.0.0 || ^6.0.2 || ^7.0.0"
|
||||||
|
path-scurry "^1.10.1"
|
||||||
|
|
||||||
glob@^7.1, glob@^7.1.1, glob@^7.1.2, glob@^7.1.3, glob@^7.1.4, glob@^7.1.6:
|
glob@^7.1, glob@^7.1.1, glob@^7.1.2, glob@^7.1.3, glob@^7.1.4, glob@^7.1.6:
|
||||||
version "7.2.3"
|
version "7.2.3"
|
||||||
resolved "https://registry.yarnpkg.com/glob/-/glob-7.2.3.tgz#b8df0fb802bbfa8e89bd1d938b4e16578ed44f2b"
|
resolved "https://registry.yarnpkg.com/glob/-/glob-7.2.3.tgz#b8df0fb802bbfa8e89bd1d938b4e16578ed44f2b"
|
||||||
|
@ -3698,6 +3791,15 @@ istanbul-reports@^3.1.3:
|
||||||
html-escaper "^2.0.0"
|
html-escaper "^2.0.0"
|
||||||
istanbul-lib-report "^3.0.0"
|
istanbul-lib-report "^3.0.0"
|
||||||
|
|
||||||
|
jackspeak@^2.3.5:
|
||||||
|
version "2.3.6"
|
||||||
|
resolved "https://registry.yarnpkg.com/jackspeak/-/jackspeak-2.3.6.tgz#647ecc472238aee4b06ac0e461acc21a8c505ca8"
|
||||||
|
integrity sha512-N3yCS/NegsOBokc8GAdM8UcmfsKiSS8cipheD/nivzr700H+nsMOxJjQnvwOcRYVuFkdH0wGUvW2WbXGmrZGbQ==
|
||||||
|
dependencies:
|
||||||
|
"@isaacs/cliui" "^8.0.2"
|
||||||
|
optionalDependencies:
|
||||||
|
"@pkgjs/parseargs" "^0.11.0"
|
||||||
|
|
||||||
jest-changed-files@^27.5.1:
|
jest-changed-files@^27.5.1:
|
||||||
version "27.5.1"
|
version "27.5.1"
|
||||||
resolved "https://registry.yarnpkg.com/jest-changed-files/-/jest-changed-files-27.5.1.tgz#a348aed00ec9bf671cc58a66fcbe7c3dfd6a68f5"
|
resolved "https://registry.yarnpkg.com/jest-changed-files/-/jest-changed-files-27.5.1.tgz#a348aed00ec9bf671cc58a66fcbe7c3dfd6a68f5"
|
||||||
|
@ -4103,6 +4205,22 @@ jest@^27.4:
|
||||||
import-local "^3.0.2"
|
import-local "^3.0.2"
|
||||||
jest-cli "^27.5.1"
|
jest-cli "^27.5.1"
|
||||||
|
|
||||||
|
js-beautify@1.15:
|
||||||
|
version "1.15.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/js-beautify/-/js-beautify-1.15.1.tgz#4695afb508c324e1084ee0b952a102023fc65b64"
|
||||||
|
integrity sha512-ESjNzSlt/sWE8sciZH8kBF8BPlwXPwhR6pWKAw8bw4Bwj+iZcnKW6ONWUutJ7eObuBZQpiIb8S7OYspWrKt7rA==
|
||||||
|
dependencies:
|
||||||
|
config-chain "^1.1.13"
|
||||||
|
editorconfig "^1.0.4"
|
||||||
|
glob "^10.3.3"
|
||||||
|
js-cookie "^3.0.5"
|
||||||
|
nopt "^7.2.0"
|
||||||
|
|
||||||
|
js-cookie@^3.0.5:
|
||||||
|
version "3.0.5"
|
||||||
|
resolved "https://registry.yarnpkg.com/js-cookie/-/js-cookie-3.0.5.tgz#0b7e2fd0c01552c58ba86e0841f94dc2557dcdbc"
|
||||||
|
integrity sha512-cEiJEAEoIbWfCZYKWhVwFuvPX1gETRYPw6LlaTKoxD3s2AkXzkCjnp6h0V77ozyqj0jakteJ4YqDJT830+lVGw==
|
||||||
|
|
||||||
js-tokens@^4.0.0:
|
js-tokens@^4.0.0:
|
||||||
version "4.0.0"
|
version "4.0.0"
|
||||||
resolved "https://registry.yarnpkg.com/js-tokens/-/js-tokens-4.0.0.tgz#19203fb59991df98e3a287050d4647cdeaf32499"
|
resolved "https://registry.yarnpkg.com/js-tokens/-/js-tokens-4.0.0.tgz#19203fb59991df98e3a287050d4647cdeaf32499"
|
||||||
|
@ -4370,6 +4488,11 @@ lru-cache@^6.0.0:
|
||||||
dependencies:
|
dependencies:
|
||||||
yallist "^4.0.0"
|
yallist "^4.0.0"
|
||||||
|
|
||||||
|
"lru-cache@^9.1.1 || ^10.0.0":
|
||||||
|
version "10.2.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/lru-cache/-/lru-cache-10.2.0.tgz#0bd445ca57363465900f4d1f9bd8db343a4d95c3"
|
||||||
|
integrity sha512-2bIM8x+VAf6JT4bKAljS1qUWgMsqZRPGJS6FSahIMPVvctcNhyVp7AJu7quxOW9jwkryBReKZY5tY5JYv2n/7Q==
|
||||||
|
|
||||||
make-dir@^2.1.0:
|
make-dir@^2.1.0:
|
||||||
version "2.1.0"
|
version "2.1.0"
|
||||||
resolved "https://registry.yarnpkg.com/make-dir/-/make-dir-2.1.0.tgz#5f0310e18b8be898cc07009295a30ae41e91e6f5"
|
resolved "https://registry.yarnpkg.com/make-dir/-/make-dir-2.1.0.tgz#5f0310e18b8be898cc07009295a30ae41e91e6f5"
|
||||||
|
@ -4507,6 +4630,13 @@ min-indent@^1.0.0:
|
||||||
resolved "https://registry.yarnpkg.com/min-indent/-/min-indent-1.0.1.tgz#a63f681673b30571fbe8bc25686ae746eefa9869"
|
resolved "https://registry.yarnpkg.com/min-indent/-/min-indent-1.0.1.tgz#a63f681673b30571fbe8bc25686ae746eefa9869"
|
||||||
integrity sha512-I9jwMn07Sy/IwOj3zVkVik2JTvgpaykDZEigL6Rx6N9LbMywwUSMtxET+7lVoDLLd3O3IXwJwvuuns8UB/HeAg==
|
integrity sha512-I9jwMn07Sy/IwOj3zVkVik2JTvgpaykDZEigL6Rx6N9LbMywwUSMtxET+7lVoDLLd3O3IXwJwvuuns8UB/HeAg==
|
||||||
|
|
||||||
|
minimatch@9.0.1:
|
||||||
|
version "9.0.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-9.0.1.tgz#8a555f541cf976c622daf078bb28f29fb927c253"
|
||||||
|
integrity sha512-0jWhJpD/MdhPXwPuiRkCbfYfSKp2qnn2eOc279qI7f+osl/l+prKSrvhg157zSYvx/1nmgn2NqdT6k2Z7zSH9w==
|
||||||
|
dependencies:
|
||||||
|
brace-expansion "^2.0.1"
|
||||||
|
|
||||||
minimatch@^3.0, minimatch@^3.0.4, minimatch@^3.1.1:
|
minimatch@^3.0, minimatch@^3.0.4, minimatch@^3.1.1:
|
||||||
version "3.1.2"
|
version "3.1.2"
|
||||||
resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-3.1.2.tgz#19cd194bfd3e428f049a70817c038d89ab4be35b"
|
resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-3.1.2.tgz#19cd194bfd3e428f049a70817c038d89ab4be35b"
|
||||||
|
@ -4514,6 +4644,13 @@ minimatch@^3.0, minimatch@^3.0.4, minimatch@^3.1.1:
|
||||||
dependencies:
|
dependencies:
|
||||||
brace-expansion "^1.1.7"
|
brace-expansion "^1.1.7"
|
||||||
|
|
||||||
|
minimatch@^9.0.1:
|
||||||
|
version "9.0.3"
|
||||||
|
resolved "https://registry.yarnpkg.com/minimatch/-/minimatch-9.0.3.tgz#a6e00c3de44c3a542bfaae70abfc22420a6da825"
|
||||||
|
integrity sha512-RHiac9mvaRw0x3AYRgDC1CxAP7HTcNrrECeA8YYJeWnpo+2Q5CegtZjaotWTWxDG3UeGA1coE05iH1mPjT/2mg==
|
||||||
|
dependencies:
|
||||||
|
brace-expansion "^2.0.1"
|
||||||
|
|
||||||
minimist-options@4.1.0:
|
minimist-options@4.1.0:
|
||||||
version "4.1.0"
|
version "4.1.0"
|
||||||
resolved "https://registry.yarnpkg.com/minimist-options/-/minimist-options-4.1.0.tgz#c0655713c53a8a2ebd77ffa247d342c40f010619"
|
resolved "https://registry.yarnpkg.com/minimist-options/-/minimist-options-4.1.0.tgz#c0655713c53a8a2ebd77ffa247d342c40f010619"
|
||||||
|
@ -4595,6 +4732,11 @@ minipass@^4.0.0:
|
||||||
resolved "https://registry.yarnpkg.com/minipass/-/minipass-4.0.1.tgz#2b9408c6e81bb8b338d600fb3685e375a370a057"
|
resolved "https://registry.yarnpkg.com/minipass/-/minipass-4.0.1.tgz#2b9408c6e81bb8b338d600fb3685e375a370a057"
|
||||||
integrity sha512-V9esFpNbK0arbN3fm2sxDKqMYgIp7XtVdE4Esj+PE4Qaaxdg1wIw48ITQIOn1sc8xXSmUviVL3cyjMqPlrVkiA==
|
integrity sha512-V9esFpNbK0arbN3fm2sxDKqMYgIp7XtVdE4Esj+PE4Qaaxdg1wIw48ITQIOn1sc8xXSmUviVL3cyjMqPlrVkiA==
|
||||||
|
|
||||||
|
"minipass@^5.0.0 || ^6.0.2 || ^7.0.0":
|
||||||
|
version "7.0.4"
|
||||||
|
resolved "https://registry.yarnpkg.com/minipass/-/minipass-7.0.4.tgz#dbce03740f50a4786ba994c1fb908844d27b038c"
|
||||||
|
integrity sha512-jYofLM5Dam9279rdkWzqHozUo4ybjdZmCsDHePy5V/PbBcVMiSZR97gmAy45aqi8CK1lG2ECd356FU86avfwUQ==
|
||||||
|
|
||||||
minizlib@^1.3.3:
|
minizlib@^1.3.3:
|
||||||
version "1.3.3"
|
version "1.3.3"
|
||||||
resolved "https://registry.yarnpkg.com/minizlib/-/minizlib-1.3.3.tgz#2290de96818a34c29551c8a8d301216bd65a861d"
|
resolved "https://registry.yarnpkg.com/minizlib/-/minizlib-1.3.3.tgz#2290de96818a34c29551c8a8d301216bd65a861d"
|
||||||
|
@ -4742,6 +4884,13 @@ nopt@^5.0.0:
|
||||||
dependencies:
|
dependencies:
|
||||||
abbrev "1"
|
abbrev "1"
|
||||||
|
|
||||||
|
nopt@^7.2.0:
|
||||||
|
version "7.2.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/nopt/-/nopt-7.2.0.tgz#067378c68116f602f552876194fd11f1292503d7"
|
||||||
|
integrity sha512-CVDtwCdhYIvnAzFoJ6NJ6dX3oga9/HyciQDnG1vQDjSLMeKLJ4A93ZqYKDrgYSr1FBY5/hMYC+2VCi24pgpkGA==
|
||||||
|
dependencies:
|
||||||
|
abbrev "^2.0.0"
|
||||||
|
|
||||||
normalize-package-data@^2.0.0, normalize-package-data@^2.3.2, normalize-package-data@^2.5.0:
|
normalize-package-data@^2.0.0, normalize-package-data@^2.3.2, normalize-package-data@^2.5.0:
|
||||||
version "2.5.0"
|
version "2.5.0"
|
||||||
resolved "https://registry.yarnpkg.com/normalize-package-data/-/normalize-package-data-2.5.0.tgz#e66db1838b200c1dfc233225d12cb36520e234a8"
|
resolved "https://registry.yarnpkg.com/normalize-package-data/-/normalize-package-data-2.5.0.tgz#e66db1838b200c1dfc233225d12cb36520e234a8"
|
||||||
|
@ -5158,6 +5307,14 @@ path-parse@^1.0.7:
|
||||||
resolved "https://registry.yarnpkg.com/path-parse/-/path-parse-1.0.7.tgz#fbc114b60ca42b30d9daf5858e4bd68bbedb6735"
|
resolved "https://registry.yarnpkg.com/path-parse/-/path-parse-1.0.7.tgz#fbc114b60ca42b30d9daf5858e4bd68bbedb6735"
|
||||||
integrity sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==
|
integrity sha512-LDJzPVEEEPR+y48z93A0Ed0yXb8pAByGWo/k5YYdYgpY2/2EsOsksJrq7lOHxryrVOn1ejG6oAp8ahvOIQD8sw==
|
||||||
|
|
||||||
|
path-scurry@^1.10.1:
|
||||||
|
version "1.10.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/path-scurry/-/path-scurry-1.10.1.tgz#9ba6bf5aa8500fe9fd67df4f0d9483b2b0bfc698"
|
||||||
|
integrity sha512-MkhCqzzBEpPvxxQ71Md0b1Kk51W01lrYvlMzSUaIzNsODdd7mqhiimSZlr+VegAz5Z6Vzt9Xg2ttE//XBhH3EQ==
|
||||||
|
dependencies:
|
||||||
|
lru-cache "^9.1.1 || ^10.0.0"
|
||||||
|
minipass "^5.0.0 || ^6.0.2 || ^7.0.0"
|
||||||
|
|
||||||
path-type@^3.0.0:
|
path-type@^3.0.0:
|
||||||
version "3.0.0"
|
version "3.0.0"
|
||||||
resolved "https://registry.yarnpkg.com/path-type/-/path-type-3.0.0.tgz#cef31dc8e0a1a3bb0d105c0cd97cf3bf47f4e36f"
|
resolved "https://registry.yarnpkg.com/path-type/-/path-type-3.0.0.tgz#cef31dc8e0a1a3bb0d105c0cd97cf3bf47f4e36f"
|
||||||
|
@ -5667,6 +5824,13 @@ semver@^6.0.0, semver@^6.3.0:
|
||||||
resolved "https://registry.yarnpkg.com/semver/-/semver-6.3.0.tgz#ee0a64c8af5e8ceea67687b133761e1becbd1d3d"
|
resolved "https://registry.yarnpkg.com/semver/-/semver-6.3.0.tgz#ee0a64c8af5e8ceea67687b133761e1becbd1d3d"
|
||||||
integrity sha512-b39TBaTSfV6yBrapU89p5fKekE2m/NwnDocOVruQFS1/veMgdzuPcnOM34M6CwxW8jH/lxEa5rBoDeUwu5HHTw==
|
integrity sha512-b39TBaTSfV6yBrapU89p5fKekE2m/NwnDocOVruQFS1/veMgdzuPcnOM34M6CwxW8jH/lxEa5rBoDeUwu5HHTw==
|
||||||
|
|
||||||
|
semver@^7.5.3:
|
||||||
|
version "7.6.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/semver/-/semver-7.6.0.tgz#1a46a4db4bffcccd97b743b5005c8325f23d4e2d"
|
||||||
|
integrity sha512-EnwXhrlwXMk9gKu5/flx5sv/an57AkRplG3hTK68W7FRDN+k+OWBj65M7719OkA82XLBxrcX0KSHj+X5COhOVg==
|
||||||
|
dependencies:
|
||||||
|
lru-cache "^6.0.0"
|
||||||
|
|
||||||
serialize-javascript@^6.0.0:
|
serialize-javascript@^6.0.0:
|
||||||
version "6.0.1"
|
version "6.0.1"
|
||||||
resolved "https://registry.yarnpkg.com/serialize-javascript/-/serialize-javascript-6.0.1.tgz#b206efb27c3da0b0ab6b52f48d170b7996458e5c"
|
resolved "https://registry.yarnpkg.com/serialize-javascript/-/serialize-javascript-6.0.1.tgz#b206efb27c3da0b0ab6b52f48d170b7996458e5c"
|
||||||
|
@ -5712,6 +5876,11 @@ signal-exit@^3.0.0, signal-exit@^3.0.2, signal-exit@^3.0.3:
|
||||||
resolved "https://registry.yarnpkg.com/signal-exit/-/signal-exit-3.0.7.tgz#a9a1767f8af84155114eaabd73f99273c8f59ad9"
|
resolved "https://registry.yarnpkg.com/signal-exit/-/signal-exit-3.0.7.tgz#a9a1767f8af84155114eaabd73f99273c8f59ad9"
|
||||||
integrity sha512-wnD2ZE+l+SPC/uoS0vXeE9L1+0wuaMqKlfz9AMUo38JsyLSBWSFcHR1Rri62LZc12vLr1gb3jl7iwQhgwpAbGQ==
|
integrity sha512-wnD2ZE+l+SPC/uoS0vXeE9L1+0wuaMqKlfz9AMUo38JsyLSBWSFcHR1Rri62LZc12vLr1gb3jl7iwQhgwpAbGQ==
|
||||||
|
|
||||||
|
signal-exit@^4.0.1:
|
||||||
|
version "4.1.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/signal-exit/-/signal-exit-4.1.0.tgz#952188c1cbd546070e2dd20d0f41c0ae0530cb04"
|
||||||
|
integrity sha512-bzyZ1e88w9O1iNJbKnOlvYTrWPDl46O1bG0D3XInv+9tkPrxrN8jUUTiFlDkkmKWgn1M6CfIA13SuGqOa9Korw==
|
||||||
|
|
||||||
sisteransi@^1.0.5:
|
sisteransi@^1.0.5:
|
||||||
version "1.0.5"
|
version "1.0.5"
|
||||||
resolved "https://registry.yarnpkg.com/sisteransi/-/sisteransi-1.0.5.tgz#134d681297756437cc05ca01370d3a7a571075ed"
|
resolved "https://registry.yarnpkg.com/sisteransi/-/sisteransi-1.0.5.tgz#134d681297756437cc05ca01370d3a7a571075ed"
|
||||||
|
@ -5887,6 +6056,15 @@ string-length@^4.0.1:
|
||||||
char-regex "^1.0.2"
|
char-regex "^1.0.2"
|
||||||
strip-ansi "^6.0.0"
|
strip-ansi "^6.0.0"
|
||||||
|
|
||||||
|
"string-width-cjs@npm:string-width@^4.2.0":
|
||||||
|
version "4.2.3"
|
||||||
|
resolved "https://registry.yarnpkg.com/string-width/-/string-width-4.2.3.tgz#269c7117d27b05ad2e536830a8ec895ef9c6d010"
|
||||||
|
integrity sha512-wKyQRQpjJ0sIp62ErSZdGsjMJWsap5oRNihHhu6G7JVO/9jIB6UyevL+tXuOqrng8j/cxKTWyWUwvSTriiZz/g==
|
||||||
|
dependencies:
|
||||||
|
emoji-regex "^8.0.0"
|
||||||
|
is-fullwidth-code-point "^3.0.0"
|
||||||
|
strip-ansi "^6.0.1"
|
||||||
|
|
||||||
string-width@^1.0.1:
|
string-width@^1.0.1:
|
||||||
version "1.0.2"
|
version "1.0.2"
|
||||||
resolved "https://registry.yarnpkg.com/string-width/-/string-width-1.0.2.tgz#118bdf5b8cdc51a2a7e70d211e07e2b0b9b107d3"
|
resolved "https://registry.yarnpkg.com/string-width/-/string-width-1.0.2.tgz#118bdf5b8cdc51a2a7e70d211e07e2b0b9b107d3"
|
||||||
|
@ -5905,6 +6083,15 @@ string-width@^1.0.1:
|
||||||
is-fullwidth-code-point "^3.0.0"
|
is-fullwidth-code-point "^3.0.0"
|
||||||
strip-ansi "^6.0.1"
|
strip-ansi "^6.0.1"
|
||||||
|
|
||||||
|
string-width@^5.0.1, string-width@^5.1.2:
|
||||||
|
version "5.1.2"
|
||||||
|
resolved "https://registry.yarnpkg.com/string-width/-/string-width-5.1.2.tgz#14f8daec6d81e7221d2a357e668cab73bdbca794"
|
||||||
|
integrity sha512-HnLOCR3vjcY8beoNLtcjZ5/nxn2afmME6lhrDrebokqMap+XbeW8n9TXpPDOqdGK5qcI3oT0GKTW6wC7EMiVqA==
|
||||||
|
dependencies:
|
||||||
|
eastasianwidth "^0.2.0"
|
||||||
|
emoji-regex "^9.2.2"
|
||||||
|
strip-ansi "^7.0.1"
|
||||||
|
|
||||||
string.prototype.trimend@^1.0.6:
|
string.prototype.trimend@^1.0.6:
|
||||||
version "1.0.6"
|
version "1.0.6"
|
||||||
resolved "https://registry.yarnpkg.com/string.prototype.trimend/-/string.prototype.trimend-1.0.6.tgz#c4a27fa026d979d79c04f17397f250a462944533"
|
resolved "https://registry.yarnpkg.com/string.prototype.trimend/-/string.prototype.trimend-1.0.6.tgz#c4a27fa026d979d79c04f17397f250a462944533"
|
||||||
|
@ -5937,6 +6124,13 @@ string_decoder@~1.1.1:
|
||||||
dependencies:
|
dependencies:
|
||||||
safe-buffer "~5.1.0"
|
safe-buffer "~5.1.0"
|
||||||
|
|
||||||
|
"strip-ansi-cjs@npm:strip-ansi@^6.0.1":
|
||||||
|
version "6.0.1"
|
||||||
|
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-6.0.1.tgz#9e26c63d30f53443e9489495b2105d37b67a85d9"
|
||||||
|
integrity sha512-Y38VPSHcqkFrCpFnQ9vuSXmquuv5oXOKpGeT6aGrr3o3Gc9AlVa6JBfUSOCnbxGGZF+/0ooI7KrPuUSztUdU5A==
|
||||||
|
dependencies:
|
||||||
|
ansi-regex "^5.0.1"
|
||||||
|
|
||||||
strip-ansi@^3.0.0, strip-ansi@^3.0.1:
|
strip-ansi@^3.0.0, strip-ansi@^3.0.1:
|
||||||
version "3.0.1"
|
version "3.0.1"
|
||||||
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-3.0.1.tgz#6a385fb8853d952d5ff05d0e8aaf94278dc63dcf"
|
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-3.0.1.tgz#6a385fb8853d952d5ff05d0e8aaf94278dc63dcf"
|
||||||
|
@ -5951,6 +6145,13 @@ strip-ansi@^6.0.0, strip-ansi@^6.0.1:
|
||||||
dependencies:
|
dependencies:
|
||||||
ansi-regex "^5.0.1"
|
ansi-regex "^5.0.1"
|
||||||
|
|
||||||
|
strip-ansi@^7.0.1:
|
||||||
|
version "7.1.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/strip-ansi/-/strip-ansi-7.1.0.tgz#d5b6568ca689d8561370b0707685d22434faff45"
|
||||||
|
integrity sha512-iq6eVVI64nQQTRYq2KtEg2d2uU7LElhTJwsH4YzIHZshxlgZms/wIc4VoDQTlG/IvVIrBKG06CrZnp0qv7hkcQ==
|
||||||
|
dependencies:
|
||||||
|
ansi-regex "^6.0.1"
|
||||||
|
|
||||||
strip-bom@^3.0.0:
|
strip-bom@^3.0.0:
|
||||||
version "3.0.0"
|
version "3.0.0"
|
||||||
resolved "https://registry.yarnpkg.com/strip-bom/-/strip-bom-3.0.0.tgz#2334c18e9c759f7bdd56fdef7e9ae3d588e68ed3"
|
resolved "https://registry.yarnpkg.com/strip-bom/-/strip-bom-3.0.0.tgz#2334c18e9c759f7bdd56fdef7e9ae3d588e68ed3"
|
||||||
|
@ -6605,6 +6806,15 @@ wordwrap@^1.0.0:
|
||||||
resolved "https://registry.yarnpkg.com/wordwrap/-/wordwrap-1.0.0.tgz#27584810891456a4171c8d0226441ade90cbcaeb"
|
resolved "https://registry.yarnpkg.com/wordwrap/-/wordwrap-1.0.0.tgz#27584810891456a4171c8d0226441ade90cbcaeb"
|
||||||
integrity sha512-gvVzJFlPycKc5dZN4yPkP8w7Dc37BtP1yczEneOb4uq34pXZcvrtRTmWV8W+Ume+XCxKgbjM+nevkyFPMybd4Q==
|
integrity sha512-gvVzJFlPycKc5dZN4yPkP8w7Dc37BtP1yczEneOb4uq34pXZcvrtRTmWV8W+Ume+XCxKgbjM+nevkyFPMybd4Q==
|
||||||
|
|
||||||
|
"wrap-ansi-cjs@npm:wrap-ansi@^7.0.0":
|
||||||
|
version "7.0.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-7.0.0.tgz#67e145cff510a6a6984bdf1152911d69d2eb9e43"
|
||||||
|
integrity sha512-YVGIj2kamLSTxw6NsZjoBxfSwsn0ycdesmc4p+Q21c5zPuZ1pl+NfxVdxPtdHvmNVOQ6XSYG4AUtyt/Fi7D16Q==
|
||||||
|
dependencies:
|
||||||
|
ansi-styles "^4.0.0"
|
||||||
|
string-width "^4.1.0"
|
||||||
|
strip-ansi "^6.0.0"
|
||||||
|
|
||||||
wrap-ansi@^7.0.0:
|
wrap-ansi@^7.0.0:
|
||||||
version "7.0.0"
|
version "7.0.0"
|
||||||
resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-7.0.0.tgz#67e145cff510a6a6984bdf1152911d69d2eb9e43"
|
resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-7.0.0.tgz#67e145cff510a6a6984bdf1152911d69d2eb9e43"
|
||||||
|
@ -6614,6 +6824,15 @@ wrap-ansi@^7.0.0:
|
||||||
string-width "^4.1.0"
|
string-width "^4.1.0"
|
||||||
strip-ansi "^6.0.0"
|
strip-ansi "^6.0.0"
|
||||||
|
|
||||||
|
wrap-ansi@^8.1.0:
|
||||||
|
version "8.1.0"
|
||||||
|
resolved "https://registry.yarnpkg.com/wrap-ansi/-/wrap-ansi-8.1.0.tgz#56dc22368ee570face1b49819975d9b9a5ead214"
|
||||||
|
integrity sha512-si7QWI6zUMq56bESFvagtmzMdGOtoxfR+Sez11Mobfc7tm+VkUckk9bW2UeffTGVUbOksxmSw0AA2gs8g71NCQ==
|
||||||
|
dependencies:
|
||||||
|
ansi-styles "^6.1.0"
|
||||||
|
string-width "^5.0.1"
|
||||||
|
strip-ansi "^7.0.1"
|
||||||
|
|
||||||
wrappy@1:
|
wrappy@1:
|
||||||
version "1.0.2"
|
version "1.0.2"
|
||||||
resolved "https://registry.yarnpkg.com/wrappy/-/wrappy-1.0.2.tgz#b5243d8f3ec1aa35f1364605bc0d1036e30ab69f"
|
resolved "https://registry.yarnpkg.com/wrappy/-/wrappy-1.0.2.tgz#b5243d8f3ec1aa35f1364605bc0d1036e30ab69f"
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
if ! [ -d .venv ]
|
if ! [ -d .venv ]
|
||||||
then
|
then
|
||||||
python -m venv .venv
|
python3 -m venv .venv
|
||||||
. .venv/bin/activate
|
. .venv/bin/activate
|
||||||
pip install -e '.[dev]'
|
pip install -e '.[dev]'
|
||||||
else
|
else
|
||||||
|
|
|
@ -1,6 +1,6 @@
|
||||||
# This used to just be
|
# This used to just be
|
||||||
# PACKAGEVERSION := "`python3 setup.py --version`"
|
# PACKAGEVERSION := "`python3 setup.py --version`"
|
||||||
PACKAGEVERSION := $(shell python -c 'import tomllib; print(tomllib.load(open("pyproject.toml", "rb"))["project"]["version"])')
|
PACKAGEVERSION := $(shell ./print-package-version)
|
||||||
|
|
||||||
all: test build-docs
|
all: test build-docs
|
||||||
|
|
||||||
|
|
|
@ -7,7 +7,6 @@ The main package re-exports a subset of the exports of its constituent modules:
|
||||||
- From [preserves.values][]:
|
- From [preserves.values][]:
|
||||||
- [Annotated][preserves.values.Annotated]
|
- [Annotated][preserves.values.Annotated]
|
||||||
- [Embedded][preserves.values.Embedded]
|
- [Embedded][preserves.values.Embedded]
|
||||||
- [Float][preserves.values.Float]
|
|
||||||
- [ImmutableDict][preserves.values.ImmutableDict]
|
- [ImmutableDict][preserves.values.ImmutableDict]
|
||||||
- [Record][preserves.values.Record]
|
- [Record][preserves.values.Record]
|
||||||
- [Symbol][preserves.values.Symbol]
|
- [Symbol][preserves.values.Symbol]
|
||||||
|
@ -56,7 +55,7 @@ Finally, it provides a few utility aliases for common tasks:
|
||||||
|
|
||||||
'''
|
'''
|
||||||
|
|
||||||
from .values import Float, Symbol, Record, ImmutableDict, Embedded, preserve
|
from .values import Symbol, Record, ImmutableDict, Embedded, preserve
|
||||||
|
|
||||||
from .values import Annotated, is_annotated, strip_annotations, annotate
|
from .values import Annotated, is_annotated, strip_annotations, annotate
|
||||||
|
|
||||||
|
|
|
@ -206,7 +206,6 @@ class Decoder(BinaryCodec):
|
||||||
return self.wrap(Embedded(self.decode_embedded(self.next())))
|
return self.wrap(Embedded(self.decode_embedded(self.next())))
|
||||||
if tag == 0x87:
|
if tag == 0x87:
|
||||||
count = self.nextbyte()
|
count = self.nextbyte()
|
||||||
if count == 4: return self.wrap(Float.from_bytes(self.nextbytes(4)))
|
|
||||||
if count == 8: return self.wrap(struct.unpack('>d', self.nextbytes(8))[0])
|
if count == 8: return self.wrap(struct.unpack('>d', self.nextbytes(8))[0])
|
||||||
raise DecodeError('Invalid IEEE754 size')
|
raise DecodeError('Invalid IEEE754 size')
|
||||||
if tag == 0xb0: return self.wrap(self.nextint(self.varint()))
|
if tag == 0xb0: return self.wrap(self.nextint(self.varint()))
|
||||||
|
@ -218,7 +217,11 @@ class Decoder(BinaryCodec):
|
||||||
if not vs: raise DecodeError('Too few elements in encoded record')
|
if not vs: raise DecodeError('Too few elements in encoded record')
|
||||||
return self.wrap(Record(vs[0], vs[1:]))
|
return self.wrap(Record(vs[0], vs[1:]))
|
||||||
if tag == 0xb5: return self.wrap(tuple(self.nextvalues()))
|
if tag == 0xb5: return self.wrap(tuple(self.nextvalues()))
|
||||||
if tag == 0xb6: return self.wrap(frozenset(self.nextvalues()))
|
if tag == 0xb6:
|
||||||
|
vs = self.nextvalues()
|
||||||
|
s = frozenset(vs)
|
||||||
|
if len(s) != len(vs): raise DecodeError('Duplicate value')
|
||||||
|
return self.wrap(s)
|
||||||
if tag == 0xb7: return self.wrap(ImmutableDict.from_kvs(self.nextvalues()))
|
if tag == 0xb7: return self.wrap(ImmutableDict.from_kvs(self.nextvalues()))
|
||||||
raise DecodeError('Invalid tag: ' + hex(tag))
|
raise DecodeError('Invalid tag: ' + hex(tag))
|
||||||
|
|
||||||
|
|
|
@ -33,12 +33,12 @@ import numbers
|
||||||
from enum import Enum
|
from enum import Enum
|
||||||
from functools import cmp_to_key
|
from functools import cmp_to_key
|
||||||
|
|
||||||
from .values import preserve, Float, Embedded, Record, Symbol, cmp_floats, _unwrap
|
from .values import preserve, Embedded, Record, Symbol, cmp_floats, _unwrap
|
||||||
from .compat import basestring_
|
from .compat import basestring_
|
||||||
|
|
||||||
class TypeNumber(Enum):
|
class TypeNumber(Enum):
|
||||||
BOOL = 0
|
BOOL = 0
|
||||||
FLOAT = 1
|
# FLOAT = 1 # single-precision
|
||||||
DOUBLE = 2
|
DOUBLE = 2
|
||||||
SIGNED_INTEGER = 3
|
SIGNED_INTEGER = 3
|
||||||
STRING = 4
|
STRING = 4
|
||||||
|
@ -57,7 +57,6 @@ def type_number(v):
|
||||||
raise ValueError('type_number expects Preserves value; use preserve()')
|
raise ValueError('type_number expects Preserves value; use preserve()')
|
||||||
|
|
||||||
if isinstance(v, bool): return TypeNumber.BOOL
|
if isinstance(v, bool): return TypeNumber.BOOL
|
||||||
if isinstance(v, Float): return TypeNumber.FLOAT
|
|
||||||
if isinstance(v, float): return TypeNumber.DOUBLE
|
if isinstance(v, float): return TypeNumber.DOUBLE
|
||||||
if isinstance(v, numbers.Number): return TypeNumber.SIGNED_INTEGER
|
if isinstance(v, numbers.Number): return TypeNumber.SIGNED_INTEGER
|
||||||
if isinstance(v, basestring_): return TypeNumber.STRING
|
if isinstance(v, basestring_): return TypeNumber.STRING
|
||||||
|
|
|
@ -9,7 +9,7 @@ def map_embeddeds(f, v):
|
||||||
|
|
||||||
```python
|
```python
|
||||||
>>> map_embeddeds(lambda w: Embedded(f'w={w}'), ['a', Embedded(123), {'z': 6.0}])
|
>>> map_embeddeds(lambda w: Embedded(f'w={w}'), ['a', Embedded(123), {'z': 6.0}])
|
||||||
('a', #!'w=123', {'z': 6.0})
|
('a', #:'w=123', {'z': 6.0})
|
||||||
|
|
||||||
```
|
```
|
||||||
"""
|
"""
|
||||||
|
|
|
@ -1,5 +1,5 @@
|
||||||
´³schema·³version°³definitions·³Axis´³orµµ±values´³rec´³lit³values„´³tupleµ„„„„µ±descendants´³rec´³lit³descendants„´³tupleµ„„„„µ±at´³rec´³lit³at„´³tupleµ´³named³key³any„„„„„µ±label´³rec´³lit³label„´³tupleµ„„„„µ±keys´³rec´³lit³keys„´³tupleµ„„„„µ±length´³rec´³lit³length„´³tupleµ„„„„µ±annotations´³rec´³lit³annotations„´³tupleµ„„„„µ±embedded´³rec´³lit³embedded„´³tupleµ„„„„µ±parse´³rec´³lit³parse„´³tupleµ´³named³module´³seqof´³atom³Symbol„„„´³named³name´³atom³Symbol„„„„„„µ±unparse´³rec´³lit³unparse„´³tupleµ´³named³module´³seqof´³atom³Symbol„„„´³named³name´³atom³Symbol„„„„„„„„³Step´³orµµ±Axis´³refµ„³Axis„„µ±Filter´³refµ„³Filter„„µ±Function´³refµ„³Function„„„„³Filter´³orµµ±nop´³rec´³lit³nop„´³tupleµ„„„„µ±compare´³rec´³lit³compare„´³tupleµ´³named³op´³refµ„³
|
´³schema·³version°³definitions·³Axis´³orµµ±values´³rec´³lit³values„´³tupleµ„„„„µ±descendants´³rec´³lit³descendants„´³tupleµ„„„„µ±at´³rec´³lit³at„´³tupleµ´³named³key³any„„„„„µ±label´³rec´³lit³label„´³tupleµ„„„„µ±keys´³rec´³lit³keys„´³tupleµ„„„„µ±length´³rec´³lit³length„´³tupleµ„„„„µ±annotations´³rec´³lit³annotations„´³tupleµ„„„„µ±embedded´³rec´³lit³embedded„´³tupleµ„„„„µ±parse´³rec´³lit³parse„´³tupleµ´³named³module´³seqof´³atom³Symbol„„„´³named³name´³atom³Symbol„„„„„„µ±unparse´³rec´³lit³unparse„´³tupleµ´³named³module´³seqof´³atom³Symbol„„„´³named³name´³atom³Symbol„„„„„„„„³Step´³orµµ±Axis´³refµ„³Axis„„µ±Filter´³refµ„³Filter„„µ±Function´³refµ„³Function„„„„³Filter´³orµµ±nop´³rec´³lit³nop„´³tupleµ„„„„µ±compare´³rec´³lit³compare„´³tupleµ´³named³op´³refµ„³
|
||||||
Comparison„„´³named³literal³any„„„„„µ±regex´³rec´³lit³regex„´³tupleµ´³named³regex´³atom³String„„„„„„µ±test´³rec´³lit³test„´³tupleµ´³named³pred´³refµ„³ Predicate„„„„„„µ±real´³rec´³lit³real„´³tupleµ„„„„µ±int´³rec´³lit³int„´³tupleµ„„„„µ±kind´³rec´³lit³kind„´³tupleµ´³named³kind´³refµ„³ ValueKind„„„„„„„„³Function´³rec´³lit³count„´³tupleµ´³named³selector´³refµ„³Selector„„„„„³Selector´³seqof´³refµ„³Step„„³ Predicate´³orµµ±Selector´³refµ„³Selector„„µ±not´³rec´³lit³not„´³tupleµ´³named³pred´³refµ„³ Predicate„„„„„„µ±or´³rec´³lit³or„´³tupleµ´³named³preds´³seqof´³refµ„³ Predicate„„„„„„„µ±and´³rec´³lit³and„´³tupleµ´³named³preds´³seqof´³refµ„³ Predicate„„„„„„„„„³ ValueKind´³orµµ±Boolean´³lit³Boolean„„µ±Float´³lit³Float„„µ±Double´³lit³Double„„µ±
SignedInteger´³lit³
SignedInteger„„µ±String´³lit³String„„µ±
|
Comparison„„´³named³literal³any„„„„„µ±regex´³rec´³lit³regex„´³tupleµ´³named³regex´³atom³String„„„„„„µ±test´³rec´³lit³test„´³tupleµ´³named³pred´³refµ„³ Predicate„„„„„„µ±real´³rec´³lit³real„´³tupleµ„„„„µ±int´³rec´³lit³int„´³tupleµ„„„„µ±kind´³rec´³lit³kind„´³tupleµ´³named³kind´³refµ„³ ValueKind„„„„„„„„³Function´³rec´³lit³count„´³tupleµ´³named³selector´³refµ„³Selector„„„„„³Selector´³seqof´³refµ„³Step„„³ Predicate´³orµµ±Selector´³refµ„³Selector„„µ±not´³rec´³lit³not„´³tupleµ´³named³pred´³refµ„³ Predicate„„„„„„µ±or´³rec´³lit³or„´³tupleµ´³named³preds´³seqof´³refµ„³ Predicate„„„„„„„µ±and´³rec´³lit³and„´³tupleµ´³named³preds´³seqof´³refµ„³ Predicate„„„„„„„„„³ ValueKind´³orµµ±Boolean´³lit³Boolean„„µ±Double´³lit³Double„„µ±
SignedInteger´³lit³
SignedInteger„„µ±String´³lit³String„„µ±
|
||||||
ByteString´³lit³
|
ByteString´³lit³
|
||||||
ByteString„„µ±Symbol´³lit³Symbol„„µ±Record´³lit³Record„„µ±Sequence´³lit³Sequence„„µ±Set´³lit³Set„„µ±
|
ByteString„„µ±Symbol´³lit³Symbol„„µ±Record´³lit³Record„„µ±Sequence´³lit³Sequence„„µ±Set´³lit³Set„„µ±
|
||||||
Dictionary´³lit³
|
Dictionary´³lit³
|
||||||
|
|
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue