Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Types that reference JSONSchema are currently unusable #111

Open
brendanhay opened this issue Jan 31, 2020 · 5 comments
Open

Types that reference JSONSchema are currently unusable #111

brendanhay opened this issue Jan 31, 2020 · 5 comments

Comments

@brendanhay
Copy link

As a concrete example, I'm attempting to convert various helm charts such as cert-manager into Dhall and there are various errors in the CustomResourceDefinition and related CustomResourceValidation types that are supposed to use JSONSchema:

  • JSONSchemaProps
    • externalDocumentation is required but has its fields marked as optional. The property is actually optional.
    • properties has a Text value. It should be of type JSONSchema, whatever that might be.
  • JSONSchemaPropsOrBool, JSONSchemaPropsOrArray, JSONSchemaPropsOrStringArray
    • All of these types are generated as {}. They should reference a recursive JSONSchema type similarly to JSONSchemaProps.properties.
  • JSON
    • Is generated as {}. Probably it should point to either Prelude.JSON.Type or Text. (I personally find the former untenable outside of automated conversion.)

Some thoughts/observations:

  • I've currently 'solved' this by inlining entire CustomResourceDefinition's YAML as Text and post-processing this to decode the Text into a JSON Value and reinsert it into the YAML/JSON AST when converting Dhall to YAML.
  • Probably the JSONSchema related types should not be generated and there should be a separate JSONSchema Dhall package that is pulled in as a dependency, since the spec is versioned and won't be subject to change on dhall-kubernetes regeneration.
  • Prelude.JSON.Type could be used as a shorter-term solution - but it's absolutely horrendous to work with when trying to represent ad-hoc JSON by hand.
@Gabriella439
Copy link
Contributor

Gabriella439 commented Jan 31, 2020

@brendanhay: Note that #110 just changed the types to more accurately match the OpenAPI spec. For example the externalDocumentation field you mentioned is now properly Optional in its own right:

, externalDocs :
Optional
./io.k8s.apiextensions-apiserver.pkg.apis.apiextensions.v1.ExternalDocumentation.dhall

The reason why properties has Text values is because the current generator is ignoring the additionalProperties field of the properties field's schema:

        "patternProperties": {
          "additionalProperties": {
            "$ref": "#/definitions/io.k8s.apiextensions-apiserver.pkg.apis.apiextensions.v1.JSONSchemaProps"
          },
          "type": "object"
        },

It's actually not clear to how specifying JSONSchemaProps as additionalProperties is different from just specifying that as the schema for the field, like they do for other fields such as this one:

        "not": {
          "$ref": "#/definitions/io.k8s.apiextensions-apiserver.pkg.apis.apiextensions.v1.JSONSchemaProps"
        },

EIther way, I think this is fixable by changing JSONSchemaProps to model it as a recursive type (similar to Prelude.JSON.Type)

The reason JSONSchemaPropsOrBool translates to {} is because that's how it's defined in the OpenAPI spec:

    "io.k8s.apiextensions-apiserver.pkg.apis.apiextensions.v1.JSONSchemaPropsOrBool": {
      "description": "JSONSchemaPropsOrBool represents JSONSchemaProps or a boolean value. Defaults to true for the boolean property."
    },

We can add an override to fix it to match the description, but you should also open an issue against the Kubernetes repository asking them to fix the OpenAPI spec for those JSONSchemaPropsOr* resources.

I also agree that io.k8s.apiextensions-apiserver.pkg.apis.apiextensions.v1.JSON should be modeled as either Text or Prelude.JSON.Type. Personally I think the right thing to do here is Prelude.JSON.Type.

I know you think Prelude.JSON.Type is horrendous to author, but there is a trick (which we can document), which is that you can use json-to-dhall https://prelude.dhall-lang.org/v12.0.0/JSON/type to convert arbitrary JSON to that type:

$ json-to-dhall https://prelude.dhall-lang.org/v12.0.0/JSON/Type <<< '{ "x": [ 1, [ { "y": true } ] ] }'
  λ(JSON : Type)
→ λ ( json
    : { array : List JSON → JSON
      , bool : Bool → JSON
      , null : JSON
      , number : Double → JSON
      , object : List { mapKey : Text, mapValue : JSON } → JSON
      , string : Text → JSON
      }
    )
→ json.object
    [ { mapKey = "x"
      , mapValue =
          json.array
            [ json.number 1.0
            , json.array
                [ json.object [ { mapKey = "y", mapValue = json.bool True } ] ]
            ]
      }
    ]

... and we could have it optionally import the Prelude to replace the λ(json : …) → header with let json = https://prelude.dhall-lang.org/…

@brendanhay
Copy link
Author

Given the state of those OpenAPI definitions I'll maintain that writing/reusing a standalone JSON Schema Dhall package according the RFC is going to be smoother than attempting to fix upstream metadata or adding generator overrides, from my own experience, but YMMV.

Regarding using dhall-to-json - thanks - I've used the manually conversion to a Prelude.JSON.Type via json-to-dhall approach previously which was removed in favour of using Text and Dhall's interpolation/templating functionality. When we edit JSON/YAML or similar structured formats we're not editing the AST directly - which is what happens if we embed the Prelude.JSON.Type and then attempt to maintain that. Alternatively you end up maintaining the original YAML files and converting/embedding them as part of some pre/post-processing pipeline.

One small improvement would be to have json-to-dhall emit toMap and records for objects rather than the normalised [ { mapKey = ..., mapValue = ... }, ... ]. For your example above it would result in:

json.object (toMap
    { x = json.array
        [ json.number 1.0
        , json.array [ json.object (toMap { y = json.bool True }) ]
        ]
    })

Not a huge improvement in this small example - but in my scenario(s) I have various YAML/JSON objects that are around ~5-10K lines in size, so potentially a win there. Although as stated previously you're still maintaining a JSON AST rather than a more first-class representation. (This is all moot though if a proper JSON Schema recursive type existed.)

If only a JSON/encode builtin for encoding arbitrary records was possible. 🥇

@Gabriella439
Copy link
Contributor

@brendanhay: Yeah, we have to write a Dhall JSONSchema package anyway, since it would be pretty difficult for the dhall-kubernetes-generator to automatically infer the correct recursive type anyway.

The non-trivial part is converting it to YAML because it would require either:

  • dhall-to-{json,yaml,yaml-ng} having built-in support for converting JSON schema to JSON/YAML
  • Extending dhall-to-{json,yaml,yaml-ng} to auto-detect and handle recursive types

Currently I'm leaning towards the former, since it is simpler and it seems sensible for the dhall-{json,yaml}{,-ng} to natively support JSON schemas

I also agree that {json,yaml}-to-dhall should emit toMap. The only reason they don't is because they were initially authored before the toMap keyword existed.

We might someday support building in the Dhall ↔ JSON/YAML conversion functionality into the language, but right now the purpose of the command-line tools is to better understand what is required of those conversions to better inform the design of the corresponding language feature.

@Gabriella439
Copy link
Contributor

Just updating on this: apparently JSON Schema is waaaaaay more complicated than I realize, to the point where it's easier to implement dhall-to-{json,yaml,yaml-ng} support for recursive types rather than to special case support for JSON schema.

@WillSewell
Copy link
Collaborator

@brendanhay would it be possible to share an example of your workaround?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants