Skip to content

Klarity Craft (KC)

TODO : for simplicity, we will document KC, integrating WS and KS, and blueprint in the same place for this version as separation between both component has to be documented

Klarity Craft is a set of libraries which handle metrics to build artefacts of an AI-Based component using bluesprints.

It works in association with Klarity Dashboard by pushing artefacts payload onto it. Enabling the validation of theses artefacts and initiating status and version change (from one stage to another or within the same stage)

Klarity craft handle the data model behind artefact construction, it is KS to manage storage and Klarity Workbench for for metrics computation.

Klarity data model

In this diagram all these object manage living data in klarity, they reference 2 elements outside this diagram (version for history and accountability) and configuration (which adapt behabiour of our these data can be manipulated in a specific context) these elements are manage at different level reference with class member in this diagram. Where you do to have a version or configuration see Configuration or TODO

  • The version of the agregative object :
    • workbench for tools
    • DataTable for data elements
    • AI_Component for operation associated to function
    • The PCIV version

Craft architecture

  • Craft will use git to manage accountability of change and immutability of data stored in S3
    • lib to access git from python
    • possible transition to rust

Craft depolyment

  • craft will always work with a git repository to manage accountability

  • several working mode

    • on a tag version from MLOPS runner
    • from a local clone of the project repository, with sandbox dataspace for local computation
  • local deployment, using cli from the git repository of the project

    • a dedicated directory file contening craft configuration of the project .safenai/ in the repository root
      • project_name
      • KD backend_url
  • the git repository will have 3 branches

    • specification
    • development
    • operation
  • on CI for each version :

    • check of metrics / artefact that can be compute from current available elements
    • if metrics is already compute, nothing
    • if metric not compute, compute the metrics
    • if artefact can be generated, the artefact is generated
  • the tag put into the repository shall comply with klarity version schema

    • add in git hook compliance version check

Craft user interaction

  • cli / web ui / jupyter

    • Display list of request version from KD (can be filtered by stage) ==> Not in MVP

    • Display version completion status :

      • for each requested artefact :
        • The associated metrics needed (by blueprint configuration) with following status

          • open : generation not yes requested
          • pending : generation requested not yet executed
          • sandbox : computed in the sandbox
          • not_computable : missing elements to be computed
          • not_uptodate : already computed, but change tag this metric as dirty
          • ready : already computed and available
          • immutable : computed in a previous stage and immutable
        • The artefact status

          • open : generation not yes requested
          • pending : generation requested not yet executed
          • sandbox : computed in the sandbox
          • not_computable : missing elements to be computed
          • not_uptodate : already computed, but change tag this metric as dirty
          • ready : already computed and available
          • immutable : computed in a previous stage and immutable
    • create artefacts for a version from available metrics (or a specific artefact of the version)

      • change artefact or metrics status to pending, if target DataTable in sandbox, then try to compute in sandbox
    • create metrics output from (Scope Item, configuration, and tool)

      • for each metric a python module is available in a blueprint package

        • this module have a standard interface with the following interface
          • checkInputs : verify conformity of inputs provided (return list of missing scope id, configuration errors)
          • generateTraceabilityObject : get all traceability element, and put them in a signed object
          • process : generate the metric output
        • to access these module the package shall rely on plugin python
        • propose a generic module that enable metric tools call from a configuration (json, yaml, ...) without specific dev
      • this module can be called in standalone inside a jupyter, or register in blueprint configuration.

    • display in jupyter widget the artefact

      • disabling action dedicated to validation, but with all real display
    • push a version to dashboard (local dev or in server)

      • this action is perform by CI or user if workbench is not inside klarity
  • Generate one artefact from existing metrics

Metric computation

Metric computation configuration and implementation are available in (TODO : blueprint + common, need to identify ) TODO : integrate code to generate metric in the repository for demo, to factorize later. TODO : define a basic way to manage dataset in the demo

Artefact build

Using several metrics and user_content provide, we build the artefact payload associated to a specific version.

  • This payload can contain several Artefact Item.
  • Between each Artefact Item we might have several relation with an other Artefact Item from an other Artefact each of them with a type (satisfy, verify, refine, deriveReqt, copy trace»). ==> Il will enable to generate coverage / traceability matrix, ....

TODO : integrate code to generate metric in the repository for demo, to factorize later.