Skip to end of metadata
Go to start of metadata

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

Version 1 Next »

A targeting object represents a set of rules which need to be evaluated, in order to retrieve the best-fit variation for a flag. Rulesets need to be evaluated in priority order. The first matching rule will dictate the resulting output (i.e. the recommended variation).

Data Models

Evaluation Engine

The ruleset evaluation engine is essentially a method used to select the variation for the first rule that is satisfied. It can be defined recursively in 4 lines. Note that the following bellow is just pseudo-code aimed to illustrate the core logic for flag evaluation.

evaluate(ruleset, fallthrough_variation) :: (Ruleset, Variation) -> Variation
  if (ruleset is empty) return fallthrough_variation
  
  cons {condition, identity, segement, variation} = ruleset.pop()
  
  if (<condition> matches <identity | segment>) then return variation
    
  return evaluate(ruleset, fallthrough_variation)

  • No labels