Checkers

Assessing model fit

It is important to check that your assumptions about the data story along with your modelling decisions are good ones. Also, to check if there are any errors in the fitting process. Checkers in hibayes achieve this. There are a number of built in default checkers that you can select from:

insert table

What makes up a checker?

Checkers word on each model independentally, and so require a ModelAnalysisState and have an optional argument for the display. They return the ModelAnalysisState with the CheckerResult (pass, fail, error or NA).

from hibayes.check import Checker, CheckerResult, checker


@checker(when="after")
def my_checker(
    chances: float = 0.42,
) -> Checker:
    """
    a checker which fails with a chances probability
    """
    def check(state, display):
        rng = random.uniform(0,1)

        inference_data = state.inference_data

        state.add_diagnostic("rng": rng)

        return state, "pass" if rng > chances else "fail"
    return check
1
does this check run before or after the modell has been fitted? The checker decorator registers the function so it can be accessed simply by calling the checker string but it also enforces an agreed upon functionality.
2
any args you might want the user to pass through the config.
3
you can access anything in the ModelAnalysisState

How to interpret results

If you are using the hibayes display CheckerResults are displayed live in the Checker Result tab with an F, green dot, yellow E and grey dot representing a fail, pass, error, NA respectively.

checker result

You can also inspect the model directory in the AnalysisState for each model. Here diagnostic plots are saved and diagnostic summaries are stored in diagnostics.json