1. Knowledge Base
  2. Human in the loop

How do I set up human review?

A how-to guide on using Human in the Loop (HITL) in Levity

Human review, also known as Human in the Loop (HITL), is how you use human feedback to optimize your AI block's 'brain'. 

Essentially, this works just as you would train an employee with a question - if an unfamiliar file, perhaps even in an unfamiliar language, comes across the table, they will ask for your feedback. Next time they encounter this file, they will likely know what to do without asking.


You can set up different rules for human review for different AI blocks. To implement human review in Levity, just follow these steps:

Step 1: Open/create the AI Block

You might already have a block you wish to setup Human Review for. If so, click into the AI block and click on the Human Review tab at the top. If you are still needing to create something, check out our guide on making your own AI block.

Step 2: Choose your Human Review option

Here you have three options:

  • No human review: The prediction gets classified to the label with the highest confidence rate and no human intervention is determined.

  • Standard human review: Set up basic error minimization by determining the maximum error rate you want to allow. The model will give you an estimate on how much data will have to be reviewed by a human pair of eyes.

  • Advanced settings: You can set up custom error settings for each label. In detail, these settings enable you to predetermine the number of false positives and false negatives you want to allow for each label. For more explanation, check out our blog post on the topic.

Step 3: Select your preferences

hitl pref

For example, click "Standard human review". Set your maximum error rate and keep in mind the trade-off between model accuracy and manual human labor. The model will learn from your decisions and become more accurate, so you might want to come back here to decrease the maximum error rate after some time.

You can now either do the reviewing by yourself or set up a team on Slack to help you out.

Step 4: Add reviewers to your workflow via Slack

  1. Click the Add Reviewers button, which will prompt you to connect your Slack workspace (if not already done) and allow Levity to interact with it. 

  2. Select the labelers from your Slack workspace and click on Done.

  3. Make sure to click Save selection changes on the main Human Review screen to save your settings.
This will send an interactive message to reviewers through Slack from which they can label edge cases.

hitl slack

Step 5: Test and adjust as needed

If you want to test the labeling feature, you can do so by uploading a difficult data point to the "Testing" tab which will fall below the confidence threshold you selected in your settings. This will trigger the interactive Slack labeling message.