Skip to content

[SC 9567] Add text support in the test result object#349

Merged
AnilSorathiya merged 2 commits intomainfrom
anilsorathiya/sc-9567/add-support-for-text-output-in-the-test-result
Apr 10, 2025
Merged

[SC 9567] Add text support in the test result object#349
AnilSorathiya merged 2 commits intomainfrom
anilsorathiya/sc-9567/add-support-for-text-output-in-the-test-result

Conversation

@AnilSorathiya
Copy link
Contributor

@AnilSorathiya AnilSorathiya commented Apr 9, 2025

External Release Notes

We now support text output in test results.
If a test returns a string, it will be used as the test description. In such cases, the automatic generation of the test result description will be skipped, and the test output will be used directly.

@vm.test("my_custom_tests.MyCustomTest")
def my_custom_test(dataset, model):
    """
    This is a custom test that does nothing.
    """
    y_true = dataset.y
    y_pred = dataset.y_pred(model)

    confusion_matrix = metrics.confusion_matrix(y_true, y_pred)

    cm_display = metrics.ConfusionMatrixDisplay(
        confusion_matrix=confusion_matrix, display_labels=[False, True]
    )
    cm_display.plot()

    plt.close()  # close the plot to avoid displaying it



    return cm_display.figure_, "Test Description - Confusion Matrix", pd.DataFrame({"Value": [1, 2, 3]})  # return the figure object itself

Run test

from validmind.tests import run_test

result = run_test(
    "my_custom_tests.MyCustomTest",
    inputs={"model": "model", "dataset": "test_dataset"},
)

Output

image

@AnilSorathiya AnilSorathiya added documentation Improvements or additions to documentation enhancement New feature or request highlight Feature to be curated in the release notes labels Apr 9, 2025
Copy link
Contributor

@johnwalz97 johnwalz97 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

lgtm

@juanmleng
Copy link
Contributor

Looks great. Perhaps worth adding this example to notebooks/code_samples/custom_tests/implement_custom_tests.ipynb

@AnilSorathiya
Copy link
Contributor Author

Looks great. Perhaps worth adding this example to notebooks/code_samples/custom_tests/implement_custom_tests.ipynb

thanks. I am testing through the implement_custom_tests.ipynb notebook. Let will push this notebook.

@github-actions
Copy link
Contributor

PR Summary

This pull request introduces enhancements to the custom test framework by allowing users to define custom descriptions for their tests. The changes include:

  1. Notebook Update: A new section titled 'Custom Test: Description' has been added to the implement_custom_tests.ipynb notebook. This section demonstrates how to write a custom test that returns a string description, specifically for a confusion matrix in a binary classification model.

  2. Output Handling: A new StringOutputHandler class has been introduced in validmind/tests/output.py. This handler processes string outputs, converting them to HTML if they are not already in HTML format, and assigns them to the test result's description.

  3. Test Result Description Logic: Modifications in validmind/tests/run.py ensure that a custom description is used if provided. If no custom description is available, the system generates one using existing logic.

  4. Logging Enhancements: The logging logic in validmind/vm_models/result/result.py has been updated to include descriptions in the logging process if they are present, alongside tables and figures.

Test Suggestions

  • Test the new custom test description feature by creating a test that returns a string description and verify it appears correctly in the UI.
  • Ensure that the StringOutputHandler correctly processes and converts markdown strings to HTML.
  • Verify that existing tests without custom descriptions still generate descriptions as expected.
  • Check that the logging functionality includes descriptions when they are present.
  • Test the backward compatibility by running existing tests to ensure no breakage occurs due to the new changes.

@AnilSorathiya AnilSorathiya merged commit 50aeb99 into main Apr 10, 2025
6 checks passed
@cachafla
Copy link
Contributor

Late to the party but very nice 😊

@johnwalz97 johnwalz97 deleted the anilsorathiya/sc-9567/add-support-for-text-output-in-the-test-result branch August 20, 2025 17:04
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

documentation Improvements or additions to documentation enhancement New feature or request highlight Feature to be curated in the release notes

Projects

None yet

Development

Successfully merging this pull request may close these issues.

4 participants