A “model-specific” explainer requires some information of ML models, e.g., whether the model isĭifferentiable, whether the model is a linear model or a tree-based model. : It contains the explainers for ranking tasks, e.g., ValidityRankingExplainer,įor “tabular”, “vision”, “nlp” and “timeseries”, the explainers are further categorized into “model-agnostic”, “model-specific” and “counterfactual”.Ī “model-agnostic” explainer can handle black-box ML models, i.e., only requiring a prediction function without _language: It contains the explainers for vision-language tasks, e.g., IG, GradCAM. : It contains the explainers for time series tasks, e.g., SHAP, MACE. : It contains the explainers for NLP tasks, e.g., LIME, integrated-gradient. Grad-CAM, contrastive explanation, counterfactual explanation. : It contains the explainers for vision tasks, e.g., integrated-gradient, Such as PDP, local explanations such as LIME, SHAP, MACE. : It contains the explainers for tabular data, e.g., global explanations : It computes the performance metrics for classification and regression tasks. : It is for data exploration/analysis, including feature correlation analysis,įeature imbalance analysis, feature selection, etc. The explainers are categorized into the following groups: Omnixai.explainers: This is the main package in the library, which contains all the supported explainers. One can simply use this class to transform the raw data into the training/test dataset that a particular machine learning The TF-IDF transformation and token-to-id transformation for text data.Ī pre-processing pipeline combining multiple pre-processing functions together.įor tabular data, provides a convenient way for feature pre-processing. Recaling, normalization, resizing for image data. KBins, standard normalization, min-max normalization, rescaling, NaN-filling for continuous-valued features.Ī pre-processing module for tabular data. One-hot encoding and ordinal encoding for categorical features. Omnixai.preprocessing: This package contains various pre-processing functions for different feature types: The library provides simple constructors for creating instances of these classes from numpy arrays, pandas dataframes, For example, the explainers for tabular data use an instance of as one of their inputs. Omnixai.data: This package contains the classes for representing tabular, image, text, and time series data, Unified interface to generate the explanations for their applications by only writing a few lines ofĬodes, and also a GUI dashboard for visualization for obtaining more insights about decisions. For practitioners, OmniXAI provides an easy-to-use Methods including “model-specific” and “model-agnostic” methods (such as feature-attribution explanation,Ĭounterfactual explanation, gradient-based explanation, etc). (traditional ML in Scikit-learn and deep learning models in PyTorch/TensorFlow), and a range of diverse explaination Supports multiple data types (tabular data, images, texts, time-series), multiple types of ML models OmniXAI includes a rich family of explanation methods integrated in a unified interface, which OmniXAI aims to be a one-stop comprehensive library that makes explainable AI easy forĭata scientists, ML researchers and practitioners who need explanation for various types of data, models andĮxplanation methods at different stages of ML process: Machine learning capabilities to address many pain points in explaining decisions made by machine learning OmniXAI (short for Omni eXplainable AI) is a Python library for explainable AI (XAI), offering omni-way explainable AI and interpretable OmniXAI: An Explanation Toolbox Introduction
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |