Models
Workbench supports multiple model frameworks — all using the same API. Just change the model_framework parameter (or model_class for scikit-learn) and everything else stays the same: training, deployment, inference, and confidence scoring.
Available Frameworks
| Framework | Description | Guide |
|---|---|---|
| XGBoost | Gradient boosted trees on molecular descriptors | XGBoost Models |
| Scikit-Learn | Any scikit-learn estimator (RandomForest, KMeans, etc.) | Scikit-Learn Models |
| PyTorch | Neural network on molecular descriptors | PyTorch Models |
| Fingerprint | Count fingerprint models for molecular similarity | Fingerprint Models |
| ChemProp | Message Passing Neural Network on molecular graphs | ChemProp Models |
| Meta Model | Ensemble aggregating multiple endpoints | Meta Models |
Quick Example
from workbench.api import FeatureSet, ModelType, ModelFramework
fs = FeatureSet("my_features")
# XGBoost (default framework)
xgb_model = fs.to_model(
name="my-xgb-model",
model_type=ModelType.REGRESSOR,
model_framework=ModelFramework.XGBOOST,
target_column="target",
feature_list=fs.feature_columns,
)
# PyTorch (same API, different framework)
pytorch_model = fs.to_model(
name="my-pytorch-model",
model_type=ModelType.REGRESSOR,
model_framework=ModelFramework.PYTORCH,
target_column="target",
feature_list=fs.feature_columns,
)
# Deploy Endpoints
xgb_end = xgb_model.to_endpoint()
xgb_end.auto_inference()
pytorch_end = pytorch_model.to_endpoint()
pytorch_end.auto_inference()
Questions?

The SuperCowPowers team is happy to answer any questions you may have about AWS® and Workbench.
- Support: workbench@supercowpowers.com
- Discord: Join us on Discord
- Website: supercowpowers.com