Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

merge_ebm produces broken classifiers #576

Open
DerWeh opened this issue Oct 1, 2024 · 0 comments
Open

merge_ebm produces broken classifiers #576

DerWeh opened this issue Oct 1, 2024 · 0 comments

Comments

@DerWeh
Copy link
Contributor

DerWeh commented Oct 1, 2024

The result of merge_ebm is not a valid classifier, as attributes are missing. The repr raises an AttributeError which makes it horrible to debug.

The following is a minimal reproducible example:

from sklearn.datasets import load_iris
from interpret.glassbox import ExplainableBoostingClassifier, merge_ebms

X, y = load_iris(return_X_y=True)

clf1 = ExplainableBoostingClassifier(interactions=0, outer_bags=2)
clf1.fit(X,y)
clf2 = ExplainableBoostingClassifier(interactions=0, outer_bags=2)
clf2.fit(X,y)
clf = merge_ebms([clf1, clf2])
repr(clf)

Which results in the error AttributeError: 'ExplainableBoostingClassifier' object has no attribute 'cyclic_progress'.

A hotfix is simply copying the attributes of the first classifier in the list:

for attr, val in clf1.get_params(deep=False).items():
    if not hasattr(clf, attr):
        setattr(clf, attr, val)

The question is, what the desired strategy is to fix this error. (I haven't delved into merge_ebms to know what's going on.) Another option would be setting the parameters where all classifiers agree, and setting all other parameters to None. This avoids immediate AttributeErrors but might cause problems later, in case None is not a valid parameter.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Development

No branches or pull requests

1 participant