These don't necessarily meet all your preferences, but there's also:
Treenet commercialization and extension of Jerome Friedman's original implementation. Not open-source but we've found it to work pretty well.
Refer the link to get R Packages which redirects to Gradient Descent for Regression Tasks.
from sklearn.ensemble import AdaBoostClassifier
from sklearn.ensemble import AdaBoostRegressor
from sklearn.tree import DecisionTreeClassifier
import numpy as np
dt = DecisionTreeClassifier()
clf = AdaBoostClassifier(n_estimators=100, base_estimator=dt,learning_rate=1)
clf.fit(X=np.random.rand(10, 3),
y=np.random.randint(2, size=(10,)))
You can tune the parameters to optimize the performance of algorithms, I’ve mentioned below the key parameters for tuning:
One can also go through Gradient Boosting for more details on this.
from sklearn.ensemble import GradientBoostingClassifier #For Classification
from sklearn.ensemble import GradientBoostingRegressor #For Regression
import numpy as np
clf = GradientBoostingClassifier(n_estimators=100, learning_rate=1.0, max_depth=1)
clf.fit(X=np.random.rand(10, 3),
y=np.random.randint(2, size=(10,)))