An example demonstrates Train and evaluate regression models¶

Regression is a commonly used kind of machine learning for predicting numeric values. When to use regression models? How to train and evaluate regression models using the Scikit-Learn

  • toc: true
  • badges: true
  • comments: true
  • categories: [jupyter]
  • image: images/chart-preview.png

Regression is where models predict a number.¶

In machine learning, the goal of regression is to create a model that can predict a numeric, quantifiable value, such as a price, amount, size, or other scalar number. Regression is a statistical technique of fundamental importance to science because of its ease of interpretation, robustness, and speed in calculation. Regression models provide an excellent foundation to understanding how more complex machine learning techniques work. In real world situations, particularly when little data are available, regression models are very useful for making predictions. For example, if a company that rents bicycles wants to predict the expected number of rentals on a given day in the future, a regression model can predict this number. A model could be created using existing data such as the number of bicycles that were rented on days where the season, day of the week, and so on, were also recorded.

Regression can fit many kinds of relationships, including those with multiple factors, and those where the importance of one factor depends on another.

Experimenting with models¶

  • Regression models are often chosen because they work with small data samples, are robust, easy to interpret, and a variety exist. Linear regression is the simplest form of regression, with no limit to the number of features used. Linear regression comes in many forms - often named by the number of features used and the shape of the curve that fits.
  • Decision trees take a step-by-step approach to predicting a variable. If we think of our bicycle example, the decision tree may be first split examples between ones that are during Spring/Summer and Autumn/Winter, make a prediction based on the day of the week. Spring/Summer-Monday may have a bike rental rate of 100 per day, while Autumn/Winter-Monday may have a rental rate of 20 per day.
  • Ensemble algorithms construct not just one decision tree, but a large number of trees - allowing better predictions on more complex data. Ensemble algorithms, such as Random Forest, are widely used in machine learning and science due to their strong prediction abilities. Data scientists often experiment with using different models. In the following exercise, we'll experiment with different types of models to compare how they perform on the same data.
In [2]:
# Import modules we'll need for this notebook
import pandas as pd
from sklearn.linear_model import LinearRegression
from sklearn.metrics import mean_squared_error, r2_score
from sklearn.model_selection import train_test_split
import numpy as np
import matplotlib.pyplot as plt
%matplotlib inline

# load the training dataset
bike_data_path = "https://raw.githubusercontent.com/MicrosoftDocs/mslearn-introduction-to-machine-learning/main/Data/ml-basics/daily-bike-share.csv"
bike_data = pd.read_csv(bike_data_path)
bike_data['day'] = pd.DatetimeIndex(bike_data['dteday']).day
numeric_features = ['temp', 'atemp', 'hum', 'windspeed']
categorical_features = ['season','mnth','holiday','weekday','workingday','weathersit', 'day']
bike_data[numeric_features + ['rentals']].describe()
print(bike_data.head())


# Separate features and labels
# After separating the dataset, we now have numpy arrays named **X** containing the features, and **y** containing the labels.
X, y = bike_data[['season','mnth', 'holiday','weekday','workingday','weathersit','temp', 'atemp', 'hum', 'windspeed']].values, bike_data['rentals'].values

# Split data 70%-30% into training set and test set
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.30, random_state=0)

print ('Training Set: %d rows\nTest Set: %d rows' % (X_train.shape[0], X_test.shape[0]))
   instant    dteday  season  yr  mnth  holiday  weekday  workingday  \
0        1  1/1/2011       1   0     1        0        6           0   
1        2  1/2/2011       1   0     1        0        0           0   
2        3  1/3/2011       1   0     1        0        1           1   
3        4  1/4/2011       1   0     1        0        2           1   
4        5  1/5/2011       1   0     1        0        3           1   

   weathersit      temp     atemp       hum  windspeed  rentals  day  
0           2  0.344167  0.363625  0.805833   0.160446      331    1  
1           2  0.363478  0.353739  0.696087   0.248539      131    2  
2           1  0.196364  0.189405  0.437273   0.248309      120    3  
3           1  0.200000  0.212122  0.590435   0.160296      108    4  
4           1  0.226957  0.229270  0.436957   0.186900       82    5  
Training Set: 511 rows
Test Set: 220 rows

Experiment with Algorithms¶

The linear regression algorithm we used last time to train the model has some predictive capability, but there are many kinds of regression algorithm we could try, including:

  • Linear algorithms: Not just the Linear Regression algorithm we used above (which is technically an Ordinary Least Squares algorithm), but other variants such as Lasso and Ridge.
  • Tree-based algorithms: Algorithms that build a decision tree to reach a prediction.
  • Ensemble algorithms: Algorithms that combine the outputs of multiple base algorithms to improve generalizability. Note: For a full list of Scikit-Learn estimators that encapsulate algorithms for supervised machine learning, see the Scikit-Learn documentation. There are many algorithms to choose from, but for most real-world scenarios, the Scikit-Learn estimator cheat sheet can help you find a suitable starting point.

Try Another Linear Algorithm¶

Let's try training our regression model by using a Lasso algorithm. We can do this by just changing the estimator in the training code.

In [3]:
from sklearn.linear_model import Lasso

# Fit a lasso model on the training set
model = Lasso().fit(X_train, y_train)
print (model, "\n")

# Evaluate the model using the test data
predictions = model.predict(X_test)
mse = mean_squared_error(y_test, predictions)
print("MSE:", mse)
rmse = np.sqrt(mse)
print("RMSE:", rmse)
r2 = r2_score(y_test, predictions)
print("R2:", r2)

# Plot predicted vs actual
plt.scatter(y_test, predictions)
plt.xlabel('Actual Labels')
plt.ylabel('Predicted Labels')
plt.title('Daily Bike Share Predictions')
# overlay the regression line
z = np.polyfit(y_test, predictions, 1)
p = np.poly1d(z)
plt.plot(y_test,p(y_test), color='magenta')
plt.show()
Lasso() 

MSE: 201155.70593338404
RMSE: 448.5038527519959
R2: 0.6056468637824488
No description has been provided for this image

Try a Decision Tree Algorithm¶

As an alternative to a linear model, there's a category of algorithms for machine learning that uses a tree-based approach in which the features in the dataset are examined in a series of evaluations, each of which results in a branch in a decision tree based on the feature value. At the end of each series of branches are leaf-nodes with the predicted label value based on the feature values.

It's easiest to see how this works with an example. Let's train a Decision Tree regression model using the bike rental data. After training the model, the code below will print the model definition and a text representation of the tree it uses to predict label values.

In [4]:
from sklearn.tree import DecisionTreeRegressor
from sklearn.tree import export_text

# Train the model
model = DecisionTreeRegressor().fit(X_train, y_train)
print (model, "\n")

# Visualize the model tree
tree = export_text(model)
print(tree)
DecisionTreeRegressor() 

|--- feature_6 <= 0.45
|   |--- feature_4 <= 0.50
|   |   |--- feature_7 <= 0.32
|   |   |   |--- feature_8 <= 0.41
|   |   |   |   |--- feature_1 <= 2.50
|   |   |   |   |   |--- feature_6 <= 0.29
|   |   |   |   |   |   |--- feature_8 <= 0.36
|   |   |   |   |   |   |   |--- value: [558.00]
|   |   |   |   |   |   |--- feature_8 >  0.36
|   |   |   |   |   |   |   |--- value: [515.00]
|   |   |   |   |   |--- feature_6 >  0.29
|   |   |   |   |   |   |--- value: [317.00]
|   |   |   |   |--- feature_1 >  2.50
|   |   |   |   |   |--- feature_8 <= 0.40
|   |   |   |   |   |   |--- feature_9 <= 0.22
|   |   |   |   |   |   |   |--- value: [981.00]
|   |   |   |   |   |   |--- feature_9 >  0.22
|   |   |   |   |   |   |   |--- value: [968.00]
|   |   |   |   |   |--- feature_8 >  0.40
|   |   |   |   |   |   |--- feature_3 <= 3.00
|   |   |   |   |   |   |   |--- value: [710.00]
|   |   |   |   |   |   |--- feature_3 >  3.00
|   |   |   |   |   |   |   |--- value: [532.00]
|   |   |   |--- feature_8 >  0.41
|   |   |   |   |--- feature_7 <= 0.25
|   |   |   |   |   |--- feature_6 <= 0.18
|   |   |   |   |   |   |--- feature_8 <= 0.43
|   |   |   |   |   |   |   |--- value: [284.00]
|   |   |   |   |   |   |--- feature_8 >  0.43
|   |   |   |   |   |   |   |--- feature_7 <= 0.10
|   |   |   |   |   |   |   |   |--- value: [150.00]
|   |   |   |   |   |   |   |--- feature_7 >  0.10
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.21
|   |   |   |   |   |   |   |   |   |--- value: [117.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.21
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.50
|   |   |   |   |   |   |   |   |   |   |--- value: [73.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.50
|   |   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.17
|   |   |   |   |   |   |   |   |   |   |   |--- value: [68.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_7 >  0.17
|   |   |   |   |   |   |   |   |   |   |   |--- value: [67.00]
|   |   |   |   |   |--- feature_6 >  0.18
|   |   |   |   |   |   |--- feature_9 <= 0.17
|   |   |   |   |   |   |   |--- feature_3 <= 3.00
|   |   |   |   |   |   |   |   |--- value: [140.00]
|   |   |   |   |   |   |   |--- feature_3 >  3.00
|   |   |   |   |   |   |   |   |--- value: [123.00]
|   |   |   |   |   |   |--- feature_9 >  0.17
|   |   |   |   |   |   |   |--- feature_9 <= 0.19
|   |   |   |   |   |   |   |   |--- value: [333.00]
|   |   |   |   |   |   |   |--- feature_9 >  0.19
|   |   |   |   |   |   |   |   |--- feature_8 <= 0.53
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.21
|   |   |   |   |   |   |   |   |   |   |--- value: [251.00]
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.21
|   |   |   |   |   |   |   |   |   |   |--- feature_3 <= 3.50
|   |   |   |   |   |   |   |   |   |   |   |--- value: [217.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_3 >  3.50
|   |   |   |   |   |   |   |   |   |   |   |--- value: [205.00]
|   |   |   |   |   |   |   |   |--- feature_8 >  0.53
|   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.24
|   |   |   |   |   |   |   |   |   |   |--- value: [288.00]
|   |   |   |   |   |   |   |   |   |--- feature_7 >  0.24
|   |   |   |   |   |   |   |   |   |   |--- value: [275.00]
|   |   |   |   |--- feature_7 >  0.25
|   |   |   |   |   |--- feature_9 <= 0.11
|   |   |   |   |   |   |--- value: [706.00]
|   |   |   |   |   |--- feature_9 >  0.11
|   |   |   |   |   |   |--- feature_8 <= 0.54
|   |   |   |   |   |   |   |--- feature_5 <= 1.50
|   |   |   |   |   |   |   |   |--- feature_7 <= 0.26
|   |   |   |   |   |   |   |   |   |--- value: [309.00]
|   |   |   |   |   |   |   |   |--- feature_7 >  0.26
|   |   |   |   |   |   |   |   |   |--- feature_0 <= 2.50
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.26
|   |   |   |   |   |   |   |   |   |   |   |--- value: [408.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.26
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |--- feature_0 >  2.50
|   |   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.48
|   |   |   |   |   |   |   |   |   |   |   |--- value: [440.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_8 >  0.48
|   |   |   |   |   |   |   |   |   |   |   |--- value: [502.00]
|   |   |   |   |   |   |   |--- feature_5 >  1.50
|   |   |   |   |   |   |   |   |--- value: [618.00]
|   |   |   |   |   |   |--- feature_8 >  0.54
|   |   |   |   |   |   |   |--- feature_6 <= 0.29
|   |   |   |   |   |   |   |   |--- feature_6 <= 0.28
|   |   |   |   |   |   |   |   |   |--- value: [318.00]
|   |   |   |   |   |   |   |   |--- feature_6 >  0.28
|   |   |   |   |   |   |   |   |   |--- value: [354.00]
|   |   |   |   |   |   |   |--- feature_6 >  0.29
|   |   |   |   |   |   |   |   |--- feature_7 <= 0.29
|   |   |   |   |   |   |   |   |   |--- value: [195.00]
|   |   |   |   |   |   |   |   |--- feature_7 >  0.29
|   |   |   |   |   |   |   |   |   |--- value: [155.00]
|   |   |--- feature_7 >  0.32
|   |   |   |--- feature_9 <= 0.25
|   |   |   |   |--- feature_6 <= 0.37
|   |   |   |   |   |--- feature_7 <= 0.36
|   |   |   |   |   |   |--- feature_6 <= 0.36
|   |   |   |   |   |   |   |--- feature_1 <= 10.50
|   |   |   |   |   |   |   |   |--- feature_7 <= 0.33
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.21
|   |   |   |   |   |   |   |   |   |   |--- feature_1 <= 3.50
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1047.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_1 >  3.50
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.21
|   |   |   |   |   |   |   |   |   |   |--- value: [724.00]
|   |   |   |   |   |   |   |   |--- feature_7 >  0.33
|   |   |   |   |   |   |   |   |   |--- feature_3 <= 3.00
|   |   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.35
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |   |--- feature_7 >  0.35
|   |   |   |   |   |   |   |   |   |   |   |--- value: [694.00]
|   |   |   |   |   |   |   |   |   |--- feature_3 >  3.00
|   |   |   |   |   |   |   |   |   |   |--- value: [879.00]
|   |   |   |   |   |   |   |--- feature_1 >  10.50
|   |   |   |   |   |   |   |   |--- feature_8 <= 0.57
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.22
|   |   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.33
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1156.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_7 >  0.33
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.22
|   |   |   |   |   |   |   |   |   |   |--- value: [943.00]
|   |   |   |   |   |   |   |   |--- feature_8 >  0.57
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.10
|   |   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.66
|   |   |   |   |   |   |   |   |   |   |   |--- value: [955.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_8 >  0.66
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.10
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.33
|   |   |   |   |   |   |   |   |   |   |   |--- value: [767.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.33
|   |   |   |   |   |   |   |   |   |   |   |--- value: [922.00]
|   |   |   |   |   |   |--- feature_6 >  0.36
|   |   |   |   |   |   |   |--- value: [1658.00]
|   |   |   |   |   |--- feature_7 >  0.36
|   |   |   |   |   |   |--- feature_6 <= 0.35
|   |   |   |   |   |   |   |--- value: [331.00]
|   |   |   |   |   |   |--- feature_6 >  0.35
|   |   |   |   |   |   |   |--- feature_3 <= 2.00
|   |   |   |   |   |   |   |   |--- value: [538.00]
|   |   |   |   |   |   |   |--- feature_3 >  2.00
|   |   |   |   |   |   |   |   |--- value: [560.00]
|   |   |   |   |--- feature_6 >  0.37
|   |   |   |   |   |--- feature_9 <= 0.24
|   |   |   |   |   |   |--- feature_9 <= 0.15
|   |   |   |   |   |   |   |--- feature_8 <= 0.67
|   |   |   |   |   |   |   |   |--- feature_7 <= 0.41
|   |   |   |   |   |   |   |   |   |--- value: [2252.00]
|   |   |   |   |   |   |   |   |--- feature_7 >  0.41
|   |   |   |   |   |   |   |   |   |--- value: [2290.00]
|   |   |   |   |   |   |   |--- feature_8 >  0.67
|   |   |   |   |   |   |   |   |--- feature_6 <= 0.40
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.08
|   |   |   |   |   |   |   |   |   |   |--- value: [1249.00]
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.08
|   |   |   |   |   |   |   |   |   |   |--- value: [1153.00]
|   |   |   |   |   |   |   |   |--- feature_6 >  0.40
|   |   |   |   |   |   |   |   |   |--- value: [1619.00]
|   |   |   |   |   |   |--- feature_9 >  0.15
|   |   |   |   |   |   |   |--- feature_6 <= 0.38
|   |   |   |   |   |   |   |   |--- value: [1651.00]
|   |   |   |   |   |   |   |--- feature_6 >  0.38
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.20
|   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.42
|   |   |   |   |   |   |   |   |   |   |--- feature_1 <= 2.00
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1070.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_1 >  2.00
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |--- feature_6 >  0.42
|   |   |   |   |   |   |   |   |   |   |--- value: [1188.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.20
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.22
|   |   |   |   |   |   |   |   |   |   |--- value: [665.00]
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.22
|   |   |   |   |   |   |   |   |   |   |--- value: [642.00]
|   |   |   |   |   |--- feature_9 >  0.24
|   |   |   |   |   |   |--- value: [2301.00]
|   |   |   |--- feature_9 >  0.25
|   |   |   |   |--- feature_8 <= 0.81
|   |   |   |   |   |--- feature_1 <= 2.50
|   |   |   |   |   |   |--- value: [397.00]
|   |   |   |   |   |--- feature_1 >  2.50
|   |   |   |   |   |   |--- feature_5 <= 1.50
|   |   |   |   |   |   |   |--- value: [982.00]
|   |   |   |   |   |   |--- feature_5 >  1.50
|   |   |   |   |   |   |   |--- feature_8 <= 0.77
|   |   |   |   |   |   |   |   |--- value: [480.00]
|   |   |   |   |   |   |   |--- feature_8 >  0.77
|   |   |   |   |   |   |   |   |--- value: [640.00]
|   |   |   |   |--- feature_8 >  0.81
|   |   |   |   |   |--- feature_7 <= 0.41
|   |   |   |   |   |   |--- value: [120.00]
|   |   |   |   |   |--- feature_7 >  0.41
|   |   |   |   |   |   |--- value: [121.00]
|   |--- feature_4 >  0.50
|   |   |--- feature_6 <= 0.34
|   |   |   |--- feature_1 <= 2.50
|   |   |   |   |--- feature_7 <= 0.29
|   |   |   |   |   |--- feature_7 <= 0.19
|   |   |   |   |   |   |--- feature_7 <= 0.14
|   |   |   |   |   |   |   |--- feature_3 <= 2.50
|   |   |   |   |   |   |   |   |--- feature_6 <= 0.12
|   |   |   |   |   |   |   |   |   |--- value: [86.00]
|   |   |   |   |   |   |   |   |--- feature_6 >  0.12
|   |   |   |   |   |   |   |   |   |--- value: [89.00]
|   |   |   |   |   |   |   |--- feature_3 >  2.50
|   |   |   |   |   |   |   |   |--- value: [95.00]
|   |   |   |   |   |   |--- feature_7 >  0.14
|   |   |   |   |   |   |   |--- feature_8 <= 0.46
|   |   |   |   |   |   |   |   |--- feature_6 <= 0.18
|   |   |   |   |   |   |   |   |   |--- value: [75.00]
|   |   |   |   |   |   |   |   |--- feature_6 >  0.18
|   |   |   |   |   |   |   |   |   |--- value: [61.00]
|   |   |   |   |   |   |   |--- feature_8 >  0.46
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.30
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.49
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.26
|   |   |   |   |   |   |   |   |   |   |   |--- value: [41.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.26
|   |   |   |   |   |   |   |   |   |   |   |--- value: [38.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.49
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.17
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.17
|   |   |   |   |   |   |   |   |   |   |   |--- value: [42.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.30
|   |   |   |   |   |   |   |   |   |--- value: [25.00]
|   |   |   |   |   |--- feature_7 >  0.19
|   |   |   |   |   |   |--- feature_3 <= 4.50
|   |   |   |   |   |   |   |--- feature_9 <= 0.26
|   |   |   |   |   |   |   |   |--- feature_3 <= 2.50
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.15
|   |   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.66
|   |   |   |   |   |   |   |   |   |   |   |--- value: [186.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_8 >  0.66
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.15
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.19
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.19
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |--- feature_3 >  2.50
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.51
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.22
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.22
|   |   |   |   |   |   |   |   |   |   |   |--- value: [82.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.51
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.20
|   |   |   |   |   |   |   |   |   |   |   |--- value: [15.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.20
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 4
|   |   |   |   |   |   |   |--- feature_9 >  0.26
|   |   |   |   |   |   |   |   |--- feature_8 <= 0.82
|   |   |   |   |   |   |   |   |   |--- feature_3 <= 2.50
|   |   |   |   |   |   |   |   |   |   |--- value: [64.00]
|   |   |   |   |   |   |   |   |   |--- feature_3 >  2.50
|   |   |   |   |   |   |   |   |   |   |--- value: [72.00]
|   |   |   |   |   |   |   |   |--- feature_8 >  0.82
|   |   |   |   |   |   |   |   |   |--- value: [34.00]
|   |   |   |   |   |   |--- feature_3 >  4.50
|   |   |   |   |   |   |   |--- feature_8 <= 0.47
|   |   |   |   |   |   |   |   |--- value: [115.00]
|   |   |   |   |   |   |   |--- feature_8 >  0.47
|   |   |   |   |   |   |   |   |--- feature_7 <= 0.23
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.14
|   |   |   |   |   |   |   |   |   |   |--- value: [149.00]
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.14
|   |   |   |   |   |   |   |   |   |   |--- value: [148.00]
|   |   |   |   |   |   |   |   |--- feature_7 >  0.23
|   |   |   |   |   |   |   |   |   |--- value: [174.00]
|   |   |   |   |--- feature_7 >  0.29
|   |   |   |   |   |--- feature_3 <= 4.50
|   |   |   |   |   |   |--- feature_5 <= 1.50
|   |   |   |   |   |   |   |--- feature_6 <= 0.31
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.17
|   |   |   |   |   |   |   |   |   |--- value: [206.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.17
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.62
|   |   |   |   |   |   |   |   |   |   |--- value: [163.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.62
|   |   |   |   |   |   |   |   |   |   |--- value: [173.00]
|   |   |   |   |   |   |   |--- feature_6 >  0.31
|   |   |   |   |   |   |   |   |--- value: [218.00]
|   |   |   |   |   |   |--- feature_5 >  1.50
|   |   |   |   |   |   |   |--- feature_8 <= 0.63
|   |   |   |   |   |   |   |   |--- value: [135.00]
|   |   |   |   |   |   |   |--- feature_8 >  0.63
|   |   |   |   |   |   |   |   |--- value: [74.00]
|   |   |   |   |   |--- feature_3 >  4.50
|   |   |   |   |   |   |--- feature_7 <= 0.30
|   |   |   |   |   |   |   |--- value: [227.00]
|   |   |   |   |   |   |--- feature_7 >  0.30
|   |   |   |   |   |   |   |--- feature_7 <= 0.32
|   |   |   |   |   |   |   |   |--- value: [310.00]
|   |   |   |   |   |   |   |--- feature_7 >  0.32
|   |   |   |   |   |   |   |   |--- value: [307.00]
|   |   |   |--- feature_1 >  2.50
|   |   |   |   |--- feature_9 <= 0.20
|   |   |   |   |   |--- feature_9 <= 0.12
|   |   |   |   |   |   |--- feature_9 <= 0.06
|   |   |   |   |   |   |   |--- feature_7 <= 0.33
|   |   |   |   |   |   |   |   |--- value: [362.00]
|   |   |   |   |   |   |   |--- feature_7 >  0.33
|   |   |   |   |   |   |   |   |--- value: [337.00]
|   |   |   |   |   |   |--- feature_9 >  0.06
|   |   |   |   |   |   |   |--- feature_9 <= 0.08
|   |   |   |   |   |   |   |   |--- feature_5 <= 1.50
|   |   |   |   |   |   |   |   |   |--- value: [143.00]
|   |   |   |   |   |   |   |   |--- feature_5 >  1.50
|   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.30
|   |   |   |   |   |   |   |   |   |   |--- value: [174.00]
|   |   |   |   |   |   |   |   |   |--- feature_7 >  0.30
|   |   |   |   |   |   |   |   |   |   |--- value: [178.00]
|   |   |   |   |   |   |   |--- feature_9 >  0.08
|   |   |   |   |   |   |   |   |--- feature_7 <= 0.31
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.56
|   |   |   |   |   |   |   |   |   |   |--- value: [243.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.56
|   |   |   |   |   |   |   |   |   |   |--- value: [254.00]
|   |   |   |   |   |   |   |   |--- feature_7 >  0.31
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.66
|   |   |   |   |   |   |   |   |   |   |--- value: [268.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.66
|   |   |   |   |   |   |   |   |   |   |--- value: [261.00]
|   |   |   |   |   |--- feature_9 >  0.12
|   |   |   |   |   |   |--- feature_8 <= 0.64
|   |   |   |   |   |   |   |--- feature_8 <= 0.45
|   |   |   |   |   |   |   |   |--- feature_7 <= 0.29
|   |   |   |   |   |   |   |   |   |--- value: [245.00]
|   |   |   |   |   |   |   |   |--- feature_7 >  0.29
|   |   |   |   |   |   |   |   |   |--- value: [316.00]
|   |   |   |   |   |   |   |--- feature_8 >  0.45
|   |   |   |   |   |   |   |   |--- feature_8 <= 0.63
|   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.30
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.17
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 3
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.17
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |--- feature_6 >  0.30
|   |   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.52
|   |   |   |   |   |   |   |   |   |   |   |--- value: [359.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_8 >  0.52
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 3
|   |   |   |   |   |   |   |   |--- feature_8 >  0.63
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.64
|   |   |   |   |   |   |   |   |   |   |--- value: [491.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.64
|   |   |   |   |   |   |   |   |   |   |--- value: [429.00]
|   |   |   |   |   |   |--- feature_8 >  0.64
|   |   |   |   |   |   |   |--- feature_8 <= 0.65
|   |   |   |   |   |   |   |   |--- value: [168.00]
|   |   |   |   |   |   |   |--- feature_8 >  0.65
|   |   |   |   |   |   |   |   |--- feature_8 <= 0.72
|   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.32
|   |   |   |   |   |   |   |   |   |   |--- value: [289.00]
|   |   |   |   |   |   |   |   |   |--- feature_6 >  0.32
|   |   |   |   |   |   |   |   |   |   |--- value: [314.00]
|   |   |   |   |   |   |   |   |--- feature_8 >  0.72
|   |   |   |   |   |   |   |   |   |--- value: [349.00]
|   |   |   |   |--- feature_9 >  0.20
|   |   |   |   |   |--- feature_8 <= 0.78
|   |   |   |   |   |   |--- feature_9 <= 0.23
|   |   |   |   |   |   |   |--- feature_9 <= 0.21
|   |   |   |   |   |   |   |   |--- feature_1 <= 7.00
|   |   |   |   |   |   |   |   |   |--- feature_3 <= 1.50
|   |   |   |   |   |   |   |   |   |   |--- value: [222.00]
|   |   |   |   |   |   |   |   |   |--- feature_3 >  1.50
|   |   |   |   |   |   |   |   |   |   |--- value: [221.00]
|   |   |   |   |   |   |   |   |--- feature_1 >  7.00
|   |   |   |   |   |   |   |   |   |--- value: [198.00]
|   |   |   |   |   |   |   |--- feature_9 >  0.21
|   |   |   |   |   |   |   |   |--- feature_6 <= 0.28
|   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.23
|   |   |   |   |   |   |   |   |   |   |--- value: [123.00]
|   |   |   |   |   |   |   |   |   |--- feature_7 >  0.23
|   |   |   |   |   |   |   |   |   |   |--- value: [137.00]
|   |   |   |   |   |   |   |   |--- feature_6 >  0.28
|   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.30
|   |   |   |   |   |   |   |   |   |   |--- value: [191.00]
|   |   |   |   |   |   |   |   |   |--- feature_7 >  0.30
|   |   |   |   |   |   |   |   |   |   |--- value: [177.00]
|   |   |   |   |   |   |--- feature_9 >  0.23
|   |   |   |   |   |   |   |--- feature_3 <= 4.50
|   |   |   |   |   |   |   |   |--- feature_8 <= 0.56
|   |   |   |   |   |   |   |   |   |--- feature_0 <= 1.50
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.26
|   |   |   |   |   |   |   |   |   |   |   |--- value: [203.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.26
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 3
|   |   |   |   |   |   |   |   |   |--- feature_0 >  1.50
|   |   |   |   |   |   |   |   |   |   |--- feature_0 <= 3.00
|   |   |   |   |   |   |   |   |   |   |   |--- value: [317.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_0 >  3.00
|   |   |   |   |   |   |   |   |   |   |   |--- value: [326.00]
|   |   |   |   |   |   |   |   |--- feature_8 >  0.56
|   |   |   |   |   |   |   |   |   |--- feature_0 <= 3.00
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.29
|   |   |   |   |   |   |   |   |   |   |   |--- value: [247.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.29
|   |   |   |   |   |   |   |   |   |   |   |--- value: [195.00]
|   |   |   |   |   |   |   |   |   |--- feature_0 >  3.00
|   |   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.58
|   |   |   |   |   |   |   |   |   |   |   |--- value: [139.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_8 >  0.58
|   |   |   |   |   |   |   |   |   |   |   |--- value: [150.00]
|   |   |   |   |   |   |   |--- feature_3 >  4.50
|   |   |   |   |   |   |   |   |--- feature_1 <= 7.00
|   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.29
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.24
|   |   |   |   |   |   |   |   |   |   |   |--- value: [300.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.24
|   |   |   |   |   |   |   |   |   |   |   |--- value: [307.00]
|   |   |   |   |   |   |   |   |   |--- feature_7 >  0.29
|   |   |   |   |   |   |   |   |   |   |--- value: [247.00]
|   |   |   |   |   |   |   |   |--- feature_1 >  7.00
|   |   |   |   |   |   |   |   |   |--- value: [456.00]
|   |   |   |   |   |--- feature_8 >  0.78
|   |   |   |   |   |   |--- feature_0 <= 1.50
|   |   |   |   |   |   |   |--- value: [9.00]
|   |   |   |   |   |   |--- feature_0 >  1.50
|   |   |   |   |   |   |   |--- feature_3 <= 3.00
|   |   |   |   |   |   |   |   |--- feature_6 <= 0.30
|   |   |   |   |   |   |   |   |   |--- value: [123.00]
|   |   |   |   |   |   |   |   |--- feature_6 >  0.30
|   |   |   |   |   |   |   |   |   |--- value: [87.00]
|   |   |   |   |   |   |   |--- feature_3 >  3.00
|   |   |   |   |   |   |   |   |--- feature_5 <= 2.50
|   |   |   |   |   |   |   |   |   |--- value: [166.00]
|   |   |   |   |   |   |   |   |--- feature_5 >  2.50
|   |   |   |   |   |   |   |   |   |--- value: [179.00]
|   |   |--- feature_6 >  0.34
|   |   |   |--- feature_3 <= 4.50
|   |   |   |   |--- feature_8 <= 0.48
|   |   |   |   |   |--- feature_6 <= 0.42
|   |   |   |   |   |   |--- feature_8 <= 0.44
|   |   |   |   |   |   |   |--- feature_3 <= 1.50
|   |   |   |   |   |   |   |   |--- value: [208.00]
|   |   |   |   |   |   |   |--- feature_3 >  1.50
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.23
|   |   |   |   |   |   |   |   |   |--- value: [229.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.23
|   |   |   |   |   |   |   |   |   |--- feature_3 <= 3.00
|   |   |   |   |   |   |   |   |   |   |--- value: [324.00]
|   |   |   |   |   |   |   |   |   |--- feature_3 >  3.00
|   |   |   |   |   |   |   |   |   |   |--- value: [340.00]
|   |   |   |   |   |   |--- feature_8 >  0.44
|   |   |   |   |   |   |   |--- feature_3 <= 3.50
|   |   |   |   |   |   |   |   |--- feature_8 <= 0.47
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.46
|   |   |   |   |   |   |   |   |   |   |--- value: [518.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.46
|   |   |   |   |   |   |   |   |   |   |--- value: [482.00]
|   |   |   |   |   |   |   |   |--- feature_8 >  0.47
|   |   |   |   |   |   |   |   |   |--- value: [413.00]
|   |   |   |   |   |   |   |--- feature_3 >  3.50
|   |   |   |   |   |   |   |   |--- value: [663.00]
|   |   |   |   |   |--- feature_6 >  0.42
|   |   |   |   |   |   |--- feature_8 <= 0.40
|   |   |   |   |   |   |   |--- value: [1192.00]
|   |   |   |   |   |   |--- feature_8 >  0.40
|   |   |   |   |   |   |   |--- feature_9 <= 0.32
|   |   |   |   |   |   |   |   |--- feature_6 <= 0.44
|   |   |   |   |   |   |   |   |   |--- value: [834.00]
|   |   |   |   |   |   |   |   |--- feature_6 >  0.44
|   |   |   |   |   |   |   |   |   |--- value: [819.00]
|   |   |   |   |   |   |   |--- feature_9 >  0.32
|   |   |   |   |   |   |   |   |--- value: [795.00]
|   |   |   |   |--- feature_8 >  0.48
|   |   |   |   |   |--- feature_5 <= 1.50
|   |   |   |   |   |   |--- feature_8 <= 0.55
|   |   |   |   |   |   |   |--- feature_1 <= 2.50
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.21
|   |   |   |   |   |   |   |   |   |--- feature_3 <= 2.50
|   |   |   |   |   |   |   |   |   |   |--- value: [199.00]
|   |   |   |   |   |   |   |   |   |--- feature_3 >  2.50
|   |   |   |   |   |   |   |   |   |   |--- value: [141.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.21
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.25
|   |   |   |   |   |   |   |   |   |   |--- value: [259.00]
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.25
|   |   |   |   |   |   |   |   |   |   |--- value: [253.00]
|   |   |   |   |   |   |   |--- feature_1 >  2.50
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.33
|   |   |   |   |   |   |   |   |   |--- value: [331.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.33
|   |   |   |   |   |   |   |   |   |--- value: [432.00]
|   |   |   |   |   |   |--- feature_8 >  0.55
|   |   |   |   |   |   |   |--- feature_8 <= 0.65
|   |   |   |   |   |   |   |   |--- feature_1 <= 3.50
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.60
|   |   |   |   |   |   |   |   |   |   |--- value: [394.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.60
|   |   |   |   |   |   |   |   |   |   |--- value: [460.00]
|   |   |   |   |   |   |   |   |--- feature_1 >  3.50
|   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.40
|   |   |   |   |   |   |   |   |   |   |--- value: [615.00]
|   |   |   |   |   |   |   |   |   |--- feature_7 >  0.40
|   |   |   |   |   |   |   |   |   |   |--- value: [571.00]
|   |   |   |   |   |   |   |--- feature_8 >  0.65
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.10
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.07
|   |   |   |   |   |   |   |   |   |   |--- value: [305.00]
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.07
|   |   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.72
|   |   |   |   |   |   |   |   |   |   |   |--- value: [370.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_8 >  0.72
|   |   |   |   |   |   |   |   |   |   |   |--- value: [376.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.10
|   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.37
|   |   |   |   |   |   |   |   |   |   |--- value: [439.00]
|   |   |   |   |   |   |   |   |   |--- feature_6 >  0.37
|   |   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.67
|   |   |   |   |   |   |   |   |   |   |   |--- value: [433.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_8 >  0.67
|   |   |   |   |   |   |   |   |   |   |   |--- value: [410.00]
|   |   |   |   |   |--- feature_5 >  1.50
|   |   |   |   |   |   |--- feature_7 <= 0.39
|   |   |   |   |   |   |   |--- feature_9 <= 0.17
|   |   |   |   |   |   |   |   |--- feature_7 <= 0.39
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.63
|   |   |   |   |   |   |   |   |   |   |--- value: [466.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.63
|   |   |   |   |   |   |   |   |   |   |--- value: [534.00]
|   |   |   |   |   |   |   |   |--- feature_7 >  0.39
|   |   |   |   |   |   |   |   |   |--- value: [330.00]
|   |   |   |   |   |   |   |--- feature_9 >  0.17
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.35
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.82
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.37
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 3
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.37
|   |   |   |   |   |   |   |   |   |   |   |--- value: [269.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.82
|   |   |   |   |   |   |   |   |   |   |--- value: [203.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.35
|   |   |   |   |   |   |   |   |   |--- value: [127.00]
|   |   |   |   |   |   |--- feature_7 >  0.39
|   |   |   |   |   |   |   |--- feature_9 <= 0.26
|   |   |   |   |   |   |   |   |--- feature_1 <= 3.00
|   |   |   |   |   |   |   |   |   |--- value: [190.00]
|   |   |   |   |   |   |   |   |--- feature_1 >  3.00
|   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.40
|   |   |   |   |   |   |   |   |   |   |--- value: [233.00]
|   |   |   |   |   |   |   |   |   |--- feature_7 >  0.40
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.42
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 3
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.42
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |--- feature_9 >  0.26
|   |   |   |   |   |   |   |   |--- feature_8 <= 0.75
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.69
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.33
|   |   |   |   |   |   |   |   |   |   |   |--- value: [181.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.33
|   |   |   |   |   |   |   |   |   |   |   |--- value: [167.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.69
|   |   |   |   |   |   |   |   |   |   |--- value: [255.00]
|   |   |   |   |   |   |   |   |--- feature_8 >  0.75
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.88
|   |   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.42
|   |   |   |   |   |   |   |   |   |   |   |--- value: [81.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_7 >  0.42
|   |   |   |   |   |   |   |   |   |   |   |--- value: [112.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.88
|   |   |   |   |   |   |   |   |   |   |--- feature_3 <= 2.00
|   |   |   |   |   |   |   |   |   |   |   |--- value: [2.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_3 >  2.00
|   |   |   |   |   |   |   |   |   |   |   |--- value: [50.00]
|   |   |   |--- feature_3 >  4.50
|   |   |   |   |--- feature_8 <= 0.39
|   |   |   |   |   |--- value: [1807.00]
|   |   |   |   |--- feature_8 >  0.39
|   |   |   |   |   |--- feature_5 <= 1.50
|   |   |   |   |   |   |--- feature_6 <= 0.37
|   |   |   |   |   |   |   |--- feature_9 <= 0.19
|   |   |   |   |   |   |   |   |--- value: [484.00]
|   |   |   |   |   |   |   |--- feature_9 >  0.19
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.24
|   |   |   |   |   |   |   |   |   |--- value: [709.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.24
|   |   |   |   |   |   |   |   |   |--- value: [618.00]
|   |   |   |   |   |   |--- feature_6 >  0.37
|   |   |   |   |   |   |   |--- feature_6 <= 0.37
|   |   |   |   |   |   |   |   |--- value: [1603.00]
|   |   |   |   |   |   |   |--- feature_6 >  0.37
|   |   |   |   |   |   |   |   |--- value: [1095.00]
|   |   |   |   |   |--- feature_5 >  1.50
|   |   |   |   |   |   |--- feature_7 <= 0.37
|   |   |   |   |   |   |   |--- feature_1 <= 7.50
|   |   |   |   |   |   |   |   |--- value: [246.00]
|   |   |   |   |   |   |   |--- feature_1 >  7.50
|   |   |   |   |   |   |   |   |--- value: [178.00]
|   |   |   |   |   |   |--- feature_7 >  0.37
|   |   |   |   |   |   |   |--- feature_7 <= 0.39
|   |   |   |   |   |   |   |   |--- value: [796.00]
|   |   |   |   |   |   |   |--- feature_7 >  0.39
|   |   |   |   |   |   |   |   |--- feature_8 <= 0.73
|   |   |   |   |   |   |   |   |   |--- feature_0 <= 2.50
|   |   |   |   |   |   |   |   |   |   |--- value: [447.00]
|   |   |   |   |   |   |   |   |   |--- feature_0 >  2.50
|   |   |   |   |   |   |   |   |   |   |--- value: [470.00]
|   |   |   |   |   |   |   |   |--- feature_8 >  0.73
|   |   |   |   |   |   |   |   |   |--- value: [548.00]
|--- feature_6 >  0.45
|   |--- feature_4 <= 0.50
|   |   |--- feature_8 <= 0.83
|   |   |   |--- feature_1 <= 10.50
|   |   |   |   |--- feature_9 <= 0.28
|   |   |   |   |   |--- feature_7 <= 0.77
|   |   |   |   |   |   |--- feature_8 <= 0.67
|   |   |   |   |   |   |   |--- feature_9 <= 0.09
|   |   |   |   |   |   |   |   |--- value: [3065.00]
|   |   |   |   |   |   |   |--- feature_9 >  0.09
|   |   |   |   |   |   |   |   |--- feature_8 <= 0.66
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.15
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.14
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 8
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.14
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 3
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.15
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.16
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 4
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.16
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 10
|   |   |   |   |   |   |   |   |--- feature_8 >  0.66
|   |   |   |   |   |   |   |   |   |--- value: [3031.00]
|   |   |   |   |   |   |--- feature_8 >  0.67
|   |   |   |   |   |   |   |--- feature_7 <= 0.67
|   |   |   |   |   |   |   |   |--- feature_6 <= 0.51
|   |   |   |   |   |   |   |   |   |--- feature_0 <= 1.50
|   |   |   |   |   |   |   |   |   |   |--- value: [2207.00]
|   |   |   |   |   |   |   |   |   |--- feature_0 >  1.50
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.14
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1138.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.14
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |--- feature_6 >  0.51
|   |   |   |   |   |   |   |   |   |--- feature_1 <= 4.00
|   |   |   |   |   |   |   |   |   |   |--- value: [3155.00]
|   |   |   |   |   |   |   |   |   |--- feature_1 >  4.00
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.69
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 7
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.69
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 6
|   |   |   |   |   |   |   |--- feature_7 >  0.67
|   |   |   |   |   |   |   |   |--- feature_6 <= 0.74
|   |   |   |   |   |   |   |   |   |--- feature_3 <= 0.50
|   |   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.76
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1298.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_8 >  0.76
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1249.00]
|   |   |   |   |   |   |   |   |   |--- feature_3 >  0.50
|   |   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.67
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1549.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_7 >  0.67
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1521.00]
|   |   |   |   |   |   |   |   |--- feature_6 >  0.74
|   |   |   |   |   |   |   |   |   |--- value: [1920.00]
|   |   |   |   |   |--- feature_7 >  0.77
|   |   |   |   |   |   |--- feature_9 <= 0.13
|   |   |   |   |   |   |   |--- value: [1203.00]
|   |   |   |   |   |   |--- feature_9 >  0.13
|   |   |   |   |   |   |   |--- value: [987.00]
|   |   |   |   |--- feature_9 >  0.28
|   |   |   |   |   |--- feature_6 <= 0.47
|   |   |   |   |   |   |--- value: [1558.00]
|   |   |   |   |   |--- feature_6 >  0.47
|   |   |   |   |   |   |--- feature_2 <= 0.50
|   |   |   |   |   |   |   |--- value: [998.00]
|   |   |   |   |   |   |--- feature_2 >  0.50
|   |   |   |   |   |   |   |--- value: [1198.00]
|   |   |   |--- feature_1 >  10.50
|   |   |   |   |--- feature_9 <= 0.18
|   |   |   |   |   |--- value: [1097.00]
|   |   |   |   |--- feature_9 >  0.18
|   |   |   |   |   |--- value: [787.00]
|   |   |--- feature_8 >  0.83
|   |   |   |--- feature_9 <= 0.30
|   |   |   |   |--- feature_3 <= 3.00
|   |   |   |   |   |--- feature_0 <= 2.50
|   |   |   |   |   |   |--- value: [1582.00]
|   |   |   |   |   |--- feature_0 >  2.50
|   |   |   |   |   |   |--- value: [1483.00]
|   |   |   |   |--- feature_3 >  3.00
|   |   |   |   |   |--- feature_9 <= 0.21
|   |   |   |   |   |   |--- feature_7 <= 0.49
|   |   |   |   |   |   |   |--- value: [1033.00]
|   |   |   |   |   |   |--- feature_7 >  0.49
|   |   |   |   |   |   |   |--- value: [902.00]
|   |   |   |   |   |--- feature_9 >  0.21
|   |   |   |   |   |   |--- value: [1462.00]
|   |   |   |--- feature_9 >  0.30
|   |   |   |   |--- value: [226.00]
|   |--- feature_4 >  0.50
|   |   |--- feature_8 <= 0.83
|   |   |   |--- feature_3 <= 4.50
|   |   |   |   |--- feature_7 <= 0.53
|   |   |   |   |   |--- feature_9 <= 0.10
|   |   |   |   |   |   |--- feature_9 <= 0.10
|   |   |   |   |   |   |   |--- feature_6 <= 0.53
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.09
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.07
|   |   |   |   |   |   |   |   |   |   |--- value: [846.00]
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.07
|   |   |   |   |   |   |   |   |   |   |--- value: [830.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.09
|   |   |   |   |   |   |   |   |   |--- value: [763.00]
|   |   |   |   |   |   |   |--- feature_6 >  0.53
|   |   |   |   |   |   |   |   |--- value: [1122.00]
|   |   |   |   |   |   |--- feature_9 >  0.10
|   |   |   |   |   |   |   |--- value: [1348.00]
|   |   |   |   |   |--- feature_9 >  0.10
|   |   |   |   |   |   |--- feature_9 <= 0.23
|   |   |   |   |   |   |   |--- feature_3 <= 2.50
|   |   |   |   |   |   |   |   |--- feature_8 <= 0.64
|   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.47
|   |   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.52
|   |   |   |   |   |   |   |   |   |   |   |--- value: [838.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_8 >  0.52
|   |   |   |   |   |   |   |   |   |   |   |--- value: [922.00]
|   |   |   |   |   |   |   |   |   |--- feature_6 >  0.47
|   |   |   |   |   |   |   |   |   |   |--- feature_0 <= 3.00
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 3
|   |   |   |   |   |   |   |   |   |   |--- feature_0 >  3.00
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |--- feature_8 >  0.64
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.14
|   |   |   |   |   |   |   |   |   |   |--- feature_5 <= 1.50
|   |   |   |   |   |   |   |   |   |   |   |--- value: [699.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_5 >  1.50
|   |   |   |   |   |   |   |   |   |   |   |--- value: [637.00]
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.14
|   |   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.48
|   |   |   |   |   |   |   |   |   |   |   |--- value: [486.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_7 >  0.48
|   |   |   |   |   |   |   |   |   |   |   |--- value: [409.00]
|   |   |   |   |   |   |   |--- feature_3 >  2.50
|   |   |   |   |   |   |   |   |--- feature_7 <= 0.50
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.13
|   |   |   |   |   |   |   |   |   |   |--- value: [655.00]
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.13
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.46
|   |   |   |   |   |   |   |   |   |   |   |--- value: [516.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.46
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 4
|   |   |   |   |   |   |   |   |--- feature_7 >  0.50
|   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.54
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.53
|   |   |   |   |   |   |   |   |   |   |   |--- value: [735.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.53
|   |   |   |   |   |   |   |   |   |   |   |--- value: [695.00]
|   |   |   |   |   |   |   |   |   |--- feature_6 >  0.54
|   |   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.53
|   |   |   |   |   |   |   |   |   |   |   |--- value: [559.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_7 >  0.53
|   |   |   |   |   |   |   |   |   |   |   |--- value: [550.00]
|   |   |   |   |   |   |--- feature_9 >  0.23
|   |   |   |   |   |   |   |--- feature_6 <= 0.48
|   |   |   |   |   |   |   |   |--- feature_7 <= 0.46
|   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.45
|   |   |   |   |   |   |   |   |   |   |--- value: [614.00]
|   |   |   |   |   |   |   |   |   |--- feature_7 >  0.45
|   |   |   |   |   |   |   |   |   |   |--- value: [745.00]
|   |   |   |   |   |   |   |   |--- feature_7 >  0.46
|   |   |   |   |   |   |   |   |   |--- value: [471.00]
|   |   |   |   |   |   |   |--- feature_6 >  0.48
|   |   |   |   |   |   |   |   |--- feature_5 <= 1.50
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.34
|   |   |   |   |   |   |   |   |   |   |--- value: [834.00]
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.34
|   |   |   |   |   |   |   |   |   |   |--- value: [905.00]
|   |   |   |   |   |   |   |   |--- feature_5 >  1.50
|   |   |   |   |   |   |   |   |   |--- value: [1008.00]
|   |   |   |   |--- feature_7 >  0.53
|   |   |   |   |   |--- feature_9 <= 0.13
|   |   |   |   |   |   |--- feature_7 <= 0.72
|   |   |   |   |   |   |   |--- feature_6 <= 0.68
|   |   |   |   |   |   |   |   |--- feature_8 <= 0.82
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.09
|   |   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.66
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |   |--- feature_8 >  0.66
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.09
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.64
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 6
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.64
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 3
|   |   |   |   |   |   |   |   |--- feature_8 >  0.82
|   |   |   |   |   |   |   |   |   |--- value: [1334.00]
|   |   |   |   |   |   |   |--- feature_6 >  0.68
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.12
|   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.64
|   |   |   |   |   |   |   |   |   |   |--- value: [1177.00]
|   |   |   |   |   |   |   |   |   |--- feature_7 >  0.64
|   |   |   |   |   |   |   |   |   |   |--- feature_3 <= 3.50
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |   |--- feature_3 >  3.50
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1363.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.12
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.13
|   |   |   |   |   |   |   |   |   |   |--- feature_1 <= 7.50
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |   |--- feature_1 >  7.50
|   |   |   |   |   |   |   |   |   |   |   |--- value: [989.00]
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.13
|   |   |   |   |   |   |   |   |   |   |--- value: [1233.00]
|   |   |   |   |   |   |--- feature_7 >  0.72
|   |   |   |   |   |   |   |--- feature_6 <= 0.78
|   |   |   |   |   |   |   |   |--- feature_6 <= 0.76
|   |   |   |   |   |   |   |   |   |--- value: [568.00]
|   |   |   |   |   |   |   |   |--- feature_6 >  0.76
|   |   |   |   |   |   |   |   |   |--- value: [673.00]
|   |   |   |   |   |   |   |--- feature_6 >  0.78
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.12
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.55
|   |   |   |   |   |   |   |   |   |   |--- value: [921.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.55
|   |   |   |   |   |   |   |   |   |   |--- value: [872.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.12
|   |   |   |   |   |   |   |   |   |--- value: [778.00]
|   |   |   |   |   |--- feature_9 >  0.13
|   |   |   |   |   |   |--- feature_8 <= 0.53
|   |   |   |   |   |   |   |--- feature_7 <= 0.73
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.15
|   |   |   |   |   |   |   |   |   |--- value: [1281.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.15
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.51
|   |   |   |   |   |   |   |   |   |   |--- feature_1 <= 6.50
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 6
|   |   |   |   |   |   |   |   |   |   |--- feature_1 >  6.50
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 4
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.51
|   |   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.61
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1242.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_7 >  0.61
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1032.00]
|   |   |   |   |   |   |   |--- feature_7 >  0.73
|   |   |   |   |   |   |   |   |--- value: [1405.00]
|   |   |   |   |   |   |--- feature_8 >  0.53
|   |   |   |   |   |   |   |--- feature_5 <= 1.50
|   |   |   |   |   |   |   |   |--- feature_9 <= 0.17
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.17
|   |   |   |   |   |   |   |   |   |   |--- feature_3 <= 3.50
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 7
|   |   |   |   |   |   |   |   |   |   |--- feature_3 >  3.50
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.17
|   |   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.65
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1198.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_8 >  0.65
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1128.00]
|   |   |   |   |   |   |   |   |--- feature_9 >  0.17
|   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.73
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.72
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 7
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.72
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 5
|   |   |   |   |   |   |   |   |   |--- feature_7 >  0.73
|   |   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.57
|   |   |   |   |   |   |   |   |   |   |   |--- value: [662.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_8 >  0.57
|   |   |   |   |   |   |   |   |   |   |   |--- value: [606.00]
|   |   |   |   |   |   |   |--- feature_5 >  1.50
|   |   |   |   |   |   |   |   |--- feature_3 <= 1.50
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.76
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.57
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.57
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 4
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.76
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.20
|   |   |   |   |   |   |   |   |   |   |   |--- value: [653.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.20
|   |   |   |   |   |   |   |   |   |   |   |--- value: [630.00]
|   |   |   |   |   |   |   |   |--- feature_3 >  1.50
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.19
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.15
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 5
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.15
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 3
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.19
|   |   |   |   |   |   |   |   |   |   |--- feature_1 <= 8.00
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 5
|   |   |   |   |   |   |   |   |   |   |--- feature_1 >  8.00
|   |   |   |   |   |   |   |   |   |   |   |--- value: [428.00]
|   |   |   |--- feature_3 >  4.50
|   |   |   |   |--- feature_1 <= 4.00
|   |   |   |   |   |--- value: [2469.00]
|   |   |   |   |--- feature_1 >  4.00
|   |   |   |   |   |--- feature_8 <= 0.72
|   |   |   |   |   |   |--- feature_9 <= 0.12
|   |   |   |   |   |   |   |--- feature_9 <= 0.08
|   |   |   |   |   |   |   |   |--- value: [1325.00]
|   |   |   |   |   |   |   |--- feature_9 >  0.08
|   |   |   |   |   |   |   |   |--- feature_8 <= 0.65
|   |   |   |   |   |   |   |   |   |--- feature_7 <= 0.62
|   |   |   |   |   |   |   |   |   |   |--- value: [1516.00]
|   |   |   |   |   |   |   |   |   |--- feature_7 >  0.62
|   |   |   |   |   |   |   |   |   |   |--- value: [1511.00]
|   |   |   |   |   |   |   |   |--- feature_8 >  0.65
|   |   |   |   |   |   |   |   |   |--- value: [1379.00]
|   |   |   |   |   |   |--- feature_9 >  0.12
|   |   |   |   |   |   |   |--- feature_6 <= 0.83
|   |   |   |   |   |   |   |   |--- feature_1 <= 6.50
|   |   |   |   |   |   |   |   |   |--- feature_8 <= 0.52
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.25
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.25
|   |   |   |   |   |   |   |   |   |   |   |--- value: [898.00]
|   |   |   |   |   |   |   |   |   |--- feature_8 >  0.52
|   |   |   |   |   |   |   |   |   |   |--- feature_0 <= 2.50
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 3
|   |   |   |   |   |   |   |   |   |   |--- feature_0 >  2.50
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |--- feature_1 >  6.50
|   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.72
|   |   |   |   |   |   |   |   |   |   |--- feature_6 <= 0.69
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |   |--- feature_6 >  0.69
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 3
|   |   |   |   |   |   |   |   |   |--- feature_6 >  0.72
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.16
|   |   |   |   |   |   |   |   |   |   |   |--- value: [1366.00]
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.16
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |--- feature_6 >  0.83
|   |   |   |   |   |   |   |   |--- feature_1 <= 6.50
|   |   |   |   |   |   |   |   |   |--- value: [829.00]
|   |   |   |   |   |   |   |   |--- feature_1 >  6.50
|   |   |   |   |   |   |   |   |   |--- value: [670.00]
|   |   |   |   |   |--- feature_8 >  0.72
|   |   |   |   |   |   |--- feature_9 <= 0.20
|   |   |   |   |   |   |   |--- feature_1 <= 5.50
|   |   |   |   |   |   |   |   |--- feature_6 <= 0.61
|   |   |   |   |   |   |   |   |   |--- value: [909.00]
|   |   |   |   |   |   |   |   |--- feature_6 >  0.61
|   |   |   |   |   |   |   |   |   |--- value: [1417.00]
|   |   |   |   |   |   |   |--- feature_1 >  5.50
|   |   |   |   |   |   |   |   |--- feature_6 <= 0.55
|   |   |   |   |   |   |   |   |   |--- value: [1182.00]
|   |   |   |   |   |   |   |   |--- feature_6 >  0.55
|   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.16
|   |   |   |   |   |   |   |   |   |   |--- feature_9 <= 0.14
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |   |--- feature_9 >  0.14
|   |   |   |   |   |   |   |   |   |   |   |--- truncated branch of depth 2
|   |   |   |   |   |   |   |   |   |--- feature_9 >  0.16
|   |   |   |   |   |   |   |   |   |   |--- value: [1045.00]
|   |   |   |   |   |   |--- feature_9 >  0.20
|   |   |   |   |   |   |   |--- feature_9 <= 0.23
|   |   |   |   |   |   |   |   |--- value: [529.00]
|   |   |   |   |   |   |   |--- feature_9 >  0.23
|   |   |   |   |   |   |   |   |--- value: [533.00]
|   |   |--- feature_8 >  0.83
|   |   |   |--- feature_5 <= 2.50
|   |   |   |   |--- feature_8 <= 0.94
|   |   |   |   |   |--- feature_1 <= 5.50
|   |   |   |   |   |   |--- feature_6 <= 0.53
|   |   |   |   |   |   |   |--- value: [692.00]
|   |   |   |   |   |   |--- feature_6 >  0.53
|   |   |   |   |   |   |   |--- feature_3 <= 2.50
|   |   |   |   |   |   |   |   |--- value: [678.00]
|   |   |   |   |   |   |   |--- feature_3 >  2.50
|   |   |   |   |   |   |   |   |--- feature_1 <= 4.50
|   |   |   |   |   |   |   |   |   |--- value: [547.00]
|   |   |   |   |   |   |   |   |--- feature_1 >  4.50
|   |   |   |   |   |   |   |   |   |--- value: [536.00]
|   |   |   |   |   |--- feature_1 >  5.50
|   |   |   |   |   |   |--- feature_8 <= 0.89
|   |   |   |   |   |   |   |--- feature_7 <= 0.55
|   |   |   |   |   |   |   |   |--- value: [438.00]
|   |   |   |   |   |   |   |--- feature_7 >  0.55
|   |   |   |   |   |   |   |   |--- feature_3 <= 2.50
|   |   |   |   |   |   |   |   |   |--- value: [477.00]
|   |   |   |   |   |   |   |   |--- feature_3 >  2.50
|   |   |   |   |   |   |   |   |   |--- value: [480.00]
|   |   |   |   |   |   |--- feature_8 >  0.89
|   |   |   |   |   |   |   |--- value: [555.00]
|   |   |   |   |--- feature_8 >  0.94
|   |   |   |   |   |--- value: [258.00]
|   |   |   |--- feature_5 >  2.50
|   |   |   |   |--- feature_8 <= 0.93
|   |   |   |   |   |--- feature_7 <= 0.53
|   |   |   |   |   |   |--- feature_7 <= 0.51
|   |   |   |   |   |   |   |--- value: [254.00]
|   |   |   |   |   |   |--- feature_7 >  0.51
|   |   |   |   |   |   |   |--- feature_3 <= 2.50
|   |   |   |   |   |   |   |   |--- value: [204.00]
|   |   |   |   |   |   |   |--- feature_3 >  2.50
|   |   |   |   |   |   |   |   |--- value: [217.00]
|   |   |   |   |   |--- feature_7 >  0.53
|   |   |   |   |   |   |--- value: [315.00]
|   |   |   |   |--- feature_8 >  0.93
|   |   |   |   |   |--- value: [126.00]

So now we have a tree-based model; but is it any good? Let's evaluate it with the test data.

In [5]:
# Evaluate the model using the test data
predictions = model.predict(X_test)
mse = mean_squared_error(y_test, predictions)
print("MSE:", mse)
rmse = np.sqrt(mse)
print("RMSE:", rmse)
r2 = r2_score(y_test, predictions)
print("R2:", r2)

# Plot predicted vs actual
plt.scatter(y_test, predictions)
plt.xlabel('Actual Labels')
plt.ylabel('Predicted Labels')
plt.title('Daily Bike Share Predictions')
# overlay the regression line
z = np.polyfit(y_test, predictions, 1)
p = np.poly1d(z)
plt.plot(y_test,p(y_test), color='magenta')
plt.show()
MSE: 233156.95454545456
RMSE: 482.8632876347658
R2: 0.5429104243934175
No description has been provided for this image

The tree-based model doesn't seem to have improved over the linear model, so what else could we try?

Try an Ensemble Algorithm¶

Ensemble algorithms work by combining multiple base estimators to produce an optimal model, either by applying an aggregate function to a collection of base models (sometimes referred to a bagging) or by building a sequence of models that build on one another to improve predictive performance (referred to as boosting).

For example, let's try a Random Forest model, which applies an averaging function to multiple Decision Tree models for a better overall model.

In [6]:
from sklearn.ensemble import RandomForestRegressor

# Train the model
model = RandomForestRegressor().fit(X_train, y_train)
print (model, "\n")

# Evaluate the model using the test data
predictions = model.predict(X_test)
mse = mean_squared_error(y_test, predictions)
print("MSE:", mse)
rmse = np.sqrt(mse)
print("RMSE:", rmse)
r2 = r2_score(y_test, predictions)
print("R2:", r2)

# Plot predicted vs actual
plt.scatter(y_test, predictions)
plt.xlabel('Actual Labels')
plt.ylabel('Predicted Labels')
plt.title('Daily Bike Share Predictions')
# overlay the regression line
z = np.polyfit(y_test, predictions, 1)
p = np.poly1d(z)
plt.plot(y_test,p(y_test), color='magenta')
plt.show()
RandomForestRegressor() 

MSE: 111953.86751999999
RMSE: 334.5950799399178
R2: 0.7805214693595766
No description has been provided for this image

For good measure, let's also try a boosting ensemble algorithm. We'll use a Gradient Boosting estimator, which like a Random Forest algorithm builds multiple trees, but instead of building them all independently and taking the average result, each tree is built on the outputs of the previous one in an attempt to incrementally reduce the loss (error) in the model.

In [7]:
# Train the model
from sklearn.ensemble import GradientBoostingRegressor

# Fit a lasso model on the training set
model = GradientBoostingRegressor().fit(X_train, y_train)
print (model, "\n")

# Evaluate the model using the test data
predictions = model.predict(X_test)
mse = mean_squared_error(y_test, predictions)
print("MSE:", mse)
rmse = np.sqrt(mse)
print("RMSE:", rmse)
r2 = r2_score(y_test, predictions)
print("R2:", r2)

# Plot predicted vs actual
plt.scatter(y_test, predictions)
plt.xlabel('Actual Labels')
plt.ylabel('Predicted Labels')
plt.title('Daily Bike Share Predictions')
# overlay the regression line
z = np.polyfit(y_test, predictions, 1)
p = np.poly1d(z)
plt.plot(y_test,p(y_test), color='magenta')
plt.show()
GradientBoostingRegressor() 

MSE: 104146.7555827192
RMSE: 322.7177645911659
R2: 0.7958268223098317
No description has been provided for this image

Summary¶

Here we've tried a number of new regression algorithms to improve performance. In our notebook we'll look at 'tuning' these algorithms to improve performance.

Improve models with hyperparameters¶

Simple models with small datasets can often be fit in a single step, while larger datasets and more complex models must be fit by repeatedly using the model with training data and comparing the output with the expected label. If the prediction is accurate enough, we consider the model trained. If not, we adjust the model slightly and loop again.

Hyperparameters are values that change the way that the model is fit during these loops. Learning rate, for example, is a hyperparameter that sets how much a model is adjusted during each training cycle. A high learning rate means a model can be trained faster, but if it’s too high the adjustments can be so large that the model is never ‘finely tuned’ and not optimal.

Preprocessing data¶

Preprocessing refers to changes you make to your data before it is passed to the model. We have previously read that preprocessing can involve cleaning your dataset. While this is important, preprocessing can also include changing the format of your data, so it's easier for the model to use. For example, data described as ‘red’, ‘orange’, ‘yellow’, ‘lime’, and ‘green’, may work better if converted into a format more native to computers, such as numbers stating the amount of red and the amount of green.

Scaling features¶

The most common preprocessing step is to scale features so they fall between zero and one. For example, the weight of a bike and the distance a person travels on a bike may be two very different numbers, but by scaling both numbers to between zero and one allows models to learn more effectively from the data.

Using categories as features¶

In machine learning, you can also use categorical features such as 'bicycle', 'skateboard’ or 'car'. These features are represented by 0 or 1 values in one-hot vectors - vectors that have a 0 or 1 for each possible value. For example, bicycle, skateboard, and car might respectively be (1,0,0), (0,1,0), and (0,0,1).

Regression - Optimize and save models¶

Previously, we used complex regression models to look at the relationship between features of a bike rentals dataset. In this notebook, we'll see if we can improve the performance of these models even further.

Let's start by loading the bicycle sharing data as a Pandas DataFrame and viewing the first few rows. As usual, we'll also split our data into training and test datasets.

Optimize Hyperparameters¶

Take a look at the GradientBoostingRegressor estimator definition in the output above, and note that it, like the other estimators we tried previously, includes a large number of parameters that control the way the model is trained. In machine learning, the term parameters refers to values that can be determined from data; values that you specify to affect the behavior of a training algorithm are more correctly referred to as hyperparameters.

The specific hyperparameters for an estimator vary based on the algorithm that the estimator encapsulates. In the case of the GradientBoostingRegressor estimator, the algorithm is an ensemble that combines multiple decision trees to create an overall predictive model. You can learn about the hyperparameters for this estimator in the Scikit-Learn documentation.

We won't go into the details of each hyperparameter here, but they work together to affect the way the algorithm trains a model. In many cases, the default values provided by Scikit-Learn will work well; but there may be some advantage in modifying hyperparameters to get better predictive performance or reduce training time.

So how do you know what hyperparameter values you should use? Well, in the absence of a deep understanding of how the underlying algorithm works, you'll need to experiment. Fortunately, SciKit-Learn provides a way to tune hyperparameters by trying multiple combinations and finding the best result for a given performance metric.

Let's try using a grid search approach to try combinations from a grid of possible values for the learning_rate and n_estimators hyperparameters of the GradientBoostingRegressor estimator.

In [8]:
from sklearn.model_selection import GridSearchCV
from sklearn.metrics import make_scorer, r2_score

# Use a Gradient Boosting algorithm
alg = GradientBoostingRegressor()

# Try these hyperparameter values
params = {
 'learning_rate': [0.1, 0.5, 1.0],
 'n_estimators' : [50, 100, 150]
 }

# Find the best hyperparameter combination to optimize the R2 metric
score = make_scorer(r2_score)
gridsearch = GridSearchCV(alg, params, scoring=score, cv=3, return_train_score=True)
gridsearch.fit(X_train, y_train)
print("Best parameter combination:", gridsearch.best_params_, "\n")

# Get the best model
model=gridsearch.best_estimator_
print(model, "\n")

# Evaluate the model using the test data
predictions = model.predict(X_test)
mse = mean_squared_error(y_test, predictions)
print("MSE:", mse)
rmse = np.sqrt(mse)
print("RMSE:", rmse)
r2 = r2_score(y_test, predictions)
print("R2:", r2)

# Plot predicted vs actual
plt.scatter(y_test, predictions)
plt.xlabel('Actual Labels')
plt.ylabel('Predicted Labels')
plt.title('Daily Bike Share Predictions')
# overlay the regression line
z = np.polyfit(y_test, predictions, 1)
p = np.poly1d(z)
plt.plot(y_test,p(y_test), color='magenta')
plt.show()
Best parameter combination: {'learning_rate': 0.1, 'n_estimators': 150} 

GradientBoostingRegressor(n_estimators=150) 

MSE: 104315.13945974117
RMSE: 322.97854334265173
R2: 0.7954967162873157
No description has been provided for this image

Note: The use of random values in the Gradient Boosting algorithm results in slightly different metrics each time. In this case, the best model produced by hyperparameter tuning is unlikely to be significantly better than one trained with the default hyperparameter values; but it's still useful to know about the hyperparameter tuning technique!

In [9]:
# Train the model
from sklearn.compose import ColumnTransformer
from sklearn.pipeline import Pipeline
from sklearn.impute import SimpleImputer
from sklearn.preprocessing import StandardScaler, OneHotEncoder
from sklearn.linear_model import LinearRegression
import numpy as np

# Define preprocessing for numeric columns (scale them)
numeric_features = [6,7,8,9]
numeric_transformer = Pipeline(steps=[
    ('scaler', StandardScaler())])

# Define preprocessing for categorical features (encode them)
categorical_features = [0,1,2,3,4,5]
categorical_transformer = Pipeline(steps=[
    ('onehot', OneHotEncoder(handle_unknown='ignore'))])

# Combine preprocessing steps
preprocessor = ColumnTransformer(
    transformers=[
        ('num', numeric_transformer, numeric_features),
        ('cat', categorical_transformer, categorical_features)])

# Create preprocessing and training pipeline
pipeline = Pipeline(steps=[('preprocessor', preprocessor),
                           ('regressor', GradientBoostingRegressor())])


# fit the pipeline to train a linear regression model on the training set
model = pipeline.fit(X_train, (y_train))
print (model)
Pipeline(steps=[('preprocessor',
                 ColumnTransformer(transformers=[('num',
                                                  Pipeline(steps=[('scaler',
                                                                   StandardScaler())]),
                                                  [6, 7, 8, 9]),
                                                 ('cat',
                                                  Pipeline(steps=[('onehot',
                                                                   OneHotEncoder(handle_unknown='ignore'))]),
                                                  [0, 1, 2, 3, 4, 5])])),
                ('regressor', GradientBoostingRegressor())])

OK, the model is trained, including the preprocessing steps. Let's see how it performs with the validation data.

In [10]:
# Get predictions
predictions = model.predict(X_test)

# Display metrics
mse = mean_squared_error(y_test, predictions)
print("MSE:", mse)
rmse = np.sqrt(mse)
print("RMSE:", rmse)
r2 = r2_score(y_test, predictions)
print("R2:", r2)

# Plot predicted vs actual
plt.scatter(y_test, predictions)
plt.xlabel('Actual Labels')
plt.ylabel('Predicted Labels')
plt.title('Daily Bike Share Predictions')
z = np.polyfit(y_test, predictions, 1)
p = np.poly1d(z)
plt.plot(y_test,p(y_test), color='magenta')
plt.show()
MSE: 104987.61656110545
RMSE: 324.0179262959157
R2: 0.7941783671372035
No description has been provided for this image

The pipeline is composed of the transformations and the algorithm used to train the model. To try an alternative algorithm you can just change that step to a different kind of estimator.

In [11]:
# Use a different estimator in the pipeline
pipeline = Pipeline(steps=[('preprocessor', preprocessor),
                           ('regressor', RandomForestRegressor())])


# fit the pipeline to train a linear regression model on the training set
model = pipeline.fit(X_train, (y_train))
print (model, "\n")

# Get predictions
predictions = model.predict(X_test)

# Display metrics
mse = mean_squared_error(y_test, predictions)
print("MSE:", mse)
rmse = np.sqrt(mse)
print("RMSE:", rmse)
r2 = r2_score(y_test, predictions)
print("R2:", r2)

# Plot predicted vs actual
plt.scatter(y_test, predictions)
plt.xlabel('Actual Labels')
plt.ylabel('Predicted Labels')
plt.title('Daily Bike Share Predictions - Preprocessed')
z = np.polyfit(y_test, predictions, 1)
p = np.poly1d(z)
plt.plot(y_test,p(y_test), color='magenta')
plt.show()
Pipeline(steps=[('preprocessor',
                 ColumnTransformer(transformers=[('num',
                                                  Pipeline(steps=[('scaler',
                                                                   StandardScaler())]),
                                                  [6, 7, 8, 9]),
                                                 ('cat',
                                                  Pipeline(steps=[('onehot',
                                                                   OneHotEncoder(handle_unknown='ignore'))]),
                                                  [0, 1, 2, 3, 4, 5])])),
                ('regressor', RandomForestRegressor())]) 

MSE: 102732.05699
RMSE: 320.51841911191315
R2: 0.7986002501092208
No description has been provided for this image

bounouce question:¶

You train a regression model using scikit-learn. When you evaluate it with test data, you determine that the model achieves an R-squared metric of 0.95. What does this metric tell you about the model?

The model explains most of the variance between predicted and actual values.

In [ ]: