Wavelet decompositions of Random Forests - smoothness analysis, sparse approximation and applications
Authors: Oren Elisha, Shai Dekel
JMLR 2016 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | For our experimental results, we implemented C# code that supports RF construction, Besov index analysis, wavelet decompositions of RF and applications such as wavelet-based VI, etc. (source code is available, see link in (Wavelet RF code)). Most datasets are taken from the UCI repository (UCI repository), which allows us to compare our results to previous work. In Figure 9 we observe the rate-distortion performance measured on validation points in a fivefold cross validation of M term wavelet approximation and standard RF, as trees are added. It can be seen that for functions that are smoother in weak-type sense (e.g. higher α), wavelet approximation outperforms the standard RF. Table 1 below shows an extensive list of more datasets. |
| Researcher Affiliation | Collaboration | Oren Elisha School of Mathematical Sciences University of Tel-Aviv and GE Global Research Israel. Shai Dekel School of Mathematical Sciences University of Tel-Aviv and GE Global Research |
| Pseudocode | No | The paper describes methodologies through textual descriptions and mathematical formulations but does not include any clearly labeled pseudocode or algorithm blocks with structured steps. |
| Open Source Code | Yes | For our experimental results, we implemented C# code that supports RF construction, Besov index analysis, wavelet decompositions of RF and applications such as wavelet-based VI, etc. (source code is available, see link in (Wavelet RF code)). Wavelet-based Random Forest source code, https://github.com/orenelis/Wavelets Forest.git. |
| Open Datasets | Yes | Most datasets are taken from the UCI repository (UCI repository), which allows us to compare our results to previous work. UCI machine learning repository, http://archive.ics.uci.edu/ml/. |
| Dataset Splits | Yes | In Figure 9 we observe the rate-distortion performance measured on validation points in a fivefold cross validation of M term wavelet approximation and standard RF, as trees are added. We perform RF construction with 1000 trees and 5 fold cross validation. M was selected automatically using 10 percent of the training set. |
| Hardware Specification | Yes | The algorithms are executed on the Amazon Web Services cloud, using up to 120 CPUs. |
| Software Dependencies | No | For our experimental results, we implemented C# code that supports RF construction, Besov index analysis, wavelet decompositions of RF and applications such as wavelet-based VI, etc. (source code is available, see link in (Wavelet RF code)). The paper mentions C# code but does not specify any version numbers for the language or any libraries used, which is required for reproducible software dependencies. |
| Experiment Setup | Yes | We have generated RF with 100 decision trees with 80% bagging and n hyper-parameter. We perform RF construction with 1000 trees and 5 fold cross validation. M was selected automatically using 10 percent of the training set. |