Learning Continuous Time Bayesian Networks in Non-stationary Domains
Authors: Simone Villa, Fabio Stella
JAIR 2016 | Venue PDF | Archive PDF | Plain Text | LLM Run Details
| Reproducibility Variable | Result | LLM Response |
|---|---|---|
| Research Type | Experimental | A set of numerical experiments on synthetic data is used to compare the effectiveness of non-stationary continuous time Bayesian networks to that of non-stationary dynamic Bayesian networks. Furthermore, the performance achieved by non-stationary continuous time Bayesian networks is compared to that achieved by state-of-the-art algorithms on four real-world datasets, namely drosophila, saccharomyces cerevisiae, songbird and macroeconomics. |
| Researcher Affiliation | Academia | Simone Villa EMAIL Fabio Stella EMAIL Department of Informatics, Systems and Communication University of Milano-Bicocca Viale Sarca 336, 20126 Milan, Italy |
| Pseudocode | Yes | Algorithm 1 Learn KTTX Algorithm 2 Learn KNEX Algorithm 3 Tentative Allocation Algorithm 4 Learn UNEX Algorithm 5 Split Merge |
| Open Source Code | No | We acknowledge the precious help of Alex Hartemink who let us use the nsdbn jar executable program for learning ns DBN models. Furthermore, he also provided the drosophila and songbird datasets. |
| Open Datasets | Yes | A set of numerical experiments on synthetic data is used to compare the effectiveness of non-stationary continuous time Bayesian networks to that of non-stationary dynamic Bayesian networks. Furthermore, the performance achieved by non-stationary continuous time Bayesian networks is compared to that achieved by state-of-the-art algorithms on four real-world datasets, namely drosophila, saccharomyces cerevisiae, songbird and macroeconomics. The saccharomyces cerevisiae dataset is obtained from a synthetic regulatory network with 5 genes in saccharomyces cerevisiae (Cantone, Marucci, Iorio, Ricci, Belcastro, Bansal, Santini, di Bernardo, di Bernardo, & Cosma, 2009). |
| Dataset Splits | No | The paper does not provide specific train/test/validation dataset splits. For synthetic datasets, it mentions |
| Hardware Specification | No | The paper does not explicitly describe the specific hardware used to run its experiments (e.g., GPU/CPU models, memory amounts). |
| Software Dependencies | No | The paper mentions using the "nsdbn jar executable" which is a third-party tool, but it does not specify any software components or libraries with version numbers that were used in the authors' own implementation. |
| Experiment Setup | Yes | ns CTBN were learned by using the following parameters setting: Iters = 1,000, CT0 = 1,000, ζ = 0.8, z = 3, σ = 1, sp = 0.3, mp = 0.3, α = 1 and τ = 0.1 using the BDeu metric. Furthermore, for ns DBN and ns CTBN we set the maximum number of parents to 4. Structural learning experiments were performed with λc = {1, 2, 4} and λe = {5, 10, 15} for ns CTBN and λs = {1, 2, 4} and with λm = {10, 50, 100} for ns DBN. The network inference task was performed by learning ns CTBN under the UNE setting with the following parameter values λc = {0.2, 0.4, 1, 2} and λe = {0.5, 1, 2, 5}. Furthermore, we set the maximum number of parents to 2, the number of iterations to 1,000 and the number of runs to 100. |