1 %feature("docstring") gum::learning::BNLearner 2 " 3 BNLearner(filename,inducedTypes=True) -> BNLearner 4 Parameters: 5 * **filename** (*str*) -- the file to learn from 6 * **inducedTypes** (*Bool*) -- whether BNLearner should try to automatically find the type of each variable 7 8 BNLearner(filename,src) -> BNLearner 9 Parameters: 10 * **filename** (*str*) -- the file to learn from 11 * **src** (*pyAgrum.BayesNet*) -- the Bayesian network used to find those modalities 12 13 BNLearner(learner) -> BNLearner 14 Parameters: 15 * **learner** (*pyAgrum.BNLearner*) -- the BNLearner to copy 16 " 17 18 %feature("docstring") gum::learning::BNLearner::learnBN 19 " 20 learn a BayesNet from a file (must have read the db before) 21 22 Returns 23 ------- 24 pyAgrum.BayesNet 25 the learned BayesNet 26 " 27 28 %feature("docstring") gum::learning::BNLearner::learnParameters 29 " 30 learns a BN (its parameters) when its structure is known. 31 32 Parameters 33 ---------- 34 dag : pyAgrum.DAG 35 bn : pyAgrum.BayesNet 36 take_into_account_score : bool 37 The dag passed in argument may have been learnt from a structure learning. In this case, if the score used to learn the structure has an implicit apriori (like K2 which has a 1-smoothing apriori), it is important to also take into account this implicit apriori for parameter learning. By default, if a score exists, we will learn parameters by taking into account the apriori specified by methods useAprioriXXX () + the implicit apriori of the score, else we just take into account the apriori specified by useAprioriXXX () 38 39 Returns 40 ------- 41 pyAgrum.BayesNet 42 the learned BayesNet 43 44 Raises 45 ------ 46 pyAgrum.MissingVariableInDatabase 47 If a variable of the BN is not found in the database 48 pyAgrum.UnknownLabelInDatabase 49 If a label is found in the database that do not correspond to the variable 50 " 51 52 %feature("docstring") gum::learning::BNLearner::setInitialDAG 53 " 54 Parameters 55 ---------- 56 dag : pyAgrum.DAG 57 an initial DAG structure 58 " 59 60 %feature("docstring") gum::learning::genericBNLearner::useEM 61 " 62 Indicates if we use EM for parameter learning. 63 64 Parameters 65 ---------- 66 epsilon : double 67 if epsilon=0.0 then EM is not used 68 if epsilon>0 then EM is used and stops when the sum of the cumulative squared error on parameters is les than epsilon. 69 " 70 71 %feature("docstring") gum::learning::BNLearner::useMIIC 72 " 73 Indicate that we wish to use MIIC. 74 " 75 76 %feature("docstring") gum::learning::BNLearner::use3off2 77 " 78 Indicate that we wish to use 3off2. 79 " 80 81 %feature("docstring") gum::learning::BNLearner::useNMLCorrection 82 " 83 Indicate that we wish to use the NML correction for 3off2 or MIIC 84 " 85 86 %feature("docstring") gum::learning::BNLearner::useMDLCorrection 87 " 88 Indicate that we wish to use the MDL correction for 3off2 or MIIC 89 " 90 91 %feature("docstring") gum::learning::BNLearner::useNoCorrection 92 " 93 Indicate that we wish to use the NoCorr correction for 3off2 or MIIC 94 " 95 96 %feature("docstring") gum::learning::BNLearner::learnMixedStructure 97 " 98 Warnings 99 -------- 100 learner must be using 3off2 or MIIC algorithm 101 102 Returns 103 ------- 104 pyAgrum.EssentialGraph 105 the learned structure as an EssentialGraph 106 " 107 108 %feature("docstring") gum::learning::BNLearner::latentVariables 109 " 110 Warnings 111 -------- 112 learner must be using 3off2 or MIIC algorithm 113 114 Returns 115 ------- 116 list 117 the list of latent variables 118 " 119 120 121 %feature("docstring") gum::learning::genericBNLearner::setSliceOrder 122 " 123 Set a partial order on the nodes. 124 125 Parameters 126 ---------- 127 l : list 128 a list of sequences (composed of ids of rows or string) 129 " 130 131 132 %feature("docstring") gum::learning::BNLearner::useAprioriDirichlet 133 " 134 Use the Dirichlet apriori. 135 136 Parameters 137 ---------- 138 filename : str 139 the Dirichlet related database 140 " 141 142 %feature("docstring") gum::learning::BNLearner::useAprioriSmoothing 143 " 144 Use the apriori smoothing. 145 146 Parameters 147 ---------- 148 weight : double 149 pass in argument a weight if you wish to assign a weight to the smoothing, else the current weight of the learner will be used. 150 " 151 152 %feature("docstring") gum::learning::BNLearner::useGreedyHillClimbing 153 " 154 Indicate that we wish to use a greedy hill climbing algorithm. 155 " 156 157 %feature("docstring") gum::learning::genericBNLearner::useK2 158 " 159 Indicate to use the K2 algorithm (which needs a total ordering of the variables). 160 161 Parameters 162 ---------- 163 order : list[int or str] 164 sequences of (ids or name) 165 " 166 167 %feature("docstring") gum::learning::genericBNLearner::useLocalSearchWithTabuList 168 " 169 Indicate that we wish to use a local search with tabu list 170 171 Parameters 172 ---------- 173 tabu_size : int 174 The size of the tabu list 175 176 nb_decrease : int 177 The max number of changes decreasing the score consecutively that we allow to apply 178 " 179 180 %feature("docstring") gum::learning::genericBNLearner::hasMissingValues 181 " 182 Indicates whether there are missing values in the database. 183 184 Returns 185 ------- 186 bool 187 True if there are some missing values in the database. 188 " 189 190 %feature("docstring") gum::learning::BNLearner::useNoApriori 191 " 192 Use no apriori. 193 " 194 195 %feature("docstring") gum::learning::genericBNLearner::useAprioriBDeu 196 " 197 The BDeu apriori adds weight to all the cells of the counting tables. 198 In other words, it adds weight rows in the database with equally probable 199 values. 200 201 Parameters 202 ---------- 203 weight : double 204 the apriori weight 205 " 206 207 %feature("docstring") gum::learning::BNLearner::useScoreAIC 208 " 209 Indicate that we wish to use an AIC score. 210 " 211 212 %feature("docstring") gum::learning::BNLearner::useScoreBD 213 " 214 Indicate that we wish to use a BD score. 215 " 216 217 %feature("docstring") gum::learning::BNLearner::useScoreBDeu 218 " 219 Indicate that we wish to use a BDeu score. 220 " 221 222 %feature("docstring") gum::learning::BNLearner::useScoreBIC 223 " 224 Indicate that we wish to use a BIC score. 225 " 226 227 %feature("docstring") gum::learning::BNLearner::useScoreK2 228 " 229 Indicate that we wish to use a K2 score. 230 " 231 232 %feature("docstring") gum::learning::BNLearner::useScoreLog2Likelihood 233 " 234 Indicate that we wish to use a Log2Likelihood score. 235 " 236 237 238 %feature("docstring") gum::learning::genericBNLearner::idFromName 239 " 240 Parameters 241 ---------- 242 var_names : str 243 a variable's name 244 245 Returns 246 ------- 247 int 248 the column id corresponding to a variable name 249 250 Raises 251 ------ 252 pyAgrum.MissingVariableInDatabase 253 If a variable of the BN is not found in the database. 254 " 255 256 %feature("docstring") gum::learning::genericBNLearner::learnDAG 257 " 258 learn a structure from a file 259 260 Returns 261 ------- 262 pyAgrum.DAG 263 the learned DAG 264 " 265 266 267 %feature("docstring") gum::learning::genericBNLearner::erasePossibleEdge 268 " 269 Allow the 2 arcs to be added if necessary. 270 271 Parameters 272 ---------- 273 arc : pyAgrum 274 an arc 275 head : 276 a variable's id (int) 277 tail : 278 a variable's id (int) 279 head : 280 a variable's name (str) 281 tail : 282 a variable's name (str) 283 " 284 %feature("docstring") gum::learning::genericBNLearner::eraseForbiddenArc 285 " 286 Allow the arc to be added if necessary. 287 288 Parameters 289 ---------- 290 arc : pyAgrum 291 an arc 292 head : 293 a variable's id (int) 294 tail : 295 a variable's id (int) 296 head : 297 a variable's name (str) 298 tail : 299 a variable's name (str) 300 " 301 302 %feature("docstring") gum::learning::genericBNLearner::eraseMandatoryArc 303 " 304 Parameters 305 ---------- 306 arc : pyAgrum 307 an arc 308 head : 309 a variable's id (int) 310 tail : 311 a variable's id (int) 312 head : 313 a variable's name (str) 314 tail : 315 a variable's name (str) 316 " 317 318 %feature("docstring") gum::learning::genericBNLearner::addForbiddenArc 319 " 320 The arc in parameters won't be added. 321 322 Parameters 323 ---------- 324 arc : pyAgrum.Arc 325 an arc 326 head : 327 a variable's id (int) 328 tail : 329 a variable's id (int) 330 head : 331 a variable's name (str) 332 tail : 333 a variable's name (str) 334 " 335 336 %feature("docstring") gum::learning::genericBNLearner::addMandatoryArc 337 " 338 Allow to add prior structural knowledge. 339 340 Parameters 341 ---------- 342 arc : pyAgrum.Arc 343 an arc 344 head : 345 a variable's id (int) 346 tail : 347 a variable's id (int) 348 head : 349 a variable's name (str) 350 tail : 351 a variable's name (str) 352 353 Raises 354 ------ 355 pyAgrum.InvalidDirectedCycle 356 If the added arc creates a directed cycle in the DAG 357 " 358 359 360 %feature("docstring") gum::learning::BNLearner::modalities 361 " 362 Returns 363 ------- 364 vector<pos,size> 365 the number of modalities of the database's variables. 366 " 367 368 %feature("docstring") gum::learning::genericBNLearner::nameFromId 369 " 370 Parameters 371 ---------- 372 id 373 a node id 374 375 Returns 376 ------- 377 str 378 the variable's name 379 " 380 381 %feature("docstring") gum::learning::genericBNLearner::names 382 " 383 Returns 384 ------- 385 List[str] 386 the names of the variables in the database 387 " 388 389 390 %feature("docstring") gum::learning::BNLearner::setMaxIndegree 391 " 392 Parameters 393 ---------- 394 max_indegree : int 395 the limit number of parents 396 " 397 398 %feature("docstring") gum::learning::genericBNLearner::setDatabaseWeight 399 " 400 Set the database weight which is given as an equivalent sample size. 401 402 Parameters 403 ---------- 404 weight : double 405 the database weight 406 " 407 408 %feature("docstring") gum::learning::BNLearner::chi2 409 " 410 chi2 computes the chi2 statistic and pvalue for two columns, given a list of other columns. 411 412 413 Parameters 414 ---------- 415 name1: str 416 the name of the first column 417 418 name2 : str 419 the name of the second column 420 421 knowing : [str] 422 the list of names of conditioning columns 423 424 Returns 425 ------- 426 statistic,pvalue 427 the chi2 statistic and the associated p-value as a Tuple 428 " 429 430 431 %feature("docstring") gum::learning::BNLearner::G2 432 " 433 G2 computes the G2 statistic and pvalue for two columns, given a list of other columns. 434 435 436 Parameters 437 ---------- 438 name1: str 439 the name of the first column 440 441 name2 : str 442 the name of the second column 443 444 knowing : [str] 445 the list of names of conditioning columns 446 447 Returns 448 ------- 449 statistic,pvalue 450 the G2 statistic and the associated p-value as a Tuple 451 " 452 453 %feature("docstring") gum::learning::genericBNLearner::logLikelihood 454 " 455 logLikelihood computes the log-likelihood for the columns in vars, given the columns in the list knowing (optional) 456 457 458 Parameters 459 ---------- 460 vars: List[str] 461 the name of the columns of interest 462 463 knowing : List[str] 464 the (optional) list of names of conditioning columns 465 466 Returns 467 ------- 468 double 469 the log-likelihood (base 2) 470 " 471 472 %feature("docstring") gum::learning::genericBNLearner::nbRows 473 " 474 Return the number of row in the database 475 476 477 Returns 478 ------- 479 int 480 the number of rows in the database 481 " 482 483 484 %feature("docstring") gum::learning::genericBNLearner::nbCols 485 " 486 Return the nimber of columns in the database 487 488 489 Returns 490 ------- 491 int 492 the number of columns in the database 493 " 494