**Looney posted on 07-02-2009 6:57 PM**

Hi All,

I am newbie to the field of machine learning. I wish to create a model where on basis of about 20 different random variables(18 booleans and 2 discrete ints) to be able to predict another boolean. I wish to train this model from a huge data set being pulled
out from a Database so i guess i have to use a shared variables based chunk reading model.

- Which type of model such as Bayes Point machine(single or Multi-class), HMM or simple markov network would be best suit my modeling needs ? In case it is a Mutli-class Bayes Point Mahcine how many classes and variables would i need, what's the difference
between the two ?
- Also in my environment. I 'll like to have different applications for training and testing the model, so I 'd really like to pullout the prior probabilities and all the learned information and store it in db and when the testing model is instantiated
to restore it to the learned model, Is this possible to do also which of the samples demonstrates this restoration ability (does not matter about the source of the learned parameters) or just directly compiling in the learned shared variable model code would
be the way to go ?

My apologies if these questions are asking the obvious, but i would highly appreciate any help, guidance or indicators about this so while i read my "AI Modern approach" and get educated. I can pay more attention to the specifics which i would
directly be using to build my model.

Kind Regards

Newbie