We are working with a complex application i.e. a physical measurement in a lab, that has approximately 230 different input parameters, many of which are ranges or multiple-value.
The application produces a single output, which is then verified in an external (physical) process. At the end of the process the individual tests are marked as "success" or "fail". That is, despite the many input parameters, the output is assessed in a boolean manner.
When tests fail, the parameters are 'loosened' slightly and re-tested.
We have about 20,000 entries in our database, with both "success" and "fail", and we are considering a machine learning application to help in two areas:
1) Initial selection of optimum parameters
2) Suggestions for how to tune the parameters after a "fail"
Many of the input parameters are strongly related to each other.
I studied computer science in the mid-90s, when the focus was mostly expert systems and neural networks. We also have access to some free CPU hours of Microsoft Azure Machine Learning.
What type of machine learning would fit these use-cases?