That said, using a neural net might be an interesting way to develop a new control scheme.
Edit: I'm reading the paper and I think that's what Ryan was talking about, although it was unclear in the way he worded it.
That said, using a neural net might be an interesting way to develop a new control scheme.
Edit: I'm reading the paper and I think that's what Ryan was talking about, although it was unclear in the way he worded it.
no, i think its more like, you use a neural network when you don't know the fundamental governing equations. you use the network to derive its own governing equations given a certain set of input data. the the netowrk not only has to use the governing equations, it also has to figure out what they are.
in most cases where modeling is too complex or too tough for engineering applications, we know the governing equations. and the governing equations and there application is in and of itself, too complex for computational solutions to be generated in a reasonable time. and thats when you know and understand the system, and why its behaving as it is. if you don't know that, like a neural network wont, then you have added a HUGE mess of computation that your network has to do. it has to figure out what the problem is that it needs to solve. whereas, we already know how to solve the problem.
I agree with all of this, with the pithy addendum that it is by their very kludginess that nets are useful at all. A lot of the criticisms of them as a toy have been fixed by Nvidia bothering to write cuDNN.
My assumption in the original post is that Gordon actually has an interesting problem on his hands in the remaining noise that he can't dial out by his pre-cut and compensate approach. (A feedback mechanism superior to feedforward mechanisms which tend to have divergence problems.)
Something like taking all of your input data and then looking at accuracy error in an REML fit model analysis could be useful to try to poke around if anything you're ALREADY MEASURING could be assignable in further error. I'm pretty sure linearity would be baked into that and it's computationally very cheap. The attractive part about doing it with online machine learning is you wouldn't have to fuck around with orthogonal DOE-by-hand nonsense and can just gradient descend you way to a local optimum.
Ah, got it. I think I'm caught up now.
modeling, meshing, boundary conditioning, and initializing an entire diamond turning machine, and then modeling every interaction that shows how such a machine changes size based on fractional degrees of temperature gradients, and the performance of the cooling systems ... i mean, its all technically possible. its a solvable problem.
its just computationally so complex it might as well be unsolvable.