1. 1. Usually people use gradient descent in logistic regression to
minimize the cost function and on each iteration better values to fit their
data. But in Neural Networks you have a different (but not that different) way,
we use back propagation to find the parameters (or weights) to our inputs.
2. 2. Simple logistic regression is not appropriate for
large complex system (too many feature) but neural networks are much better for
a complex nonlinear hypothesis even when feature space is huge.
No comments:
Post a Comment