ece4580:module_recognition
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
ece4580:module_recognition [2017/04/08 22:24] – [Module #2: Alternative Classifiers] pvela | ece4580:module_recognition [2024/08/20 21:38] (current) – external edit 127.0.0.1 | ||
---|---|---|---|
Line 106: | Line 106: | ||
- Train SVM as you did before on this 4_9_3000 dataset with linear kernel. Compare the results with different kernels. | - Train SVM as you did before on this 4_9_3000 dataset with linear kernel. Compare the results with different kernels. | ||
- Train SVM on your car and face dataset with linear kernel. Compare the results with different kernels. Which dataset has more improvements with different kernels. Why? | - Train SVM on your car and face dataset with linear kernel. Compare the results with different kernels. Which dataset has more improvements with different kernels. Why? | ||
+ | |||
+ | |||
+ | |||
+ | |||
+ | __**Week #4: Cross Validation**__ \\ | ||
+ | - Besides kernel, another important thing to apply SVM correctly is to select good hyper-parameters. | ||
+ | - If you checked SVM GUI [[https:// | ||
+ | - Please quickly review the SVM material again, and explain what is C parameter here? | ||
+ | - One way to select good hyper-parameter is to apply "cross validation" | ||
+ | - Please read carefully about [[https:// | ||
+ | - Now, apply cross validation by using 10% of your data for cross validation. Try it on mnist_49_3000. Test C = 10^-2, 10^-1, 1, 10^1, 10^2, which one gives you best performance? | ||
+ | - What is the difference between K-fold and LOO(leave one out)? | ||
--------------- | --------------- |
ece4580/module_recognition.1491704674.txt.gz · Last modified: 2024/08/20 21:38 (external edit)