User Tools

Site Tools


ece4580:module_recognition

Differences

This shows you the differences between two versions of the page.

Link to this comparison view

Both sides previous revisionPrevious revision
Next revision
Previous revision
ece4580:module_recognition [2017/04/08 17:01] – [Module #2: Alternative Classifiers] typosece4580:module_recognition [2024/08/20 21:38] (current) – external edit 127.0.0.1
Line 92: Line 92:
  
  
-__**Week #2: Kernel Trick**__ \\+__**Week #3: Kernel Trick**__ \\
   - So far we've applied SVM on our dataset to distinguish car and face. We are going to learn more about SVM to see how it works.   - So far we've applied SVM on our dataset to distinguish car and face. We are going to learn more about SVM to see how it works.
   - If you played with the nice SVM GUI [[https://www.csie.ntu.edu.tw/~cjlin/libsvm/|here]] by generating some points in different colors, you can observe that the separating hyperplanes are not always a straight line. Why is it?   - If you played with the nice SVM GUI [[https://www.csie.ntu.edu.tw/~cjlin/libsvm/|here]] by generating some points in different colors, you can observe that the separating hyperplanes are not always a straight line. Why is it?
Line 106: Line 106:
   - Train SVM as you did before on this 4_9_3000 dataset with linear kernel. Compare the results with different kernels.   - Train SVM as you did before on this 4_9_3000 dataset with linear kernel. Compare the results with different kernels.
   - Train SVM on your car and face dataset with linear kernel. Compare the results with different kernels. Which dataset has more improvements with different kernels. Why?   - Train SVM on your car and face dataset with linear kernel. Compare the results with different kernels. Which dataset has more improvements with different kernels. Why?
 +
 +
 +
 +
 +__**Week #4: Cross Validation**__ \\
 +  - Besides kernel, another important thing to apply SVM correctly is to select good hyper-parameters. 
 +  - If you checked SVM GUI [[https://www.csie.ntu.edu.tw/~cjlin/libsvm/|here]], you might notice that different C might lead to different performance.  
 +  - Please quickly review the SVM material again, and explain what is C parameter here?
 +  - One way to select good hyper-parameter is to apply "cross validation".
 +  - Please read carefully about [[https://www.cs.cmu.edu/~schneide/tut5/node42.html|Cross Validation]]. The idea is mainly to leave some data untouched, and use it to test your selected parameter. And repeat this step using different untouched portion in your training data.
 +  - Now, apply cross validation by using 10% of your data for cross validation. Try it on mnist_49_3000. Test C = 10^-2, 10^-1, 1, 10^1, 10^2, which one gives you best performance? 
 +  - What is the difference between K-fold and LOO(leave one out)?
 +
 --------------- ---------------
 ;#; ;#;
 [[ECE4580:Modules|ECE4580 Learning Modules]] [[ECE4580:Modules|ECE4580 Learning Modules]]
 ;#; ;#;
ece4580/module_recognition.1491685291.txt.gz · Last modified: 2024/08/20 21:38 (external edit)