ece4580:module_recognition
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
ece4580:module_recognition [2017/04/02 14:20] – [Module #2: Alternative Classifiers ===extract libsvm in a directory of your choosing, say C:\libsvm] typos | ece4580:module_recognition [2024/08/20 21:38] (current) – external edit 127.0.0.1 | ||
---|---|---|---|
Line 58: | Line 58: | ||
------------------------- | ------------------------- | ||
- | ===== Module #2: Alternative Classifiers ===extract libsvm in a directory of your choosing, say C:\libsvm== | + | ===== Module #2: Alternative Classifiers ===== |
The vanilla Bag-of-Words algorithm utilizes the //k nearest neighbors// (kNN) algorithm to output the final decision. | The vanilla Bag-of-Words algorithm utilizes the //k nearest neighbors// (kNN) algorithm to output the final decision. | ||
Line 89: | Line 90: | ||
- You have to generate two files, one for training and one for testing. | - You have to generate two files, one for training and one for testing. | ||
- Use the commands you learned last week, report your SVM accuracy. | - Use the commands you learned last week, report your SVM accuracy. | ||
+ | |||
+ | |||
+ | __**Week #3: Kernel Trick**__ \\ | ||
+ | - So far we've applied SVM on our dataset to distinguish car and face. We are going to learn more about SVM to see how it works. | ||
+ | - If you played with the nice SVM GUI [[https:// | ||
+ | - Actually, one of the good properties of SVM is that you can apply different " | ||
+ | - Please read carefully about [[http:// | ||
+ | - Now, figure out how you can change different kernel in libSVM. Write down your answer. | ||
+ | - Here we provide an easier dataset [[https:// | ||
+ | - You can use the following command to visualize your data | ||
+ | - >> load mnist_49_3000; | ||
+ | - >> [d,n] = size(x); | ||
+ | - >> i = 1; % index of image to be visualized | ||
+ | - >> imagesc(reshape(x(:, | ||
+ | - Train SVM as you did before on this 4_9_3000 dataset with linear kernel. Compare the results with different kernels. | ||
+ | - Train SVM on your car and face dataset with linear kernel. Compare the results with different kernels. Which dataset has more improvements with different kernels. Why? | ||
+ | |||
+ | |||
+ | |||
+ | |||
+ | __**Week #4: Cross Validation**__ \\ | ||
+ | - Besides kernel, another important thing to apply SVM correctly is to select good hyper-parameters. | ||
+ | - If you checked SVM GUI [[https:// | ||
+ | - Please quickly review the SVM material again, and explain what is C parameter here? | ||
+ | - One way to select good hyper-parameter is to apply "cross validation" | ||
+ | - Please read carefully about [[https:// | ||
+ | - Now, apply cross validation by using 10% of your data for cross validation. Try it on mnist_49_3000. Test C = 10^-2, 10^-1, 1, 10^1, 10^2, which one gives you best performance? | ||
+ | - What is the difference between K-fold and LOO(leave one out)? | ||
+ | |||
--------------- | --------------- | ||
;#; | ;#; | ||
[[ECE4580: | [[ECE4580: | ||
;#; | ;#; |
ece4580/module_recognition.1491157257.txt.gz · Last modified: 2024/08/20 21:38 (external edit)