ece4580:module_recognition
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
ece4580:module_recognition [2017/01/29 16:12] โ pvela | ece4580:module_recognition [2024/08/20 21:38] (current) โ external edit 127.0.0.1 | ||
---|---|---|---|
Line 59: | Line 59: | ||
------------------------- | ------------------------- | ||
===== Module #2: Alternative Classifiers ===== | ===== Module #2: Alternative Classifiers ===== | ||
+ | |||
The vanilla Bag-of-Words algorithm utilizes the //k nearest neighbors// (kNN) algorithm to output the final decision. | The vanilla Bag-of-Words algorithm utilizes the //k nearest neighbors// (kNN) algorithm to output the final decision. | ||
Here we will explore support vector machines (SVM) as a means to perform the final decision. | Here we will explore support vector machines (SVM) as a means to perform the final decision. | ||
- | --------------- | ||
- | ;#; | ||
- | [[ECE4580: | ||
- | ;#; | ||
+ | __**Week #1: understanding SVM**__ \\ | ||
+ | - Go to [[https:// | ||
+ | - See square in the middle? Select a color by clicking on the " | ||
+ | - Here we would like to use SVM classifier to help us to train a model to predict if a coming image is a car or face by drawing a hyper-plane in feature space, so we no longer have to compare each new image to all the training image(k-nearest neighbor approach in module # | ||
+ | - Go to " | ||
+ | - Extract the compressed file, open your Matlab, and browse into the folder, say C:\libsvm | ||
+ | - Now type command >> mex -setup | ||
+ | - After you select your compiler for MEX-files, get into /matlab folder >> cd(' | ||
+ | - Compile it >> make | ||
+ | - Okay, now your should have SVM libraries in your computer! you can add path by >> addpath(' | ||
+ | - Now, we'd like to run some toy example from [[http:// | ||
+ | - >> labels = double(rand(10, | ||
+ | - >> data = rand(10,5); | ||
+ | - >> model = svmtrain(labels, | ||
+ | - Now you should be able to understand the basic usages. Please carefully read the README file in the folder and run the example. Output the result of accuracy_l and accuracy_p to demonstrate you've run the example yourself. It is very important your have correctly installed libSVM. We will use libSVM for the next task! | ||
+ | - (optional) anyone who is interested in SVM, I highly recommend [[https:// | ||
+ | __**Week #2: apply SVM to car and face dataset**__ \\ | ||
+ | - If you did the toy example in libSVM and understood commands correctly, now you are ready to apply this powerful library on our previous dataset - car and face!. | ||
+ | - In the toy example, data is actually a list of feature and it's true label. Now, we would like to use our bag-of-words features here. | ||
+ | - Generate bag-of-words feature for car and face as you did in your previous tasks (collect sift features, kmeans, and compute the histogram for each image. We use histogram as the feature for each image). | ||
+ | - Mimic the toy example, for each of the image, we set an label for car (and different label for face). Also for each of the image we have the histogram as its feature like you see in toy example. | ||
+ | - You have to generate two files, one for training and one for testing. | ||
+ | - Use the commands you learned last week, report your SVM accuracy. | ||
+ | __**Week #3: Kernel Trick**__ \\ | ||
+ | - So far we've applied SVM on our dataset to distinguish car and face. We are going to learn more about SVM to see how it works. | ||
+ | - If you played with the nice SVM GUI [[https:// | ||
+ | - Actually, one of the good properties of SVM is that you can apply different " | ||
+ | - Please read carefully about [[http:// | ||
+ | - Now, figure out how you can change different kernel in libSVM. Write down your answer. | ||
+ | - Here we provide an easier dataset [[https:// | ||
+ | - You can use the following command to visualize your data | ||
+ | - >> load mnist_49_3000; | ||
+ | - >> [d,n] = size(x); | ||
+ | - >> i = 1; % index of image to be visualized | ||
+ | - >> imagesc(reshape(x(:, | ||
+ | - Train SVM as you did before on this 4_9_3000 dataset with linear kernel. Compare the results with different kernels. | ||
+ | - Train SVM on your car and face dataset with linear kernel. Compare the results with different kernels. Which dataset has more improvements with different kernels. Why? | ||
+ | |||
+ | |||
+ | |||
+ | |||
+ | __**Week #4: Cross Validation**__ \\ | ||
+ | - Besides kernel, another important thing to apply SVM correctly is to select good hyper-parameters. | ||
+ | - If you checked SVM GUI [[https:// | ||
+ | - Please quickly review the SVM material again, and explain what is C parameter here? | ||
+ | - One way to select good hyper-parameter is to apply "cross validation" | ||
+ | - Please read carefully about [[https:// | ||
+ | - Now, apply cross validation by using 10% of your data for cross validation. Try it on mnist_49_3000. Test C = 10^-2, 10^-1, 1, 10^1, 10^2, which one gives you best performance? | ||
+ | - What is the difference between K-fold and LOO(leave one out)? | ||
+ | |||
+ | --------------- | ||
+ | ;#; | ||
+ | [[ECE4580: | ||
+ | ;#; |
ece4580/module_recognition.1485724356.txt.gz ยท Last modified: 2024/08/20 21:38 (external edit)