ece4580:module_classification
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
ece4580:module_classification [2017/01/24 06:09] – typos | ece4580:module_classification [2024/08/20 21:38] (current) – external edit 127.0.0.1 | ||
---|---|---|---|
Line 1: | Line 1: | ||
- | ====== Classification ====== | + | ====== |
- | Classification involves | + | Classification involves |
- | /* | + | This set of learning modules will go a little backwards relative to how our understanding has evolved regarding classification. It starts with an unsupervised method for performing feature descriptor estimation, where machine learning tools are used to come up with the identifying features. Later methods to be explored are slightly more engineered solutions that have been shown to work well for some problems. |
- | (2) bag-of-words classifier: http:// | + | |
- | They were short courses on ICCV 2005. | + | |
- | */ | + | |
+ | ---------- | ||
+ | ===== Module #1: Digit Classification Using Stacked Autoencoders ===== | ||
/* | /* | ||
Andrew Ng | Andrew Ng | ||
*/ | */ | ||
- | ===== Module #1 ===== | ||
- | Classification on digits using neural networks | ||
- | - Check [[http://ufldl.stanford.edu/ | + | The classical artificial intelligence/machine learning example of classification is digits classification. |
- | - In this module, you are requires | + | |
- | - First, implement | + | |
- | - Second, get ' | + | |
- | - Third, implement ' | + | |
+ | This set of activities comes courtesy of Prof. Andrew Ng at Stanford university. We will be using his [[http:// | ||
- | -------------------- | + | __**Week |
- | ===== Module | + | Implement the ' |
- | Classification on digits using DEEP neural networks | + | |
- | - Once you finish module | + | __**Week #2: Data Pre-Processing and Classification**__ \\ |
- | - Now, the advanced task would require you to extend it to be deep neural networks | + | Complete the `Vectorized implementation' |
- | - In this module, based on your module 1, please continue to work on ' | + | |
+ | __** Week #3: Feature Learning and Classification.**__ \\ | ||
+ | Complete the ' | ||
- | -------------------- | + | __** Week #4: Deep Neural Networks.**__ \\ |
- | ===== Module | + | Now, we are going to use the autoencoder repeteadly to create |
- | Classification on characters using DEEP neural networks | + | |
- | + | ||
- | - Now you have a very nice classification system on digit. Can we classify on other similar | + | |
- | - In module 3, take your neural networks and train it with characters instead | + | |
- | - Report | + | |
+ | __** Week #5: Character Classification with Deep Neural Networks.**__ \\ | ||
+ | Now you have a very nice classification system on digits. Can we classify on other similar data, such as characters, using the same approach? | ||
+ | Take your feature learning and classification training deep network procedure and train it with characters instead of digits. For characters, we would like to use [[https:// | ||
+ | - Use the feature space you learned already from the digits, and use it as your neural network feature descriptor function. | ||
+ | - Start all over from scratch, and learn with autoencoders a feature space for characters. | ||
+ | - Use the new character feature space to train a regression classifier. Get the accuracy and confusion matrix. | ||
+ | - Report your results and comparison. | ||
+ | ---------------- | ||
+ | ===== Module #2: Engineered Features: Bag-of-Words ===== | ||
+ | Before people got into the auto-learning of feature spaces, it was common to try to hand craft a feature space, or come up with a mechanism for creating feature spaces that generalized. | ||
+ | /* | ||
+ | (2) bag-of-words classifier: http:// | ||
+ | They were short courses on ICCV 2005. | ||
+ | */ | ||
/* | /* | ||
Other related material, maybe more classic. | Other related material, maybe more classic. | ||
*/ | */ | ||
+ | |||
+ | __**Week #1: Clustering to Define Words**__ \\ | ||
+ | - Study [[https:// | ||
+ | - Download (or clone) the clustering skeleton code [[https:// | ||
+ | - Implement k-means clustering algorithm working in RGB space by following the algorithmic steps. You are welcome to implement from scratch without skeleton code. | ||
+ | - Test your algorithm on segmenting the image // | ||
+ | - Try different random initialization and show corresponding results. | ||
+ | - Comment on your different segmentation results. | ||
+ | |||
+ | //Matlab Notes:// Matlab has several functions that can assist with the calculations so that you do not have to process the data in a for loops. | ||
+ | |||
+ | __**Week #2: Object Recognition**__\\ | ||
+ | |||
+ | - Study the [[https:// | ||
+ | - We begin with implementing a simple but powerful recognition system to classify //faces// and //cars//. | ||
+ | - Check [[https:// | ||
+ | - In our implementation, | ||
+ | - Now, use first 40 images in both categories for training. | ||
+ | - Extract SIFT features from each image | ||
+ | - Derive k codewords with k-means clustering in module 1. | ||
+ | - Compute histogram of codewords using [[https:// | ||
+ | - Use the rest of 50 images in both categories to test your implementation. | ||
+ | - Report the accuracy and computation time with different k | ||
+ | |||
+ | |||
+ | |||
+ | __**Week #3: Spatial Pyramid Matching (SPM)**__\\ | ||
+ | Usually objects have different properties across the spatial scales, even though they may appear common at one given scale. | ||
+ | - Study [[https:// | ||
+ | - We will implement a simplified version of SPM based on your molude 2 | ||
+ | - First, for each traning image, divide it equally into a 2 × 2 spatial bin. | ||
+ | - Second, for each of the 4 bins, extract the SIFT features and compute the histograms of codewords as in module 2 | ||
+ | - Third, concatenate the 4 histogram vectors in a fixed order. (hint: the a vector has 4k dimension.) | ||
+ | - Forth, concatenate the vector you have in module 2 with this vector (both weighted by 0.5 before concatenated). | ||
+ | - Finally, use this 5k representation and re-run the training and testing again. | ||
+ | - Compare the results from module 3 and module 2. Explain what you observe. | ||
----------------- | ----------------- |
ece4580/module_classification.1485256173.txt.gz · Last modified: 2024/08/20 21:38 (external edit)