What should I do in this week?

Post date: Sep 16, 2011 11:09:23 PM

Umm...

  1. When the evaluation of UCM is done, post the result in the cnel website
    1. Write to my team, cc Hagai
    2. Put the F-measure on the table
  2. Analysis of the boundary of the superpixel
    1. Plot the cumulative length of the contour from the coarsest to finest, and expect the jump on the curve.
  3. Run PEDT from Amos
    1. [The code cannot be run!!!]
    2. If we cannot run the code, we might want to use the results in the data set to compare with one from our work.
    3. Ask P'Pew to run the code?
  4. or run IR from Sinisa [No file is available]
  5. Selecting the number of component is extremely important to the result! So, we gotta pick a good method.
    1. GMM-BIC <-- I'm experimenting on this, but need to tune the alpha for the dataset
      1. Conjecture: good alpha depends on the number of samples and the dimensionality? For example, given the same dataset, if the (dimensionality) feature or the number of samples are change, the optimal alpha would change accordingly?
      2. It might be that the in high dimension, those BIC, Figue tends to over-segment the data!!! How about we do the model selection in lower dimension like 3D!!! Yes, that seems to be true, at 3D, BIC-alpha1 works fine and BIC-alpha7 is under-segmentation. However, when D = 10, even BIC-alpha7 is over-segmentation!!! This give a hint of the fact that the model selection might be too over-segmented in high dimension.
    2. Figueiredo GMM <-- the code is NOT robust!!, so I don't use it
    3. Lossy GMM from Yang
    4. k-nn random walk
  6. Dimensionality reduction [Now, we are not using PCA at all, since sLuv is sufficient]
    1. PCA10
    2. SNE10. Note that tSNE is for visualization!!!
  7. Test the MWIS from Sinisa , and run MWIS on the contour produced from DDT or superGMM.
  8. Use UCM as the superpixel for DDT --> UCM-DDT ***** [Done, it works very well], right now here is what we should do:
    1. Try DDT+UCM with another setting, say N_sup = 5 10 20 60 240. [The combination works fine]
    2. In fact, the number of each level can be set using the importance of segment using Dkl
    3. Evaluate the DDT+UCM vs UCM used in the DDT, this way we can say that DDT does improve UCM. Unfortunately, the equivalent UCM (F=0.71) outperforms both DDT and UCM2!!!
    4. And here we found that the BSDS300 evaluation seems to favor noisy segmentation. And penalize sparse contour!!!
  9. When superKmean_allfeat is done --> evaluate by Yang's code for all the things in "/home/student1/MATLABcodes/Collection_Algorithms_results/BSDS300_data/superGMM_superKmean"
    1. Now we have table computed for each method already, so we are able to indicate which algorithm/ scale is the best
    2. Find the optimal scale of the result (#class and #superpixel) --> take that as the optimal scale
  10. When evaluating with Yang's criteria, should do majority-vote contour of superGMM, DDT, GMiND
    1. One problem is that the superpixel we have is not 100% overlapped, so we may need to have the UCM first! [UCM has been used!]
    2. If we have overlapped superpixel, we do majority vote contour
  11. From superGMM, find a good combination obtained from Yang's result --> take that as the optimal weights to combine the multi-scale contour ---> Test the result in BSDS300 eval code.
  12. Pick the best feature from the superGMM framework [we are doing it right now]
  13. Explore the feature extraction codes in BSDS
  14. Put the PCA-histogram feature in the UCM-DDT