[1]
|
J. Sinno, Q. Y. Pan. A survey on transfer learning. IEEE Transactions on Knowledge and Data Engineering, vol.,22, no.,10, pp.,1345-135, 2010. |
[2]
|
M. R. Rashedur, R. M. Fazle. Using and comparing different decision tree classification techniques for mining ICDDR, B Hospital Surveillance data. Expert Systems with Applications, vol.,38, no.,9, pp.,11421-11436, 2011. |
[3]
|
M. Wasikowski, X. W. Chen. Combating the small sample class imbalance problem using feature selection. IEEE Transactions on Knowledge and Data Engineering, vol.,22, no.,10, pp.,1388-1400, 2010. |
[4]
|
Q. B. Song, J. J. Ni, G. T. Wang. A fast clustering-based feature subset selection algorithm for high-dimensional data. IEEE Transactions on Knowledge and Data Engineering, vol.,l5, no.,1, pp.,1-14, 2013. |
[5]
|
J. F. Artur, A. T. Mário. Efficient feature selection filters for high-dimensional data. Pattern Recognition Letters, vol.,33, no.,13, pp.,1794-1804, 2012. |
[6]
|
J. Wu, L. Chen, Y. P. Feng, Z. B. Zheng, M. C. Zhou, Z. H. Wu. Predicting quality of service for selection by neighborhood-based collaborative filtering. IEEE Transactions on Systems, Man, and Cybernetics:Systems, vol.,43, no.,2, pp.,428-439, 2012. |
[7]
|
C. P. Hou, F. P. Nie, X. Li, D. Yi, Y. Wu. Joint embedding learning and sparse regression:A framework for unsupervised feature selection. IEEE Transactions on Cybernetics, vol.,44, no.,6, pp.,793-804, 2014. |
[8]
|
P. Bermejo, L. dela Ossa, J. A. Gámez, J. M. Puerta. Fast wrapper feature subset selection in high-dimensional datasets by means of filter re-ranking Original Research Article. Knowledge-based Systems, vol.,25, no.,1, pp.,35-44, 2012. |
[9]
|
S. Atulji, G. Shameek, V. K. Jayaramanb. Hybrid biogeography based simultaneous feature selection and MHC class I peptide binding prediction using support vector machines and random forests. Journal of Immunological Methods, vol.,387, no.,1-2, pp.,284-292, 2013. |
[10]
|
H. L. Wei, S. Billings. Feature subset selection and ranking for data dimensionality reduction. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.,29, no.,1, pp.,162-166, 2007. |
[11]
|
X. V. Nguyen, B. James. Comments on supervised feature selection by clustering using conditional mutual information-based distances. Pattern Recognition, vol.,46, no.,4, pp.,1220-1225, 2013. |
[12]
|
A. G. Iffat, S. S. Leslie. Feature subset selection in large dimensionality domains. Pattern Recognition, vol.,43, no.,1, pp.,5-13, 2010. |
[13]
|
M. Hall. Correlation-based Feature Selection for Machine Learning, Ph.,D dissertation, The University of Waikato, New Zealond, 1999. |
[14]
|
Y. Lei, L. Huan. Efficient feature selection via analysis of relevance and redundancy. Journal of Machine Learning Research, vol.,5, no.,1, pp.,1205-1224, 2004. |
[15]
|
M. Dash, H. Liu, H. Motoda. Consistency based feature selection. In Proceedings of the 4th Pacific Asia Conference on Knowledge Discovery and Data Mining, Kyoto, Japan, pp.,98-109, 2000. |
[16]
|
H. Peng, L. Fulmi, C. Ding. Feature selection based on mutual information criteria of max-dependency, max-relevance, and min-redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.,27, no.,8, pp.,1226-1238, 2005. |
[17]
|
H. Uğuz. A two-stage feature selection method for text categorization by using information gain, principal component analysis and genetic algorithm. Knowledge-based Systems, vol.,24, no.,7, pp.,1024-1032, 2011. |
[18]
|
M. Robnik-Šikonja, I. Kononenko. Theoretical and empirical analysis of ReliefF and RReliefF. Machine Learning, vol.,53, no.,1-2, pp.,23-69, 2003. |
[19]
|
S. Yijun, T. Sinisa, G. Steve. Local-learning-based feature selection for high-dimensional data analysis. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.,32, no.,9, pp.,1610-1626, 2010. |
[20]
|
W. Peng, S. Cesar, S. Edward. Prediction based on integration of decisional DNA and a feature selection algorithm RELIEF-F. Cybernetics and Systems, vol.,44, no.,3, pp.,173-183, 2013. |
[21]
|
H. W. Liu, J. G. Sun, L. Liu, H. J. Zhang. Feature selection with dynamic mutual information. Pattern Recognition, vol.,42, no.,7, pp.,1330-1339, 2009. |
[22]
|
M. C. Lee. Using support vector machine with a hybrid feature selection method to the stock trend prediction. Expert Systems with Applications, vol.,36, no.,8, pp.,10896-10904, 2009. |
[23]
|
P. Mitra, C. A. Murthy, S. K. Pal. Unsupervised feature selection using feature similarity. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.,24, no.,3, pp.,301-312, 2002. |
[24]
|
J. Handl, J. Knowles. Feature subset selection in unsupervised learning via multi objective optimization. International Journal of Computational Intelligence Research, vol.,2, no.,3, pp.,217-238, 2006. |
[25]
|
H. Liu, L. Yu. Toward integrating feature selection algorithms for classification and clustering. IEEE Transactions on Knowledge and Data Engineering, vol.,17, no.,4, pp.,491-502, 2005. |
[26]
|
S. García, J. Luengo, J. A. Sáez, V. Lóez, F. Herrera. A survey of discretization techniques:Taxonomy and empirical analysis in supervised learning. IEEE Transactions on Knowledge and Data Engineering, vol.,25, no.,4, pp.,734-750, 2013. |
[27]
|
S. A. A. Balamurugan, R. Rajaram. Effective and efficient feature selection for large-scale data using Bayes' theorem. International Journal of Automation and Computing, vol.,6, no.,1, pp.,62-71, 2009. |
[28]
|
J. A. Mangai, V. S. Kumar, S. A. alias Balamurugan. A novel feature selection framework for automatic web page classification. International Journal of Automation and Computing, vol.,9, no.,4, pp.,442-448, 2012. |
[29]
|
H. J. Huang, C. N. Hsu. Bayesian classification for data from the same unknown class. IEEE Transactions on Systems, Man, and Cybernetics, Part B:Cybernetic, vol.,32, no.,2, pp.,137-145, 2002. |
[30]
|
S. Ruggieri. Efficient C4.5. IEEE Transactions on Knowledge and Data Engineering, vol.,14, no.,2, pp.,438-444, 2002. |
[31]
|
P. Kemal, G. Salih. A novel hybrid intelligent method based on C4.5 decision tree classifier and one-against-all approach for multi-class classification problems. Expert Systems with Applications, vol.,36, no.,2, pp.,1587-1592, 2009. |
[32]
|
W. W. Cheng, E. Höullermeier. Combining instance-based learning and logistic regression for multilabel classification. Machine Learning, vol.,76, no.,3, pp.,211-225, 2009. |
[33]
|
J. H. Hsiu, P. Saumyadipta, I. L. Tsung. Maximum likelihood inference for mixtures of skew Student-t-normal distributions through practical EM-type algorithms. Statistics and Computing, vol.,22, no.,1, pp.,287-299, 2012. |
[34]
|
M. H. C. Law, M. A. T. Jain, A. K. Jain. Simultaneous feature selection and clustering using mixture models. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.,26, no.,9, pp.,1154-1166, 2004. |
[35]
|
T. W. Lee, M. S. Lewicki, T. J. Sejnowski. ICA mixture models for unsupervised classification of non-gaussian classes and automatic context switching in blind signal separation. IEEE Transactions on Pattern Analysis and Machine Intelligence, vol.,22, no.,10, pp.,1078-1089, 2002. |
[36]
|
M. Hall, E. Frank, G. Holmes, B. Pfahringer, P. Reutemann. The WEKA data mining software:An update. ACM SIGKDD Explorations Newsletter, vol.,11, no.,1, pp.,10-18, 2009. |