In this paper, we propose a new methodology for extracting and formalizing subjective topics at a specific time using a set of keywords extracted automatically from online news articles. To do this, we first extracted a set of keywords by applying TF-IDF methods selected by a series of comparative experiments on various statistical weighting schemes that can measure the importance of individual words in a large set of texts. In order to effectively calculate the semantic relation between extracted keywords, a set of word embedding vectors was constructed by using about 1,000,000 news articles collected separately. Individual keywords extracted were quantified in the form of numerical vectors and clustered by K-means algorithm. As a result of qualitative in-depth analysis of each keyword cluster finally obtained, we witnessed that most of the clusters were evaluated as appropriate topics with sufficient semantic concentration for us to easily assign labels to them.
김성진. (2017). 딥러닝 기반의 뉴스 분석을 활용한 주제별 최신 연관단어 추출 기법 (873-876). 한국정보처리학회 2017년 춘계학술발표대회.
김우생. (2014). 주성분 분석과 k 평균 알고리즘을 이용한 문서군집 방법. 한국정보통신학회논문지, 18(3), 625-630. http://dx.doi.org/10.6109/jkiice.2014.18.3.625.
김일환. (2011). 대규모 신문 기사의 자동 키워드 추출과 분석 -t-점수를 이용하여-. 한국어학, 53, 145-194.
신동혁. (2016). K-평균 클러스터링을 이용한 네트워크 유해트래픽 탐지. 한국통신학회논문지, 41(2), 277-284.
안희정. (2015). 키워드 가중치 방식에 근거한 도서 본문 주제어 추출 (19-22). 한국컴퓨터정보학회 학술발표논문집.
이성직. (2009). TF‐IDF의 변형을 이용한 전자뉴스에서의 키워드 추출 기법. 한국전자거래학회지, 14(4), 59-73.
정다미. (2013). 사회문제 해결형 기술수요 발굴을 위한키워드 추출 시스템 제안. 지능정보연구, 19(3), 1-23.
조태민. (2015). LDA 모델을 이용한 잠재 키워드 추출. 한국지능시스템학회 논문지, 25(2), 180-186.
한국정보화진흥원. (2015). 2015 정보격차 실태조사. 미래창조과학부.
Beliga, S.. (2015). An overview of graph-based keyword extraction methods and approaches. Journal of Information and Organizational Sciences, 39(1), 1-20.
David M. Blei. (2012). Probabilistic topic models. Communications of the ACM, 55(4), 77-84. http://dx.doi.org/10.1145/2133806.2133826.
Harris, Z. S.. (1954). Distributional structure. Word, 10(2-3), 146-162.
Mikolov, T.. (2013). Efficient estimation of word representations in vector space (-). International Conference on Learning Representation 2013.
Mikolov, T.. (2013). Linguistic regularities in continuous space word representations. NAACL-HLT (746-751). NAACL-HLT.
Rong, X.. (2014). Word2vec parameter learning explained. arXiv preprint arXiv:1411.2738. http://scikit-learn.org/stable/modules/generated/sklearn.cluster.KMeans.html#sklearn.cluster.KMeans.
(2018). Tensorflow Motivation: Why learn word embeddings?. https://www.tensorflow.org/tutorials/.
word2vec Word2vec. https://code.google.com/archive/p/word2vec/.