IJSRP, Volume 4, Issue 1, January 2014 Edition [ISSN 2250-3153]
Suresh Kumar, Mrs. D. Saravanapriya
Abstract:
Text sequences are ubiquitous, multiple text sequence are often related to each other by sharing common topics. The interactions among these sequences provide more information to derive more meaningful topics. Discovering valuable knowledge from a text sequence involves extracting topics from the sequence with both semantic and temporal information. The method is relied on a fundamental assumption that different sequences are always synchronous in time. The documents from different sequences on the same topic have different time stamp and there is no guarantee that the articles covering the same topic are indexed by the same time stamps. The key idea is to introduce a generative topic model for utilizing correlation between the semantic and temporal information in the sequences. Topic model is mainly focused on extracting a set of common topics from given sequences using their original time stamps. It performs topic extraction and time synchronization alternatively to optimize a unified objective function. A local optimum is guaranteed with the proposed method.