This early career project aims to build an automated tool that can plug into a full-text bibliographic database, extract the citation statements toward a cited article, separate substantial citations from perfunctory ones, and categorize substantial citation opinions by their purposes (e.g. comparison, critique, etc.), subjects (e.g. methods, results, etc.), and tones (e.g. positive, negative, and neutral). This Citation Opinion Retrieval and Analysis tool, abbreviated as CORA, will save librarians and researchers significant amount of time to find the most useful comments from a large number of citations. CORA will also provide a new, qualitative approach for assessing research impact. CORA can also help monitoring the quality of scientific publications by facilitating easier identification of citation bias and inaccurate citations from the re-organized citations.Early Career award to build the Citation Opinion Retrieval and Analysis (CORA) tool. The tool will help researchers get citation statements about published literature and will help them analyze and classify citations to assist and expedite research. The tool can plug into a full-text bibliographic database and will save librarians and researchers time when trying to find useful comments from large collections of citations.
CORA will also contribute a new approach for assessing research impact and help monitor the quality of scientific publications by making it easier to identify citation bias and inaccuracy.
About Bei Yu
Bei Yu is an Associate Professor at the School of Information Studies at Syracuse University. Her research area is in applied natural language processing, especially sentiment and opinion analysis.
Before joining Syracuse she was a postdoctoral researcher at the Kellogg School of Management at Northwestern University. Dr. Yu earned her PhD from the Graduate School of Library and Information Science at University of Illinois at Urbana-Champaign. She holds both BS and MS degrees in computer science.