본문 바로가기
반응형

Reading papers5

Continual Learning for Generative Retrieval over Dynamic Corpora TaskSo far, these tasks have assumed a static document collection. In many practical scenarios, however, document collections are dynamic, where new documents are continuously added to the corpus. ContributionsWe put forward a novel Continual-LEarner for generatiVE Retrieval (CLEVER) model and make two major contributions.To encode new documents into docids with low computational cost, we presen.. 2023. 9. 7.
G-Eval MotivationThe quality of texts generated by natural language generation (NLG) systems is hard to measure automatically. Conventional referencebased metrics, such as BLEU and ROUGE, have been shown to have relatively low correlation with human judgments, especially for tasks that require creativity and diversity. G-EVALIn this paper, we propose G-EVAL, a framework of using LLMs with chain-of-thou.. 2023. 8. 16.
[논문 리뷰] SEAL: Autoregressive Search Engines: Generating Substrings as Document ODQA와 같은 Konwledge intensive task에서 Reader component에는 autoregressive model이 de-facto이나 Retrieval system에는 몇몇 시도들이 있지만 autoregressive model이 자리 잡고 있진 않다. 본 연구에서 autoregressiv model과 FM-Index를 활용한 generative retrieval을 제안한다. KILT의 여러 dataset에 대해 SoTA를 달성하였다. 논문 : https://arxiv.org/pdf/2204.10628.pdf Abstract Knowledge-intensive language task는 주어진 corpus에 대해 correct answer과 그 결과의 supporting eviden.. 2023. 3. 26.
[논문 요약] SOM-DST: Efficient Dialogue State Tracking by Selectively Overwriting Memory SOM-DST OVERVIEW 기존 접근 방식 Traditional neural DST approaches assume that all candidate slot-value pairs are given in advance, i.e., they perform predefined ontology-based DST 문제점 it is often difficult to obtain the ontology in advance, especially in a real scenario predefined ontologybased DST cannot handle previously unseen slot values the approach does not scale large since it has to go over al.. 2022. 2. 4.
728x90
반응형