Attention Is All You Need
Paper
•
1706.03762
•
Published
•
47
It is a collection of papers that are useful in studying LLM.
Note Basic
Note Basic
Note Basic understanding of Reinforcement fine-tuning method.
Note Greatness paper to under standing about context and LLM
Note RAG Methodology
Note falsh attention
Note Optional: Think about Data
Note Optional: Think about Data
Note Optional: Think about Data
Note Optional: Think about Data
Note We have to think about what is really important component about Transformer model.
Note Llama2 paper
Note Mistral 7B paper
Note New metric to evaluate LLM