There is a lot of knowledge that we can utilize from the internet, or even from some work social site like Linkedin!
Here is a summary from Gen AI. As we know gen is part of AI that not learning or optimizing from the model. But something that totally generates something, like content or picture.
- LLM (Large Language Model): AI systems trained on vast datasets to generate human-like text for content and conversational AI.
- Transformers: Neural network architecture using self-attention for processing sequential data, enabling breakthroughs in language understanding.
- Prompt Engineering: Crafting AI input instructions to achieve desired outputs effectively.
- Fine-tuning: Adapting pre-trained models for specific tasks using specialized datasets.
- Embeddings: Numerical representations of text or data in high-dimensional space for semantic search and efficient AI processing.
- RAG (Retrieval Augmented Generation): Combining knowledge retrieval with AI generation for factual responses.
- Tokens: Basic units of text processing, like words or subwords.
- Hallucination: When AI generates factually incorrect but plausible content.
- Zero-shot: The ability of AI to perform tasks without specific training, using general knowledge.
- Chain-of-Thought: Encouraging step-by-step reasoning to improve accuracy.
- Context Window: The maximum text an AI model can process in one interaction.
- Temperature: A parameter controlling randomness in AI responses, balancing creativity and determinism.
0 Comments