Questions run semantic search over indexed documents. Only chunks with similarity ≥ your threshold (plus each file's AI summary) are sent to OpenAI (gpt-4o-mini). Set LLM_PROVIDER=openai or groq in .env.local. Search uses local MiniLM embeddings. Answers render as Markdown.