Quantifying prompt quality using information theory: entropy and mutual information analysis of 1,800 LLM generations
-
Updated
Nov 20, 2025 - Jupyter Notebook
Quantifying prompt quality using information theory: entropy and mutual information analysis of 1,800 LLM generations
A new package that takes user-provided text (such as a blog post title or a short article snippet) and generates a structured summary highlighting key advantages or claims. It uses an LLM to analyze t
lawhead-extractor parses legal headlines, extracting parties, claim type and outcome using an LLM with pattern matching for accuracy.
Stock Analysis Dashboard featuring Risk, Fundamental, Sentiment, and Technical analysis, plus AI-powered insights with a rating score, summary table, overall evaluation, and detailed breakdown of each analysis type.
Analysis of emergent behavior in real human–AI dialogues.
A new package would process user complaints or descriptions about logging systems, extracting structured insights such as common pain points, root causes, or improvement suggestions. It uses an LLM to
AI Text Slop: A Quantitative Study of Stylistic Convergence Across Six Language Models in Japanese Technical Writing
A Python-based tool for comparing translated .docx documents against their original versions. It highlights differences, calculates similarity metrics, and generates detailed comparison reports, including suggested corrections.
A Python CLI tool that collects and analyzes Discourse forum discussions using Claude AI to identify common problems, categorize issues by severity, and provide natural language querying of forum insights.
🔍 Analyze user feedback on logging systems with Logference, extracting insights to identify pain points and improve efficiency.
📊 Explore how Shannon entropy and mutual information can quantify prompt quality in generative AI systems across various temperature settings.
Add a description, image, and links to the llm-analysis topic page so that developers can more easily learn about it.
To associate your repository with the llm-analysis topic, visit your repo's landing page and select "manage topics."