Skip to content

A new package designed to facilitate structured and reliable interactions with language models for analyzing and summarizing technical discussions. Given a detailed description or excerpt of a technic

Notifications You must be signed in to change notification settings

chigwell/techsummly

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

TechSummly

PyPI version License: MIT Downloads LinkedIn

A Python package for structured and reliable interactions with language models to analyze and summarize technical discussions. Given a detailed technical input, TechSummly processes the text to generate a clear, structured summary highlighting key points, potential issues, and insights. The output is consistently formatted for easy parsing and downstream processing.

Features

  • Structured output using regex pattern matching
  • Support for multiple language models via LangChain
  • Automatic retries for reliable extraction
  • Predefined prompt templates for technical summarization

Installation

Install the package using pip:

pip install techsummly

Usage

Using the Default LLM (ChatLLM7)

By default, TechSummly uses the ChatLLM7 model. You can use it without providing an API key for limited usage, or provide your own for higher rate limits.

from techsummly import techsummly

user_input = "Your detailed technical input here..."
response = techsummly(user_input)
print(response)

Using a Custom API Key for LLM7

You can pass your LLM7 API key directly or set it as an environment variable.

from techsummly import techsummly

user_input = "Your technical input..."
response = techsummly(user_input, api_key="your_api_key_here")

Or set the environment variable:

export LLM7_API_KEY="your_api_key_here"

Using Other Language Models

TechSummly supports any LangChain-compatible chat model. Here are examples for popular providers:

OpenAI

from langchain_openai import ChatOpenAI
from techsummly import techsummly

llm = ChatOpenAI()
user_input = "Your technical input..."
response = techsummly(user_input, llm=llm)

Anthropic

from langchain_anthropic import ChatAnthropic
from techsummly import techsummly

llm = ChatAnthropic()
user_input = "Your technical input..."
response = techsummly(user_input, llm=llm)

Google Generative AI

from langchain_google_genai import ChatGoogleGenerativeAI
from techsummly import techsummly

llm = ChatGoogleGenerativeAI()
user_input = "Your technical input..."
response = techsummly(user_input, llm=llm)

Parameters

  • user_input (str): The technical text to process.
  • llm (Optional[BaseChatModel]): A LangChain chat model instance. Defaults to ChatLLM7.
  • api_key (Optional[str]): API key for LLM7. If not provided, the package checks the LLM7_API_KEY environment variable.

Default Model

TechSummly uses ChatLLM7 by default. The free tier rate limits are sufficient for most use cases. For higher limits, get a free API key by registering at https://token.llm7.io/.

Error Handling

If the language model fails to produce a response matching the expected pattern, a RuntimeError is raised with details.

Contributing

Found a bug or have a feature request? Please open an issue on GitHub.

Author

Eugene Evstafev
Email: hi@euegne.plus
GitHub: chigwell