Skip to content

A new package designed to process user input descriptions of technical products or components and generate structured summaries or specifications. It leverages LLMs to interpret the input text and ext

Notifications You must be signed in to change notification settings

chigwell/techspec-extract

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 

Repository files navigation

techspec-extract

PyPI version License: MIT Downloads LinkedIn

A Python package designed to process user input descriptions of technical products or components and generate structured summaries or specifications. It leverages LLMs to interpret the input text and extract key details such as features, return status, or related media references, enabling consistent data extraction for product management, customer support, or inventory tracking without handling the actual media files.

Installation

pip install techspec_extract

Usage

from techspec_extract import techspec_extract

user_input = "Your user input text here"
response = techspec_extract(user_input)
print(response)

Parameters

  • user_input (str): The user input text to process.
  • llm (Optional[BaseChatModel]): The LangChain LLM instance to use. If not provided, the default ChatLLM7 will be used.
  • api_key (Optional[str]): The API key for LLM7. If not provided, the environment variable LLM7_API_KEY will be used.

Using Different LLMs

You can safely pass your own LLM instance if you want to use another LLM. Here are examples of how to use different LLMs:

Using OpenAI

from langchain_openai import ChatOpenAI
from techspec_extract import techspec_extract

llm = ChatOpenAI()
response = techspec_extract(user_input, llm=llm)

Using Anthropic

from langchain_anthropic import ChatAnthropic
from techspec_extract import techspec_extract

llm = ChatAnthropic()
response = techspec_extract(user_input, llm=llm)

Using Google

from langchain_google_genai import ChatGoogleGenerativeAI
from techspec_extract import techspec_extract

llm = ChatGoogleGenerativeAI()
response = techspec_extract(user_input, llm=llm)

Rate Limits

The default rate limits for LLM7 free tier are sufficient for most use cases of this package. If you want higher rate limits for LLM7, you can pass your own API key via the environment variable LLM7_API_KEY or directly via the api_key parameter:

from techspec_extract import techspec_extract

user_input = "Your user input text here"
response = techspec_extract(user_input, api_key="your_api_key")

You can get a free API key by registering at LLM7.

Issues

If you encounter any issues, please report them on the GitHub issues page.

Author