diff --git a/CHANGELOG.md b/CHANGELOG.md index f5dd3e3..1928786 100644 --- a/CHANGELOG.md +++ b/CHANGELOG.md @@ -1,45 +1,5 @@ # Changelog -## 1.0.1 - -### Added - -- **`llm_gateway_generate_text()` UDF wrapper for AI-powered DataFrame transformations.** - - New method on proxy providers to generate AI completions in DataFrame operations via a built-in UDF. - - ```python - from datacustomcode import Client - from pyspark.sql.functions import col - - client = Client() - - # Generate summaries in a DataFrame column - df = df.withColumn( - "summary", - client._proxy.llm_gateway_generate_text( - "Summarize {company}: revenue={revenue}, CEO={ceo}", - { - "company": col("company"), - "revenue": col("revenue"), - "ceo": col("ceo") - }, - llmModelId="sfdc_ai__DefaultGPT4Omni", - maxTokens=200 - ) - ) - ``` - - **Local Development:** Returns placeholder string (doesn't execute) - **Production:** Calls a built-in UDF - - **Parameters:** - - `template` (str): Prompt template with {placeholder} syntax - - `values` (dict or Column): Dict mapping placeholders to Columns, or pre-built named_struct - - `llmModelId` (str): Model identifier (required, e.g., "sfdc_ai__DefaultGPT4Omni") - - `maxTokens` (int): Maximum tokens that will be spent on this query - - ## 1.0.0 ### Breaking Changes diff --git a/README.md b/README.md index fe2e226..49ea0b9 100644 --- a/README.md +++ b/README.md @@ -169,34 +169,6 @@ client.write_to_dlo('output_DLO') > [!WARNING] > Currently we only support reading from DMOs and writing to DMOs or reading from DLOs and writing to DLOs, but they cannot mix. -### LLM Gateway - -Generate AI completions in DataFrame transformations using the LLM gateway UDF. - -```python -from datacustomcode import Client -from pyspark.sql.functions import col - -client = Client() - -# Use template with placeholders -df = df.withColumn( - "summary", - client._proxy.llm_gateway_generate_text( - "Summarize {company}: revenue={revenue}, CEO={ceo}", - { - "company": col("company"), - "revenue": col("revenue"), - "ceo": col("ceo") - }, - llmModelId="sfdc_ai__DefaultGPT4Omni", - maxTokens=200 - ) -) -``` - -> [!WARNING] -> This method returns a placeholder string in local development. It only makes a LLM call and spends tokens when deployed, where it calls the real LLM Gateway service via a built-in UDF. ## CLI