Simple integration of OpenAI and Claude AI models with Google Sheets through custom functions.
- Open your Google Sheet
- Go to Extensions > Apps Script
- Copy and paste the following files into your Apps Script project:
llmToCellO.gs(OpenAI integration)llmToCellC.gs(Claude integration)apiKeys.gs(API key management)menuHandler.gs(UI menu creation)KeyMenu.html(API key setup interface)
- Save the project
- Refresh your Google Sheet
- After installation, you'll see a new "LLM Menu" in your Google Sheets menu bar
- Click on "LLM Menu > APIs Setup"
- Enter your API keys:
- OpenAI API key (get from OpenAI Platform)
- Claude API key (get from Anthropic Console)
- Click "Save"
Basic usage:
=LLM(A1, "Summarize this text")
Full syntax:
=LLM(inputText, prompt, model, temperature)
Parameters:
inputText: Content to process (required)prompt: Instructions for the AI model (required)model: AI model to use (optional, default: "gpt-4o-mini")temperature: Creativity setting 0-1 (optional, default: 0)
Basic usage:
=LLMC(A1, "You are a helpful assistant.")
Full syntax:
=LLMC(inputText, systemPrompt, userPromptStructure, assistantStart, stopSequences, model, max_tokens, temperature)
Parameters:
inputText: Content to process (required)systemPrompt: Instructions for the model's role (required)userPromptStructure: XML-tagged prompt with {inputText} placeholder (optional, not used in Response Generator)assistantStart: Starting text for the response (optional)stopSequences: Array of sequences to stop generation (optional)model: Claude model to use (optional, default: "claude-3-7-sonnet-20250219")max_tokens: Maximum tokens to generate (optional, default: 1024)temperature: Creativity setting 0-1 (optional, default: 0)
- Click on "LLM Menu > Response generation"
- Fill in the following cells in your sheet:
- I3: Input text to process
- K3: System prompt/instructions (include role, instructions, and any other context for the model)
- M3: (Optional) Assistant's response start text
- N3: (Optional) Stop sequences (comma-separated or JSON array)
- O3: (Optional) Model name (default: claude-3-7-sonnet-20250219)
- P3: (Optional) Max tokens (default: 1024)
- Q3: (Optional) Temperature (default: 0)
Note: The system prompt in K3 should contain all instructions for the model, including its role and any specific formatting requirements. The input text from I3 will be sent directly as the user message content.
- The generated response will appear in cell I23
This method is particularly useful when:
- You need to process multiple inputs in sequence
- You want to keep a record of your prompts and responses
- You prefer a visual interface over formula syntax
- The layout can be configured for specific user tasks as needed
This approach helps bypass the cell content limitations (below 2048 tokens) when using formulas, providing a significantly higher token threshold
For more detailed documentation, see:
- LLMC_Documentation.md - Complete function documentation
- LLMC_CheatSheet.md - Quick reference guide
Access documentation through the menu: "LLM Menu > About OpenAI" or "LLM Menu > About Claude"
This project is licensed under the Apache License, Version 2.0 - see the LICENSE file for details.
-
Google Apps Script
- Copyright (c) Google LLC
- Licensed under the Apache License 2.0
- Terms of Service
-
Anthropic Claude API
- Copyright (c) Anthropic
- Licensed under Anthropic's Terms of Service
- Terms of Service
-
This project
- Copyright (c) 2025 Pavel Kravets
- Licensed under the Apache License, Version 2.0
- See LICENSE file for details
- You can freely use, modify, and distribute this software
- You can use the code for commercial purposes
- You can create and distribute closed source modifications
- You must include the original copyright notice and license
- You must state significant changes made to the code
- The full license text is available at apache.org/licenses/LICENSE-2.0