kgnite is a command-line utility for working with Kaggle datasets, competitions, notebooks, models, downloads, uploads, and competition workflows.
It combines:
kaggleCLI for search, listings, metadata, competition actions, and notebook source pullskagglehubfor dataset, competition, model, and notebook-output downloads, plus direct Python-side uploads
Quick entry points:
kgnite depends on Kaggle authentication already being available on your machine.
Recommended auth method:
KAGGLE_API_TOKEN
Compatible fallback methods:
~/.kaggle/kaggle.jsonKAGGLE_USERNAME+KAGGLE_KEY
For this project, KAGGLE_API_TOKEN is the preferred option.
Set it in your shell:
export KAGGLE_API_TOKEN="your_token_here"To make it permanent in zsh, add this to ~/.zshrc:
export KAGGLE_API_TOKEN="your_token_here"Then reload:
source ~/.zshrcCreate:
~/.kaggle/kaggle.jsonwith contents like:
{
"username": "your_kaggle_username",
"key": "your_kaggle_api_key"
}Then secure it:
chmod 600 ~/.kaggle/kaggle.jsonCheck your Kaggle CLI:
kaggle --version
kaggle config viewCheck kgnite view of auth:
kgnite doctor
kgnite doctor --jsonYou need:
- Python 3.11+
- Kaggle CLI installed and available on
PATH - internet access
- valid Kaggle authentication
Optional but useful:
zshorbash/usr/local/binwrite access if you want system-style installation
There are two main ways to install it.
This is the recommended install for normal use.
From the project root:
cd /Users/rajeshsingh/myprojects/kgnite
bash scripts/install.shWhat this does:
- Creates an isolated virtualenv in
~/.local/share/kgnite/venv - Installs
kgniteinto that virtualenv - Tries to create a launcher at
/usr/local/bin/kgnite
If /usr/local/bin is writable, you are done.
Verify:
kgnite --help
kgnite doctor
kgnite completionsThe installer prints a follow-up command.
Example:
sudo install -m 755 "/Users/rajeshsingh/.local/share/kgnite/kgnite-launcher" "/usr/local/bin/kgnite"Then refresh command lookup:
hash -r
kgnite --help
kgnite completionsIf you prefer a user-only install:
cd /Users/rajeshsingh/myprojects/kgnite
KGNITE_BIN_DIR="$HOME/.local/bin" bash scripts/install.shThen make sure that directory is in your PATH.
For the current shell:
export PATH="$HOME/.local/bin:$PATH"
hash -r
kgnite --help
kgnite completionsTo make it permanent in zsh, add this to ~/.zshrc:
export PATH="$HOME/.local/bin:$PATH"Then reload:
source ~/.zshrc
hash -rUse this only if you are actively editing the project:
cd /Users/rajeshsingh/myprojects/kgnite
python3 -m venv .venv
source .venv/bin/activate
pip install -e .❯ kgnite
usage: kgnite [-h] {usage,completions,doctor,search,info,files,download,pull-notebook,submit,leaderboard,submissions,upload-dataset,upload-model,browse} ...
Unified Kaggle helper for search, metadata, files, downloads, and notebook pulls.
positional arguments:
{usage,completions,doctor,search,info,files,download,pull-notebook,submit,leaderboard,submissions,upload-dataset,upload-model,browse}
usage Show example workflows and common command patterns.
completions Install or print shell completion setup.
doctor Inspect local Kaggle/KaggleHub availability and authentication state.
search Search Kaggle resources.
info Show metadata and related information for a resource.
files List files for a dataset, competition, notebook, or model version.
download Download Kaggle assets using kagglehub.
pull-notebook Pull notebook source files via the Kaggle CLI.
submit Submit a file or notebook run to a Kaggle competition.
leaderboard Show or download a competition leaderboard.
submissions List your submissions for a competition.
upload-dataset Upload or version a dataset. Use --handle for kagglehub upload, or rely on metadata files for kaggle CLI mode.
upload-model Upload a model variation/version. Use --handle for kagglehub upload, or CLI metadata mode for create/update.
browse Interactive terminal workflow for search -> inspect -> download.
options:
-h, --help show this help message and exit
Common workflows:
kgnite doctor
kgnite completions
kgnite search datasets "vision transformer" --sort-by votes
kgnite info dataset zillow/zecon
kgnite files competition titanic
kgnite download dataset zillow/zecon --output-dir ./downloads
kgnite pull-notebook owner/notebook --output-dir ./notebooks
kgnite submit titanic --file ./submission.csv --message "baseline"
kgnite leaderboard titanic --show
kgnite upload-dataset ./my-dataset --handle me/my-dataset --message "v1"
kgnite upload-model ./my-model --handle me/model/pytorch/base --message "v1"
kgnite browse
After installation:
kgnite
kgnite doctor
kgnite usage
kgnite completionsBehavior:
kgnitewith no arguments prints help and common usage examples- missing required arguments print command-specific examples
kgnite doctorshows Kaggle auth and runtime status
Top-level help:
kgnite --help
kgniteExamples-only help:
kgnite usageCommand help:
kgnite search --help
kgnite info --help
kgnite download --help
kgnite completions --help
kgnite submit --help
kgnite browse --helpkgnite can now manage completions directly.
Run:
kgnite completionsWhat it does:
- Detects your shell from
$SHELL - Generates the right completion file for
bashorzsh - Writes it to:
~/.local/share/kgnite/completions/kgnite.bashfor Bash~/.local/share/kgnite/completions/_kgnitefor Zsh
- Prints the exact command to refresh completions immediately
- Prints the exact config line(s) to add for persistence
Examples:
kgnite completions
kgnite completions --shell zsh
kgnite completions --shell bash
kgnite completions --printIf you reinstall or update kgnite, run kgnite completions again to refresh the generated completion file.
Supported groups:
datasetscompetitionskernelsmodels
Examples:
kgnite search datasets "vision transformer" --sort-by votes
kgnite search competitions llm --category playground --page-size 20
kgnite search kernels rag --language python --kernel-type notebook
kgnite search models gemma --owner google
kgnite search datasets titanic --jsonSupported resource types:
datasetcompetitionnotebookmodel
Examples:
kgnite info dataset zillow/zecon
kgnite info competition titanic
kgnite info notebook kaggle/getting-started-with-ai4code
kgnite info model google/gemma/pytorch/2b
kgnite info model google/gemma/pytorch/2b/3
kgnite info dataset heptapod/titanic --jsonkgnite files dataset zillow/zecon
kgnite files competition titanic
kgnite files notebook kaggle/getting-started-with-ai4code
kgnite files model google/gemma/pytorch/2b/3
kgnite files competition titanic --jsonSupported download targets:
datasetcompetitionmodelnotebook-output
Examples:
kgnite download dataset zillow/zecon --output-dir ./downloads
kgnite download dataset zillow/zecon --path data.csv --output-dir ./downloads
kgnite download competition titanic --output-dir ./downloads
kgnite download model google/gemma/pytorch/2b/3 --output-dir ./models
kgnite download notebook-output kaggle/getting-started-with-ai4code --output-dir ./nb-outputNotebook source and notebook output are different operations.
kgnite pull-notebook kaggle/getting-started-with-ai4code --output-dir ./notebookskgnite submit titanic --file ./submission.csv --message "baseline v1"kgnite submit some-code-competition \
--kernel yourname/your-notebook \
--version 3 \
--message "submit notebook version 3"kgnite leaderboard titanic --show
kgnite leaderboard titanic --show --page-size 50
kgnite leaderboard titanic --download --output-dir ./leaderboardskgnite submissions titanic
kgnite submissions titanic --jsonIf Kaggle rejects submission-history access, kgnite now returns a clearer explanation instead of only a raw API error. Typical causes:
- you have not joined the competition yet
- you have not accepted the competition rules
- you have no submissions yet
- Kaggle API access for submissions is restricted for that competition/account state
Handle-driven upload through kagglehub:
kgnite upload-dataset ./my-dataset \
--handle yourname/my-dataset \
--message "initial upload"Kaggle CLI metadata-folder upload:
kgnite upload-dataset ./my-dataset --public
kgnite upload-dataset ./my-dataset --version --message "new rows for march"Notes:
- handle mode is simpler if you already know the dataset target
- CLI mode expects Kaggle dataset metadata files in the folder
Handle-driven upload through kagglehub:
kgnite upload-model ./my-model \
--handle yourname/my-model/pytorch/base \
--license-name Apache-2.0 \
--message "initial model version"Kaggle CLI metadata-folder mode:
kgnite upload-model ./my-model --action create
kgnite upload-model ./my-model --action updateNotes:
kagglehubmode is best for direct artifact uploads- CLI mode expects Kaggle model metadata files in the folder
Run:
kgnite browseWhat browse does:
- Shows a resource menu
- Accepts either:
- a number:
1,2,3,4 - a resource name:
datasets,competitions,kernels,models - or a direct search query like
llmorgemma
- a number:
- Shows search results
- Lets you pick a result number
- Lets you choose an action
Resource menu:
1. datasets
2. competitions
3. kernels
4. models
Examples:
kgnite browse
1
titanic
1
info
kgnite browse
datasets
titanic
1
download
kgnite browse
llm
1
info
When you choose a download action in browse, kgnite asks where to save the files.
Options:
- Default cache folder
- Current folder
- Custom folder
Default base folders:
- datasets:
~/.cache/kagglehub/datasets - competitions:
~/.cache/kagglehub/competitions - kernels:
~/.cache/kagglehub/notebooks - models:
~/.cache/kagglehub/models
kgnite then creates a resource-specific subfolder under the chosen base path, so downloads do not collide with existing files.
Examples:
- dataset:
~/.cache/kagglehub/datasets/<owner>/<dataset>
- competition:
~/.cache/kagglehub/competitions/<competition>
- kernel output:
~/.cache/kagglehub/notebooks/<owner>/<notebook>
- notebook source pull into current folder:
./<owner>/<notebook>/...
If you changed the project code and want the installed tool updated:
cd /Users/rajeshsingh/myprojects/kgnite
bash scripts/install.shIf you use /usr/local/bin, rerun the printed sudo install ... command only if needed.
kgnite doctorcd /Users/rajeshsingh/myprojects/kgnite
bash scripts/uninstall.shIf the launcher is in /usr/local/bin and cannot be removed without elevated privileges, the uninstall script tells you the sudo rm ... command to run.
You can also use:
make install
make reinstall
make uninstall
make dev
make completions- search, listings, metadata, and competition actions primarily wrap the Kaggle CLI
- downloads primarily use
kagglehub - notebook source pulls and notebook output downloads remain separate because Kaggle treats them as separate resources
browseis interactive terminal guidance, not a full TUIupload-modelCLI mode only wraps metadata-based model create/update flows- model downloads are most reliable with full handles like
<owner>/<model>/<framework>/<variation>/<version>