Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions Community/huggingface-hub/DISPLAY.json
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
{"icon":"download","tags":["huggingface","models","datasets","ai"]}
68 changes: 68 additions & 0 deletions Community/huggingface-hub/SKILL.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,68 @@
---
name: huggingface-hub
description: Access Hugging Face Hub models, datasets, and spaces via the huggingface_hub Python library. Use when you need to list, search, download, or upload HF assets.
compatibility: Created for Zo Computer
metadata:
author: jaknyfe.zo.computer
---

# Hugging Face Hub Skill

Interact with Hugging Face Hub — browse models/datasets/spaces, download files, upload artifacts, and more.

## Prerequisites

Install the library:
```bash
pip install huggingface_hub
```

Save your HF token to [Settings > Advanced](/?t=settings&s=advanced) as `HF_TOKEN` (get one at https://huggingface.co/settings/tokens).

## Usage

```bash
python3 /home/workspace/Skills/huggingface-hub/scripts/hf_hub.py <command> [options]
```

### Commands

| Command | Description |
|---------|-------------|
| `list-models` | List models. Options: `--search`, `--sort`, `--direction`, `--limit` |
| `list-datasets` | List datasets. Options: `--search`, `--sort`, `--direction`, `--limit` |
| `list-spaces` | List spaces. Options: `--search`, `--sort`, `--direction`, `--limit` |
| `model-info` | Get model info. Options: `--model` |
| `dataset-info` | Get dataset info. Options: `--dataset` |
| `download-file` | Download a file from a repo. Options: `--repo-id`, `--filename`, `--revision` |
| `upload-file` | Upload a file. Options: `--local-path`, `--repo-id`, `--repo-type`, `--path-in-repo` |
| `whoami` | Show authenticated user info |

### Examples

```bash
# Search for text-to-image models
python3 .../hf_hub.py list-models --search "text-to-image" --limit 10

# Download a model file
python3 .../hf_hub.py download-file --repo-id "stabilityai/stable-diffusion-xl-base-1.0" --filename "pytorch_model.bin"

# Upload a file
python3 .../hf_hub.py upload-file --local-path ./model.pt --repo-id "your-username/my-model" --repo-type "model"

# List top-rated datasets
python3 .../hf_hub.py list-datasets --sort "likes" --direction desc --limit 5
```

## API Reference

This skill wraps `huggingface_hub`. Key functions:
- `list_models`, `list_datasets`, `list_spaces` — browse hub
- `huggingface_hub` — authentication and file ops
- `InferenceClient` — run inference on hosted models/spaces

## Notes

- Downloads cache to `~/.cache/huggingface/`
- `repo-type` values: `model`, `dataset`, `space`
- Default revision is `main`
168 changes: 168 additions & 0 deletions Community/huggingface-hub/scripts/hf_hub.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,168 @@
#!/usr/bin/env /usr/local/bin/python3
"""Hugging Face Hub CLI tool. Usage: python3 hf_hub.py <command> [options]"""

import argparse
import os
import sys
from pathlib import Path

try:
from huggingface_hub import (
HfApi, hf_hub_download,
list_models, list_datasets, list_spaces,
model_info, dataset_info,
)
except ImportError:
sys.stderr.write("ERROR: huggingface_hub not installed. Run: pip install huggingface_hub\n")
sys.exit(1)


def get_token() -> str:
token = os.environ.get("HF_TOKEN", "")
if not token:
try:
with open("/etc/secrets/HF_TOKEN", "r") as f:
token = f.read().strip()
except Exception:
pass
return token


def get_api() -> HfApi:
token = get_token()
return HfApi(token=token if token else None)


def cmd_whoami(args):
api = get_api()
info = api.whoami()
print(f"Username: {info['name']}")
print(f"Full name: {info.get('fullname', 'N/A')}")
print(f"Email: {info.get('email', 'N/A')}")
print(f"Organizations: {', '.join(info.get('orgs', []))}")


def cmd_list_models(args):
kwargs = {"sort": args.sort, "direction": args.direction, "limit": args.limit}
if args.search:
kwargs["search"] = args.search
kwargs = {k: v for k, v in kwargs.items() if v is not None}
for m in list_models(**kwargs):
print(f"{m.id} likes={m.likes}")


def cmd_list_datasets(args):
kwargs = {"sort": args.sort, "direction": args.direction, "limit": args.limit}
if args.search:
kwargs["search"] = args.search
kwargs = {k: v for k, v in kwargs.items() if v is not None}
for d in list_datasets(**kwargs):
print(f"{d.id} likes={d.likes}")


def cmd_list_spaces(args):
kwargs = {"sort": args.sort, "direction": args.direction, "limit": args.limit}
if args.search:
kwargs["search"] = args.search
kwargs = {k: v for k, v in kwargs.items() if v is not None}
for s in list_spaces(**kwargs):
print(f"{s.id} likes={s.likes}")


def cmd_model_info(args):
info = model_info(args.model)
print(f"Model ID: {info.id}")
print(f"Downloads: {getattr(info, 'downloads', 'N/A')}")
print(f"Likes: {info.likes}")
print(f"Tags: {', '.join(info.tags)}")
print(f"Pipeline tag: {getattr(info, 'pipeline_tag', 'N/A')}")
print(f"Created at: {info.created_at}")
print(f"Last modified: {info.last_modified}")


def cmd_dataset_info(args):
info = dataset_info(args.dataset)
print(f"Dataset ID: {info.id}")
print(f"Downloads: {getattr(info, 'downloads', 'N/A')}")
print(f"Likes: {info.likes}")
print(f"Tags: {', '.join(info.tags)}")
print(f"Created at: {info.created_at}")
print(f"Last modified: {info.last_modified}")


def cmd_download_file(args):
path = hf_hub_download(
repo_id=args.repo_id,
filename=args.filename,
revision=args.revision or "main",
token=get_token() or None,
)
print(path)


def cmd_upload_file(args):
api = get_api()
api.upload_file(
path_or_fileobj=args.local_path,
path_in_repo=args.path_in_repo or Path(args.local_path).name,
repo_id=args.repo_id,
repo_type=args.repo_type or "model",
token=get_token() or None,
)
print(f"Uploaded {args.local_path} to {args.repo_id}/{args.path_in_repo}")


def build_parser():
parser = argparse.ArgumentParser(description="Hugging Face Hub CLI")
sub = parser.add_subparsers(dest="command")
sub.add_parser("whoami", help="Show authenticated user info")
p = sub.add_parser("list-models", help="List models")
p.add_argument("--search")
p.add_argument("--sort", default=None)
p.add_argument("--direction", default=None)
p.add_argument("--limit", type=int, default=20)
p = sub.add_parser("list-datasets", help="List datasets")
p.add_argument("--search")
p.add_argument("--sort", default=None)
p.add_argument("--direction", default=None)
p.add_argument("--limit", type=int, default=20)
p = sub.add_parser("list-spaces", help="List spaces")
p.add_argument("--search")
p.add_argument("--sort", default=None)
p.add_argument("--direction", default=None)
p.add_argument("--limit", type=int, default=20)
p = sub.add_parser("model-info", help="Get model info")
p.add_argument("--model", required=True)
p = sub.add_parser("dataset-info", help="Get dataset info")
p.add_argument("--dataset", required=True)
p = sub.add_parser("download-file", help="Download a file from a repo")
p.add_argument("--repo-id", required=True)
p.add_argument("--filename", required=True)
p.add_argument("--revision", default=None)
p = sub.add_parser("upload-file", help="Upload a file")
p.add_argument("--local-path", required=True)
p.add_argument("--repo-id", required=True)
p.add_argument("--repo-type", default="model")
p.add_argument("--path-in-repo", default=None)
return parser


COMMANDS = {
"whoami": cmd_whoami,
"list-models": cmd_list_models,
"list-datasets": cmd_list_datasets,
"list-spaces": cmd_list_spaces,
"model-info": cmd_model_info,
"dataset-info": cmd_dataset_info,
"download-file": cmd_download_file,
"upload-file": cmd_upload_file,
}


if __name__ == "__main__":
parser = build_parser()
args = parser.parse_args()
if args.command is None:
parser.print_help()
sys.exit(1)
COMMANDS[args.command](args)
4 changes: 4 additions & 0 deletions External/content360/DISPLAY.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,4 @@
{
"icon": "share",
"tags": ["social-media", "content", "notion", "automation"]
}
79 changes: 79 additions & 0 deletions External/content360/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
# Content360 Integration

Syncs posts from a **Notion content calendar** to **Content360** (app.content360.io) for scheduling across Facebook, LinkedIn, X, Instagram, YouTube, TikTok, and Pinterest.

## Setup

### 1. Content360

- Log in at https://app.content360.io
- Go to **Profile → Access Tokens** and create a new token
- Note your **Workspace UUID** from the URL (e.g. `https://app.content360.io/os/{workspace}/posts`)
- Note your **login email and password** for session-based auth

### 2. Notion

- Create a Notion integration at https://www.notion.so/my-integrations
- Share your content calendar database with the integration
- Note the database ID from the URL

### 3. Set Secrets (Zo → Settings → Advanced → Secrets)

```
CONTENT360_EMAIL=you@example.com
CONTENT360_PASSWORD=yourpassword
CONTENT360_API_KEY=your-bearer-token
CONTENT360_ORG_ID=your-workspace-uuid
NOTION_API_KEY=your-notion-integration-key
NOTION_DATABASE_ID=your-database-id
```

### 4. Install Dependencies

```bash
pip install requests
```

## Usage

```bash
# Dry run — see what would be synced
python3 scripts/content360_sync.py --dry-run

# Real sync
python3 scripts/content360_sync.py

# Filter by platform
python3 scripts/content360_sync.py --platforms facebook,linkedin,tiktok
```

## Notion Database Schema

The script expects a Notion database with these properties:

| Property | Type | Description |
|---|---|---|
| `Posted` | Checkbox | Set to true after syncing |
| `Schedule` | Date | Optional — set to schedule instead of draft |
| `Platform` | Select | facebook, linkedin, x, instagram, youtube, tiktok, pinterest |
| `Caption` | Rich Text | Main post content |
| `Hook` | Rich Text | Opening hook/line |
| `CTA` | Rich Text | Call to action |

## How It Works

1. Fetches all social accounts from Content360
2. Queries Notion for unscheduled posts (Posted = false)
3. Creates each post as a draft in Content360, mapping Notion Platform → Content360 account
4. Marks synced posts as "Posted" in Notion

## API Notes

Content360 uses **Inertia.js + Laravel** — all routes are under `/os/` and require:

- `Authorization: Bearer {token}` header
- `X-Inertia: true` header
- `X-Requested-With: XMLHttpRequest` header
- `Accept: application/json` header

The `X-Inertia-Version` header must be updated from each response (automatic in the sync script).
Loading