Skip to content

VBVR-DataFactory-Staging/vbvr-meta-generator-template

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

1 Commit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

vbvr-meta-generator-template

New-style single-task generator repo: metadata first, optional --render for media. Intended as the in-house successor shape to VBVR-DataFactory-Staging/template-data-generator, with vbvr-meta v0 fields aligned to Evalkit-oriented delivery (generator key, semantic_ground_truth, scoring_contract, provenance, optional generic_declarative_render).

This repository is a runnable skeleton with a trivial demo task (random rectangle on white). Replace REPLACE_ME__EVALKIT_TASK_GENERATOR_KEY and task logic before production use.

Quick start

cd vbvr-meta-generator-template
python3 -m venv .venv && source .venv/bin/activate   # optional
pip install -r requirements.txt

# Default: metadata.json only (no opencv required if you skip --render)
python3 examples/generate.py --num-samples 3 --seed 42 --output-dir ./out_meta

# Optional: also write first_frame.png, final_frame.png, ground_truth.mp4
python3 examples/generate.py --num-samples 2 --seed 1 --output-dir ./out_full --render

Metadata replay (no generator import)

After generating a sample, point the minimal renderer at its metadata.json:

python3 tools/replay_metadata.py \
  --json ./out_meta/template_demo_task/template_demo_00000000/metadata.json \
  --output-dir ./rendered

This template’s declarative spec only emits filled_rectangle layers. Real tasks (e.g. G-39) add more primitives; extend tools/replay_metadata.py accordingly.

Local eval (no Evalkit for demo)

python3 eval/verify_meta_contract.py

Layout

Path Role
core/ Framework: base generator, metadata builder, image/video IO, declarative_spec.py (task-specific export)
src/ Customize: config.py, prompts.py, generator.py
examples/generate.py CLI: default meta-only; --render
tools/replay_metadata.py PIL + cv2 replay from generic_declarative_render
eval/ Meta + replay smoke(无 Evalkit:见 eval/EVAL.md);真任务可再仿 zip_pilot_G39_attention_shift/eval/
schemas/metadata.example.json Shape reference
docs/CUSTOMIZE.md Fork checklist

vbvr-meta v0 top-level keys (produced by demo)

  • task_id, generator, schema_version, timestamp, parameters, param_hash, generation
  • semantic_ground_truth (includes task_summary_en / task_summary_zh, video_temporal, interpretation)
  • scoring_contract (demo uses evaluator_class: null until you wire Evalkit)
  • generic_declarative_render (schema generic_declarative_canvas_v1)
  • provenance

Relation to zip pilot (G-39)

The multi-folder vendor zip (0005, legacy tree, evaluator snapshots) lives in a separate delivery layout (see zip_pilot_G39_attention_shift in your workspace). This repo is the minimal Git template you push or Use template; copy patterns from G-39’s 04_NEW_GENERATOR_WORK when implementing a real task.

License

Same usage as sibling VBVR templates — add your org’s LICENSE when publishing.

About

vbvr-meta v0 single-task generator template (meta-first, declarative replay, eval smoke)

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages