EEschematic is an AI agent designed for automatic schematic generation in analog integrated circuit design. Built upon a Multimodal Large Language Model (MLLM), EEschematic bridges the gap between SPICE-based textual netlists and human-readable circuit schematics, providing circuit designers with visual and interpretable results, integrating textual, visual, and symbolic modalities to generate schematic diagrams that preserve both functionality and readability. The agent translates SPICE netlists into editable schematic files and employs a Visual Chain-of-Thought (VCoT) strategy to iteratively refine symbol placement and wiring, enhancing circuit symmetry and clarity.
-
Multimodal Schematic Generation
Translates SPICE netlists into schematic diagrams by integrating textual, visual, and symbolic reasoning through a multimodal LLM. -
Visual Chain-of-Thought (VCoT)
Employs iterative visual reasoning to refine component placement and wiring, ensuring schematic symmetry and visual clarity. -
Few-Shot Substructure Learning
Utilizes six analog subcircuit examples as few-shot references, enabling generalization to diverse analog topologies. -
Editable Schematic Output
Produces schematic diagrams in a human-editable JSON-like format, allowing manual refinement or further automated editing. -
Bridging Text and Visual Understanding
Provides circuit designers with an interpretable bridge between symbolic SPICE descriptions and schematic-level visualization.
Chang Liu is by Peter Denyer's PhD Scholarship at The University of Edinburgh.
