I took a challenge to design a board for the latte panda mu compute module with N305 X86 processor and 16GB of LPDDR5 RAM.
Reason is Intel makes drivers for their iGPU, and this module is dog compatible in size and power.
Reason is I tried to find accelerators for the Pi, but it's too early for that. none of the half a dozen startup I queried answered, and Hailo is just 2GB.

Cost is likely going to be 300€ for 8GB modules when they eventually come out. Meanwhile intel mobile processors have 16GB ram and drivers for their iGPU acceleration and cost less. None of the upcoming accelerators even bother using GDDR6 or GDDR7 memory, which is baffling to me. If they stick to LPDDR5 they don't have much of an edge over APUs, if any. They can do smart L3 cache thingies but that is no sobstitute for having just faster ram. I guess it makes the PCB more expensive and there aren't good memory controller IP to choose from so they stick to LPDDR5.
Reason is I want to make a pupper with local AI intelligence to do useful applications with it
My plan is to make a robot dog board, but with the power to run locally the AI
Is anyone interested in contributing on the software side? In theory, as long as you target X86 with UHD graphics on linux, it should work on latte panda mu.
Features I'm planning:
- 18650 batteries
- Pupper motors
- Multiple ESP32s for motors and AT commands for WiFi radio
- CSI for raspicam
- Ubuntu 22
- LLM brain with audio, text and image models local
- OpenVINO
- MCP
- ROS2
- Audio array
- IMU
- USB dongle for LTE connectivity

| Component |
Content |
Specs. |
| Processor |
CPU |
Intel®Processor N100 / Intel® Core™ i3-N305 |
| |
Cores / Threads |
4-Core, 4-Thread / 8-Core, 8-Thread |
| |
Max Turbo Frequency |
3.4GHz / 3.8GHz |
| |
Cache |
6MB |
| |
TDP |
6 35W / 9 35W configurable |
| |
Graphics |
Intel® UHD Graphics |
| |
Max Dynamic Frequency |
750MHz / 1.25GHz |
| |
Execution Units |
24 / 32 |
| Memory |
Capacity |
8GB LPDDR5 4800MT/s / 16GB LPDDR5 4800MT/s |
| |
ECC Memory Support |
With In-band ECC |
| Storage |
eMMC |
64GB eMMC 5.1 Speed: HS400 |
| Expansion |
Expansion Capability |
Up to 9x PCle 3.0 Lane Up to 4x USB 3.2 10Gbps Up to 2x SATA 6Gbps 8x USB 2.0 4x UART 4x I2C Expandable to 64x GPIO |
| Display |
eDP |
1x eDP 1.4(onboard), up to 4K 60Hz |
| |
HDMI |
3x HDMI 2.0/DisplayPort 1.4, up to 4K 60Hz |
| |
Max Outputs |
3 Ouputs; Max Resolution 4096 x 2160@60Hz |
| Operating System |
Microsoft Windows |
Windows 10, Windows 11 |
| |
Linux |
Ubuntu 22.04 & 24.04 |
| BIOS |
BIOS Info |
AMI UEFI 128Mbit SPI BIOS |
| Power |
Power Supply |
DC 9-20V |
| |
RTC Battery |
3V |
| Environment |
Operating Temperature |
0~60℃ |
| |
Relative Humidity |
0%~80%RH |
| Dimension |
Form Factor |
60mm x 69.6mm |
| Model(SKU) |
DFR1146 |
LattePanda Mu (N100, 8GB RAM, 64GB eMMC) |
| |
DFR1146-ENT |
LattePanda Mu (N100, 8GB RAM, 64GB eMMC, with Win11 Enterprise License) |
| |
DFR1147 |
LattePanda Mu (N100, 16GB RAM, 64GB eMMC) |
| |
DFR1149 |
LattePanda Mu (N305, 16GB RAM, 64GB eMMC) |
| Compliance |
Certification |
FCC, CE, UKCA, RoHS, KC, Safety |
I took a challenge to design a board for the latte panda mu compute module with N305 X86 processor and 16GB of LPDDR5 RAM.
Reason is Intel makes drivers for their iGPU, and this module is dog compatible in size and power.
Reason is I tried to find accelerators for the Pi, but it's too early for that. none of the half a dozen startup I queried answered, and Hailo is just 2GB.
Cost is likely going to be 300€ for 8GB modules when they eventually come out. Meanwhile intel mobile processors have 16GB ram and drivers for their iGPU acceleration and cost less. None of the upcoming accelerators even bother using GDDR6 or GDDR7 memory, which is baffling to me. If they stick to LPDDR5 they don't have much of an edge over APUs, if any. They can do smart L3 cache thingies but that is no sobstitute for having just faster ram. I guess it makes the PCB more expensive and there aren't good memory controller IP to choose from so they stick to LPDDR5.
Reason is I want to make a pupper with local AI intelligence to do useful applications with it
My plan is to make a robot dog board, but with the power to run locally the AI
Is anyone interested in contributing on the software side? In theory, as long as you target X86 with UHD graphics on linux, it should work on latte panda mu.
Features I'm planning: