Releases: zetic-ai/ZeticMLangeiOS
Releases · zetic-ai/ZeticMLangeiOS
1.5.14
1.5.13
Updated
- Added Lite tier support
- Improved Gemma 4 support for LLM
- Fixed an issue preventing LLM models from running in some builds
1.5.12
Updated
- Add Lite tier in the enum
- Update LLM for support gemma4
Known Issues
- LLM initialization may fail in some distributed iOS builds, preventing chat or model execution.
- If affected, update to the latest release.
1.5.11
Updated
As the same as 1.5.10
1.5.10
Updated
- Update
ZeticMLangeLLMModelAPI- Support selecting
apType(.CPU/.GPU) together withquantTypewhentargetis.LLAMA_CPP - Add
initOptionto configure LLM initialization options - Move
kvCacheCleanupPolicyfrom the initializer arguments intoinitOption - Support configuring context length with
initOption.nCtx
- Support selecting
TBD
- Update docs about the new
ZeticMLangeLLMModelAPI
1.5.9
Updated
- Add
cacheHandlingPolicyandModelCacheManager. - Update default modelMode in
ZeticMLangeLLLMModel.
TBD
- Update docs about cache handling
1.5.8
Release version 1.5.8
1.5.7
Full Changelog: 1.5.6...1.5.7
1.5.6
Full Changelog: 1.5.5...1.5.6
1.5.5
Full Changelog: 1.5.4...1.5.5