Skip to content

[FEATURE] - Improve React Native Performance/Options #38

Open
@GantMan

Description

@GantMan

One of the benefits of React Native is that it isn't confined to JavaScript like the browser.

While all the browser components can only utilize TensorFlow.js models, the React Native implementation could utilize TFLite models for speed enhancement.

I'd like to adjust the React Native component to be configurable to accept a TFJS model or a TFLite model.

TODO: Make AILabImage able to take a configuration to handle the model for TFLite.

Right now: AILabImage for React Native is hard-wired to utilize a TFJS Model
https://github.com/infinitered/ai-lab/blob/master/packages/ai-lab-native/src/components/AILabNativeImage/AILabNativeImage.tsx#L33

Step 1:

The AILabImage can be rewired to take a model as a parameter like the web version does.
https://github.com/infinitered/ai-lab/blob/master/packages/ai-lab/src/components/AILabImage/AILabImage.tsx#L12

Step 2:

Native code would then be implemented. It should be wired up to another SSD like https://tfhub.dev/tensorflow/lite-model/ssd_mobilenet_v1/1/default/1 via React Native dive down to native.

There are example projects all over github of people wiring TFLite to React Native. The goal here would be to wire this so the model could be passed into AILab as similar as a TFJS model.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions