Hi,
I’m trying to do something very simple as a “hello world” example.
I have a very simple .tflite model that just multiplies 1 input by 2.
I want to try and infer that on the NPU just to test the integration.
I do everything in native with a simple Android application wrapper with 1 button to trigger my native function.
I compile and link my library with tensorflowlite, neuronusdk_adapter and neuron_buffer_allocator.
Now, we get a crash in somewhere from inside the libneuronusdk_adapter.mtk.so during its initialization.
Now to my question - we are doing this on production build on Dimensity9400,
could it be because we don’t have access to the neuron runtime and runtime API, in the basic SDK? Do I need to link with them or something?
Our company has NDA but I have no idea where do I get the NP Basic.
I’m using the NeuroPilotTFLiteShim.h to interface with NP.
Am I missing something, is my approach not correct for what I’m trying to do?
Any pointers would be welcome