Hi all,
I want to know If we write the inference plugin for deep learning models with python or c++, what's difference in the speed up?
All we know, the c/c++ language has more speed than python, and this is right when we want to implement the algorithms from scratch, because we want to impelement inference codes of model that has c/c++ backend but python interface, Writing python gstreamer plugin but running the main codes with c/c++ backend(like tensorflow), How different are their speeds(c/c++ plugin and python plugin)?