Running K-Nearest Neighbors on Tensorflow Lite

Experiment setup

  • Enroll training data to TfKNN
  • Run nearest neighbors search using TfKNN
  • Export tflite model from TfKNN
  • Load tflite model and run nearest neighbors search using TfliteKNN
  • Compare nearest neighbors search results generated from TfKNN and TfliteKNN

Background configurations

  • The experiment uses tensorflow 2.4.0
  • There are a couple of constants used for illustration:
K = 3: the number of neighbors found for each KNN searchN_FEATURES = 2: the number of features for input dataN_SAMPLES = 1000: the number of samples (dataset size)N_CENTERS = 5: the number of clusters drawn from the synthetic dataRANDOM_STATE = 0: seed value for deterministic experiment TEST_SIZE = 0.3: ratio of data size split for test set

Tensorflow KNN

  • TfKNN needs to take in the training data ( train_tensor ) as an attribute in order to run the search operation at inference.
  • The distance function used in TfKNN is l2 distance.
  • TfKNN.neighbors is the actual function that performs KNN search. Also, after TF lite conversion, this is the method executed by the tflite model.

Tensorflow Lite KNN

Evaluation

Dataset

Dataset generate by `make_blobs()`

Evaluation process

  • Step 1: training data is enrolled into TfKNN
  • Step 2: tflite model is exported from TfKNN
  • Step 3: run knn search on both TfKNN and TfliteKNN
  • Step 4: compare search results on test data from both implementations

Extra step — use KNN for clustering

Tf KNN accuracy:0.9433333333333334
TfLite KNN accuracy: 0.9433333333333334

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Ata-tech

Ata-tech

7 Followers

Knowledge is power, but shared knowledge is far more powerful