Running K-Nearest Neighbors on Tensorflow Lite

Experiment setup

Background configurations

K = 3: the number of neighbors found for each KNN searchN_FEATURES = 2: the number of features for input dataN_SAMPLES = 1000: the number of samples (dataset size)N_CENTERS = 5: the number of clusters drawn from the synthetic dataRANDOM_STATE = 0: seed value for deterministic experiment TEST_SIZE = 0.3: ratio of data size split for test set

Tensorflow KNN

Tensorflow Lite KNN

Evaluation

Dataset

Dataset generate by `make_blobs()`

Evaluation process

Extra step — use KNN for clustering

Tf KNN accuracy:0.9433333333333334
TfLite KNN accuracy: 0.9433333333333334

--

--

--

Knowledge is power, but shared knowledge is far more powerful

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Computer Vision: Segmentation

Is machine learning fair?

Serving Machine Learning Models on Serverless AWS Lambda Functions

Graph Convolution Intuition

‘Automatically’ Picking Icons Using Machine Learning

Return on Investment for Machine Learning

Trueface Tutorials: Converting MXNet Models to Work with High-performance Inference Frameworks

Building Machine Learning Apps with Streamlit

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Ata-tech

Ata-tech

Knowledge is power, but shared knowledge is far more powerful

More from Medium

Student Mind State Prediction- Confused or Not — Part2

What is Principal Component Analysis (PCA) & T-distributed stochastic Neighbour Embedding (T-SNE)?

Recurrent Neural Network (RNN)

Sign Language Detection using LSTM Model