Running K-Nearest Neighbors on Tensorflow Lite

Experiment setup

Background configurations

K = 3: the number of neighbors found for each KNN searchN_FEATURES = 2: the number of features for input dataN_SAMPLES = 1000: the number of samples (dataset size)N_CENTERS = 5: the number of clusters drawn from the synthetic dataRANDOM_STATE = 0: seed value for deterministic experiment TEST_SIZE = 0.3: ratio of data size split for test set

Tensorflow KNN

Tensorflow Lite KNN

Evaluation

Dataset

Dataset generate by `make_blobs()`

Evaluation process

Extra step — use KNN for clustering

Tf KNN accuracy:0.9433333333333334
TfLite KNN accuracy: 0.9433333333333334

--

--

--

Knowledge is power, but shared knowledge is far more powerful

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Deezer at ICASSP 2017

Abstractive Text Summarization using Pointer-Generator Networks Deep Learning

what is gpt-3…

Different types of Distances used in Machine Learning

India can deploy Machine Learning to douse Ceasefire Violations

Filipino ULMFiT: Pre-trained AWD-LSTM Language Model sa Filipino

The Development of Machine Learning Models For Algorithmic Music Composition.

Variational Recurrent Neural Networks — VRNNs

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Ata-tech

Ata-tech

Knowledge is power, but shared knowledge is far more powerful

More from Medium

An Introduction to Machine Learning — Saad Khalid

FizzBuzz in TensorFlow

Adaboost classifier for face detection using viola jones algorithm

Learning Hidden Markov Models with simple examples and Matlab [part 1]