Getting started with KNN and C#
K-Nearest Neighbors (k-NN) regression is a simple yet effective technique for predicting numeric values. This post demonstrates how to implement k-NN regression in C# from scratch.
For simplicity, we use a small synthetic dataset:
double[][] trainData = new double[][]
{
new double[] {1.0, 1.1, 100.0},
new double[] {2.0, 2.2, 200.0},
new double[] {3.0, 3.3, 300.0},
new double[] {4.0, 4.4, 400.0}
};
// Test data
double[] testInput = { 2.5, 2.6 };
Each row represents an instance with two input features and one output value.
K-NN Regression Implementation
The algorithm calculates the Euclidean distance between the test point and training instances, selects the k nearest neighbors, and averages their output values.
using System;
using System.Linq;
class KNNRegressor(double[][] data, int kValue)
{
int k = kValue
public double Predict(double[] input)
{
var neighbors = data
.Select(row => new
{
Distance = EuclideanDistance(input, row),
Value = row[^1]
})
.OrderBy(x => x.Distance)
.Take(k)
.Select(x => x.Value);
return neighbors.Average();
}
static double EuclideanDistance(double[] a, double[] b)
{
double sum = 0;
for (int i = 0; i < a.Length; i++)
sum += Math.Pow(a[i] - b[i], 2);
return Math.Sqrt(sum);
}
}
Example usage
var knn = new KNNRegressor(trainData, k: 3);
double prediction = knn.Predict(testInput);
Console.WriteLine($"Predicted value: {prediction}");
Evaluation
To assess the model, compare predictions with actual values and compute the Mean Absolute Error (MAE).
double[] actualValues = { 250.0 }; // Ground truth for test data
double mae = Math.Abs(actualValues[0] - prediction);
Console.WriteLine($"Mean Absolute Error: {mae}");
Conclusion
This implementation demonstrates how to apply K-NN regression in C# with minimal dependencies.
While simple, k-NN is effective for small datasets and serves as a good baseline for more complex models.
References
K-NN Algorithm: Wikipedia