ML in Network Packages

Timeframe:
Spring 2018 – Fall 2019

Students:
Bikramjit DasGupta

Faculty in Collaboration:
Dr. Stan McClellan

Overview:
The objective of this project is to analyze network packages through Machine & Deep Learning as an efficient approach to detect the success and failure of network communication.


Stages

Phase 1:
Study bandwidth and round-trip time statistics, data analytics can be used to classify three characteristic phenomena in wireless signal use: decreases in bandwidth due to signal over-saturation, signal attenuation due to increasing distance, and signal improvement due to decreasing distance. Using a K-Means algorithm, bandwidth and round-trip time trends were clustered correctly by signal loss type with a 99.98% accuracy rating with 10,000 validation samples.

Phase 2:
The study compares the performance and accuracy of four machine learning algorithms in classifying four characteristic phenomena in wireless signal use: decreases in bandwidth due to signal over-saturation, the signal improvement due to a device moving closer to a wireless signal, signal attenuation due to increasing distance, and congestion caused by competition with high-intensity cross-traffic on a switch. With large enough data samples, an SVM with a moderately high C parameter yielded the smallest 95% confidence intervals when compared to the other machine learning algorithms.

Phase 3:
By comparing an RNN-LSTM, and a CNN-LSTM, versus Jacobson’s Algorithm, both the RNN-LMST and the CNN-LSTM was shown to provide a better RTT estimate. By replacing the predictor used by Jacobson’s Algorithm with a neural network predictor, the number of segment retransmissions was reduced by more than 90%.


Publications:

Thesis:

 

 

Twitter Logo@GroupHipe