The increasing complexity of optical networks to support a multitude of services leads to massive amounts of data. Moreover, any interruption, even momentary, can cause huge data losses, leading to a poor customer experience. The use of machine learning in various domains has been proposed and tested over the past decades. In this study we will propose a tool for estimating the transmission quality of optical connections before their establishment in the network, based on machine learning algorithms. A high level of service is guaranteed at any geographical location on both fixed and mobile equipment. The Internet of Things will connect billions of devices and sensors. The latency of data in 5G networks will be only one millisecond, compared to 50 ms for current systems. This is important because minimised latency will make near real-time communications possible, such as between two unmanned vehicles travelling in tandem at relatively high speeds or for applied virtual reality The main role of our paper is to study the analysis of the papers that have been published before on next generation antennas, in this area and see how to study and model the different algorithms of IOT and also of 6th generation antennas.