Reliable Multicast has been applied to large-scale contents delivery systems for distributing digital contents to a large number of users without data loss. Performance evaluation of reliable multicast is useful for delivery time estimation in active delivery systems as well as for designing communication protocols and systems. This paper evaluates the implementation and performance of RMTP; a reliable multicast protocol for bulk-data transfer for several types of contents delivery services. System configuration is examined including operation function such as delivery scheduling based on performance evaluation. Furthermore, research issues are discussed when reliable multicast is applied to emerging broadband network services.
There has recently been an expansion of Internet of Things (IoT) devices thanks to advances in sensing technology and an increase in data sources having broadband connectivity through the launch of 5G (fifth-generation mobile communication) networks. In addition, advances in artificial intelligence (AI) technologies are enabling high-speed processing of data far exceeding the cognitive and processing abilities of humans. As a result, the amount of data generated throughout the world is increasing steadily in a manner that is expected to not only continue but accelerate in the years to come.In addition to using data only within closed organizations such as companies, NTT seeks to achieve a data-centric society in which massive amounts of data will be widely distributed beyond the traditional borders of industries and fields at ultrahigh speeds between autonomously operating AI-based systems. This society will enable the creation of totally new value and development of solutions to social problems through novel combinations of data and expertise.
Problems in achieving a data-centric societyHowever, there are two main problems that must be solved to achieve this data-centric society.
Limits imposed by current data processingarchitecture Data processing is currently executed by individual systems (silos) that differ in terms of purpose and processing method. This silo-oriented architecture results in many copies of the same data in those systems. In a data-centric society in which large volumes of data much greater than current levels are exchanged at ultrahigh speeds between many more entities (humans, systems, devices, etc. that distribute data) than today, this situation will only accelerate, which means that the following problems will arise if the current data-processing architecture continues to be used without modification.• Storage and network performance/capacity will come under pressure. • The number of data-processing flows that must be managed will increase, making data management
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.