D4.1 Large Scale IoT Crawling, Indexing and Ranking

by | Jul 4, 2019 | 0 comments

This deliverable summarises the work of Task T2.3 and T4.1. In a first step the requirements for an information model for the IoTCrawler framework are presented based on the development of scenarios in deliverable D2.1. The following state of the art about ontology models and quality calculation is used to prepare the development of an information model capable of fulfilling the requirements. The main aspects that have been considered are the general data source annotation including time and geo information, privacy, and quality of information. In addition, it is shown how the information model and the additional quality information can be used for the indexing and crawling mechanisms for the framework.

More details on the calculation of quality of information and its metrics are presented followed by some early implementations and demonstrations. All in all this deliverable builds a common basis for understanding information crawled with the IoTCrawler framework and shows how the process of data searching with IoTCrawler can be supported by additional Quality of Information to finally enhance the user experience and to improve results of machine initiated data queries.

Your content goes here. Edit or remove this text inline or in the module Content settings. You can also style every aspect of this content in the module

Hien Truong

NEC

WordPress Appliance - Powered by TurnKey Linux