Cooperation Project 2
Coordinator: Prof. Luiz Gustavo Fernandes
Extracting knowledge from big data has become paramount in a world where we can collect data about anything in our lives on a regular basis. The set of techniques employed for extraction of knowledge makes it possible, for instance, to build sets of large and complex data at a deeper level, although they may not have been collected for this purpose.
The main challenges in this area relate to the following aspects:
(i) scalability of algorithms for processing big data;
(ii) reliability of scalable systems;
(iii) ability to visualize these data in order for humans to have a better perception; and
(iv) use of parallel computing techniques for processing these data in real-time.
This project seeks to trigger international collaborations to improve the state-of-the-art for each of the aforementioned challenges through strategic partnerships with renowned research laboratories in renowned universities in this area. The project involving the University of Pisa (Italy) seeks to investigate parallelization techniques in line with adaptive and autonomous computing algorithms to improve the efficiency of the process of analysis of big data in real-time. The project involving the University of Lugano (Switzerland) seeks to develop new approaches to improve the scalability and reliability of distributed systems through the use of State Machine Replication (SMR) techniques.
It is expected that these investigations will culminate in the development of protocols to support distributed systems in the presence of arbitrary or malicious behaviors of the computational processes. Lastly, the partnership with Dalhousie University (Canada) explores two distinct areas. The first involves the development of new visualization techniques to highlight key aspects of sets of data during any process of analysis, with focus on the pre-processing stage. The second addresses the development of artificial intelligence techniques for the recognition of goals for human behavior based on the employment of real-life big data without the need of a human specialist to think up a model for plan recognition.