IBM designs energy efficient method to validate data at record speed
The amount of data generated from various electronic devices is growing exponentially and storing this data is not the only problem faced by the digital world. Checking its validity is also equally important. Although accumulating data is a very difficult job, checking its validity requires more time and resources. However, in a recent record breaking experiment conducted by IBM researchers on the fourth most powerful supercomputer, 9 terabytes of data was analyzed in less than 20 minutes. This activity, which normally would have taken a day to be completed in the same system, was carried out by implementing a new method for checking the integrity of data designed by IBM researchers. That’s not all. This method used only 1% of the total amount of energy that would normally be required. Analysis of data is required in all fields ranging from traffic and water management to financial planning, and this method will be able to provide models with greater efficiency and accuracy.
This method lowers the complexity of the process of validating data and at the same time makes optimal use of all the computer resources available.