Even when the full file list did fit, the spreadsheet could be very slow to produce its calculations – and its memory demands could cause it, other programs, or the operating system to crash. Full-drive comparisons could be difficult when the number of files exceeded the capacity of a spreadsheet (e.g., ~1 million rows, in the case of Microsoft Excel). Each run could take hours: depending upon the algorithm selected, hashing files could be much slower than merely producing a list of files. Hashing an entire drive, on a regular basis, could impose a lot of wear and tear. Moreover, a full list of hashes for all files could be useful in later comparisons, to verify that nothing on the drive had changed.Ī later post observed that that approach might be overkill, insofar as ransomware evidently tended to encrypt entire drives rapidly: a check of just a fraction of files on a drive might suffice to detect it. In addition, a full-drive hash could highlight instances where files simply went missing, such that no hash comparison was possible. Also, it could detect file rot, where memory errors or disk read errors cause files to change from their original form. First, it could detect ransomware that might encrypt files gradually, so as to avoid attracting the user’s attention to constant drive activity. That approach could have several advantages in some situations. The earlier post took the approach of calculating hashes for all files in a folder or, more likely, in an entire data drive. This post builds upon an earlier effort to detect the possible ransomware corruption of data files by comparing hashes for those files, computed on different operating systems and/or at different points in time.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |