R-Studio Technician
Do you need R-Studio?
USB Stabilizer can be purchased either as a standalone tool, or as part of a complete data recovery suite together with R-Studio Technician. When purchased standalone, the sole function of USB Stabilizer is to add hardware read instability handling to any third party Windows software, so you would still need data recovery or forensics software to perform functions like imaging or file recovery. As such, it makes sense to buy USB Stabilizer standalone only for established forensics or data recovery professionals who already have various software tools to use it with.
Which software-level features are most important when working with unstable drives?
Everything stems from the fact that unstable drives will only work for a finite period of time before failing completely, so the most important features are those that allow focusing the recovery effort on the most valuable data while minimizing drive stress:
-
When working with unstable drives it is important to never read the same sectors twice to avoid unnecessarily stressing the drive. This is done by maintaining a sector map and a runtime image file. Whenever any sector is successfully read, it is immediately backed up to the runtime image file, and its status is also recorded in the sector map. This way whenever there is a secondary read request for the same sector, the software refers to the sector map to find out that it has already been imaged in the past, and then reads from the image file instead of reading the unstable source drive all over again. This way no matter what recovery procedure is ran, the software always seamlessly works to complete the same image and never has to read anything twice.
-
With the increased capacity and fragility of modern storage devices, it is often impossible to recover the entire drive. Too many unstable drives simply cannot survive the process and will fail part of the way through the recovery. The situation can be improved by recovering only important files first, and the rest of the drive afterwards, if possible. This way if the drive fails during the recovery process, at least the most important files are safe. This is done by reading and parsing only filesystem metadata at first, and then using that information to generate a file tree to allow targeting specific files. It is also critical for the software to have a strong algorithm for assembling corrupt filesystem metadata, as it will usually be at least partially corrupt on unstable drives due to bad sectors.
-
Files are frequently fragmented and scattered all around the drive. Traditionally, file recovery was done one file at a time, so the software would read every fragment of the first file before moving on to the next file. The problem with this process is that it is quite harsh on HDDs due to the read/write heads being forced to constantly jump around the platters to seek out all of these small file fragments. A much better method is to image all selected files at the same time as a single sequential recovery process. In other words, essentially doing a regular sequential image while skipping all of the sectors that do not belong to selected files. This works much faster and applies less stress on the drive.
-
If it is impossible to focus the recovery effort only on specific files, for example because filesystem metadata is entirely missing, then imaging the entire drive becomes a necessity in order to do raw recovery. (Raw recovery is the process where the software carves out files based on the hex signatures of their headers, instead of relying on filesystem metadata.) The most time-consuming and risky part of imaging is reading the unstable/degraded areas of the drive, as it may cause the drive to fail entirely. In the initial passes, it is best to aggressively skip over bad areas, focusing on recovering only the healthy sections. Reading bad areas is then done during later passes, ensuring that if the drive does fail, at least the good data has already been retrieved. This is much more effective than the traditional linear imaging process which is likely to get stuck on a bad area and end up missing all data beyond that point.
-
File Vault and Bitlocker are native encryption methods built into MacOS and Windows. They are aggressively marketed and are becoming increasingly popular each year, often to the detriment of the users themselves. The software tool should be able to decrypt them (with known password) on-the-fly to allow for targeted file recovery. Without this, the entire drive would need to be imaged first before decryption can be attempted, which may not be possible for unstable drives.
-
Unstable drives usually have bad sectors that corrupt some files. It is crucial for the software to generate a report that clearly identifies which files are affected by these bad sectors. This functionality is essential not only for forensic investigations but also for data recovery service providers to effectively communicate recovery results to their customers.
All of these features are present in quality data recovery software like R-Studio Technician which can be optionally bundled with USB Stabilizer.