What is the data-centric storage concept?

Last Update Time: 2021-05-08 10:25:48

Is storage just a series of bit pushers? Pass a large number of 0s and 1s from here to there, and require that the data be intact and delivered in time? Not everyone views it this way. The lack of intelligent and efficient data storage tools is seriously troubling those responsible for data storage and protection in enterprises.

Just look at the data stored on the disk, flash memory, magnetic tape, or other media. Those bits and bytes are like a checker on a chess board, which never stops. But we don't understand it this way. We understand that these bits and bytes are the corporate data we operate.

But at present, most enterprises are still forced to use the oldest data storage tools to treat information as a series of data strings with no intrinsic value.


image.png


Backup is a typical example of using brute force to reduce the value of its data. Most backup processes, including the latest flat backup technology, look at data according to the plan 40 years ago. Of course, the new methods are more efficient and keep the data center organized, but they still do not think from an intelligent point of view of what the data really represents and how important they are to the enterprise.

In the best scenario, the data itself is smart enough to understand its usage, users, active time, and whether it has other practical value beyond its direct application.

But before we created such intelligence in the data, even efficiency-enhancing operations like flat backups were regarded as a kind of blindness. It was still dealing with a bunch of things without any meaning to the data itself.

When you try to think about how the application or data-centric concept formation process and how this new concept reshapes the data center, you will introduce storage that no longer transparently handles data to only a handful of manufacturers. The status of the tool is difficult to understand, and now that the node has been reached, the lack of insight will obliterate these advances. As long as the data still serves as a commodity rather than the valuable information it represents, more opportunities will be missed on the road to better use of the data.

But there is still hope.

Object storage—putting extended, recordable metadata with the objects it stores—is promising. Of course, the increase in metadata is the responsibility of the applications and users that create and use the data objects. But it is undeniable that this has potential benefits. For example, based on the file type, content, creator, modifier, etc. mark the retention time of a data object. This metadata information can in turn notify the storage system or a data manipulation application that the object cannot be copied to the cloud or deleted and archived on a specific date.

This intelligence can end a large number of manual operations, and can effectively reduce the number of retained data copies. The smarter the data, the smarter the data processing, and the operation of the data will become a strategy-driven event.

A small number of manufacturers have begun to use new data storage tools to deal with such problems.

The storage arrays of the DataGravity Discovery series have built-in intelligence. In addition to storing data, the system supports associating detailed information to personal files to track and control access to special information, using policies to control data retention and provide detailed reports on file-related activities. DataGravity's products are a step towards data intelligence.

There is also a data storage tool manufacturer Qumulo, which recently launched its data intelligence, scale-out NAS products. The system supports analysis of data stored on it and uses metadata to classify files and other objects. The product also provides detailed performance information deep into specific clients and data paths, reporting any potential bottlenecks.

Tarmin builds its data-defined storage based on object storage. Tarmin opens up a global namespace on top of object storage that may be located in multiple locations. Processes such as data layering, archiving, retention, and encryption can all be controlled by policy.

New vendor Primary Data has also created some intelligence by deploying a separate namespace that can be extended to DAS, NAS, block or cloud storage. It uses strategy to control data replacement and contains a large amount of file data and its metadata. Metadata can be accessed through the API provided by the master data. Policies created by users can control a range of permissions and activities, including the use of files, directories, and volumes, and can limit permissions such as file copying.

These developments are in the right direction and deserve attention. Before the value of data can be fully utilized by the system and processes, we will undoubtedly store more data, do more data backup and may lose important data. Intelligent storage relies on intelligence and intelligent storage tools.

 

If you want to know more, our website has product specifications for data-centric storage, you can go to ALLICDATA ELECTRONICS LIMITED to get more information