Our Focus on Latency and Why it’s Important
SAS HBAs
ATTO ExpressSAS HBAs
ATTO SAS host bus adapters are engineered with the latest technology, delivering industry-leading performance for the most demanding storage applications with included RAID.
Learn MoreI am text block. Click edit button to change this text. Lorem ipsum dolor sit amet, consectetur adipiscing elit. Ut elit tellus, luctus nec ullamcorper mattis, pulvinar dapibus leo.
Latency. End-users might not know what to call it, but they know it when they experience it. That wait for an ATM to respond when a user is checking their account balance is a type of latency. When a webpage doesn’t instantly appear in a browser, that is latency. Video editors experiencing micro-pauses or frame skipping while trying to create are being subjected to latency.
The annoyances latency causes are important enough, however, the effects of latency can be far more critical. In the medical industry, for example, in a life-threatening situation a doctor might need access to a record or other critical data, or the feed from a security camera might drop frames at a crucial moment.
Latency is defined as the time between a request and a response. In a storage system, it is the time from when a host application issues a request to the time an HBA or network card returns a response from the storage system.
Latency is not throughput. Throughput, typically measured in MB/sec, is concerned with how quickly data can be moved to and from the drives. Latency is concerned with how smoothly data can be transferred. For example, if you transfer 1000MB of data in 100 seconds, the throughput is 10MB/sec. But this tells you nothing about how long it takes each command to transfer 1MB of data. Some commands may take longer, and some commands may execute quicker.
As data size increases, security threats grow more intense and user expectations rise, system architects are considering latency far more seriously than ever. Because of the nature of digital technology, there will always be latency no matter how minute. A brute-speed approach to solving latency only works in very limited applications. The best approach we have found is to manage latency and make it as consistent as possible.
ATTO specializes in technology and solutions that reduce storage and network latency. By focusing on latency we improve the user experience and workflow outcomes, increase productivity and the ability to respond to real-time scenarios faster and enhance the integrity and resiliency of data transferred across a network. We have developed very effective technology to combat latency that we use in every product in the ATTO ecosystem.
ATTO Advanced Data Streaming™ (ADS) Technology is a proprietary technology built into ATTO host adapters that is designed to manage latency in high-bandwidth work environments. ADS provides controlled acceleration of data transfers by utilizing a combination of features to move large amounts of data faster and more efficiently, maintaining the highest consistent performance.
Where data streams without ADS feature transfer rate peaks and drops, ADS managed data streams are stabilized and smooth providing far more optimal I/O response times than unmanaged streams.
When users need a low latency solution they often turn to Fibre Channel. Fibre Channel is a key technology for existing and new networks that support demanding, high-performance workloads and is crucial for high bit rate and bit depth applications. It provides deterministic throughput and reliability as signature features.
However, as mentioned, we build into every product latency management technology, so whether it is Fibre Channel, Ethernet, SAS, Thunderbolt™ or NVMe, an ATTO product will always provide better performance.
Read more about latency.