Retailers Should Address IT System Performance Before Adding More Data-Intensive Technologies
The 2019 Retail Technology Report illustrates a major issue with information technology in the retail space — investing in more data-intensive technology without addressing IT system performance can cause serious problems, including system slowdowns and crashes.
In retail, either of those scenarios can result in lost customers and lost revenue. This issue is being masked by the increased investments in technology to enable omnichannel operations, such as order processing and inventory management systems.
While these investments will help satisfy the need for better customer relationships and sell-through, they will fail miserably if poor system performance drives away customers and prospects. In other words, it’s no use handling big data unless it’s also fast data.
The key to powering a seamless customer experience is IT system performance, which is essential to processing and analyzing massive amounts of data for a multitude of essential functions:
- While average retail inventory accuracy is about 65 percent, omnichannel fulfillment requires at least 95 percent accuracy to provide customers a seamless experience.
- Along with accurate inventory, omnichannel retailers also must integrate data from the supply chain, CRM, credit and collections, marketing, and sensor networks.
- As retailers incorporate advanced technologies like artificial intelligence (AI) and Internet of Things (IoT), IT systems must be able to process and analyze even more data.
What's often overlooked is that processing and analyzing data is dependent on the overall system’s input/output (I/O) performance, also known as throughput. Even though the IT industry has advanced with increasing network and memory speeds, faster processors and more bandwidth, poor I/O performance negates any benefit.
As noted in Total Retail's 2019 Retail Technology Report, many retailers know that integrating new technology with existing IT systems is a challenge. They likely are not aware that there are cost-effective software solutions to address application performance at the operating system, file system and storage levels, boosting performance 30 percent to 50 percent or more without a hardware or network upgrade.
Traditional thinking about IT investments goes like this: We need more computer power; we buy more systems. We need faster network speeds, we increase network bandwidth and buy the hardware that goes with it. We need more storage, we buy more hardware. Costs continue to rise proportionate to the demand for the three fundamentals (applications, uptime and speed).
However, there are solutions that can help contain IT costs. Data center infrastructure management (DCIM) software has become an effective tool for analyzing and then reducing the overall cost of IT. In fact, the U.S. Data Center Optimization Initiative claims to have saved nearly $2 billion since 2016.
Other solutions that don’t require new hardware to improve performance and extend the life of existing systems are also available.
Many large organizations performing data analytics require a computer system to access multiple and widespread databases, pulling information together through millions of I/O operations. The system’s analytic capability is dependent on the efficiency of those operations, which in turn is dependent on the efficiency of the computer’s operating environment.
In the Windows environment especially (which runs about 80 percent of the world’s computers), I/O performance degradation progresses over time. This degradation, which can lower the system’s overall throughput capacity by 50 percent or more, happens in any storage environment. Windows penalizes optimum performance due to server inefficiencies in the hand off of data to storage. This occurs in any data center, whether it's in the cloud or on premises. And it gets worse in a virtualized computing environment. In a virtual environment, the multitude of systems all sending I/O up and down the stack to and from storage create tiny, fractured, random I/O that results in a “noisy” environment that slows down application performance. Left untreated, it only worsens over time.
Even experienced IT professionals mistakenly think that new hardware will solve these problems. Since data is so essential to running organizations, they're tempted to throw money at the problem by buying expensive new hardware. While additional hardware can temporarily mask this degradation, targeted software can improve system throughput by 30 percent to 50 percent or more. Software like this has the advantage of being nondisruptive (i.e., no ripping and replacing hardware), and it can be transparent to end users as it's added in the background. Thus, a software solution can handle more data by eliminating overhead; increase performance at a much, much lower cost; and extend the life of existing systems.
As retailers address omnichannel operations investments in technology, they must not ignore the concurrent need for optimal systems performance.
James D'Arezzo is CEO of Condusiv Technologies, the world leader in I/O reduction solutions for virtual and physical server environments.
Related story: Centralized IT Structure Helps Propel Aaron's Digital Transformation
James D'Arezzo is CEO of Condusiv Technologies, the world leader in I/O reduction solutions for virtual and physical server environments.