Infrastructure and Data Management
Edge Computing is one of the top areas of IT investments, especially in the light of companywide digital transformation initiatives such as the internet of things (IoT). Computing technologies that enable the delivery and analysis of data and resources to people and things in a timely fashion are collectively referred to as Edge Computing. Edge Computing is the moving of compute to where the data is generated, and therefore aids in reducing the "time to value" i.e., the insight gleaned from it.
At first glance, Edge Computing appears very homogeneous, and includes all activities that are performed outside of the "Core", which could be the location of the primary corporatewide IT (Information Technologies) infrastructure. In fact, Edge Computing is a multitiered mix of assets arranged in a use-case and workload centric fashion. An "intelligent" Edge Tier is a crucial link between the Core and Endpoints, that provides a distributed compute, data persistence and network aggregation…
Digital transformation (referred to as DX, for short) is a journey that most organizations have to undertake in order to future-proof their existence. For these organizations, cloud is a crucial foundation for their DX initiatives. Cloud shapes enterprise IT service strategies, and, therefore, influences the outcome of DX initiatives. Organizations that fully embrace the distributed infrastructure model supported by cloud and hybrid IT treat IT as a core competency – one that drives new sources of competitive differentiation, while also supporting ongoing business processes.
Digital transformation (referred to as DX, for short) is a journey that most organizations have to undertake in order to future-proof their existence.Organizations embark on DX initiatives to achieve business growth, gain competitive advantage and deliver improved customer experiences.For many organizations, applications and workloads are at the center of their DX efforts, given that so much of their business is conducted and enhanced via applications.
For these organizations, cloud is a crucial…
This is a series of blogs that explore the use of NPS and address some of the criticisms that have been leveled against it over the years. NPS can be an effective tool in an enterprise storage vendor's arsenal, but how effectively a vendor designs and implements their "NPS program" will determine their success with it
More reaction to criticisms leveled against the use of NPS to evaluate customer experience…
Experience and NPS score rarely match. For an NPS score (or any tool, for that matter) to have predictive value it should consistently forecast some outcome. With NPS, the assumption is that a high number indicates lots of happy customers, and prospective customers looking at that might expect that if they purchase the product or service, they will also become happy customers. I don't know that I would…
This is a series of blogs that explore the use of NPS and address some of the criticisms that have been leveled against it over the years. NPS can be an effective tool in an enterprise storage vendor's arsenal, but how effectively a vendor designs and implements their "NPS program" will determine their success with it.
Over the last several years, there have been several startup enterprise storage vendors that have built a program around the Net Promoter Score (NPS) customer satisfaction metric that has generated a significantly differentiated customer experience. Early vendors to use this metric very successfully were Nimble Storage (now HPE), Nutanix, and Pure Storage. I looked at the NPS metric back in 2016 with a document entitled Why Enterprise Storage Managers Need to Understand the Net Promoter Score (I…
The IDC AFA MarketScape evaluated 10 vendors' enterprise storage platforms on their ability to meet requirements for dense mixed workload consolidation that includes at least some mission-critical applications. In this rapidly maturing market, there is still much to differentiate vendors. This document should provide food for thought for both customers and vendors alike.
In mid-December, IDC released the IDC MarketScape: Worldwide All-Flash Array 2017 Vendor Assessment (IDC, December 2017). Given the state of market maturity in the AFA space, it was necessary to narrowly focus assessment to arrays that were specifically sold for dense mixed workload consolidation that included at least some mission-critical applications. Many AFA vendors now have a broad portfolio of AFA platforms, targeting each at different types of workloads and customers. Other…
I had a chance to spend a few days at the Flash Memory Summit in Santa Clara this year, and this blog highlights some of the recent announcements in the AFA space from the show. NVMe was a major theme of the show, and we are seeing more enterprise storage vendors announce NVMe-based features, products and roadmaps.
At the Flash Memory Summit at the Santa Clara Convention Center this year, NVMe technology was a mainstream theme. IDC research indicates that 48% of enterprises already have NVMe deployed in some manner in their IT shops, but 99%+ of this is as local storage that was purchased after market and configured into PCIe slots on commodity x86 servers. While there were several rack scale flash vendors at the show (Apeiron Data Systems, E8 Storage, Excelero), the rack scale flash market is still an…
Last week, I explored some of the key issues and core benefits that are prompting enterprises to move to more flexible and cost-effective composable infrastructures. As I pointed out in Part 1 of this blog, composable infrastructure technologies from vendors like TidalScale are designed to address many of the most pressing issues in today’s data centers, such as the rapid growth of data, the challenges of accommodating unpredictable workloads with traditional servers and rack systems, and the inherent inefficiency and outright waste that comes from provisioning servers that cannot address the needs of new-generation applications and those that are dedicated to running just one application. In this part, l will review the role of software-defined resources in ensuring that composable data centers are a realistic and cost-effective end goal for enterprise digital transformation.
How essential are software-defined resources to the composable data center?
A self-imposed incentive by most industries to transform themselves digitally is fueling the demand for a new infrastructure architecture. Lines of businesses are mandating IT organizations adopt a software-defined and service-centric approach to speed up IT provisioning, optimize application performance, and increase IT efficiency. The endgame will be to run the business with processes and operations that are…
Some recent acquisitions in the SDS market - Nutanix bought PernixData and Red Hat bought Permabit - highlight a cautionary adage I often heard when working with venture capitalists in the past. When evaluating the future prospects of a funding opportunity, VCs want to understand whether a new business idea is a standalone product or is really just a feature that will quickly be integrated into a platform (presumably owned and shipped by someone else).
Software-defined storage (SDS) is a high growth area that is bringing some strong benefits – better agility, easier storage management, and reduced CAPEX – to those IT organizations that have the requisite skill to deploy it effectively. Some recent acquisitions in the SDS market – Nutanix bought PernixData and Red Hat bought Permabit – highlight a cautionary adage I often heard when working with venture capitalists in my past that is particularly relevant for software products. When evaluating…
New approaches to infrastructure design are required for businesses to keep up with the amount of data that is generated, and whose timely analysis is of paramount importance for the business to remain competitive in the digital economy. Newer approaches to infrastructure must focus on efficiency to minimize budgetary shocks on IT departments, and agility to respond to business needs on-demand. Businesses are embracing new-generation applications to prepare themselves for the future, while maintaining current-gen applications that support revenue-generating operations.
Composable infrastructure technologies from vendors like TidalScale are designed with these key objectives in mind. They are designed to support both current and new generation of applications, thus enabling IT to better service revenue-generating operations while also supporting their business foray into the future. Crucially, Composable software solutions are software defined, and maximize return on investments in server hardware by pooling compute, memory and disk resources for maximum efficiency, utilization, and visibility across the entire datacenter, and not just a cluster of servers.
How does composable infrastructure add value in today’s modern IT environment?
Current-generation IT infrastructure can be rigid and siloed, making it difficult for IT to deliver quickly on the demands of new-generation applications going into production. As businesses embrace NGAs, they are adopting an application-centric approach to IT — building environments that require new levels of scale, automation, and flexibility. This model means a shift from a static and inflexible infrastructure to…
IoT is bridging the IT–OT divide rapidly. Data is no longer just under the purview of IT. Smart and connected devices, which are under the purview of OT, enable data collection, control and actuation, and enable additional IT-centric applications. The need to collect, store, and analyze data in a cost-efficient and timely manner means that IT and OT architecture and operations models need to converge and coexist. Software-defined OT (SD-OT) and IT–OT convergence are part of an “Intelligent Edge." Converged IT/OT Systems minimize data transfer between the core and edge, and carry out OT and IT functions seamlessly. SD-OT moves OT functions into the software running on industry-standard hardware. OT control and data acquisition functions are network-based and can be performed from the Core or anywhere at the Edge. Converging IT and OT means running IT and OT software on the same core and edge infrastructure tier and possibly on the same physical hardware.
This is an excerpt from an IDC Perspective posted on idc.com on the topic of SD-OT and Intelligent Edge. Link here.
Firms embark on Digital Transformation (DX) Initiatives by embracing a data-driven, analytics-first approach to improve business processes and increase operational efficiencies, better understand customer behavior and preferences, and build deeper customer, supplier and/or partner relationships, and more importantly, be prudent about how they gain insight from the data they…
About this channel
- 555k views
- 83 articles
- 7 followers
IDC's Infrastructure and Data Management Blog is the home for IDC storage analysts to share their thoughts on technology, market and industry trends, announcements, movers and shakers, innovative ideas, and recent research.
- OfflineAshish Nadkarni
- October 2012 2
- November 2012 6
- December 2012 2
- January 2013 6
- February 2013 7
- March 2013 3
- April 2013 8
- May 2013 2
- June 2013 4
- September 2013 2
- October 2013 1
- January 2014 1
- May 2014 3
- July 2014 3
- August 2014 6
- January 2015 1
- March 2015 1
- April 2015 1
- June 2015 1
- July 2015 1
- August 2015 2
- September 2015 1
- December 2015 1
- February 2016 1
- March 2016 2
- May 2016 1
- June 2016 1
- July 2016 1
- October 2016 1
- March 2017 1
- July 2017 2
- August 2017 3
- December 2017 1
- February 2018 2
- April 2018 1
- May 2018 1